白丝美女被狂躁免费视频网站,500av导航大全精品,yw.193.cnc爆乳尤物未满,97se亚洲综合色区,аⅴ天堂中文在线网官网

Data processing apparatus, data processing method, and non-transitory storage medium

專利號
US10867162B2
公開日期
2020-12-15
申請人
NEC Corporation(JP Tokyo)
發(fā)明人
Jianquan Liu; Shoji Nishimura; Takuya Araki; Yasufumi Hirakawa
IPC分類
G06K9/00; G06F16/783; G06F16/00
技術(shù)領(lǐng)域
person,data,extraction,in,analyzed,appearance,image,unit,moving,be
地域: Tokyo

摘要

A data processing apparatus (1) of the present invention includes a unit that retrieves a predetermined subject from moving image data. The data processing apparatus includes a person extraction unit (10) that analyzes moving image data to be analyzed and extracts a person whose appearance frequency in the moving image data to be analyzed satisfies a predetermined condition among persons detected in the moving image data to be analyzed, and an output unit (20) that outputs information regarding the extracted person.

說明書

Then, the person extraction unit 10 groups detection IDs obtained from different frames such that those having associated feature values that are similar to each other by a predetermined level or more belong to the same group. As a result, in a case where the same person is detected from a plurality of frames, the detection IDs of the plurality of detections can be grouped. The person extraction unit 10 then assigns a person ID to each group, the person ID identifying each “person detected in the moving image data to be analyzed”. As a result, detected person information shown in FIG. 6 is generated. In the detected person information shown in the diagram, a person ID for identifying a “person detected in the moving image data to be analyzed” is associated with a detection ID.

After the detected person information shown in FIG. 6 is completed by processing all the frames to be processed, the person extraction unit 10 determines whether or not each person (person corresponding to each person ID) detected in the moving image data 100 to be analyzed appears in each of the plurality of time windows based on the information shown in FIGS. 5 and 6 and on the relationship between each of the plurality of time windows and the frames included in each time window. As a result, a determination result shown in FIG. 4 is obtained.

Then, the person extraction unit 10 calculates appearance frequencies as described in the first example embodiment, and extracts a person whose appearance frequency satisfies a predetermined condition.

權(quán)利要求

1
微信群二維碼
意見反饋