白丝美女被狂躁免费视频网站,500av导航大全精品,yw.193.cnc爆乳尤物未满,97se亚洲综合色区,аⅴ天堂中文在线网官网

Data processing apparatus, data processing method, and non-transitory storage medium

專利號
US10867162B2
公開日期
2020-12-15
申請人
NEC Corporation(JP Tokyo)
發(fā)明人
Jianquan Liu; Shoji Nishimura; Takuya Araki; Yasufumi Hirakawa
IPC分類
G06K9/00; G06F16/783; G06F16/00
技術領域
person,data,extraction,in,analyzed,appearance,image,unit,moving,be
地域: Tokyo

摘要

A data processing apparatus (1) of the present invention includes a unit that retrieves a predetermined subject from moving image data. The data processing apparatus includes a person extraction unit (10) that analyzes moving image data to be analyzed and extracts a person whose appearance frequency in the moving image data to be analyzed satisfies a predetermined condition among persons detected in the moving image data to be analyzed, and an output unit (20) that outputs information regarding the extracted person.

說明書

CROSS REFERENCE TO RELATED APPLICATIONS

This application is a National Stage of International Application No. PCT/JP2016/081526 filed Oct. 25, 2016, claiming priority based on Japanese Patent Application No. 2015-218164 filed Nov. 6, 2015, the entire contents of which are incorporated herein.

TECHNICAL FIELD

The present invention relates to a data processing apparatus, a data processing method, and a program.

BACKGROUND ART

Patent Document 1 discloses a retrieval apparatus that retrieves a specified person from an image. The retrieval apparatus includes: a first acquisition unit that acquires an image including a plurality of frames; a first extraction unit that extracts a plurality of persons included in the frames and extracts a plurality of types of first attributes characterizing the person; a second extraction unit that extracts a plurality of types of second attributes characterizing the person from a first person specified by the user; a retrieval unit that retrieves the first person from the frames by using at least one type of the second attributes as a retrieval condition; and an addition unit that, in a case where the first person is retrieved by the retrieval unit and the first person includes an attribute different from the first attribute and the second attribute, adds at least one type of a different attribute as a new condition for retrieval.

Patent Documents 2 and 3 disclose an index generation apparatus that generates indexes in which a plurality of nodes are hierarchized.

RELATED DOCUMENT Patent Document

[Patent Document 1] Japanese Patent Application Publication No. 2014-16968

權利要求

1
The invention claimed is:1. A data processing apparatus, comprising:a memory comprising instructions; andat least one processor configured to execute the instructions to implement:an extraction unit that analyzes data to be analyzed and extracts a subject whose appearance frequency in the data to be analyzed satisfies a predetermined condition among subjects detected in the data to be analyzed; andan output unit that outputs information regarding the extracted subject,wherein the appearance frequency comprises a number of appearances or an appearance rate,wherein the extraction unit comprises a person extraction unit that analyzes moving image data to be analyzed and extracts a person whose appearance frequency in the moving image data to be analyzed satisfies a predetermined condition among persons detected in the moving image data to be analyzed,wherein the output unit outputs information regarding the extracted person, andwherein the person extraction unit extracts a predetermined number of persons in descending order of appearance frequency.2. The data processing apparatus according to claim 1,wherein the moving image data to be analyzed comprises a plurality of pieces of child data captured at a plurality places different from each other, andthe person extraction unit determines whether or not each person detected in the moving image data to be analyzed appears in each of the plurality of pieces of child data, and calculates an appearance frequency for each detected person based on a result of the determination.3. The data processing apparatus according to claim 2claim 1,wherein the moving image data to be analyzed comprises moving image data captured at the same place over a predetermined time period, andthe person extraction unit analyzes the moving image data to be analyzed in units of time windows, each time window having a time width smaller than the predetermined time period, determines whether or not each person detected in the moving image data to be analyzed appears in each of a plurality of the time windows, and calculates an appearance frequency of each detected person based on a determination result.4. The data processing apparatus according to claim 3, wherein the at least one processor is further configured to execute the instructions to implement:an input receiving unit that receives a user input to set a time width of the time window.5. The data processing apparatus according to claim 3, wherein the at least one processor is further configured to execute the instructions to implement:an input receiving unit that receives a user input to individually set a start position and an end position of each of a plurality of the time windows.6. The data processing apparatus according to claim 1,wherein the person extraction unit extracts a person whose appearance frequency is equal to or higher than a predetermined level.7. The data processing apparatus according to claim 1,wherein the output unit outputs an image of the person acquired from the moving image data to be analyzed as information regarding the extracted person.8. The data processing apparatus according to claim 7,wherein the output unit displays a list of a plurality of images of the person acquired from a plurality of frames different from each other as information regarding the extracted person.9. The data processing apparatus according to claim 7,wherein the output unit outputs an appearance frequency in the moving image data to be analyzed as information regarding the extracted person.10. The data processing apparatus according to claim 7,wherein the output unit displays information regarding each of a plurality of extracted persons as a list in descending order of appearance frequency.11. The data processing apparatus according to claim 7,wherein the moving image data to be analyzed comprises moving image data captured at the same place over a predetermined time period, andthe output unit outputs information indicating a temporal change in appearance frequency as information regarding the extracted person.12. The data processing apparatus according to claim 7,wherein the moving image data to be analyzed comprises a plurality of pieces of child data captured at a plurality of places different from each other, andthe output unit outputs information indicating at least one of a place where the person has appeared and the number of places where the person has appeared as information regarding the extracted person.13. The data processing apparatus according to claim 1,wherein the person extraction unit executes processing to determine whether or not a person detected in a frame to be processed is similar to a person detected in a previously processed frame in outer appearance feature values by a predetermined level or more, andthe data processing apparatus further comprises a unit that receives a user input to set the predetermined level in the processing.14. A data processing method executed by a computer, the method comprising:an extraction step of analyzing data to be analyzed and extracting a subject whose appearance frequency in the data to be analyzed satisfies a predetermined condition among subjects detected in the data to be analyzed; andan output step of outputting information regarding the extracted subject,wherein the appearance frequency comprises a number of appearances or an appearance rate,wherein the extraction step comprises analyzing moving image data to be analyzed and extracting a person whose appearance frequency in the moving image data to be analyzed satisfies a predetermined condition among persons detected in the moving image data to be analyzed,wherein the output step comprises outputting information regarding the extracted person, andwherein the extraction step further comprises extracting a predetermined number of persons in descending order of appearance frequency.15. A non-transitory storage medium storing a program causing a computer to function as:an extraction unit that analyzes data to be analyzed and extracts a subject whose appearance frequency in the data to be analyzed satisfies a predetermined condition among subjects detected in the data to be analyzed; andan output unit that outputs information regarding the extracted subject,wherein the appearance frequency comprises a number of appearances or an appearance rate,wherein the extraction unit comprises a person extraction unit that analyzes moving image data to be analyzed and extracts a person whose appearance frequency in the moving image data to be analyzed satisfies a predetermined condition among persons detected in the moving image data to be analyzed,wherein the output unit outputs information regarding the extracted person, andwherein the person extraction unit extracts a predetermined number of persons in descending order of appearance frequency.16. A data processing apparatus, comprising:a memory comprising instructions; andat least one processor configured to execute the instructions to implement:an extraction unit that analyzes data to be analyzed and extracts a subject whose appearance frequency in the data to be analyzed satisfies a predetermined condition among subjects detected in the data to be analyzed; andan output unit that outputs information regarding the extracted subject,wherein the appearance frequency comprises a number of appearances or an appearance rate,wherein the extraction unit comprises a person extraction unit that analyzes moving image data to be analyzed and extracts a person whose appearance frequency in the moving image data to be analyzed satisfies a predetermined condition among persons detected in the moving image data to be analyzed,wherein the output unit outputs information regarding the extracted person,wherein the output unit outputs an image of the person acquired from the moving image data to be analyzed as information regarding the extracted person,wherein the moving image data to be analyzed comprises a plurality of pieces of child data captured at a plurality of places different from each other, andwherein the output unit outputs information indicating at least one of a place where the person has appeared and the number of places where the person has appeared as information regarding the extracted person.17. The data processing apparatus according to claim 16,wherein the moving image data to be analyzed comprises a plurality of pieces of child data captured at a plurality places different from each other, andthe person extraction unit determines whether or not each person detected in the moving image data to be analyzed appears in each of the plurality of pieces of child data, and calculates an appearance frequency for each detected person based on a result of the determination.18. The data processing apparatus according to claim 16,wherein the moving image data to be analyzed comprises moving image data captured at the same place over a predetermined time period, andthe person extraction unit analyzes the moving image data to be analyzed in units of time windows, each time window having a time width smaller than the predetermined time period, determines whether or not each person detected in the moving image data to be analyzed appears in each of a plurality of the time windows, and calculates an appearance frequency of each detected person based on a determination result.19. The data processing apparatus according to claim 18, wherein the at least one processor is further configured to execute the instructions to implement:an input receiving unit that receives a user input to set a time width of the time window.20. The data processing apparatus according to claim 18, wherein the at least one processor is further configured to execute the instructions to implement:an input receiving unit that receives a user input to individually set a start position and an end position of each of a plurality of the time windows.21. The data processing apparatus according to claim 16,wherein the person extraction unit extracts a person whose appearance frequency is equal to or higher than a predetermined level.22. The data processing apparatus according to claim 16,wherein the output unit outputs an image of the person acquired from the moving image data to be analyzed as information regarding the extracted person.23. The data processing apparatus according to claim 22,wherein the output unit displays a list of a plurality of images of the person acquired from a plurality of frames different from each other as information regarding the extracted person.24. The data processing apparatus according to claim 22,wherein the output unit outputs an appearance frequency in the moving image data to be analyzed as information regarding the extracted person.25. The data processing apparatus according to claim 22,wherein the output unit displays information regarding each of a plurality of extracted persons as a list in descending order of appearance frequency.26. The data processing apparatus according to claim 22,wherein the moving image data to be analyzed comprises moving image data captured at the same place over a predetermined time period, andthe output unit outputs information indicating a temporal change in appearance frequency as information regarding the extracted person.27. The data processing apparatus according to claim 16,wherein the person extraction unit executes processing to determine whether or not a person detected in a frame to be processed is similar to a person detected in a previously processed frame in outer appearance feature values by a predetermined level or more, andthe data processing apparatus further comprises a unit that receives a user input to set the predetermined level in the processing.28. A data processing method executed by a computer, the method comprising:an extraction step of analyzing data to be analyzed and extracting a subject whose appearance frequency in the data to be analyzed satisfies a predetermined condition among subjects detected in the data to be analyzed; andan output step of outputting information regarding the extracted subject,wherein the appearance frequency comprises a number of appearances or an appearance rate,wherein the extraction step comprises analyzing moving image data to be analyzed and extracting a person whose appearance frequency in the moving image data to be analyzed satisfies a predetermined condition among persons detected in the moving image data to be analyzed,wherein the output step comprises outputting information regarding the extracted person,wherein the output step further comprises outputting an image of the person acquired from the moving image data to be analyzed as information regarding the extracted person,wherein the moving image data to be analyzed comprises a plurality of pieces of child data captured at a plurality of places different from each other, andwherein the output step further comprises outputting information indicating at least one of a place where the person has appeared and the number of places where the person has appeared as information regarding the extracted person.29. A non-transitory storage medium storing a program causing a computer to function as:an extraction unit that analyzes data to be analyzed and extracts a subject whose appearance frequency in the data to be analyzed satisfies a predetermined condition among subjects detected in the data to be analyzed; andan output unit that outputs information regarding the extracted subject,wherein the appearance frequency comprises a number of appearances or an appearance rate,wherein the extraction unit comprises a person extraction unit that analyzes moving image data to be analyzed and extracts a person whose appearance frequency in the moving image data to be analyzed satisfies a predetermined condition among persons detected in the moving image data to be analyzed,wherein the output unit outputs information regarding the extracted person,wherein the output unit outputs an image of the person acquired from the moving image data to be analyzed as information regarding the extracted person,wherein the moving image data to be analyzed comprises a plurality of pieces of child data captured at a plurality of places different from each other, andwherein the output unit outputs information indicating at least one of a place where the person has appeared and the number of places where the person has appeared as information regarding the extracted person.
微信群二維碼
意見反饋