IMAGING SYSTEM, IMAGING DEVICE, METHOD OF IMAGING, AND STORAGE MEDIUM
This is a Continuation of PCT Application No. PCT/JP2016/059907, filed on 28 Mar. 2016. The contents of the above-mentioned application are incorporated herein by reference. The present invention relates to an imaging system, an imaging device, a method of imaging, and a storage medium. A technique that acquires a three-dimensional shape of an object is developed (for example, see Patent Literature 1). To acquire the three-dimensional shape, for example, an object is detected with a plurality of fields of view, and a partial model obtained from a detection result with a first field of view and a partial model obtained from a detection result with a second field of view are integrated. [Patent Literature 1] Japanese Unexamined Patent Application Publication No. 2010-134546 For example, when information indicating a correspondence between the first field of view and the second field of view is insufficient, it is difficult to associate information obtained with the first field of view and information obtained with the second field of view with each other with high precision. A first aspect of the present invention provides an imaging system, including: a first body; a first imager that is provided in the first body and images an object; a first information calculator that is provided in the first body and calculates first model information including at least one of shape information and texture information of the object based on an imaging result of the first imager; a pattern setter that sets a reference pattern indicating at least a part of the first model information calculated by the first information calculator; a first projector that projects the reference pattern toward the object; a second body; a second imager that is provided in the second body and images the object onto which the reference pattern is projected; a second information calculator that is provided in the second body and calculates second model information including at least one of shape information and texture information of the object based on an imaging result of the second imager; and a pattern extractor that extracts the reference pattern projected by the first projector from the imaging result of the second imager. A second aspect of the present invention provides an imaging system, including: a first body; a first projector that is provided in the first body and projects a reference pattern indicating at least a part of first model information including at least one of shape information and texture information of an object toward the object; a second body; a second imager that is provided in the second body and images the object onto which the reference pattern is projected; an information calculator that calculates second model information including at least a part of shape information and texture information of the object based on an imaging result of the second imager; and a pattern extractor that extracts the reference pattern projected by the first projector from the imaging result of the second imager. A third aspect of the present invention provides an imaging device, including: a body; an imager that is provided in the body and images an object and a reference pattern projected onto the object; a pattern extractor that extracts the reference pattern projected onto the object from an imaging result of the imager; and an information calculator that is provided in the body and uses the imaging result of the imager and the reference pattern extracted by the pattern extractor to calculate model information including at least one of shape information and texture information of the object. A fourth aspect of the present invention provides an imaging device, including: a body; an imager that is provided in the body and images a feature part provided on a surface of an object; a pattern extractor that extracts the feature part of the object from an imaging result of the imager; and an information calculator that is provided in the body and uses the imaging result of the imager and the feature part to calculate model information including at least one of shape information and texture information of the object. A fifth aspect of the present invention provides a method of imaging, including: imaging an object by a first imager provided in a first body; calculating first model information including at least one of shape information and texture information of the object based on an imaging result of the first imager by a first information calculator provided in the first body; projecting a reference pattern indicating at least a part of the first model information calculated by the first information calculator toward the object; imaging the object onto which the reference pattern is projected by a second imager provided in a second body; calculating second model information including at least one of shape information and texture information of the object based on an imaging result of the second imager by a second information calculator provided in the second body; and extracting the reference pattern projected by the first projector from the imaging result of the second imager. A sixth aspect of the present invention provides a storage medium storing therein a program that causes a computer to execute: imaging a feature part provided on a surface of an object by an imager provided in a body; extracting the feature part of the object from an imaging result of the imager; and calculating model information including at least one of shape information and texture information of the object by using the imaging result of the imager and the feature part by an information calculator provided in the body. A first embodiment is now described. The first imaging device 2 The first imaging device 2 For example, the object OB in For example, the first imaging device 2 The second imaging device 2 In the first embodiment, the first imaging device 2 For example, the first imaging device 2 For example, the second imaging device 2 The second imaging device 2 For example, the second imaging device 2 The information processing device 3 includes, for example, a computer system. The information processing device 3 is communicably connected to the first imaging device 2 The information processing device 3 acquires information from the first imaging device 2 For example, the information processing device 3 uses information on the integrated model to execute image processing (for example, rendering processing, recognition processing using model information) as information processing. For example, on the basis of setting information on a viewpoint (an imaging direction) input to the input device 4 by a user, the information processing device 3 calculates data on an estimated image of the object OB that is viewed from this viewpoint. The input device 4 includes, for example, at least one of a keyboard, a mouse, a touch panel, a sensor such as an acceleration sensor, a voice input machine, and a touch pen, and is connected to the information processing device 3. For example, the input device 4 receives an input of information from a user, and supplies the input information to the information processing device 3. The display device 5 includes, for example, a liquid crystal display or a touch panel display, and is connected to the information processing device 3. For example, the display device 5 displays an image (for example, an estimated image by the rendering processing) by using image data supplied from the information processing device 3. For example, at least one imaging device (for example, the first imaging device 2 At least one imaging device (for example, the first imaging device 2 Next, each unit in the imaging devices 2 and each unit in the information processing device 3 are described. First, an example of each unit in the first imaging device 2 The first imager 15 is provided in the first body 14. The first imager 15 images the object. For example, the first projector 18 is provided in the first body 14. The first projector 18 is capable of projecting a pattern toward the object OB. For example, the first imager 15 and the first projector 18 are projector cameras. The first imaging device 2 For example, the first imager 15 is capable of taking a visible light image and an infrared light image. The first imager 15 includes, for example, an image sensor 20 For example, the image sensor 20 The image-forming optical system 22 includes, for example, a plurality of lenses, and forms an image of an object surface (for example, object OB). For example, the image-forming optical system 22 is held in a lens barrel, and is mounted on the first body 14 together with the lens barrel. The image-forming optical system 22 and the lens barrel are, for example, interchangeable lenses, and are detachable from the first body 14. The lens barrel may be a part of the first body 14, and may be undetachable from the first body 14. The dichroic mirror 21 has characteristics of passing light (for example, visible light) with a wavelength band to which the image sensor 20 In the first imager 15, the image-forming optical system 22 functions as both a first optical system that forms the visible light image and a second optical system that forms the infrared light image; this second optical system may be provided separate from the first optical system. The first imager 15 may acquire one of the visible light image and the infrared image and does not necessarily need to acquire the other. For example, the first imager 15 may include the image sensor 20 For example, the first projector 18 is capable of projecting a visible light image and an infrared light image. For example, the first projector 18 includes a first light source 22 The visible light emitted from the first optical engine 23 When the first projector 18 projects the visible light pattern, the first imager 15 images the object OB onto which the visible light pattern is projected by the image sensor 20 For example, when the first imaging device 2 For example, the first distance measuring pattern is set as a dot pattern including a plurality of dots. In this case, for example, by detecting, at each point on the surface of the object OB, to how many pixels the size of a spot forming the dot corresponds on the taken image by the first imager 15, a distance between this point and the first imaging device 2 The first imager 15 can perform imaging concurrently with the pattern projection by the first projector 18. For example, the first imager 15 may acquire the infrared light image by the image sensor 20 The first projector 18 is capable of projecting the visible light image and the infrared light image; only one of the visible light image and the infrared light image may be projected. For example, the first projector 18 may project the visible light image and does not necessarily need to project the infrared light image. In this case, the first projector 18 does not necessarily need to include the second light source 22 The first imaging device 2 When the first detector 11 detects the distance to the object OB without projecting any pattern onto the object OB, the first projector 18 may be provided separate from the first detector 11. For example, the first projector 18 may be a unit to be externally connected to the first body 14 or arranged at a position separate from the first body 14. Referring back to First, the first information calculator 12 uses the detection result (for example, the depth information) of the first detector 11 to calculate the point group data (point group data processing). For example, the first information calculator 12 calculates the point group data through perspective transformation from a distance image indicated by the depth information to a planar image. For example, the first information calculator 12 estimates a surface between a point selected from among the points included in the point group data and a point in the vicinity thereof and transforms the point group data into polygon data having plane information between points (surface processing, surface modeling). For example, the first information calculator 12 transforms the point group data into polygon data by an algorithm using the least-square method. For example, an algorithm published in a point group processing library may be applied as this algorithm. Next, the first information calculator 12 calculates texture information by, for example, inverse rendering. The texture information includes, for example, information on at least one item of pattern information representing a pattern of the surface of the object OB, light source information on light applied to the object OB, and optical characteristics information representing optical characteristics (for example, reflectivity and scattering rate) of the surface of the object OB. The light source information includes, for example, information on at least one item of the position of a light source, the direction of light applied from the light source to the object, the wavelength of light applied from the light source, and the type of the light source. For example, the first information calculator 12 calculates the light source information by using a model that assumes Lambertian reflectance or a model including Albedo estimation. For example, the first information calculator 12 estimates, among pixel values of respective pixels in an image taken by the first imager 15, a component derived from light diffused by the object OB and a component normally reflected by the object OB. For example, the first information calculator 12 uses the result of estimating the component normally reflected by the object OB and the shape information to calculate the direction in which light enters the object OB from the light source. For example, the first information calculator 12 uses the calculated light source information and shape information to estimate reflection characteristics of the object OB and calculates the optical characteristics information including the estimation result of the reflection characteristics. For example, the first information calculator 12 uses the calculated light source information and optical characteristics information to remove the influence of illumination light from visible light image data and calculates pattern information. The feature extractor 16 extracts a feature point (for example, a feature point or a singular point) that can be identified from other parts of the first model information calculated by the first information calculator 12. For example, in the object OB, its edge or ridge line may include straight line parts. In this case, a point of intersection of a straight line part of the edge and another straight line part of the edge can be identified from the other parts and, for example, can be determined to be a point corresponding to a corner of a surface surrounded by this edge. When a plurality of corners on an edge are detected, the position of a certain division point (for example, a midpoint) between corners can be calculated from the coordinates of the two corners and is capable being identified from the other parts. For example, when the edge or the ridge line includes a curve part, a point at which the slope of its tangential line changes by a certain threshold or more can be identified from the other parts by calculating a slope at each point. For example, the feature extractor 16 extracts a plurality of feature points from the first model information in accordance with various kinds of algorithms. Feature point data indicating a feature amount at a feature point extracted from at least one of shape information and texture information of an object may be contained in the first model information. For example, the feature extractor 16 may be a part of the first information calculator 12, and the feature extractor 16 may calculate first feature point data indicating the feature amount at the feature point as a part of the processing that the first information calculator 12 calculates the first model information. The pattern setter 17 sets a reference pattern indicating at least a part of the first model information calculated by the first information calculator 12. The reference pattern is projected onto the whole or a part of the object OB by the first projector 18 and, for example, is detected by the second imaging device 2 For example, the pattern setter 17 may set a pattern that displays a code RP2 (for example, “B”) in a part in which another surface SF2 (for example, a plane) than the surface SF1 is defined as the surface information of the object OB as the reference pattern. When the code RP2 of the object OB onto which this reference pattern is projected is not detected by the second imaging device 2 The pattern setter 17 may set a pattern that displays a code RP3 (for example, a line) in a part defined as an edge or a ridge line of a surface defined as the surface information of the object OB as the reference pattern. The pattern setter 17 may set a pattern that displays a code RP4 (for example, a figure such as a rectangle or an arrow) in a part in which a corner (for example, an apex) of a surface is defined as the surface information of the object OB as the reference pattern. For example, the pattern setter 17 may set a pattern that displays a code SF5 (for example, a mesh or a grid) in a part in which a surface SF3 (for example, a curved surface) is defined as the surface information of the object OB as the reference pattern. An area in which the code RP5 is detected by the second imaging device 2 For example, the reference pattern can be used for comparing the first partial model provided in the first model information and the object OB in real space. In this case, for example, by detecting deviation between the reference part projected onto the object OB and the object OB, the accuracy of the first model information can be evaluated. For example, the reference pattern can be used as information (a signal) indicating the operation timing of the first imaging device 2 Illustrated in, for example, For example, the pattern setter 17 sets the reference pattern to a pattern associated with information on the feature point extracted by the feature extractor 16. In this case, for example, the reference part includes the feature point (for example, a corner, an edge, a ridge line, a protrusion, or a recess) extracted by the feature extractor 16. For example, the pattern setter 17 sets the reference pattern to a pattern associated with the surface information calculated by the first information calculator 12. In this case, for example, the reference part includes a part corresponding to a point (for example, a corner), a line (for example, an edge or a ridge line), or a surface defined in the surface information. For example, the pattern setter 17 sets the reference pattern to a pattern the light intensity distribution of which spatially changes. For example, the pattern setter 17 sets the reference pattern to a pattern including a code indicating the reference part. This code may include one of or two or more of a figure (for example, a line, an arrow, a polygon, or a circular shape), a character (for example, a number, an alphabet, or a symbol), a two-dimensional or three-dimensional barcode, and a texture such as a mesh or a grid. For example, the pattern setter 17 may set a pattern the light intensity distribution of which temporally changes as the reference pattern. For example, the pattern setter 17 may indicate the reference part with a blinking pattern or the reference part with color (for example, a plurality of colors) to make the reference part distinguishable from the other parts. The pattern setter 17 may set a pattern associated with identification information (for example, a number, a code, or an ID) that distinguishes one imaging device (for example, the first imaging device 2 For example, the pattern setter 17 generates image data indicating the reference pattern. For example, the type (for example, a figure or a character) of the code included in the reference pattern is stored in the memory (described below) in association with the type (for example, a corner, an edge, or a surface) of the reference part. For example, it is defined in advance that when the type of the code is an alphabet, the reference part corresponding to this code is a surface. For example, the pattern setter 17 reads the type of the pattern corresponding to the type of the reference part from the memory 27 and arranges the code of this type at a position on an image corresponding to the position of the reference part to generate the image data indicating the reference pattern (hereinafter referred to as “reference image data”). For example, the pattern setter 17 stores the generated reference image data in the memory 27. For example, the pattern setter 17 generates collation data that associates the code included in the reference pattern and the reference part indicated by this code with each other. For example, when a first code is assigned to a first feature point of the first partial model, the pattern setter 17 associates position information of the first feature point in the first partial model and the type of the first code with each other to generate the collation data. For example, the pattern setter 17 stores the collation data in the memory 27. The first imaging device 2 The inputter 26 is, for example, an operation button provided to the first body 14, a touch panel provided to the display 25, a voice input machine that recognizes voice of a user, or a release button. For example, the inputter 26 detects an operation by a user and receives an input of information from the user. The inputter 26 transmits the input information to the first controller 19. The memory 27 is, for example, a non-volatile memory such as a USB memory or a memory card and stores therein various kinds of information. The memory 27 may include a storage device incorporated in the first imaging device 2 The header information may include at least one of the identification information, the position of the first imaging device 2 For example, the communicator 28 includes at least one of an I/O port such as a USB port and a communication device that performs wireless communication by radio waves or infrared rays. The communicator 28 is controlled by the first controller 19 to read information stored in the memory 27 and transmit the read information to an external device. For example, the communicator 28 transmits at least a part of the calculation results of the first information calculator 12 (for example, the model information) to the information processing device 3. For example, the communicator 28 receives information including an instruction from an external device. The communicator 28 is capable of storing the received information in the memory 27 and supplying the received information to the first controller 19. When the first imaging device 2 The first controller 19 is held by the first body 14. For example, the first controller 19 controls each unit in the first imaging device 2 For example, the first controller 19 controls the first information calculator 12 to calculate the first model information on the basis of the taken image obtained by imaging the object OB onto which the first distance measuring pattern is projected by the first imager 15. For example, the first controller 19 stores at least a part of the first model information calculated by the first information calculator 12 in the memory 27. For example, the first controller 19 controls the feature extractor 16 to execute feature extraction processing to extract the feature point from the first model information. For example, after the first model information has been calculated or after the feature extraction processing has been executed, the first controller 19 controls the pattern setter 17 to execute reference pattern setting processing. For example, the first controller 19 causes the pattern setter 17 to execute processing to generate the reference image data. For example, the first controller 19 stores the reference image data in the memory 27. The first controller 19 supplies information indicating a projection condition for the first projector 18 to the second imaging device 2 For example, the first controller 19 causes the display 25 to display an image indicating at least a part of the information stored in the memory 27. The first controller 19 controls the communicator 28 to execute transmission of information and reception of information via the communicator 28. Next, an example of each unit in the second imaging device 2 The second imager 34 is provided in the second body 31. The second imager 34 images the object OB. For example, the second projector 35 is provided in the second body 31. The second projector 35 is capable of projecting a pattern toward the object OB. For example, the second imager 34 and the second projector 35 are projector cameras. The second imaging device 2 The second imager 34 images the object OB onto which the reference pattern is projected by the first projector 18. The second imager 34 includes an image sensor having sensitivity to the wavelength band of light emitted from the first projector 18. For example, when the first projector 18 projects the visible light image as the reference pattern, the second imager 34 includes an image sensor having sensitivity to the wavelength band of visible light. For example, when the first projector 18 projects the infrared light image as the reference pattern, the second imager 34 includes an image sensor having sensitivity to the wavelength band of infrared light. The pattern extractor 36 extracts (detects) the reference pattern projected onto the object OB by the first projector 18 from the imaging result including the reference pattern of the second imager 34. For example, the second imager 34 images the object OB in a non-projection state, in which the reference pattern is not projected from the first projector 18 and in a projection state, in which the reference pattern is projected from the first projector 18. For example, the pattern extractor 36 calculates a difference between an image acquired by the second imager 34 in the non-projection state and an image acquired by the second imager 34 in the projection state to extract the reference pattern projected onto the object OB. In this case, for example, the pattern extractor 36 can separate the reference pattern from the texture of the object OB. For example, when the reference pattern includes a code such as a character or a figure, the pattern extractor 36 performs OCR processing, pattern recognition processing, or the like on the extracted reference pattern to read the code included in the reference pattern. For example, the pattern extractor 36 acquires information on the code used for the reference pattern from a memory that stores therein the information on the code. The pattern extractor 36 collates the information on the code and the extracted reference pattern to read the code included in the reference pattern. The information on the code may be stored in a memory 39 (described later) in the second imaging device 2 The second information calculator 33 is provided in the second body 31. The second information calculator 33 uses the imaging result of the second imager 34 to calculate the second model information including at least one of shape information and texture information of the object OB. To calculate the second model information, the second information calculator 33 detects the distance from the second imaging device 2 For example, the second information calculator 33 uses the reference pattern extracted by the pattern extractor 36 to calculate the second model information. For example, when the pattern extractor 36 has detected a code indicating a feature point about the shape information, the second information calculator 33 calculates at least the position of the code and the shape information therearound around the position of the code in the taken image of the second imager 34. For example, when the pattern extractor 36 has detected a code indicating a feature point about the texture information, the second information calculator 33 calculates at least the position of the code and the texture information therearound (around the position of the code) in the taken image of the second imager 34. When the reference pattern includes a code indicating a feature point, for example, the second information calculator 33 can reduce load on processing to extract the feature point and increase the accuracy of detecting the feature point (make the feature point easily recognized). For example, the first embodiment can reduce load on communication compared with a case in which information on the feature point is acquired from the first imaging device 2 For example, when the pattern extractor 36 has detected the code RP3 (for example, an edge of a surface of the object OB) illustrated in The second imaging device 2 The second imaging device 2 The second controller 37 is held by the second body 31. For example, the second controller 37 controls each unit in the second imaging device 2 For example, the second controller 37 controls the communicator 40 to acquire information indicating the projection condition for the first projector 18 from the first imaging device 2 For example, the second controller 37 controls the second information calculator 33 to calculate the second model information on the basis of the taken image obtained by imaging the object OB onto which the second distance measuring pattern is projected by the second imager 34 and the information on the reference pattern extracted by the pattern extractor 36. For example, the second controller 37 stores at least a part of the second model information calculated by the second information calculator 33 in the memory 39. Next, an example of each unit in the information processing device 3 is described. For example, the information processing device 3 includes a communicator 51, a memory 52, a model integrator 53, a rendering processor 54, and a controller 55. For example, the communicator 51 includes at least one of a USB port, a network card, or a communication device that performs wireless communication by radio waves or infrared rays. The communicator 51 is communicable with the communicator 28 in the first imaging device 2 For example, the memory 52 includes a removable storage medium such as a USB memory or an external or built-in large-capacity storage device such as a hard disk. For example, the memory 52 stores therein data on at least a part of information received via the communicator 51, an imaging control program for controlling the imaging devices 2, and a processing program for executing each processing in the information processing device 3. The model integrator 53 integrates the first model information calculated on the basis of the result (a first detection result) of detecting the object OB from the first direction and the second model information calculated on the basis of the result (a second detection result) of detecting the object OB from the second direction to generate integrated model information. For example, the model integrator 53 uses the first model information and the collation data supplied from the first imaging device 2 For example, the model integrator 53 can acquire information on the feature point common to the first partial model indicated by the first model information and the second partial model indicated by the second model information from the information on the reference information. For example, the model integrator 53 can associate the first partial model and the second partial model by using this information on the feature point. For example, the model integrator 53 can reduce load on processing to search for the feature point common to the first partial model and the second partial model or omit this processing. In the model integration processing, for example, the model integrator 53 collates the collation data from the first imaging device 2 The rendering processor 54 performs rendering processing on the basis of one or both of at least a part of the first model information and at least a part of the second model information. For example, the rendering processor 54 uses the integrated model information calculated by the model integrator 53 on the basis of the first model information and the second model information to perform the rendering processing. The rendering processor 54 includes, for example, a graphics processing unit (GPU). The rendering processor 54 may be configured such that a CPU and a memory execute each processing in accordance with an image processing program. In the rendering processing, for example, the rendering processor 54 executes at least one of drawing processing, texture mapping processing, or shading processing. In the drawing processing, for example, the rendering processor 54 can calculate an estimated image (for example, a reconstructed image) in which the shape defined by shape information in model information is viewed from a freely selected viewpoint. In the following description, the shape indicated by shape information is referred to as “model shape”. For example, the rendering processor 54 uses at least a part (for example, the shape information) of the first model information to perform the drawing processing. The rendering processor 54 can also perform the drawing processing by using at least a part of the second model information (for example, the shape information). The rendering processor 54 may perform the drawing processing by using at least a part of the first model information and at least a part of the second model information or, for example, may perform the drawing processing by using at least a part of the integrated model information. For example, the rendering processor 54 can reconstruct a model shape (for example, an estimated image) from model information (for example, shape information) through the drawing processing. For example, the rendering processor 54 stores data on the calculated estimated image in the memory 52. The imaging devices 2 are each capable of transmitting at least a part of the model information to the information processing device 3, and hence, for example, the information processing device 3 can reduce load on the rendering processing. For example, the imaging devices 2 do not need to transmit all images taken by the first imager 15 to the information processing device 3, but can transmit at least a part of the model information (for example, shape information and texture information) calculated by the first information calculator 12 to the information processing device 3. Consequently, the imaging devices 2 according to the first embodiment can each reduce communication load on information necessary for the drawing processing by the rendering processor 54. In the texture mapping processing, for example, the rendering processor 54 can calculate an estimated image obtained by attaching an image indicated by the texture information in the model information to the surface of the object on the estimated image. The rendering processor 54 can also calculate an estimated image obtained by attaching another texture than the object OB on the surface of the object on the estimated image. In the shading processing, for example, the rendering processor 54 can calculate an estimated image in which the shade formed by a light source indicated by the light source information in the model information is added to the object on the estimated image. In the shading processing, for example, the rendering processor 54 can calculate an estimated image in which the shade formed by a freely selected light source is added to the object on the estimated image. For example, the controller 55 controls each unit in the information processing device 3, the imaging devices 2, the input device 4, and the display device 5. For example, the controller 55 controls the communicator 51 to transmit an instruction (a control signal) and setting information to each of the imaging devices 2. For example, the controller 55 stores information received by the communicator 51 from the imaging devices 2 in the memory 52. For example, the controller 55 controls the rendering processor 54 to execute the rendering processing. For example, the controller 55 controls the imaging devices 2 by transmitting an instruction (a signal) to the imaging devices 2 via the communicator 51. For example, the controller 55 controls the communicator 51 to transmit, to the imaging devices 2, an instruction (a request signal) that requests transmission of certain information. The controller 55 may transmit an instruction that instructs the imaging devices 2 to execute each processing to the imaging devices 2. For example, the controller 55 may transmit an instruction that instructs the first detector 11 in the first imaging device 2 For example, the communicator 28 transmits information calculated by the first information calculator 12 selectively for each item. For example, the setting information stored in the memory 27 includes transmission item information that defines whether to transmit information on each item in the model information and transmission order information that defines the order of transmitting the information on each item. For example, the setting information can be updated by operation of the inputter 26 or an instruction from the information processing device 3. For example, the first controller 19 controls the communicator 28 to transmit the information on items determined by the transmission item information in the order determined by the transmission order information. For example, the first controller 19 may control the communicator 28 to transmit the information on items (for example, shape information and texture information) determined by the transmission item information at a time on the basis of a certain data format. For example, the transmission item information may be set in accordance with whether corresponding information is used for the rendering processing by the information processing device 3. For example, in some cases, the rendering processing involves combining the shape of the object OB with texture different from the object OB. In this case, for example, the information processing device 3 can execute the rendering processing by using the shape information of the object OB without using the texture information of the object OB. For example, the transmission item information is set as information that defines that the shape information is transmitted but the texture information is not transmitted. In this case, the first information calculator 12 does not necessarily need to calculate the texture information. For example, the rendering processing may involve calculating an image in which illumination conditions on the object OB are changed. For example, the information processing device 3 can execute the rendering processing by using the shape information, the pattern information, and the optical characteristics information on the object OB without using the light source information. In this case, for example, the transmission item information is set as information that defines that the shape information, the pattern information, and the optical characteristics information are transmitted but the light source information is not transmitted. For example, the transmission item information is set as information that defines that at least one piece of information of the shape information, the texture information, the pattern information, the light source information, or the optical characteristics information is transmitted. The imaging devices 2 can reduce load on communication when a part of the model information is transmitted, for example. For example, the transmission order information may be set depending on the priority order in the rendering processing by the information processing device 3. For example, the transmission order information may be set such that information on an item that is used first in the rendering processing is transmitted first. For example, in some cases, the rendering processing involves calculating an image of an object OB having no texture while changing viewpoints and after determining the viewpoint, calculating an image of an object OB having texture viewed from the viewpoint. For example, the information processing device 3 can calculate an image of the object OB having no texture while changing the viewpoint by using the shape information without using the texture information. For example, the transmission item information is set as information that defines that the shape information is transmitted first and the texture information is transmitted after the shape information. For example, when the information on each item in the model information is transmitted in the order corresponding to the priority order in the rendering processing by the information processing device 3, the imaging devices 2 can transmit the information in parallel to a part of the rendering processing by the information processing device 3. For example, the first controller 19 in the first imaging device 2 For example, the controller 55 in the information processing device 3 stores information input to the input device 4 in the memory 52. The information input to the input device 4 includes, for example, setting information in the rendering processing. The setting information includes, for example, at least one of data subjected to the drawing processing (for example, the shape information in the model information), information on the viewpoint in the drawing processing, data on an object to which texture is attached in the texture mapping processing, information (for example, the texture information in the model information) on the texture to be attached in the texture mapping processing, or information (for example, the light source information in the model information) on a light source in the shading processing. For example, the rendering processor 54 executes the rendering processing in accordance with the setting information. For example, the controller 55 displays an image indicating various kinds of information stored in the memory 52 on the display device 5. For example, the controller 55 displays the setting information in the rendering processing on the display device 5 and receives a change of the setting information by the input device 4. The controller 55 displays an image indicated by the estimated image data stored in the memory 52 on the display device 5. The information processing device 3 does not necessarily need to display the estimated image obtained by the rendering processing on the display device 5, and in this case, the imaging system 1 does not necessarily need to include the display device 5. For example, the information processing device 3 may transmit at least a part of the estimated image data calculated through the rendering processing to another device (a reproduction device) via the communicator 51, and the other device may display the image. For example, the information processing device 3 may transmit the estimated image data to the communicator 28 in the first imaging device 2 The information processing device 3 may receive various kinds of setting information from another device via the communicator 51, and in this case, the imaging system 1 does not necessarily need to include the input device 4. For example, the first imaging device 2 The first imaging device 2 When executing the above-mentioned various kinds of processing under the control of the information processing device 3, the first imaging device 2 Next, a method of imaging according to the first embodiment is described on the basis of an operation of the imaging system 1. In Step S4, the imaging system 1 causes the first imaging device 2 In Step S5, the imaging system 1 detects the reference pattern by the second imaging device 2 In Step S6, the imaging system 1 calculates the second model information. For example, the second controller 37 controls the second information calculator 33 to calculate the second model information of the object OB on the basis of the imaging result of the second imager 34 in Step S3 and the detection result of the reference pattern in Step S5. In Step S7, the imaging system 1 integrates the first model information and the second model information. For example, the controller 55 in the information processing device 3 acquires the first model information from the first imaging device 2 Next, an example of the operation of the imaging system 1 is described. The second projector 35 in the second imaging device 2 The first information calculator 12 in the first imaging device 2 The communicator 28 in the first imaging device 2 In Step S25, the communicator 28 in the first imaging device 2 In Step S49, the model integrator 53 in the information processing device 3 extracts the feature point from the first model information and generates the feature point data (the first feature point data) indicating the feature amount at the feature point. In Step S50, the communicator 51 in the information processing device 3 transmits the feature point data to the communicator 28 in the first imaging device 2 Thus, the first imaging device 2 In Step S70, the first imaging device 2 In Step S75, the model integrator 53 in the information processing device 3 matches the feature point included in the first feature point data and the feature point indicated in the reference information (the second feature point data). In Step S76, the information processing device 3 determines and sets a first assigned area to be assigned to the first model information from the first imaging device 2 In Step S77, the controller 55 in the information processing device 3 controls the communicator 51 to transmit an instruction (a first request signal) that requests transmission of the first model information in the first assigned area to the communicator 28 in the first imaging device 2 In Step S82 after the processing in Step S77, the controller 55 in the information processing device 3 controls the communicator 51 to transmit an instruction (a second request signal) that requests transmission of the second model information in the second assigned area to the communicator 40 in the second imaging device 2 By using the reference pattern (see the code RP5 in A second embodiment is described. In the second embodiment, the same configurations as those in the above-mentioned embodiment are denoted by the same reference symbols and descriptions thereof are simplified or omitted. As an example, the imaging system 1 can fail to detect recessed parts and gaps such as openings of the object OB. For example, the imaging system 1 may fail to distinguish an inner wall of an opening and a virtual surface blocking the opening. In such a case, when a wire frame or the like is projected onto an object on the basis of the first model information, for example, the wire frame is discontinuous between the inner wall of the opening and the outside of the opening, whereby the imaging system 1 can detect a part deficient in accuracy in the first model information. The imaging system 1 according to the second embodiment includes an information comparator (a comparer) 56 that calculates information used for the evaluation of the model information. For example, the information comparator 56 is provided in the information processing device 3, or may be provided in at least one (for example, the first imaging device 2 For example, the rendering processor 54 in the information processing device 3 acquires the first model information from the first imaging device 2 For example, the information comparator 56 calculates a difference between a taken image in the projection state and a taken image in the non-projection state acquired by the first imager 15 and detects the second reference pattern on the object OB in the field of view of the first imager 15. For example, the information comparator 56 calculates a deviation amount between the texture added in the rendering processing and the second reference pattern detected in the field of view of the first imager 15. For example, the information comparator 56 may determine (evaluate) that the accuracy of the first model information is insufficient in a part in which the calculated deviation amount is larger than a threshold. For example, the second imager 34 images the object OB in the projection state, in which the second reference pattern is projected and the non-projection state, in which the second reference pattern is not projected. For example, the information comparator 56 calculates a difference between a taken image in the projection state and a taken image in the non-projection state acquired by the second imager 34 and detects the second reference pattern on the object OB in the field of view of the second imager 34. For example, the information comparator 56 calculates a deviation amount of the second reference pattern detected in the field of view of the second imager 34 and may determine (evaluate) that the accuracy of the first model information is insufficient in a part in which the calculated deviation amount is larger than a threshold. For example, when the second reference pattern projected by the first imaging device 2 For example, the imaging system 1 may correct the model information (for example, the first model information, the second model information, and the integrated model information) by using at least one of the deviation amount of the second reference pattern detected in the field of view of the first imager 15 and the deviation amount of the second reference pattern detected in the field of view of the second imager 34. For example, the first imaging device 2 Next, a method of imaging according to the second embodiment is described on the basis of the operation of the imaging system 1. In Step S95, the imaging system 1 performs rendering processing by the rendering processor 54 on the basis of model information (for example, the first model information). In Step S96, the imaging system 1 sets the reference pattern by the pattern setter 17 on the basis of a result of the rendering processing in Step S95. In Step S97, the imaging system 1 projects the reference pattern toward the object OB by the first imaging device 2 The first projector 18 may project the reference pattern set on the basis of the model information while changing a projection magnification. For example, the pattern setter may set a pattern indicating the contour of the object OB as the reference pattern, whereas the first projector 18 may project the reference pattern enlarged or reduced. For example, the first projector 18 first sets the projection magnification that causes the pattern indicating the contour of the object OB to be displayed on the object OB. The first projector 18 projects the pattern while increasing the projection magnification, and the first imager 15 detects a change in the pattern indicating the contour of the object OB. In this case, for example, the scale (for example, the actual size) of the object OB can be estimated from the projection magnification. For example, when a part of the pattern indicating the contour is not displayed on the object OB, information on the part that is not displayed can be used for the evaluation of the validity of the modeling result. In the above-mentioned embodiments, for example, the imaging devices 2 can obtain information indicating a correspondence (positional relation) between a plurality of fields of view and can associate information obtained with a first field of view and information obtained with a second field of view with each other with high precision. The technical scope of the present invention is not limited to the above-mentioned embodiments or modifications thereof. For example, the controller in the above-mentioned embodiments reads an imaging program stored in a storage device (for example, the memory) and executes the above-mentioned various kinds of processing in accordance with this imaging program. For example, this imaging program causes a computer to execute the above-mentioned various kinds of processing. This imaging program may be recorded in a computer-readable recording medium to be provided. For example, at least one of the elements described in the above-mentioned embodiments or modifications thereof are sometimes omitted. The elements described in the above-mentioned embodiments or modifications thereof can be combined as appropriate. In the above-mentioned embodiments, the imaging devices 2 communicate the projection condition to perform the projection and the imaging of the reference pattern in synchronization with each other; the projection condition does not necessarily need to be communicated. For example, the imaging devices 2 may perform various kinds of processing such as the projection processing and the imaging processing in accordance with a preset schedule. Schedule information indicating this schedule may be stored in each memory of the imaging devices 2. At least one imaging device (for example, the first imaging device 2 Imaging system 2 2 3 Information processing device 4 Input device 5 Display device 12 First information calculator 14 First body 15 First imager 16 Feature extractor 17 Pattern setter 18 First projector 19 First controller 31 Second body 33 Second information calculator 34 Second imager 35 Second projector 36 Pattern extractor 37 Second controller 53 Model integrator 54 Rendering processor 55 Controller 56 Information comparator An imaging system, including: a first body; a first imager that is provided in the first body and images an object; a first information calculator that is provided in the first body and calculates first model information including at least one of shape information and texture information of the object based on an imaging result of the first imager; a pattern setter that sets a reference pattern indicating at least a part of the first model information calculated by the first information calculator; a first projector that projects the reference pattern toward the object; a second body; a second imager that is provided in the second body and images the object onto which the reference pattern is projected; a second information calculator that is provided in the second body and calculates second model information including at least one of shape information and texture information of the object based on an imaging result of the second imager; and a pattern extractor that extracts the reference pattern projected by the first projector from the imaging result of the second imager. 1. An imaging system, comprising:
a first body; a first imager that is provided in the first body and images an object; a first information calculator that is provided in the first body and calculates first model information including at least one of shape information and texture information of the object based on an imaging result of the first imager; a pattern setter that sets a reference pattern indicating at least a part of the first model information calculated by the first information calculator; a first projector that projects the reference pattern toward the object; a second body; a second imager that is provided in the second body and images the object onto which the reference pattern is projected; a second information calculator that is provided in the second body and calculates second model information including at least one of shape information and texture information of the object based on an imaging result of the second imager; and a pattern extractor that extracts the reference pattern projected by the first projector from the imaging result of the second imager. 2. The imaging system according to 3. The imaging system according to the pattern setter sets a pattern associated with information on the feature part as the reference pattern. 4. The imaging system according to 5. The imaging system according to the first information calculator calculates surface information including coordinates of a plurality of points on a surface of the object and link information among the points as the shape information, and the pattern setter sets a pattern associated with the surface information as the reference pattern. 6. The imaging system according to the pattern setter sets a pattern associated with at least one of shape information and texture information of the object based on a processing result of the rendering processor as the reference pattern. 7. The imaging system according to 8. The imaging system according to 9. The imaging system according to 10. The imaging system according to 11. The imaging system according to 12. The imaging system according to a first controller that is held by the first body and controls the first projector; and a second controller that is held by the second body and controls the second imager, wherein the first controller supplies information indicating a projection condition for the first projector to the second controller. 13. The imaging system according to the second information calculator calculates at least one of shape information and texture information of the object based on a taken image obtained by imaging the second distance measuring pattern projected onto the object by the second projector by the second imager. 14. The imaging system according to the projection condition includes timing of projection of the first projector, and the second controller causes the second projector to perform projection at timing different from the first projector. 15. The imaging system according to the first controller causes the first projector to project a first distance measuring pattern onto the object, the first information calculator calculates at least one of shape information and texture information of the object based on a taken image obtained by imaging the object onto which the first distance measuring pattern is projected by the first imager, and the second controller causes the second projector to project the second distance measuring pattern in a period from when the first distance measuring pattern is projected until the reference pattern is projected. 16. An imaging system, comprising:
a first body; a first projector that is provided in the first body and projects a reference pattern indicating at least a part of first model information including at least one of shape information and texture information of an object toward the object; a second body; a second imager that is provided in the second body and images the object onto which the reference pattern is projected; an information calculator that calculates second model information including at least a part of shape information and texture information of the object based on an imaging result of the second imager; and a pattern extractor that extracts the reference pattern projected by the first projector from the imaging result of the second imager. 17. The imaging system according to 18. An imaging device, comprising:
a body; an imager that is provided in the body and images an object and a reference pattern projected onto the object; a pattern extractor that extracts the reference pattern projected onto the object from an imaging result of the imager; and an information calculator that is provided in the body and uses the imaging result of the imager and the reference pattern extracted by the pattern extractor to calculate model information including at least one of shape information and texture information of the object. 19. The imaging device according to 20. An imaging device, comprising:
a body; an imager that is provided in the body and images a feature part provided on a surface of an object; a pattern extractor that extracts the feature part of the object from an imaging result of the imager; and an information calculator that is provided in the body and uses the imaging result of the imager and the feature part to calculate model information including at least one of shape information and texture information of the object. 21. The imaging device according to 22. The imaging device according to 23. The imaging device according to 24. The imaging device according to 25. The imaging device according to 26. A method of imaging, comprising:
imaging an object by a first imager provided in a first body; calculating first model information including at least one of shape information and texture information of the object based on an imaging result of the first imager by a first information calculator provided in the first body; projecting a reference pattern indicating at least a part of the first model information calculated by the first information calculator toward the object; imaging the object onto which the reference pattern is projected by a second imager provided in a second body; calculating second model information including at least one of shape information and texture information of the object based on an imaging result of the second imager by a second information calculator provided in the second body; and extracting the reference pattern projected by the first projector from the imaging result of the second imager. 27. A storage medium storing therein a program that causes a computer to execute:
imaging a feature part provided on a surface of an object by an imager provided in a body; extracting the feature part of the object from an imaging result of the imager; and calculating model information including at least one of shape information and texture information of the object by using the imaging result of the imager and the feature part by an information calculator provided in the body.CROSS REFERENCE TO RELATED APPLICATION
FIELD OF THE INVENTION
BACKGROUND
SUMMARY
BRIEF DESCRIPTION OF DRAWINGS
DESCRIPTION OF EMBODIMENTS
First Embodiment
Second Embodiment
DESCRIPTION OF REFERENCE SIGNS









