DATA COLLECTION AND POINT CLOUD GENERATION SYSTEM AND METHOD
This application claims priority to U.S. Provisional Application Ser. No. 61/578,375 entitled “Mobile Hand-held Device and Post Process for Rapid 2D and 3D Spatial, Video, and Audio Data Collection and Transformation into Visually and Dimensionally Accurate Geometric Features,” filed, which is incorporated herein by reference in its entirety. Light detection and ranging (LIDAR) used in laser devices is oftentimes used to measure distances of objects from the laser device. In this regard, the laser device emits a pulse, and a receiver positioned near the laser device receives a reflection from the object of the pulse emitted. The travel time from the time when the pulse was emitted to the time when the reflection is received by the receiver is used to calculate the distance, i.e., the distance is equal the product of the speed of light and the time of travel divided by two (2). Additionally, an inertial measurement unit (IMU) may be used to measure the linear acceleration and angular velocity of an object, which is indicative of object's velocity and relative position, using various components, which may include accelerometers, gyroscopes, and/or magnetometers. The IMU detects a rate of acceleration of the object using the accelerometers and changes in attitude (relative to rotation of the object), including pitch, roll, and yaw using the gyroscopes. The disclosure can be better understood with reference to the following drawings. The elements of the drawings are not necessarily to scale relative to each other, emphasis instead being placed upon clearly illustrating the principles of the disclosure. Furthermore, like reference numerals designate corresponding parts throughout the several views. The present disclosure generally pertains to a data collection and point cloud generation system. In particular, the data collection and point cloud generation system collects data indicative of a layout of a building, processes the data, and creates an image of the layout of the building. In one embodiment, a laser device is used to collect data indicative of the distance of walls from the laser device. In addition, one or more inertial measurement units (IMUs) is used to collect velocity and rotation data indicative of movement of the range-finding laser device during the period of time that the distance data is being collected. Based upon the distance data and the velocity and rotation data, the data collection and point cloud generation system can store and render an image of the layout of the building to a display device. Further, the display device may be a touch screen, and the data collection and point cloud generation system may be operated and/or controlled via the touch screen by an operator. The network 31 may be any type of network that enables the computing device 32 to communicate with the mobile unit 10. In this regard, the network 31 may be any type of network known in the art or future-developed. As an example, the network 31 may be a wireless local area network (WLAN or WiFi), and the computing device 32 may communicate with the mobile unit 10 via wireless transceivers (not shown). The mobile unit 10 comprises a range-finding laser device 9, three inertial measurement units (IMUs) 6-8, a mobile computing device 11, an output device 2, a camera 29, input device 3, and a power device 12. The three IMUs 6-8 include an attitude IMU 8 and two zero velocity update (zupt) IMUs 6 and 7. Note that the listed components are exemplary components. Additional or fewer components may be used in other embodiments to effectuate functionality of the system 30, to add functionality to the system 30, or to limit functionality of the system 30. The mobile computing device 11 may be any type of computing device known in the art or future-developed. The mobile computing device 11 is described further herein with reference to The output device 2 is any type of output device known in the art or future-developed that outputs information. In one embodiment, the output device 2 is a display device that displays data to an operator (not shown) of the mobile unit 10, and such data may include images, for example. The images displayed to the output device 2 may be a rendering of a point cloud based upon data collected during operation via the range-finding laser device 9 and the IMUs 6-8. In this regard, the display device 2 may be a light emitting diode (LED) display device or the like. Other output devices may be used in other embodiments. For example, the output device 2 may be a headset. In one embodiment, the range-finding laser device 9 employs LIDAR (light detection and ranging) in order to measure distances to an object, e.g., a wall or a military target. In one embodiment, a receiver 14 is coupled to the mobile unit 10, and in particular, the receiver 14 may be coupled to the range-finding laser device 9. The mobile unit 10 may use range-finding laser to determine a distance of an object, e.g., a wall, from the mobile unit 10. In one embodiment, the range-finding laser device 9 collects range data indicative of distances of structures from the receiver 14. In one embodiment, the range-finding laser device 9 performs a scan and collects data (hereinafter referred to as scan data) indicative of a plurality of pulses received after reflection from the structure. As an example, the laser may rotate about a center axis and transmit and receive 1081 pulses during a scan, which can sweep 270°. In this regard, the first pulse in the scan is at index 1, and between the first pulse and the final pulse reflection receipt at index 1081, the laser has rotated 270° and collected data (hereinafter referred to as a scan) indicative of the distance of objects within the field of view of the range-finding laser device 9. The power device 12 is any type of device presently known or future-developed for supplying power to the mobile computing device 11, the range-finding laser device 9, the attitude IMU 8, the zupt IMUs 6 and 7, the camera 29, and the display device 2. The power device 12 may be, for example, a battery. The input device 3 is any type of input device that allows an operator to provide input to the mobile computing device 11. In one embodiment, the input device 3 is a keyboard. However, other input devices are possible in other embodiments. For example, the input device 3 may be a microphone and headphones for inputting voice commands and listening to prompts to the operator. In one embodiment, the data collection and point cloud generation system 30 collects data indicative of locations of walls (i.e., data defining rooms) within a building. In this regard, an operator dons the mobile unit 10, and travels in and out of the rooms in the building. As the operator travels in and out of the rooms, the range-finding device 9 collects data indicative of locations of the walls from the range-finding laser device 9 over a period of time. The range-finding laser device 9 collects range data and angle data. In this regard, as described hereinabove, the range-finding laser device 9 performs a scan having a field of regard defined by the specific range-finding laser device employed. For example, the range-finding laser device 9 may have an opening allowing a scan of 270°. In addition, the range-fining laser device 9 may be configured to emit a pulse and receive a reflection of the pulse every ¼° (note that this is an approximation for exemplary purposes). Thus, a single scan by the range-finding laser device 9 comprises 1081 data points indicating time elapsed from emission to receipt of a pulse and the index of each data point in the scan indicates a relative angular displacement, which may be measured from a central axis of the laser. The attitude IMU 8 is fixed relative to the range-finding laser device 9. In this regard, the attitude IMU 8 may be coupled to the housing of the range-finding laser device 9. The attitude IMU 8 collects inertial data indicative of yaw, pitch, and roll relative to the range-finding laser device's frame of reference. The zupt IMUs 6 and 7 collect angular rate and linear acceleration data. In one embodiment, the zupt IMUs 6 and 7 (as described with reference to The attitude measured by the attitude IMU 8, the data indicative of the position, velocity, and yaw calculated by the zupt IMUs 6 and 7, and the range and angle measurements collected by the range-finding laser device 9 are transmitted to the mobile computing device 11. The mobile computing device 11 determines the estimated position and attitude of the range-finding laser device 9 based upon the data received from the attitude IMU 8, the zupt IMUs 6 and 7, and the range-finding laser device 9. Once the estimated position and attitude are determined, the mobile computing device 11 generates a point cloud using the estimated position and attitude. In one embodiment, the mobile computing device 11 may render in real time an image representing one particular scan and/or combined scan(s) during operation. The image may show, for example, an outline of a wall, which is part of a layout for which the operator is collecting data with the system 30. Additionally, the point cloud may be transmitted to the computing device 32 via the network 31 (or via another transfer method such as a memory stick). The computing device 32 may comprise additional imaging tools that allow a user to study, manipulate, and/or modify images generated from the point cloud. During operation, the data collection and point cloud generation system 30 may further collect video via the camera 29. The video may be time synchronized with the other components of the system 30, i.e., the range-finding laser device 9 and the IMUs 6-8, such that subsequently the video may be used in conjunction with the collected data to provide additional information about particular characteristics of structures detected during operation. The mobile unit 10 comprises the range-finding laser device 9 communicatively coupled to the mobile computing device 11 via a cable 13. Note that the attitude IMU 8 ( As described herein, the mobile unit 10 further comprises the output device 2. Any type of output device presently known or future-developed may be used in the mobile unit 10. In one embodiment, the system 30 may comprise a wrist display device 39. The wrist display device 39 is communicatively coupled to the mobile computing device 11 such that images may be rendered to the wrist display device 39 representative of data collected during operation. In one embodiment, the output device 2 is a display device and the display device is adjustably affixed to the backpack apparatus 26 via an arm. The arm comprises a front member 4 Further, the zupt IMUs 6 and 7 are fixedly coupled to the operator's shoes 130 and 131, which are coupled to the operator's feet (not shown). The zupt IMUs 6 and 7 collect linear acceleration and angular rates related to movement of the operator's feet and transmit data indicative of the feet's position, velocity, and yaw to the mobile computing device 11. Such data is communicated, either wirelessly or otherwise, to the mobile computing device 11. During operation, the laser rotates 360° such that light propagates out of the housing through the aperture 123. In one embodiment, the aperture 123 has an arc length that allows data to be collected for a 270° scan of a field of regard. During a single scan, light from the laser begins to propagate from the aperture at edge 123 In this regard, the mobile unit 80 comprises the range-finding laser device 9 As the range-finding laser device 9 In addition, the mobile computing device 11 comprises control logic 404. The control logic 404 can be implemented in software, hardware, firmware or any combination thereof. In the exemplary mobile computing device 11 shown in Processing unit 400 may be a digital processor or other type of circuitry configured to run the control logic 404 by processing and executing the instructions of the control logic 404. The processing unit 400 communicates to and drives the other elements within the mobile computing device 11 via the local interface 402, which can include one or more buses. In addition, the network interface 407 may support any type of communication device (e.g., a modem) that communicatively couples the mobile computing device 11 with the network 31 ( The camera interface 490 is any type of interface known in the art or future-developed for communicating with the camera 29 ( The input device 3 is any type of input device known in the art or future-developed for receiving input from the operator 1 ( During operation, the control logic 404 receives from the IMUs 6-8, via the IMU interface 481, zupt IMU position, velocity, and yaw data 410 (zupt IMUs 6 and 7) and attitude IMU attitude data 413 (attitude IMU 8). Upon receipt, the control logic 404 stores the data 410 and 413 in memory 401. Further, the control logic 404 receives from the range-finding laser device 9 range and angle data 411 and stores the range and angle data 411 in memory 401. Upon receipt, the control logic 404 convert the latest range and angle data to Cartesian data. Further, the control logic 404 compares the latest Cartesian data with the last Cartesian data and derives a change in position and attitude based upon the comparison, which the control logic 404 stores as change in position and attitude data 414 in memory 401. The control logic 404 processes the data 410, 414, and 413 to generate data indicative of an estimated position and attitude 415 of the range-finding laser device 9. The estimated position and attitude data 415 of the range-finding laser device 9 is then used to transform scan data, derived from range-finding device range data 411, to a three-dimensional frame of reference so it can be added to the point cloud data 412. Note that the point cloud data 412 is a collection of laser scan data over time and at any given moment, when displayed, is indicative of a layout of a structure that has been walked through. Further, the control logic 404 may display an image indicative of the point cloud data 412 to the display device 2. In one embodiment, the control logic 404 stores the point cloud data 412, which may at a subsequent time be transferred to the computing device 32 ( For purposes of discussion in explaining the data collection and point cloud generation system 30 ( As described hereinabove, the range-finding laser device 9 collects range data. The range data collected is a plurality of data points, each data point of the distance from the range-finding laser device 9 to the wall struck by the pulse emitted as it scans the span of 270°. For purposes of explanation, a set of data points corresponding to a single scan of the laser is hereinafter referred to as scan data. In the example provided, wherein the aperture 123 allows a 270° scan, the range-finding laser device 9 determines time differentials (at) for each pulse emitted/received and calculates the distance traveled by the pulse, which indicates range (or distance) to the wall detected. In addition, the index of a particular data point in the scan data also provides angular information indicative of the angular offset of the laser beam (which may be relative to a central axis 27 Note that a laser sweep having the field of regard of ScanA produces a data set hereinafter identified as ScanN that comprises range and angle data for a single scan taken at time t1 having values associated with a plurality of data points corresponding to the “x” identifiers ( In location A, the range-finding laser device 9 has an attitude (hereinafter referred to as AttitudeA), which is measured by the attitude IMU 8 ( Further, in location A, the zupt IMUs 6 and 7 ( The square symbol 702 represents the range-finding laser device 9 when it has been rotated such that it collects data for a different section of the walls 604 and 605, i.e., the field of regard has changed based upon rotation of the range-finding laser device 9. In this regard, the square symbol 702 represents the range-finding laser device 9 and depicts a location (hereinafter referred to as “location B”) of the range-finding laser device 9 during a scan having a field of regard identified in Note that the scan having the field of regard of ScanB produces a data set hereinafter identified as ScanN+1 that comprises range and angle data for a single scan taken at time t2 having values associated with a plurality of data points corresponding to the “o” identifiers ( In location B, the range-finding laser device 9 has an attitude (hereinafter referred to as AttitudeB), which is measured by the attitude IMU 8 ( Further, in location B, the zupt IMUs 6 and 7 ( Additionally, the control logic 404 calculates the operator's body center position and attitude based on the operator's feet position and attitude provided by zupt IMUs 6 and 7. Once the operator's body center position and attitude are determined, the control logic 404 adds a predetermined offset that has been measured between the operator's body center and the range-finding laser device's center point. In calculating a global pose of the range-finding laser device 9, the mobile computing device 11 receives AttitudeN data from the attitude IMU 8, ScanN from range-finding laser device 9, and position, velocity, and yaw from the zupt IMUs 6 and 7. Such data is indicative of measurements taken at time t1. Additionally, the mobile computing device 11 receives AttitudeN data from the attitude IMU 8, ScanN from range-finding laser device 9, and position, velocity, and yaw from the zupt IMUs 6 and 7. Such data is indicative of measurements taken at time t2. The control logic 404 calculates a change in attitude from t1 to t2. Such change is a calculated attitude difference as indicative of a difference between AttitudeB (at t2) and AttitudeA (at t1). The difference is hereinafter referred to as “Delta Attitude.” Further, the control logic 404 calculates a change in position from t1 to t2. Such change is derived from a difference indicative of a difference between Location B (at t2) and Location A (at t1). The difference is hereinafter referred to as “Delta Position”. The control logic 404 performs a variety of operations on the range and angle data 411 in order to calculate the estimated change in position and attitude data 414 needed to determine the global pose of the range-finding laser device 9. Initially, the range and angle data 411 is measured in a spherical coordinate system from the range-finding laser device's frame of reference. The control logic 404 converts the range and angle data to Cartesian coordinates in an X-Y plane thereby generating, for each data point in ScanN and ScanN+1, (x, y, 0). In this regard, the data is in the range-finding laser device's frame of reference. Using the latest computed pitch and roll from the attitude IMU 8, the control logic 404 converts the Cartesian coordinates (x, y, 0) of ScanN+1 to three-dimensional, noted as (x′, y′, z′). At this point in process, the three-dimensional coordinates (x′, y′, z′) are also in the frame of reference of the range-finding laser device 9. The control logic 404 then projects the three-dimensional coordinates onto a horizontal plane (not shown) by setting the z′-value of each data point to zero (0), noted as (x′, y′, 0). In the embodiment of mobile unit 80, the control logic 404 does not perform the projection onto a horizontal plane. The control logic 404 then performs a scan matching method on ScanN data (i.e. last scan) and ScanN+1 data (i.e. latest scan). In this regard, the control logic 404 compares data points contained in ScanN+1 with ScanN to determine a change in position and attitude, which is indicative of Delta Position and Delta Attitude. Any type of scan matching method known in the art or future-developed may be used to compare ScanN+1 with ScanN to determine change in position in accordance with an embodiment of the present disclosure. The control logic 404 then uses a filter to determine an estimated change in position and change in attitude, indicative of a change in global pose, using a combination of change in position and change in attitude calculated from two sources, which include the scan matching method and zupt process. In one embodiment, the control logic 404 employs an Extended Kalman Filter (EKF). The inputs to the EKF include the results of the scan matching method (difference between ScanN+1 and ScanN) and the results of the zupt process. The control logic 404 determines a latest global pose, i.e., (x, y, z, roll, pitch, yaw) based on the change in global pose. In this regard, the control logic 404 calculates the latest global pose by adding the latest change in global pose to the last global pose. The control logic 404 then transforms the ScanN+1 for time t2 (i.e., ScanN data points) from the sensor frame of reference to the global (or room) frame of reference. The transform is performed using the Cartesian coordinates converted from the range and angle data 411 received from the range-finding laser device 9. During the course of scanning structures and obtaining data indicative of the structures, there may be spurious data points that fall outside the prevalent general location of other data points. In this regard, there may be quick movements of the operator or a malfunction in equipment (e.g., the IMUs or the laser) that causes such statistical outliers. In one embodiment of the system 30, the control logic 404 may perform a filtering method for removing such statistical outliers from the transformed ScanN+1 data before it is added to the point cloud data 412. Further, during course of operation, the operator 1 may hold the range-finding laser device 9 still for a period of time and not physically move such that data obtained by the range-finding laser device 9 becomes redundant. Thus, before adding transformed ScanN+1 data to the point cloud data 412, the control logic 404 may determine when the range-finding laser device 9 was not moving, i.e., a period of non-movement of the operator, and eliminate redundant data during that period of non-movement thereby generating data hereinafter referred to as new transformed scan data. The control logic 404 adds the new transformed scan data to the point cloud data 412. Thus, the point cloud data 412 after the addition reflects the latest data points indicative of the structures scanned by the range-finding laser device 9. The flowchart depicts five processes, including process A, B, C, D, and E. The parent process that generates the point cloud data 412 ( Process B comprises three steps including 2000-2002. In one embodiment, steps 2000-2002 are performed by the zupt IMUs 6 and 7 ( In step 2000, independent processors (not shown) of the zupt IMUs 6 and 7 receive data indicative of angle rates and linear accelerations of the foot to which the zupt IMU 6 and 7 are attached. Such angle rates and linear accelerations relate to motion characteristics of the operator's feet as he/she moves or traverses a room(s) in the building of interest. Upon receipt of the angle rates and linear accelerations, each processor performs a zero velocity update (zupt) in step 2001. A zero velocity update is a method where zero velocity intervals are detected and any error contained in the measurements is reset or set to zero. In the particular system 30, zero velocity occurs when the operator's foot is at rest, which may be a very quick moment in time while the operator walks. In step 2002, the processor calculates the position and velocity of the operator's foot based upon the measured angle rates and linear accelerations received. Note that in addition to position and velocity, the zupt IMUs 6 and 7 further provide data indicative of attitude. Once process B derives position, velocity, and yaw of the operator's feet, process B begins again at step 2000. In this regard, process B is a continual process on each zupt IMU 6 and 7 that runs during operation of the system 30 such that data indicative of the position, velocity, and yaw of the operator's feet is continually updated based upon movement of the operator. Process C comprises three steps including 2003-2005. Steps 2003-2005 are performed by the control logic 404. In step 2003, control logic 404 computes data indicative of an estimated body center of the operator based upon the position, velocity, and yaw from each foot computed independently by the zupt IMU processors, 6 and 7, in step 2002. As shown in regard to Once position, velocity, and yaw data are derived for the range-finding laser device 9 based on zupt IMU position, velocity, and yaw, the control logic 404 calculates a difference between the latest derived position and yaw and the last derived position and yaw to determine an estimated change in position and yaw. Process C begins again at step 2003. In this regard, process C is a recurring process that runs during operation of the system 30 such that data indicative of the change in the range-finding laser device's position and yaw based upon the zupt IMUs 6 and 7 is continually updated based upon movement of the operator and synchronized to each range-finding laser scan cycle (t). Process D comprises five steps including 4000-4004. Steps 4000-4004 are performed by the control logic 404. In step 4000, control logic 404 receives spherical data indicative of range and angle from the range-finding laser device 9. In step 4001, the control logic 404 converts the range and angle spherical data to Cartesian data, i.e., each data point having a radial distance (the distance from the range-finding laser device 9 to the walls) and an angle is converted to x, y coordinates represented (x, y, 0) in Cartesian notation. Note that there is no z component considered in these coordinates because the range-finding laser device 9 collects data in the x-y (horizontal) plane only. In step 4002, the control logic 404 converts the Cartesian data points (x, y, 0) for each data point in the scan to three-dimensional data based upon data indicative of the attitude (pitch and roll) provided by the attitude IMU 8 ( In step 4003, the control logic 404 projects each three-dimensional data point onto a horizontal plane, i.e., the x-y horizontal field of regard of the range-finding laser device 9. The result is data hereinafter are referred to as (x′, y′, 0). In the embodiment of the mobile unit 80 ( In step 4004, the control logic 404 compares the latest scan data (i.e. ScanN+1 at t2) to the last scan data (i.e. ScanN at t1) using a scan matching method to obtain the scan matching method derived change in position and attitude. Such data is hereinafter identified as (dX, dY, dZ) and (dYaw, dPitch, dRoll). Process D begins again at step 4000. In this regard, process D is a recurring and iterative process that runs during operation of the system 30 such that data indicative of the change in position and yaw based upon consecutive scan from the range-finding laser device 9 is continually updated based upon movement of the range-finding laser device 9. Process E comprises two steps including 3000-3001. Steps 3000-3001 are performed by the control logic 404. In step 3000, control logic 404 receives attitude data indicative of roll, pitch, and yaw from the attitude IMU 8 ( In step 3001, the control logic 404 calculates a change in attitude using a difference between the latest attitude and the last attitude. Process E begins again at step 3000. In this regard, process E is a recurring and iterative process that runs during operation of the system 30 such that data indicative of the change in attitude based upon the attitude IMU 8 is continually updated based upon movement of the operator and the range-finding laser device 9. Process A is the parent process that receives each set of data from the respective process C, D, and E. In this regard, process C provides data indicative of change in position and yaw, process D provides data indicative of change in position and attitude, and process E provides data indicative of change in attitude. In step 1003, the control logic 404 fuses the data from processes C, D, and E to obtain a fused estimated change in position and attitude of the range-finding laser device 9. In step 1004, the control logic 404 calculates a latest global pose of the range-finding laser device 9, based upon the fused data by adding the fused change in estimated position and attitude to the last global pose. In step 1005, the control logic 404 uses the latest global pose to transform the latest scan Cartesian points from the range-finding laser device's frame of reference to the global frame of reference. In step 1006, the control logic 404 performs a statistical outlier removal filter on the transformed scan data that lies in the global frame of reference as described hereinabove. Further, in step 1007, the control logic 404 performs a filter method that removes redundant scan data resulting from non-movement of the operator 1 during data collection. In this regard, when the operator does not move and the sensors, i.e., the range-finding laser device 9, the zupt IMUs 6 and 7, and the attitude IMU 8, continue to collect measurements and perform calculations, redundant scan data will unnecessarily accumulate. Thus, in order to ensure that such redundant data does not unnecessarily appear in the point cloud data 412, the control logic 404 removes such redundant scan data and does not add that data to the point cloud. In step 1008, the control logic 404 adds the latest set of scan data, if not removed by Step 1007, to the point cloud data 412. Process A begins again at step 1003. In this regard, process A is a recurring and iterative process that runs during operation of the system 30 such that point cloud data 412 is continually updated based upon movement of the range-finding laser device 9 and collection of data. A system has a range-finding laser device coupled to an operator that performs a latest scan measuring a plurality of data points indicative of range and angle, an attitude inertial measurement unit (IMU) that is affixed to the range-finding laser device that measures pitch, roll, and yaw of the range-finding laser device, and two zero-velocity update (zupt) IMUs coupled to the operator that estimate position, velocity, and yaw of the operator. Further, the system has logic that transforms a plurality of data points from a sensor frame of reference, based upon measurements made, to a global frame of reference using data indicative of a latest global pose to obtain data indicative of transformed data points and merges the data indicative of the transformed data points with a point cloud. 1. A system, comprising:
a range-finding laser device coupled to an operator and configured to perform a latest scan measuring a plurality of data points indicative of range and angle relative to the location of the range-finding laser and surrounding structure which is indicative of spatial structure in the field of view of the range-finding laser device; an attitude inertial measurement unit (IMU) that is affixed to the range-finding laser device and configured to measure pitch, roll, and yaw of the range-finding laser device; two zero-velocity update (zupt) IMUs coupled to the operator, the zupt IMUs configured to estimate position, velocity, and yaw of the operator; logic configured to convert each of the plurality of data points to Cartesian data points thereby generating latest scan data, compare the latest scan data with last scan data to derive data indicative of a first estimated change in position and attitude of the range-finding laser device via a scan matching method, wherein the last scan data comprises data indicative of a plurality of Cartesian data points indicative of a previous scan performed by the range-finding laser device, the logic further configured to convert the zupt IMU estimated position, velocity, and yaw to data indicative of a second estimated change in the position and attitude of the range-finding laser device, fuse the first estimated change in position and attitude and the second estimated change in position and attitude to obtain data indicative of a fused change in position and attitude of the range-finding laser device, calculate data indicative of a latest global pose based upon the data indicative of the fused change in position and attitude and data indicative of a last global pose, the logic further configured to transform the plurality of data points from a sensor frame of reference to a global frame of reference using the data indicative of the latest global pose to obtain data indicative of transformed data points and merge the data indicative of the transformed data points with a point cloud. 2. The system of 3. The system of 4. The system of 5. The system of 6. The system of 7. The system of 8. The system of 9. The system of 10. The system of 11. The system of 12. The system of 13. The system of 14. The system of 15. The system of 16. The system of 17. The system of 18. The system of 19. The system of 20. A method, comprising:
performing a latest scan measuring a plurality of data points indicative of range and angle relative to a location of a range-finding laser and surrounding structure which is indicative of spatial structure in the field of view of the range-finding laser device; measuring pitch, roll, and yaw of the range-finding laser device via an attitude inertial measurement unit (IMU) that is affixed to the range-finding laser device; estimating position, velocity, and yaw of the operator via two zero-velocity update (zupt) IMUs coupled to the operator; converting each of the plurality of data points to Cartesian data points thereby generating latest scan data; comparing the latest scan data with last scan data to derive data indicative of a first estimated change in position and attitude of the range-finding laser device via a scan matching method, wherein the last scan data comprises data indicative of a plurality of Cartesian data points indicative of a previous scan performed by the range-finding laser device; converting the zupt IMU estimated position, velocity, and yaw to data indicative of a second estimated change in the position and attitude of the range-finding laser device; fusing the first estimated change in position and attitude and the second estimated change in position and attitude to obtain data indicative of a fused change in position and attitude of the range-finding laser device; calculating data indicative of a latest global pose based upon the data indicative of the fused change in position and attitude and data indicative of a last global pose; transforming the plurality of data points from a sensor frame of reference to a global frame of reference using the data indicative of the latest global pose to obtain data indicative of transformed data points; and merging the data indicative of the transformed data points with a point cloud. 21. The method of 22. The method of 23. The method of 24. The method of 25. The method of 26. The method of 27. The method of CROSS REFERENCE TO RELATED APPLICATIONS
BACKGROUND
BRIEF DESCRIPTION OF THE DRAWINGS
DETAILED DESCRIPTION
1 ∠ = −135° Range1 2 ∠ = −134.75° Range2 . . . . . . . . . 1080 ∠ = 269.75° Range1080 1081 ∠ = 270° Range1081
Thus, for each set of scan data, there is range data indicating the range measured by the range-finding laser device 9 and there is angular data indicating an angle difference between the central axis 27