patents.google.com

US20080195316A1 - System and method for motion estimation using vision sensors - Google Patents

  • ️Thu Aug 14 2008

US20080195316A1 - System and method for motion estimation using vision sensors - Google Patents

System and method for motion estimation using vision sensors Download PDF

Info

Publication number
US20080195316A1
US20080195316A1 US11/673,893 US67389307A US2008195316A1 US 20080195316 A1 US20080195316 A1 US 20080195316A1 US 67389307 A US67389307 A US 67389307A US 2008195316 A1 US2008195316 A1 US 2008195316A1 Authority
US
United States
Prior art keywords
features
motion
location
image
images
Prior art date
2007-02-12
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/673,893
Inventor
Kailash Krishnaswamy
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honeywell International Inc
Original Assignee
Honeywell International Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
2007-02-12
Filing date
2007-02-12
Publication date
2008-08-14
2007-02-12 Application filed by Honeywell International Inc filed Critical Honeywell International Inc
2007-02-12 Priority to US11/673,893 priority Critical patent/US20080195316A1/en
2007-02-12 Assigned to HONEYWELL INTERNATIONAL INC. reassignment HONEYWELL INTERNATIONAL INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KRISHNASWAMY, KAILASH
2008-02-12 Priority to IL189454A priority patent/IL189454A0/en
2008-02-12 Priority to GB0802629A priority patent/GB2446713A/en
2008-08-14 Publication of US20080195316A1 publication Critical patent/US20080195316A1/en
Status Abandoned legal-status Critical Current

Links

  • 238000000034 method Methods 0.000 title claims description 29
  • 238000012545 processing Methods 0.000 claims abstract description 75
  • 238000005259 measurement Methods 0.000 claims abstract description 34
  • 238000001514 detection method Methods 0.000 claims description 9
  • 239000002131 composite material Substances 0.000 claims description 2
  • 230000001133 acceleration Effects 0.000 description 14
  • 238000007796 conventional method Methods 0.000 description 4
  • 238000010586 diagram Methods 0.000 description 4
  • 230000000875 corresponding effect Effects 0.000 description 3
  • 230000010354 integration Effects 0.000 description 3
  • 230000002596 correlated effect Effects 0.000 description 2
  • 238000005516 engineering process Methods 0.000 description 2
  • 238000003384 imaging method Methods 0.000 description 2
  • 230000006978 adaptation Effects 0.000 description 1
  • 230000000694 effects Effects 0.000 description 1
  • 230000004927 fusion Effects 0.000 description 1
  • 230000005484 gravity Effects 0.000 description 1
  • 230000003287 optical effect Effects 0.000 description 1
  • 230000001902 propagating effect Effects 0.000 description 1
  • 239000004065 semiconductor Substances 0.000 description 1

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/223Analysis of motion using block-matching
    • G06T7/238Analysis of motion using block-matching using non-full search, e.g. three-step search
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/027Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising intertial navigation means, e.g. azimuth detector
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection
    • H04N5/145Movement estimation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/0278Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS

Definitions

  • unmanned vehicles such as unmanned aerial vehicles (UAV) or robots, etc. also need accurate position and velocity information in order to properly navigate an area.
  • UAV unmanned aerial vehicles
  • GPS global positioning system
  • GPS satellite signals are not always available. For example, a GPS satellite signal may not be available when a vehicle is traveling in a city among tall buildings. When the GPS signal is not available, the IMU measurements are not updated.
  • a motion estimation system comprises two image sensors, each image sensor configured to obtain a first image at a first time and a second image at a second time; an inertial measurement unit (IMU) configured to obtain motion data for the time period between the first time and the second time; and a processing unit coupled to the IMU and the two image sensors, wherein the processing unit is configured to estimate motion by comparing the location of one or more features in the two first images with the location of the one or more features in the respective two second images, wherein the processing unit determines the location of the one or more features in at least one of the second images based at least in part on the IMU data for the time period between the first time and the second time.
  • IMU inertial measurement unit
  • FIG. 1 is a high level block diagram depicting a motion estimation system according to one embodiment of the present invention.
  • FIG. 2 is an exemplary diagram depicting the location of an object at different times as detected by vision sensors according to one embodiment of the present invention.
  • FIG. 3 is a flow chart depicting a method of estimating motion using vision sensors according to one embodiment of the present invention.
  • Embodiments of the present invention enable accurate motion estimates even in the absence of a global positioning system satellite signal.
  • embodiments of the present invention use an inertial measurement unit in conjunction with a stereo vision sensor to obtain the accurate motion estimates.
  • embodiments of the present invention improve the processing rate necessary to use the stereo vision sensor to obtain motion estimates.
  • FIG. 1 is a high level block diagram depicting a motion estimation system 100 according to one embodiment of the present invention.
  • System 100 includes a processing unit 106 , an inertial measurement unit (IMU) 102 , and a stereo vision sensor 104 .
  • Stereo vision sensor 104 and IMU 102 are each coupled to processing unit 106 and provide input data to processing unit 106 for estimating motion (e.g. position and/or velocity).
  • IMU 102 measures motion in six degrees of freedom.
  • IMU 102 includes sensors to measure acceleration along three orthogonal coordinate axis and angular acceleration about each of the three orthogonal axes.
  • IMU 102 can include, but is not limited to, three linear accelerometers configured to obtain acceleration along the three coordinate axes, and three gyroscopes to measure angular acceleration about the same three coordinate axes.
  • the gyroscope measurements are used to estimate attitude or orientation of a vehicle and the accelerometer measurements are used to estimate position and velocity, including the effects of gravity. Therefore, IMU 102 provides data to processing unit 106 which is used to determine position based on the motion measured by IMU 102 .
  • IMU 102 measurements provided by IMU 102 suffer from integration drift. In other words, position and velocity are obtained by integrating the acceleration measurements provided by IMU 102 . This integration increases errors in position and velocity due to errors in the measured acceleration. Acceleration measurements can suffer from interference (i.e. noisy signals) and the accelerometers and gyroscopes often have a certain level of bias associated with them. For example, a bias of 1 milli-g (9.8/1000 m/s 2 ) means that even though a vehicle in which system 100 is located is stationary, sensors in IMU 102 will measure 1 milli-g of acceleration. This error is further magnified as the measured acceleration is integrated to obtain position and velocity. Therefore, the calculated position based on IMU 102 measurements becomes increasingly less reliable with time.
  • Positions based on IMU 102 measurements can be corrected or updated with measurements from an optional Global Positioning System (GPS) sensor 108 .
  • Position data from GPS sensor 108 does not drift as with data from IMU 102 .
  • GPS data is not always available.
  • the GPS satellite signal can be lost when traveling near moist trees, tall buildings, etc.
  • the measurements from IMU 102 can not be updated by data from GPS sensor 108 and the calculated position becomes increasingly unreliable the longer the GPS satellite signal is unavailable.
  • Positions based on IMU 102 can also be updated or corrected with measurements from stereo vision sensor 104 .
  • Stereo vision sensor 104 can be used in conjunction with GPS sensor 108 .
  • stereo vision sensor 104 can be used whenever a GPS satellite signal is unavailable or concurrently with GPS sensor 108 regardless of availability of a GPS satellite signal.
  • GPS sensor 108 can be omitted, in some embodiments.
  • data from IMU 102 is only updated with measurements from stereo vision sensor 104 .
  • Stereo vision sensor 104 comprises at least two image sensors 112 each configured to obtain images of an area near the system 100 .
  • Image sensors 112 are implemented as any appropriate device for collecting image data including, but not limited to, a visible light camera, a laser system, an infrared camera, and any other existing or later developed imaging technology.
  • processing unit 106 is able to determine the position and velocity of the vehicle in which system 100 is located, using techniques known to one of skill in the art.
  • the position measurements obtained from stereo vision sensor 104 suffer from less drift than measurements from IMU 102 , measurements from stereo vision sensor 104 are time intensive.
  • correlation of features can take up to 30 seconds to complete using conventional correlation techniques.
  • the amount of time required for conventional techniques therefore, renders stereo vision sensor 104 ineffective for use in many vehicles, especially in vehicles moving at relatively high velocities compared to the time required for correlation using conventional techniques.
  • Embodiments of the present invention enable a relatively quick correlation process compared to conventional methods. Therefore, embodiments of the present invention enable the effective use of stereo vision sensor 104 to update or correct measurements obtained from IMU 102 .
  • processing unit 106 is configured to estimate the location of features in images obtained at the second moment in time based on data received from IMU 102 and the location of features in images obtained at the first moment in time. The estimation of the location of features in the second images is described in more detail below with regards to FIGS. 2 and 3 . Therefore, processing unit 106 is able to locate the features in the second images quicker by searching for the features in a smaller area located around an estimated location of the features. Once the actual location of the features is identified, processing unit 106 can calculate the change in position of the vehicle by using techniques known to one of skill in the art.
  • Processing unit 106 uses instructions for carrying out the various process tasks, calculations, and generation of signals and other data used in the operation of system 100 , such as to correlate the location of features in images and to determine the velocity and position of the vehicle in which system 100 is located.
  • the instructions can be implemented in software, firmware, analog or digital electronics, or any computer readable instructions. These instructions are typically stored on any appropriate computer readable medium used for storage of computer readable instructions or data structures. Such computer readable media can be any available media that can be accessed by a general purpose or special purpose computer or processor, or any programmable logic device.
  • Suitable computer readable media may comprise, for example, non-volatile memory devices including semiconductor memory devices such as EPROM, EEPROM, or flash memory devices; magnetic disks such as internal hard disks or removable disks (e.g., floppy disks); magneto-optical disks; CDs, DVDs, or other optical storage disks; nonvolatile ROM, RAM, and other like media. Any of the foregoing may be supplemented by, or incorporated in, specially-designed application-specific integrated circuits (ASICs).
  • ASICs application-specific integrated circuits
  • system 100 is able to use data from stereo vision sensor 104 to correct or update the position calculated based on data from IMU 102 .
  • the calculated position based on data from IMU 102 is combined with the calculated position from stereo vision sensor 104 .
  • the calculated position from stereo vision sensor 104 is used in place of the calculated position based on data from IMU 102 .
  • IMU 102 is essentially used only to estimate the location of features in images captured by stereo vision sensor 104 .
  • IMU 102 measures acceleration of a vehicle in six degrees of freedom.
  • the measurements from IMU 102 are passed to processing unit 106 .
  • images captured from each image sensor 112 in stereo vision sensor 104 are passed to processing unit 106 .
  • each of two sensors in stereo vision sensor 104 captures an image at a time T 1 and at a second later time T 2 .
  • Processing unit 106 is configured to locate one or more features in each of the two images captured at time T 1 .
  • Processing unit 106 also correlates the location of the one or more features in one of the two images with the location of the one or more features in the other image.
  • Processing unit 106 is also configured to calculate an estimated position and velocity using the measured acceleration from IMU 102 .
  • the estimated position and velocity is used by processing unit 106 together with the location of the one or more features in the images captured at time T 1 to estimate the location of the one or more features in at least one of the images captured at time T 2 .
  • Processing unit 106 determines the actual location of the one or more features in the image captured at time T 2 by focusing on a small area around the estimated position. The small area is defined by the possible error in the measurements of IMU 102 .
  • processing unit 106 calculates the position and velocity of the vehicle based on a comparison of the location of the one or more features between images captured at time T 1 and the images captured at time T 2 using techniques known to one of skill in the art.
  • Processing unit 106 then combines the estimated positions and velocities from IMU 102 and stereo vision sensor 104 to obtain a more accurate estimate in some embodiments. Alternatively, processing unit only uses the motion estimate from stereo vision sensor 104 . The more accurate estimate is then optionally displayed on a display element 110 , in some embodiments. For example, an automobile using system 100 can use display element 110 to display to a driver of the automobile where the vehicle is located on a map. In other embodiments, processing unit 106 uses the more accurate estimate to determine the necessary actions to take in order to reach a programmed destination. In some such embodiments, processing unit 106 generates control signals which are sent to one or more movement actuators 112 to control the movement of the vehicle.
  • processing unit 106 can control the flight of an unmanned aerial vehicle (UAV) based on control signals transmitted to movement actuators (such as the throttle, wing flaps, etc.) in the UAV to control the pitch, yaw, thrust, etc. of the UAV.
  • UAV unmanned aerial vehicle
  • movement actuators such as the throttle, wing flaps, etc.
  • FIG. 2 is an exemplary diagram depicting the location of an object 208 at different times as detected by a stereo vision sensor (e.g. stereo vision sensor 104 in FIG. 1 ) according to one embodiment of the present invention.
  • a first image sensor captures image A 1 at time T 1 and image A 2 at time T 2 .
  • a second image sensor captures image B 1 at time T 1 and image B 2 at time T 2 .
  • the image sensors are implemented as any appropriate device for collecting image data including, but not limited to, a visible light camera, a thermographic camera, a spectrometer, an infra-red sensor, a millimeter wave radar system, and any other existing or later developed imaging technology.
  • images A 1 , B 1 , A 2 , and B 2 are provided by way of explanation and not by way of limitation. It is to be understood that object 208 and images A 1 , B 1 , A 2 , and B 2 will vary in operation depending on the area near the vehicle in which a motion estimating system (e.g. system 100 ) is located. In particular, although object 208 is shown as a rectangle in FIG. 2 , it is to be understood that in other embodiments the object 208 can be any appropriate object from which a vehicles relative position can be determined. For example, object 208 can be a tunnel, building, tower, tree, river, etc.
  • Object 208 is comprised of a plurality of features 202 - 1 . . . 202 -N.
  • a feature refers to any set of one or more pixels that have a high contrast with respect to the surrounding pixels in images A 1 , B 1 , A 2 , and B 2 .
  • Also shown in each of the images A 1 , B 1 , A 2 , and B 2 are coordinate axes 204 .
  • Coordinate axes 204 are provided for purposes of explanation, but are not necessary for operation of embodiments of the present invention. In particular, coordinate axes 204 help demonstrate the relative position of object 208 in each of images A 1 , B 1 , A 2 , and B 2 .
  • images A 1 and B 1 are each obtained by a sensor (e.g. sensors 112 in FIG. 1 ) at time T 1 .
  • a sensor e.g. sensors 112 in FIG. 1
  • object 208 is offset differently from center 210 in image A 1 than in image B 1 . This difference is due to the relative position of the two sensors which capture images A 1 and B 1 .
  • the two image sensors are fixed on a rigid bar and set apart from each other.
  • a processing unit e.g. processing unit 106 ) locates features 202 - 1 . . . 202 -N in each of images A 1 and B 1 and correlates their respective locations.
  • the processing unit correlates the location of feature 202 - 1 in image A 1 with the location of feature 202 - 1 in image B 1 .
  • Any appropriate technique can be used for locating features 202 - 1 . . . 202 -N, such as the Harris corner detector or the Kanade-Lucas-Tomasi (KLT) corner detector.
  • images A 2 and B 2 are captured.
  • the processing unit locates and correlates features 202 - 1 . . . 202 -N in at least one of images A 2 and B 2 with one of images A 1 and B 1 .
  • the processing unit locates features 202 - 1 . . . 202 -N in image A 2 and correlates their location with features 202 - 1 . . . 202 -N in image A 1 .
  • the position and attitude of the vehicle can be determined using techniques known to one of skill in the art. This process can be very time intensive because of the changes in position of object 208 from time T 1 to time T 2 .
  • the vehicle has banked and turned to the left as well as pitched downward.
  • the movement of the vehicle causes object 208 as depicted in image A 2 , for example, to move and rotate to the right, and to move up in image A 2 .
  • the vehicle has moved toward object 208 causing object 208 to be larger in image A 2 . Therefore, features 202 - 1 . . . 202 -N in image A 2 are not located near the original position (e.g. pixel location) of features 202 - 1 . . . 202 -N in image A 1 .
  • Locating and correlating features 202 - 1 . . . 202 -N requires large amounts of processing time in conventional systems. Although the process of correlating features in an image captured at time T 2 with an image captured at time T 1 is time intensive in conventional systems, embodiments of the present invention enable a relatively quick correlation of features.
  • the processing unit uses data received from an IMU (e.g. IMU 102 in FIG. 1 ) to determine an approximate location of features 202 - 1 . . . 202 -N in figures A 2 and B 2 .
  • the processing unit uses the location of features 202 - 1 in figure A 1 as a starting point and propagates the location of feature 202 - 1 forward based on data received from the IMU over the time period between time T 1 and time T 2 .
  • the processing unit in propagating the location forward, the processing unit also identifies area 206 in which feature 202 - 1 is located. Area 206 results from the known approximate error of measurements obtained from the IMU.
  • the processing unit evaluates the pixels in area 206 to identify the actual location of feature 202 - 1 in image A 2 .
  • the processing unit evaluates pixels successively further away from the estimated location of feature 202 - 1 , starting with pixels adjacent to the estimated location, until the actual pixel location is identified. In such embodiments, the approximate error in the IMU data does not need to be known.
  • the processing unit can use conventional techniques, such as KLT corner detection, to locate feature 202 - 1 in area 206 . However, by focusing on area 206 of image A 2 rather than the entire image, the processing unit is able to locate feature 202 - 1 much quicker than in conventional systems which have to search larger areas.
  • the processing unit uses IMU data to estimate the location of each of features 202 - 1 . . . 202 -N. It is not necessary to correlate both of images A 2 and B 2 with images A 1 and B 1 , respectively, because once the actual location of features in one of images A 2 and B 2 are located, the processing unit can calculate position and velocity which can be used to determine the actual location of the features in the other of images A 2 and B 2 .
  • FIG. 3 is a flow chart depicting an exemplary method 300 of estimating motion using vision sensors according to one embodiment of the present invention.
  • Method 300 can be used in a motion estimating system such as system 100 above.
  • method 300 is implemented in a computer readable medium for use by a processing unit (such as processing unit 106 ).
  • a processing unit such as processing unit 106 .
  • image data is received for two images captured at a first time T 1 .
  • two image sensors e.g. sensors 112
  • the processing unit locates one or more features in both of the images.
  • the one or more features are the same in each of the images with only the relative location and/or size of the features being different.
  • locating the one or more features includes correlating the relative location of the features in one image with the location of the features in the other image.
  • the features can be located using any technique known to one of skill in the art, such as KLT corner detection or Harris corner detection.
  • image data is received for two images captured at a second time T 2 from the two image sensors.
  • Time T 2 occurs after time T 1 .
  • the subject matter of the images captured at time T 2 and time T 1 is the same. That is, each image captured at a second corresponds to one of the images captured at the first time T 1 .
  • motion data is received from an inertial measurement unit (e.g. IMU 102 ).
  • IMU 102 inertial measurement unit
  • acceleration measurements in six degrees of freedom are received by the processing unit in this embodiment.
  • the processing unit evaluates the IMU data to calculate an estimate for position and/or velocity.
  • the processing unit estimates the location of the one or more features in at least one of the images captured at time T 2 .
  • the location estimate is based on the IMU data received at 308 and the location of the features in the images captured at time T 1 .
  • the processing unit calculates motion that occurred from time T 1 to time T 2 based on the IMU data.
  • the processing unit selects a second pixel location in one of the second images for each of the one or more features.
  • the processing unit then evaluates pixels near the second pixel location to identify the actual pixel location of the feature.
  • the processing unit calculates an area to be searched around the second pixel location of each feature based on the known approximate error in the IMU measurements. By limiting the search to an area defined by the known approximate error, the processing unit searches more efficiently by not unnecessarily evaluating pixels outside the area. In other embodiments, the processing unit does not limit the search to a calculated area. In such embodiments, the processing unit successively evaluates pixels near the second pixel location until the actual pixel location of the corresponding feature is identified, starting with pixels adjacent to the second pixel location. In such embodiments, the approximate error in the IMU data does not need to be known.
  • the processing unit evaluates pixels using techniques known to one of skill in the art to determine the actual location of the features in at least one of the images captured at time T 2 .
  • techniques include but are not limited to the Harris corner detection or KLT corner detection algorithms.
  • determining the actual location of the features also includes correlating the location of features in at least one of the images captured at time T 2 to the location of the features in the corresponding image captured at time T 1 .
  • the location of the one or more features is determined in both of the images captured at time T 2 based on the IMU data, as described above. However, this is not required.
  • the change in pixel location can be used to determine the actual pixel location of the features in the other image obtained at time T 2 .
  • the processing unit calculates a motion estimate (e.g. position and/or velocity) based on a comparison of the location of the one or more features in the first images captured at time T 1 with the location of the one or more features in the second images captured at time T 2 using techniques known to one of skill in the art.
  • the processing unit calculates a motion estimate based on the IMU data and combined the IMU motion estimate with the motion estimate based on a comparison of feature location to obtain a composite motion estimate.
  • a GPS signal is also received and the processing unit calculates a motion estimate based on the GPS signal when available.
  • the GPS signal and IMU data are used exclusively when the GPS signal is available and the image sensor data is only used when the GPS signal is not available.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Navigation (AREA)

Abstract

A motion estimation system is provided. The motion estimation system comprises two image sensors, each image sensor configured to obtain a first image at a first time and a second image at a second time; an inertial measurement unit (IMU) configured to obtain motion data for the time period between the first time and the second time; and a processing unit coupled to the IMU and the two image sensors, wherein the processing unit is configured to estimate motion by comparing the location of one or more features in the two first images with the location of the one or more features in the respective two second images, wherein the processing unit determines the location of the one or more features in at least one of the second images based at least in part on the IMU data for the time period between the first time and the second time.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is related to co-pending U.S. patent application Ser. No. __/______, filed on ______ entitled “SENSOR FUSION FOR NAVIGATION”, attorney docket number H0013072-5607, hereby incorporated herein by reference, and referred to herein as the “'13072 Application”.

  • BACKGROUND
  • The need to know one's location arises in many situations. For example, an operator of an aircraft, spacecraft, land vehicle, etc. needs to know the location of the vehicle or craft in order to properly maneuver the vehicle and avoid dangerous conditions. In addition, unmanned vehicles, such as unmanned aerial vehicles (UAV) or robots, etc. also need accurate position and velocity information in order to properly navigate an area.

  • Various systems have been developed to provide the needed position and/or velocity data. One such system uses an inertial measurement unit (IMU) to measure acceleration in a plurality of directions. The measured accelerations are then used to determine position and velocity in each of the measured directions. Unfortunately, the IMU measurements are subject to integration drift which negatively affects the accuracy of the IMU measurements. Therefore, a mechanism must be used to update the IMU measurements or otherwise improve the accuracy of the position and velocity data. One such mechanism involves the use of global positioning system (GPS) satellite data. The GPS data is used periodically to correct for errors in the IMU measurements. However, GPS satellite signals are not always available. For example, a GPS satellite signal may not be available when a vehicle is traveling in a city among tall buildings. When the GPS signal is not available, the IMU measurements are not updated.

  • SUMMARY
  • In one embodiment, a motion estimation system is provided. The motion estimation system comprises two image sensors, each image sensor configured to obtain a first image at a first time and a second image at a second time; an inertial measurement unit (IMU) configured to obtain motion data for the time period between the first time and the second time; and a processing unit coupled to the IMU and the two image sensors, wherein the processing unit is configured to estimate motion by comparing the location of one or more features in the two first images with the location of the one or more features in the respective two second images, wherein the processing unit determines the location of the one or more features in at least one of the second images based at least in part on the IMU data for the time period between the first time and the second time.

  • DRAWINGS
  • FIG. 1

    is a high level block diagram depicting a motion estimation system according to one embodiment of the present invention.

  • FIG. 2

    is an exemplary diagram depicting the location of an object at different times as detected by vision sensors according to one embodiment of the present invention.

  • FIG. 3

    is a flow chart depicting a method of estimating motion using vision sensors according to one embodiment of the present invention.

  • DETAILED DESCRIPTION
  • In the following detailed description, reference is made to the accompanying drawings that form a part hereof, and in which is shown by way of illustration specific illustrative embodiments in which the invention may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the invention, and it is to be understood that other embodiments may be utilized and that logical, mechanical and electrical changes may be made without departing from the scope of the present invention. It should be understood that the exemplary method illustrated may include additional or fewer steps or may be performed in the context of a larger processing scheme. Furthermore, the method presented in the drawing figures or the specification is not to be construed as limiting the order in which the individual steps may be performed. The following detailed description is, therefore, not to be taken in a limiting sense.

  • Embodiments of the present invention enable accurate motion estimates even in the absence of a global positioning system satellite signal. In particular, embodiments of the present invention use an inertial measurement unit in conjunction with a stereo vision sensor to obtain the accurate motion estimates. Furthermore, embodiments of the present invention improve the processing rate necessary to use the stereo vision sensor to obtain motion estimates.

  • FIG. 1

    is a high level block diagram depicting a

    motion estimation system

    100 according to one embodiment of the present invention.

    System

    100 includes a

    processing unit

    106, an inertial measurement unit (IMU) 102, and a

    stereo vision sensor

    104.

    Stereo vision sensor

    104 and IMU 102 are each coupled to

    processing unit

    106 and provide input data to processing

    unit

    106 for estimating motion (e.g. position and/or velocity).

  • In this example, IMU 102 measures motion in six degrees of freedom. In particular, IMU 102 includes sensors to measure acceleration along three orthogonal coordinate axis and angular acceleration about each of the three orthogonal axes. For example, IMU 102 can include, but is not limited to, three linear accelerometers configured to obtain acceleration along the three coordinate axes, and three gyroscopes to measure angular acceleration about the same three coordinate axes. In other words, the gyroscope measurements are used to estimate attitude or orientation of a vehicle and the accelerometer measurements are used to estimate position and velocity, including the effects of gravity. Therefore, IMU 102 provides data to processing

    unit

    106 which is used to determine position based on the motion measured by IMU 102.

  • However, measurements provided by IMU 102 suffer from integration drift. In other words, position and velocity are obtained by integrating the acceleration measurements provided by IMU 102. This integration increases errors in position and velocity due to errors in the measured acceleration. Acceleration measurements can suffer from interference (i.e. noisy signals) and the accelerometers and gyroscopes often have a certain level of bias associated with them. For example, a bias of 1 milli-g (9.8/1000 m/s2) means that even though a vehicle in which

    system

    100 is located is stationary, sensors in IMU 102 will measure 1 milli-g of acceleration. This error is further magnified as the measured acceleration is integrated to obtain position and velocity. Therefore, the calculated position based on

    IMU

    102 measurements becomes increasingly less reliable with time.

  • Positions based on

    IMU

    102 measurements can be corrected or updated with measurements from an optional Global Positioning System (GPS)

    sensor

    108. Position data from

    GPS sensor

    108 does not drift as with data from IMU 102. However, GPS data is not always available. For example, the GPS satellite signal can be lost when traveling near moist trees, tall buildings, etc. In such cases, the measurements from IMU 102 can not be updated by data from

    GPS sensor

    108 and the calculated position becomes increasingly unreliable the longer the GPS satellite signal is unavailable.

  • Positions based on IMU 102 can also be updated or corrected with measurements from

    stereo vision sensor

    104.

    Stereo vision sensor

    104 can be used in conjunction with

    GPS sensor

    108. For example,

    stereo vision sensor

    104 can be used whenever a GPS satellite signal is unavailable or concurrently with

    GPS sensor

    108 regardless of availability of a GPS satellite signal. Alternatively,

    GPS sensor

    108 can be omitted, in some embodiments. In such embodiments, data from IMU 102 is only updated with measurements from

    stereo vision sensor

    104.

  • Stereo vision sensor

    104 comprises at least two

    image sensors

    112 each configured to obtain images of an area near the

    system

    100.

    Image sensors

    112 are implemented as any appropriate device for collecting image data including, but not limited to, a visible light camera, a laser system, an infrared camera, and any other existing or later developed imaging technology. By correlating the location of features in an image obtained from each sensor at a first moment in time to the location of features in an image obtained from each sensor at a second moment in time,

    processing unit

    106 is able to determine the position and velocity of the vehicle in which

    system

    100 is located, using techniques known to one of skill in the art. Although the position measurements obtained from

    stereo vision sensor

    104 suffer from less drift than measurements from

    IMU

    102, measurements from

    stereo vision sensor

    104 are time intensive. For example, correlation of features can take up to 30 seconds to complete using conventional correlation techniques. The amount of time required for conventional techniques, therefore, renders

    stereo vision sensor

    104 ineffective for use in many vehicles, especially in vehicles moving at relatively high velocities compared to the time required for correlation using conventional techniques.

  • Embodiments of the present invention, however, enable a relatively quick correlation process compared to conventional methods. Therefore, embodiments of the present invention enable the effective use of

    stereo vision sensor

    104 to update or correct measurements obtained from

    IMU

    102. In particular, processing

    unit

    106 is configured to estimate the location of features in images obtained at the second moment in time based on data received from

    IMU

    102 and the location of features in images obtained at the first moment in time. The estimation of the location of features in the second images is described in more detail below with regards to

    FIGS. 2 and 3

    . Therefore, processing

    unit

    106 is able to locate the features in the second images quicker by searching for the features in a smaller area located around an estimated location of the features. Once the actual location of the features is identified, processing

    unit

    106 can calculate the change in position of the vehicle by using techniques known to one of skill in the art.

  • Processing unit

    106 uses instructions for carrying out the various process tasks, calculations, and generation of signals and other data used in the operation of

    system

    100, such as to correlate the location of features in images and to determine the velocity and position of the vehicle in which

    system

    100 is located. The instructions can be implemented in software, firmware, analog or digital electronics, or any computer readable instructions. These instructions are typically stored on any appropriate computer readable medium used for storage of computer readable instructions or data structures. Such computer readable media can be any available media that can be accessed by a general purpose or special purpose computer or processor, or any programmable logic device.

  • Suitable computer readable media may comprise, for example, non-volatile memory devices including semiconductor memory devices such as EPROM, EEPROM, or flash memory devices; magnetic disks such as internal hard disks or removable disks (e.g., floppy disks); magneto-optical disks; CDs, DVDs, or other optical storage disks; nonvolatile ROM, RAM, and other like media. Any of the foregoing may be supplemented by, or incorporated in, specially-designed application-specific integrated circuits (ASICs). When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a computer readable medium. Thus, any such connection is properly termed a computer readable medium. Combinations of the above are also included within the scope of computer readable media.

  • Due to the quicker correlation time,

    system

    100 is able to use data from

    stereo vision sensor

    104 to correct or update the position calculated based on data from

    IMU

    102. In some embodiments, the calculated position based on data from

    IMU

    102 is combined with the calculated position from

    stereo vision sensor

    104. In other embodiments, the calculated position from

    stereo vision sensor

    104 is used in place of the calculated position based on data from

    IMU

    102. In such embodiments,

    IMU

    102 is essentially used only to estimate the location of features in images captured by

    stereo vision sensor

    104.

  • In operation,

    IMU

    102 measures acceleration of a vehicle in six degrees of freedom. The measurements from

    IMU

    102 are passed to

    processing unit

    106. In addition, images captured from each

    image sensor

    112 in

    stereo vision sensor

    104 are passed to

    processing unit

    106. In particular, each of two sensors in

    stereo vision sensor

    104 captures an image at a time T1 and at a second later time T2.

    Processing unit

    106 is configured to locate one or more features in each of the two images captured at time T1.

    Processing unit

    106 also correlates the location of the one or more features in one of the two images with the location of the one or more features in the other image.

  • Processing unit

    106 is also configured to calculate an estimated position and velocity using the measured acceleration from

    IMU

    102. The estimated position and velocity is used by processing

    unit

    106 together with the location of the one or more features in the images captured at time T1 to estimate the location of the one or more features in at least one of the images captured at time T2.

    Processing unit

    106 then determines the actual location of the one or more features in the image captured at time T2 by focusing on a small area around the estimated position. The small area is defined by the possible error in the measurements of

    IMU

    102. Once the actual location of the one or more features is determined, processing

    unit

    106 calculates the position and velocity of the vehicle based on a comparison of the location of the one or more features between images captured at time T1 and the images captured at time T2 using techniques known to one of skill in the art.

  • Processing unit

    106 then combines the estimated positions and velocities from

    IMU

    102 and

    stereo vision sensor

    104 to obtain a more accurate estimate in some embodiments. Alternatively, processing unit only uses the motion estimate from

    stereo vision sensor

    104. The more accurate estimate is then optionally displayed on a

    display element

    110, in some embodiments. For example, an

    automobile using system

    100 can use

    display element

    110 to display to a driver of the automobile where the vehicle is located on a map. In other embodiments, processing

    unit

    106 uses the more accurate estimate to determine the necessary actions to take in order to reach a programmed destination. In some such embodiments, processing

    unit

    106 generates control signals which are sent to one or

    more movement actuators

    112 to control the movement of the vehicle. For example, processing

    unit

    106 can control the flight of an unmanned aerial vehicle (UAV) based on control signals transmitted to movement actuators (such as the throttle, wing flaps, etc.) in the UAV to control the pitch, yaw, thrust, etc. of the UAV.

  • FIG. 2

    is an exemplary diagram depicting the location of an

    object

    208 at different times as detected by a stereo vision sensor (e.g.

    stereo vision sensor

    104 in

    FIG. 1

    ) according to one embodiment of the present invention. In particular, a first image sensor captures image A1 at time T1 and image A2 at time T2. Similarly, a second image sensor captures image B1 at time T1 and image B2 at time T2. The image sensors are implemented as any appropriate device for collecting image data including, but not limited to, a visible light camera, a thermographic camera, a spectrometer, an infra-red sensor, a millimeter wave radar system, and any other existing or later developed imaging technology. Also, images A1, B1, A2, and B2 are provided by way of explanation and not by way of limitation. It is to be understood that

    object

    208 and images A1, B1, A2, and B2 will vary in operation depending on the area near the vehicle in which a motion estimating system (e.g. system 100) is located. In particular, although

    object

    208 is shown as a rectangle in

    FIG. 2

    , it is to be understood that in other embodiments the

    object

    208 can be any appropriate object from which a vehicles relative position can be determined. For example, object 208 can be a tunnel, building, tower, tree, river, etc.

  • Object

    208 is comprised of a plurality of features 202-1 . . . 202-N. A feature, as used herein, refers to any set of one or more pixels that have a high contrast with respect to the surrounding pixels in images A1, B1, A2, and B2. Also shown in each of the images A1, B1, A2, and B2 are coordinate

    axes

    204. Coordinate

    axes

    204 are provided for purposes of explanation, but are not necessary for operation of embodiments of the present invention. In particular, coordinate

    axes

    204 help demonstrate the relative position of

    object

    208 in each of images A1, B1, A2, and B2.

  • In operation, images A1 and B1 are each obtained by a sensor (

    e.g. sensors

    112 in

    FIG. 1

    ) at time T1. As can be seen in

    FIG. 2

    ,

    object

    208 is offset differently from

    center

    210 in image A1 than in image B1. This difference is due to the relative position of the two sensors which capture images A1 and B1. The two image sensors are fixed on a rigid bar and set apart from each other. Once images A1 and B1 are captured, a processing unit (e.g. processing unit 106) locates features 202-1 . . . 202-N in each of images A1 and B1 and correlates their respective locations. For example, the processing unit correlates the location of feature 202-1 in image A1 with the location of feature 202-1 in image B1. Any appropriate technique can be used for locating features 202-1 . . . 202-N, such as the Harris corner detector or the Kanade-Lucas-Tomasi (KLT) corner detector.

  • At time T2, images A2 and B2 are captured. Once images A2 and B2 are captured, the processing unit locates and correlates features 202-1 . . . 202-N in at least one of images A2 and B2 with one of images A1 and B1. For example, in this example, the processing unit locates features 202-1 . . . 202-N in image A2 and correlates their location with features 202-1 . . . 202-N in image A1. Once the features are correlated, the position and attitude of the vehicle can be determined using techniques known to one of skill in the art. This process can be very time intensive because of the changes in position of

    object

    208 from time T1 to time T2.

  • For example, in this embodiment, the vehicle has banked and turned to the left as well as pitched downward. The movement of the vehicle causes

    object

    208 as depicted in image A2, for example, to move and rotate to the right, and to move up in image A2. In addition, the vehicle has moved toward

    object

    208 causing

    object

    208 to be larger in image A2. Therefore, features 202-1 . . . 202-N in image A2 are not located near the original position (e.g. pixel location) of features 202-1 . . . 202-N in image A1. Locating and correlating features 202-1 . . . 202-N requires large amounts of processing time in conventional systems. Although the process of correlating features in an image captured at time T2 with an image captured at time T1 is time intensive in conventional systems, embodiments of the present invention enable a relatively quick correlation of features.

  • The processing unit uses data received from an IMU (

    e.g. IMU

    102 in

    FIG. 1

    ) to determine an approximate location of features 202-1 . . . 202-N in figures A2 and B2. For example, the processing unit uses the location of features 202-1 in figure A1 as a starting point and propagates the location of feature 202-1 forward based on data received from the IMU over the time period between time T1 and time T2. In this embodiment, in propagating the location forward, the processing unit also identifies

    area

    206 in which feature 202-1 is located.

    Area

    206 results from the known approximate error of measurements obtained from the IMU. For example, if the IMU measures a lateral movement of 2.0 meters to the right over a time period of 1 second but has a known error of 0.1 meters/second, the actual location of feature 202-1 is between 1.9-2.1 meters to the right of its location at time T1. The processing unit then evaluates the pixels in

    area

    206 to identify the actual location of feature 202-1 in image A2. Alternatively, the processing unit evaluates pixels successively further away from the estimated location of feature 202-1, starting with pixels adjacent to the estimated location, until the actual pixel location is identified. In such embodiments, the approximate error in the IMU data does not need to be known.

  • The processing unit can use conventional techniques, such as KLT corner detection, to locate feature 202-1 in

    area

    206. However, by focusing on

    area

    206 of image A2 rather than the entire image, the processing unit is able to locate feature 202-1 much quicker than in conventional systems which have to search larger areas. The processing unit uses IMU data to estimate the location of each of features 202-1 . . . 202-N. It is not necessary to correlate both of images A2 and B2 with images A1 and B1, respectively, because once the actual location of features in one of images A2 and B2 are located, the processing unit can calculate position and velocity which can be used to determine the actual location of the features in the other of images A2 and B2.

  • FIG. 3

    is a flow chart depicting an

    exemplary method

    300 of estimating motion using vision sensors according to one embodiment of the present invention.

    Method

    300 can be used in a motion estimating system such as

    system

    100 above. In particular, in some embodiments,

    method

    300 is implemented in a computer readable medium for use by a processing unit (such as processing unit 106). At 302, image data is received for two images captured at a first time T1. In particular, in this embodiment, two image sensors (e.g. sensors 112) each capture an image at time T1 and pass the data to a processing unit. At 304, the processing unit locates one or more features in both of the images. It is to be understood that the one or more features are the same in each of the images with only the relative location and/or size of the features being different. In addition, locating the one or more features includes correlating the relative location of the features in one image with the location of the features in the other image. The features can be located using any technique known to one of skill in the art, such as KLT corner detection or Harris corner detection.

  • At 306, image data is received for two images captured at a second time T2 from the two image sensors. Time T2 occurs after time T1. In addition, the subject matter of the images captured at time T2 and time T1 is the same. That is, each image captured at a second corresponds to one of the images captured at the first time T1. At 308, motion data is received from an inertial measurement unit (e.g. IMU 102). In particular, acceleration measurements in six degrees of freedom are received by the processing unit in this embodiment. The processing unit evaluates the IMU data to calculate an estimate for position and/or velocity.

  • At 310, the processing unit estimates the location of the one or more features in at least one of the images captured at time T2. The location estimate is based on the IMU data received at 308 and the location of the features in the images captured at time T1. For example, the processing unit calculates motion that occurred from time T1 to time T2 based on the IMU data. Based on the calculated motion and the pixel location of the one or more features in one of the first images located at 304, the processing unit selects a second pixel location in one of the second images for each of the one or more features.

  • The processing unit then evaluates pixels near the second pixel location to identify the actual pixel location of the feature. In addition, in some embodiments, the processing unit calculates an area to be searched around the second pixel location of each feature based on the known approximate error in the IMU measurements. By limiting the search to an area defined by the known approximate error, the processing unit searches more efficiently by not unnecessarily evaluating pixels outside the area. In other embodiments, the processing unit does not limit the search to a calculated area. In such embodiments, the processing unit successively evaluates pixels near the second pixel location until the actual pixel location of the corresponding feature is identified, starting with pixels adjacent to the second pixel location. In such embodiments, the approximate error in the IMU data does not need to be known.

  • The processing unit evaluates pixels using techniques known to one of skill in the art to determine the actual location of the features in at least one of the images captured at time T2. For example, such techniques include but are not limited to the Harris corner detection or KLT corner detection algorithms. In addition, determining the actual location of the features also includes correlating the location of features in at least one of the images captured at time T2 to the location of the features in the corresponding image captured at time T1. In some embodiments, the location of the one or more features is determined in both of the images captured at time T2 based on the IMU data, as described above. However, this is not required. Once the pixel location of features in one of the second images is correlated with the pixel location of the features in the corresponding image obtained at time T1, the change in pixel location can be used to determine the actual pixel location of the features in the other image obtained at time T2.

  • At 312, the processing unit calculates a motion estimate (e.g. position and/or velocity) based on a comparison of the location of the one or more features in the first images captured at time T1 with the location of the one or more features in the second images captured at time T2 using techniques known to one of skill in the art. In addition, in some embodiments, the processing unit calculates a motion estimate based on the IMU data and combined the IMU motion estimate with the motion estimate based on a comparison of feature location to obtain a composite motion estimate. Similarly, in some embodiments, a GPS signal is also received and the processing unit calculates a motion estimate based on the GPS signal when available. In some such embodiments, the GPS signal and IMU data are used exclusively when the GPS signal is available and the image sensor data is only used when the GPS signal is not available.

  • Although specific embodiments have been illustrated and described herein, it will be appreciated by those of ordinary skill in the art that any arrangement, which is calculated to achieve the same purpose, may be substituted for the specific embodiment shown. This application is intended to cover any adaptations or variations of the present invention. Therefore, it is manifestly intended that this invention be limited only by the claims and the equivalents thereof.

Claims (20)

1. A motion estimation system, comprising:

two image sensors, each image sensor configured to obtain a first image at a first time and a second image at a second time;

an inertial measurement unit (IMU) configured to obtain motion data for the time period between the first time and the second time; and

a processing unit coupled to the IMU and the two image sensors, wherein the processing unit is configured to estimate motion by comparing the location of one or more features in the two first images with the location of the one or more features in the respective two second images, wherein the processing unit determines the location of the one or more features in at least one of the second images based at least in part on the IMU data for the time period between the first time and the second time.

2. The motion estimation system of

claim 1

, wherein the processing unit is further configured to determine a first pixel location of the one or more features in each first image, to select a second pixel location in the at least one second image based on the IMU data, and to evaluate pixels near the second pixel location to identify the actual pixel location of the one or more features in the at least one second image.

3. The motion estimation system of

claim 1

, further comprising a motion actuator coupled to the processing unit, wherein the movement actuator is configured to control motion based on control signals received from the processing unit.

4. The motion estimation system of

claim 1

, further comprising a display element coupled to the processing unit, wherein the display element is configured to display motion estimates based on signals received from the processing unit.

5. The motion estimation system of

claim 1

, wherein each of the two image sensors includes one of a visible light camera, a laser system, or an infrared camera.

6. The motion estimation system of

claim 1

, wherein the processing unit is configured to combine a motion estimate from the IMU data and a motion estimate from the image sensors to obtain a composite motion estimate.

7. The motion estimation system of

claim 1

, further comprising a global positioning system (GPS) sensor coupled to the processing unit, wherein the processing unit is configured to estimate motion based on data received from the GPS sensor, IMU, and two image sensors.

8. The motion estimation system of

claim 7

, wherein the processing unit is configured to estimate motion based on data received from the GPS sensor and IMU when a GPS signal is available and to estimate motion based solely on data from the two image sensors and IMU when the GPS signal is not available.

9. A method of estimating motion, the method comprising:

receiving a first image from each of two image sensors at a first time;

locating one or more features in each of the first images;

receiving a second image from each of the two image sensors at a second time;

receiving data from an inertial measurement unit (IMU) for a time period between the first and second times;

locating the one or more features in at least one of the second images based at least in part on the IMU data; and

calculating a motion estimate based on a comparison of the location of the one or more features in the first images to the location of the one or more features in the second images.

10. The method of estimating motion of

claim 9

, wherein locating the one or more features includes locating the one or more features using one of a Kanade-Lucas-Tomasi (KLT) corner detection algorithm or a Harris corner detection algorithm.

11. The method of estimating motion of

claim 9

, wherein locating one or more features in each of the first images includes correlating the relative location of the features in one of the first images with the location of the features in the other first image.

12. The method of estimating motion of

claim 9

, further comprising combining a motion estimate based on IMU data with the motion estimate based on a comparison of the location of the one or more features in the first images to the location of the one or more features in the second images.

13. The method of estimating motion of

claim 9

, further comprising:

receiving a global position system (GPS) signal; and

estimating motion based, at least in part, on the GPS signal when the GPS signal is available.

14. The method of estimating motion of

claim 9

, wherein locating the one or more features in at least one of the second images based, at least in part, on the IMU data includes:

determining a first pixel location of the one or more features in each first image;

selecting a second pixel location in the at least one second image based on the IMU data and the first pixel location; and

evaluating pixels near the second pixel location to identify the actual pixel location of the one or more features in the at least one second image.

15. The method of estimating motion of

claim 14

, wherein evaluating pixels near the second pixel location includes evaluating pixels in an area around the second pixel location, wherein the size of the area is determined based on the approximate error in the IMU data.

16. A program product comprising program instructions embodied on a processor-readable medium for execution by a programmable processor, wherein the program instructions are operable to cause the programmable processor to:

locate one or more features in each of two first images received at a first time;

evaluate data received from an inertial measurement unit (IMU) for a time period between the first time and a second time;

locate the one or more features in at least one of two second images received at the second time based at least in part on the IMU data;

calculate a motion estimate based, at least in part, on a comparison of the location of the one or more features in the first images to the location of the one or more features in the second images; and

output the calculated motion estimate.

17. The program product of

claim 16

, wherein the program instructions are further operable to cause the programmable processor to:

calculate a motion estimate based on the IMU data; and

combine the IMU motion estimate with the motion estimate based on a comparison of the location of the one or more features in the first images to the location of the one or more features in the second images.

18. The program product of

claim 16

, wherein the program instructions are further operable to cause the programmable processor to locate the one or more features using one of a Kanade-Lucas-Tomasi (KLT) corner detection algorithm or a Harris corner detection algorithm.

19. The program product of

claim 16

, wherein the program instructions are further operable to cause the programmable processor to:

determine a first pixel location of the one or more features in each first image;

select a second pixel location in the at least one second image based on the IMU data and the first pixel location; and

evaluate pixels near the second pixel location to identify the actual pixel location of the one or more features in the at least one second image.

20. The program product of

claim 19

, wherein the program instructions are further operable to cause the programmable processor to:

evaluate pixels in an area around the second pixel location to identify the actual pixel location of the one or more features in the at least one second image, wherein the area is determined by an approximate error in the IMU data.

US11/673,893 2007-02-12 2007-02-12 System and method for motion estimation using vision sensors Abandoned US20080195316A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US11/673,893 US20080195316A1 (en) 2007-02-12 2007-02-12 System and method for motion estimation using vision sensors
IL189454A IL189454A0 (en) 2007-02-12 2008-02-12 System and method for motion estimation using vision sensors
GB0802629A GB2446713A (en) 2007-02-12 2008-02-12 Image feature motion estimation based in part on inertial measurement data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/673,893 US20080195316A1 (en) 2007-02-12 2007-02-12 System and method for motion estimation using vision sensors

Publications (1)

Publication Number Publication Date
US20080195316A1 true US20080195316A1 (en) 2008-08-14

Family

ID=39247558

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/673,893 Abandoned US20080195316A1 (en) 2007-02-12 2007-02-12 System and method for motion estimation using vision sensors

Country Status (3)

Country Link
US (1) US20080195316A1 (en)
GB (1) GB2446713A (en)
IL (1) IL189454A0 (en)

Cited By (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100010741A1 (en) * 2008-07-10 2010-01-14 Lockheed Martin Missiles And Fire Control Inertial measurement with an imaging sensor and a digitized map
US20100256907A1 (en) * 2009-04-06 2010-10-07 Honeywell International Inc. Technique to improve navigation performance through carouselling
CN102052924A (en) * 2010-11-25 2011-05-11 哈尔滨工程大学 Combined navigation and positioning method of small underwater robot
US20110141485A1 (en) * 2009-12-16 2011-06-16 Industrial Technology Research Institute System and Method for Localizing a Carrier, Estimating a Posture of the Carrier and Establishing a Map
US20110218733A1 (en) * 2010-03-04 2011-09-08 Honeywell International Inc. Method and apparatus for vision aided navigation using image registration
US20120084457A1 (en) * 2009-09-18 2012-04-05 Kabushiki Kaisha Toshiba Relay device, relay method and relay system
US20140036072A1 (en) * 2012-06-20 2014-02-06 Honeywell International Inc. Cargo sensing
EP2434256A3 (en) * 2010-09-24 2014-04-30 Honeywell International Inc. Camera and inertial measurement unit integration with navigation data feedback for feature tracking
WO2014076294A1 (en) * 2012-11-19 2014-05-22 Inria Institut National De Recherche En Informatique Et En Automatique Method for determining, in a fixed 3d frame of reference, the location of a moving craft, and associated computer program and device
WO2014079632A1 (en) * 2012-11-26 2014-05-30 Robert Bosch Gmbh Autonomous transportation device
WO2015073406A1 (en) * 2013-11-13 2015-05-21 Elwha Llc Dead reckoning system for vehicles
CN104821134A (en) * 2014-02-05 2015-08-05 财团法人工业技术研究院 method and system for generating indoor map
JP2015193347A (en) * 2014-03-31 2015-11-05 三菱プレシジョン株式会社 Inertia navigation system of missile
US20160068267A1 (en) 2014-09-05 2016-03-10 SZ DJI Technology Co., Ltd Context-based flight mode selection
US20160070264A1 (en) 2014-09-05 2016-03-10 SZ DJI Technology Co., Ltd Velocity control for an unmanned aerial vehicle
JP2016045767A (en) * 2014-08-25 2016-04-04 株式会社豊田中央研究所 Motion amount estimation device and program
FR3029281A1 (en) * 2014-12-01 2016-06-03 Commissariat Energie Atomique ELECTRONIC METHOD AND COMPUTER FOR DETERMINING THE TRACK OF A MOBILE OBJECT
WO2016187760A1 (en) * 2015-05-23 2016-12-01 SZ DJI Technology Co., Ltd. Sensor fusion using inertial and image sensors
WO2017008246A1 (en) * 2015-07-14 2017-01-19 SZ DJI Technology Co., Ltd. Method, apparatus, and system for determining a movement of a mobile platform
US20170045895A1 (en) * 2014-12-31 2017-02-16 SZ DJI Technology Co., Ltd. Selective processing of sensor data
EP3158411A4 (en) * 2015-05-23 2017-07-05 SZ DJI Technology Co., Ltd. Sensor fusion using inertial and image sensors
EP3155369A4 (en) * 2015-06-26 2017-07-19 SZ DJI Technology Co., Ltd. System and method for measuring a displacement of a mobile platform
WO2017206179A1 (en) * 2016-06-03 2017-12-07 SZ DJI Technology Co., Ltd. Simple multi-sensor calibration
WO2018028649A1 (en) * 2016-08-10 2018-02-15 纳恩博(北京)科技有限公司 Mobile device, positioning method therefor, and computer storage medium
WO2018055591A1 (en) * 2016-09-23 2018-03-29 DunAn Precision, Inc. Eye-in-hand visual inertial measurement unit
WO2018127328A1 (en) * 2017-01-03 2018-07-12 Valeo Schalter Und Sensoren Gmbh Determining movement information with environment sensors
US10037028B2 (en) 2015-07-24 2018-07-31 The Trustees Of The University Of Pennsylvania Systems, devices, and methods for on-board sensing and control of micro aerial vehicles
EP3159123A4 (en) * 2014-06-17 2018-08-08 Yujin Robot Co., Ltd. Device for controlling driving of mobile robot having wide-angle cameras mounted thereon, and method therefor
CN108426576A (en) * 2017-09-15 2018-08-21 辽宁科技大学 Aircraft paths planning method and system based on identification point vision guided navigation and SINS
WO2018182524A1 (en) * 2017-03-29 2018-10-04 Agency For Science, Technology And Research Real time robust localization via visual inertial odometry
US10178316B2 (en) 2014-07-17 2019-01-08 Elbit Systems Ltd. Stabilization and display of remote images
JP2019023865A (en) * 2018-07-12 2019-02-14 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Method, system, and program for performing error recovery
US10240930B2 (en) 2013-12-10 2019-03-26 SZ DJI Technology Co., Ltd. Sensor fusion
US10315306B2 (en) 2015-10-21 2019-06-11 F Robotics Acquisitions Ltd. Domestic robotic system
CN110006423A (en) * 2019-04-04 2019-07-12 北京理工大学 An Adaptive Inertial Navigation and Vision Combined Navigation Method
US10395115B2 (en) 2015-01-27 2019-08-27 The Trustees Of The University Of Pennsylvania Systems, devices, and methods for robotic remote sensing for precision agriculture
US10429839B2 (en) 2014-09-05 2019-10-01 SZ DJI Technology Co., Ltd. Multi-sensor environmental mapping
CN110782459A (en) * 2019-01-08 2020-02-11 北京嘀嘀无限科技发展有限公司 Image processing method and device
CN111238488A (en) * 2020-03-18 2020-06-05 湖南云顶智能科技有限公司 Aircraft accurate positioning method based on heterogeneous image matching
US10732647B2 (en) 2013-11-27 2020-08-04 The Trustees Of The University Of Pennsylvania Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft micro-aerial vehicle (MAV)
CN111792034A (en) * 2015-05-23 2020-10-20 深圳市大疆创新科技有限公司 Method and system for estimating state information of movable objects using sensor fusion
US20200409385A1 (en) * 2019-06-28 2020-12-31 Ford Global Technologies, Llc Vehicle visual odometry
CN112179338A (en) * 2020-09-07 2021-01-05 西北工业大学 Low-altitude unmanned aerial vehicle self-positioning method based on vision and inertial navigation fusion
US10884430B2 (en) 2015-09-11 2021-01-05 The Trustees Of The University Of Pennsylvania Systems and methods for generating safe trajectories for multi-vehicle teams
US10914590B2 (en) * 2014-03-24 2021-02-09 SZ DJI Technology Co., Ltd. Methods and systems for determining a state of an unmanned aerial vehicle
WO2022019975A1 (en) * 2020-07-22 2022-01-27 Microsoft Technology Licensing, Llc Systems and methods for reducing a search area for identifying correspondences between images
US11348274B2 (en) 2017-01-23 2022-05-31 Oxford University Innovation Limited Determining the location of a mobile device
US11391819B2 (en) * 2018-07-18 2022-07-19 Qualcomm Incorporate Object verification using radar images
US11436749B2 (en) 2017-01-23 2022-09-06 Oxford University Innovation Limited Determining the location of a mobile device
US11460851B2 (en) 2019-05-24 2022-10-04 Ford Global Technologies, Llc Eccentricity image fusion
US20220374016A1 (en) * 2021-05-18 2022-11-24 Ford Global Technologies, Llc Intersection node-assisted high-definition mapping
US11521494B2 (en) 2019-06-11 2022-12-06 Ford Global Technologies, Llc Vehicle eccentricity mapping
US11783707B2 (en) 2018-10-09 2023-10-10 Ford Global Technologies, Llc Vehicle path planning
EP4249421A3 (en) * 2019-10-09 2023-10-11 MetraLabs GmbH Neue Technologien und Systeme Autonomous industrial truck
US12046047B2 (en) 2021-12-07 2024-07-23 Ford Global Technologies, Llc Object detection

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6445983B1 (en) * 2000-07-07 2002-09-03 Case Corporation Sensor-fusion navigator for automated guidance of off-road vehicles
US20030048357A1 (en) * 2001-08-29 2003-03-13 Geovantage, Inc. Digital imaging system for airborne applications
US20040215419A1 (en) * 2002-10-04 2004-10-28 Havens Steven W. Method and apparatus for acquiring and processing transducer data
US20040257441A1 (en) * 2001-08-29 2004-12-23 Geovantage, Inc. Digital imaging system for airborne applications
US20050031167A1 (en) * 2003-08-04 2005-02-10 Guohui Hu Method of three dimensional positioning using feature matching
US20070104354A1 (en) * 2005-11-10 2007-05-10 Holcomb Derrold W Remote sensing system capable of coregistering data from sensors potentially having unique perspectives

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB9716240D0 (en) * 1997-07-31 1997-10-08 Tricorder Technology Plc Scanning apparatus and methods
FR2903200B1 (en) * 2006-06-29 2008-12-19 Thales Sa HYBRID STABILIZATION OF IMAGES FOR VIDEO CAMERA
EP1921867B1 (en) * 2006-10-17 2016-05-25 Harman Becker Automotive Systems GmbH Sensor assisted video compression

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6445983B1 (en) * 2000-07-07 2002-09-03 Case Corporation Sensor-fusion navigator for automated guidance of off-road vehicles
US20030048357A1 (en) * 2001-08-29 2003-03-13 Geovantage, Inc. Digital imaging system for airborne applications
US20040257441A1 (en) * 2001-08-29 2004-12-23 Geovantage, Inc. Digital imaging system for airborne applications
US20040215419A1 (en) * 2002-10-04 2004-10-28 Havens Steven W. Method and apparatus for acquiring and processing transducer data
US20050031167A1 (en) * 2003-08-04 2005-02-10 Guohui Hu Method of three dimensional positioning using feature matching
US20070104354A1 (en) * 2005-11-10 2007-05-10 Holcomb Derrold W Remote sensing system capable of coregistering data from sensors potentially having unique perspectives

Cited By (100)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8515611B2 (en) 2008-07-10 2013-08-20 Lockheed Martin Corporation Inertial measurement with an imaging sensor and a digitized map
US20100010741A1 (en) * 2008-07-10 2010-01-14 Lockheed Martin Missiles And Fire Control Inertial measurement with an imaging sensor and a digitized map
US8265817B2 (en) * 2008-07-10 2012-09-11 Lockheed Martin Corporation Inertial measurement with an imaging sensor and a digitized map
US20100256907A1 (en) * 2009-04-06 2010-10-07 Honeywell International Inc. Technique to improve navigation performance through carouselling
US9599474B2 (en) * 2009-04-06 2017-03-21 Honeywell International Inc. Technique to improve navigation performance through carouselling
US8825893B2 (en) * 2009-09-18 2014-09-02 Kabushiki Kaisha Toshiba Relay device, relay method and relay system
US20120084457A1 (en) * 2009-09-18 2012-04-05 Kabushiki Kaisha Toshiba Relay device, relay method and relay system
US20110141485A1 (en) * 2009-12-16 2011-06-16 Industrial Technology Research Institute System and Method for Localizing a Carrier, Estimating a Posture of the Carrier and Establishing a Map
US8310684B2 (en) 2009-12-16 2012-11-13 Industrial Technology Research Institute System and method for localizing a carrier, estimating a posture of the carrier and establishing a map
TWI397671B (en) * 2009-12-16 2013-06-01 Ind Tech Res Inst System and method for locating carrier, estimating carrier posture and building map
US20110218733A1 (en) * 2010-03-04 2011-09-08 Honeywell International Inc. Method and apparatus for vision aided navigation using image registration
US9547910B2 (en) * 2010-03-04 2017-01-17 Honeywell International Inc. Method and apparatus for vision aided navigation using image registration
EP2372656A3 (en) * 2010-03-04 2012-10-17 Honeywell International Inc. Method and apparatus for vision aided navigation using image registration
EP2434256A3 (en) * 2010-09-24 2014-04-30 Honeywell International Inc. Camera and inertial measurement unit integration with navigation data feedback for feature tracking
CN102052924A (en) * 2010-11-25 2011-05-11 哈尔滨工程大学 Combined navigation and positioning method of small underwater robot
US20140036072A1 (en) * 2012-06-20 2014-02-06 Honeywell International Inc. Cargo sensing
FR2998363A1 (en) * 2012-11-19 2014-05-23 Inst Nat Rech Inf Automat METHOD FOR DETERMINING, IN A 3D FIXED REFERENTIAL, THE LOCATION OF A MOVING GEAR, ASSOCIATED DEVICE AND COMPUTER PROGRAM
US9329258B2 (en) 2012-11-19 2016-05-03 Inria Institut National De Recherche En Informatiq Method for determining, in a fixed 3D frame of reference, the location of a moving craft, and associated computer program and device
WO2014076294A1 (en) * 2012-11-19 2014-05-22 Inria Institut National De Recherche En Informatique Et En Automatique Method for determining, in a fixed 3d frame of reference, the location of a moving craft, and associated computer program and device
WO2014079632A1 (en) * 2012-11-26 2014-05-30 Robert Bosch Gmbh Autonomous transportation device
WO2015073406A1 (en) * 2013-11-13 2015-05-21 Elwha Llc Dead reckoning system for vehicles
US10732647B2 (en) 2013-11-27 2020-08-04 The Trustees Of The University Of Pennsylvania Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft micro-aerial vehicle (MAV)
EP2895819B1 (en) * 2013-12-10 2020-05-20 SZ DJI Technology Co., Ltd. Sensor fusion
EP3754381A1 (en) * 2013-12-10 2020-12-23 SZ DJI Technology Co., Ltd. Sensor fusion
US10240930B2 (en) 2013-12-10 2019-03-26 SZ DJI Technology Co., Ltd. Sensor fusion
CN104821134A (en) * 2014-02-05 2015-08-05 财团法人工业技术研究院 method and system for generating indoor map
US10914590B2 (en) * 2014-03-24 2021-02-09 SZ DJI Technology Co., Ltd. Methods and systems for determining a state of an unmanned aerial vehicle
JP2015193347A (en) * 2014-03-31 2015-11-05 三菱プレシジョン株式会社 Inertia navigation system of missile
EP3159123A4 (en) * 2014-06-17 2018-08-08 Yujin Robot Co., Ltd. Device for controlling driving of mobile robot having wide-angle cameras mounted thereon, and method therefor
US10178316B2 (en) 2014-07-17 2019-01-08 Elbit Systems Ltd. Stabilization and display of remote images
JP2016045767A (en) * 2014-08-25 2016-04-04 株式会社豊田中央研究所 Motion amount estimation device and program
US10421543B2 (en) 2014-09-05 2019-09-24 SZ DJI Technology Co., Ltd. Context-based flight mode selection
US20160068267A1 (en) 2014-09-05 2016-03-10 SZ DJI Technology Co., Ltd Context-based flight mode selection
US9625909B2 (en) 2014-09-05 2017-04-18 SZ DJI Technology Co., Ltd Velocity control for an unmanned aerial vehicle
US9625907B2 (en) 2014-09-05 2017-04-18 SZ DJ Technology Co., Ltd Velocity control for an unmanned aerial vehicle
US20160070264A1 (en) 2014-09-05 2016-03-10 SZ DJI Technology Co., Ltd Velocity control for an unmanned aerial vehicle
US10029789B2 (en) 2014-09-05 2018-07-24 SZ DJI Technology Co., Ltd Context-based flight mode selection
US10429839B2 (en) 2014-09-05 2019-10-01 SZ DJI Technology Co., Ltd. Multi-sensor environmental mapping
US10001778B2 (en) 2014-09-05 2018-06-19 SZ DJI Technology Co., Ltd Velocity control for an unmanned aerial vehicle
US9604723B2 (en) 2014-09-05 2017-03-28 SZ DJI Technology Co., Ltd Context-based flight mode selection
US11370540B2 (en) 2014-09-05 2022-06-28 SZ DJI Technology Co., Ltd. Context-based flight mode selection
US11914369B2 (en) 2014-09-05 2024-02-27 SZ DJI Technology Co., Ltd. Multi-sensor environmental mapping
US10901419B2 (en) 2014-09-05 2021-01-26 SZ DJI Technology Co., Ltd. Multi-sensor environmental mapping
US10845805B2 (en) 2014-09-05 2020-11-24 SZ DJI Technology Co., Ltd. Velocity control for an unmanned aerial vehicle
US9592911B2 (en) 2014-09-05 2017-03-14 SZ DJI Technology Co., Ltd Context-based flight mode selection
US20170268873A1 (en) * 2014-12-01 2017-09-21 Commissariat A L'energie Atomique Et Aux Energies Alternatives Method and electronic calculator for determining the trajectory of a mobile object
FR3029281A1 (en) * 2014-12-01 2016-06-03 Commissariat Energie Atomique ELECTRONIC METHOD AND COMPUTER FOR DETERMINING THE TRACK OF A MOBILE OBJECT
US10571265B2 (en) * 2014-12-01 2020-02-25 Commissariat A L'energie Atomique Et Aux Energies Alternatives Method and electronic calculator for determining the trajectory of a mobile object
WO2016087755A1 (en) 2014-12-01 2016-06-09 Commissariat à l'énergie atomique et aux énergies alternatives Method and electronic calculator for determining the trajectory of a mobile object
US9778661B2 (en) * 2014-12-31 2017-10-03 SZ DJI Technology Co., Ltd. Selective processing of sensor data
US20170045895A1 (en) * 2014-12-31 2017-02-16 SZ DJI Technology Co., Ltd. Selective processing of sensor data
US10802509B2 (en) 2014-12-31 2020-10-13 SZ DJI Technology Co., Ltd. Selective processing of sensor data
US10395115B2 (en) 2015-01-27 2019-08-27 The Trustees Of The University Of Pennsylvania Systems, devices, and methods for robotic remote sensing for precision agriculture
CN111792034A (en) * 2015-05-23 2020-10-20 深圳市大疆创新科技有限公司 Method and system for estimating state information of movable objects using sensor fusion
US10565732B2 (en) 2015-05-23 2020-02-18 SZ DJI Technology Co., Ltd. Sensor fusion using inertial and image sensors
CN113093808A (en) * 2015-05-23 2021-07-09 深圳市大疆创新科技有限公司 Sensor fusion using inertial and image sensors
EP3158411A4 (en) * 2015-05-23 2017-07-05 SZ DJI Technology Co., Ltd. Sensor fusion using inertial and image sensors
CN107850901A (en) * 2015-05-23 2018-03-27 深圳市大疆创新科技有限公司 Merged using the sensor of inertial sensor and imaging sensor
WO2016187760A1 (en) * 2015-05-23 2016-12-01 SZ DJI Technology Co., Ltd. Sensor fusion using inertial and image sensors
CN107850899A (en) * 2015-05-23 2018-03-27 深圳市大疆创新科技有限公司 Merged using the sensor of inertial sensor and imaging sensor
US10527416B2 (en) 2015-06-26 2020-01-07 SZ DJI Technology Co., Ltd. System and method for measuring a displacement of a mobile platform
EP3155369A4 (en) * 2015-06-26 2017-07-19 SZ DJI Technology Co., Ltd. System and method for measuring a displacement of a mobile platform
US11346666B2 (en) 2015-06-26 2022-05-31 SZ DJI Technology Co., Ltd. System and method for measuring a displacement of a mobile platform
US10760907B2 (en) 2015-06-26 2020-09-01 SZ DJI Technology Co., Ltd. System and method for measuring a displacement of a mobile platform
JP2017523382A (en) * 2015-07-14 2017-08-17 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Method, apparatus and system for determining movement of a mobile platform
US10895458B2 (en) 2015-07-14 2021-01-19 SZ DJI Technology Co., Ltd. Method, apparatus, and system for determining a movement of a mobile platform
WO2017008246A1 (en) * 2015-07-14 2017-01-19 SZ DJI Technology Co., Ltd. Method, apparatus, and system for determining a movement of a mobile platform
US10037028B2 (en) 2015-07-24 2018-07-31 The Trustees Of The University Of Pennsylvania Systems, devices, and methods for on-board sensing and control of micro aerial vehicles
US10884430B2 (en) 2015-09-11 2021-01-05 The Trustees Of The University Of Pennsylvania Systems and methods for generating safe trajectories for multi-vehicle teams
US11865708B2 (en) 2015-10-21 2024-01-09 Mtd Products Inc Domestic robotic system
US10315306B2 (en) 2015-10-21 2019-06-11 F Robotics Acquisitions Ltd. Domestic robotic system
CN109219785A (en) * 2016-06-03 2019-01-15 深圳市大疆创新科技有限公司 Simple multisensor calibration
WO2017206179A1 (en) * 2016-06-03 2017-12-07 SZ DJI Technology Co., Ltd. Simple multi-sensor calibration
US11822353B2 (en) 2016-06-03 2023-11-21 SZ DJI Technology Co., Ltd. Simple multi-sensor calibration
US11036241B2 (en) 2016-06-03 2021-06-15 SZ DJI Technology Co., Ltd. Simple multi-sensor calibration
WO2018028649A1 (en) * 2016-08-10 2018-02-15 纳恩博(北京)科技有限公司 Mobile device, positioning method therefor, and computer storage medium
WO2018055591A1 (en) * 2016-09-23 2018-03-29 DunAn Precision, Inc. Eye-in-hand visual inertial measurement unit
WO2018127328A1 (en) * 2017-01-03 2018-07-12 Valeo Schalter Und Sensoren Gmbh Determining movement information with environment sensors
US11436749B2 (en) 2017-01-23 2022-09-06 Oxford University Innovation Limited Determining the location of a mobile device
US11348274B2 (en) 2017-01-23 2022-05-31 Oxford University Innovation Limited Determining the location of a mobile device
WO2018182524A1 (en) * 2017-03-29 2018-10-04 Agency For Science, Technology And Research Real time robust localization via visual inertial odometry
US11747144B2 (en) 2017-03-29 2023-09-05 Agency For Science, Technology And Research Real time robust localization via visual inertial odometry
CN108426576A (en) * 2017-09-15 2018-08-21 辽宁科技大学 Aircraft paths planning method and system based on identification point vision guided navigation and SINS
JP2019023865A (en) * 2018-07-12 2019-02-14 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Method, system, and program for performing error recovery
US11391819B2 (en) * 2018-07-18 2022-07-19 Qualcomm Incorporate Object verification using radar images
US11783707B2 (en) 2018-10-09 2023-10-10 Ford Global Technologies, Llc Vehicle path planning
CN110782459A (en) * 2019-01-08 2020-02-11 北京嘀嘀无限科技发展有限公司 Image processing method and device
CN110006423A (en) * 2019-04-04 2019-07-12 北京理工大学 An Adaptive Inertial Navigation and Vision Combined Navigation Method
US11460851B2 (en) 2019-05-24 2022-10-04 Ford Global Technologies, Llc Eccentricity image fusion
US11521494B2 (en) 2019-06-11 2022-12-06 Ford Global Technologies, Llc Vehicle eccentricity mapping
US11662741B2 (en) * 2019-06-28 2023-05-30 Ford Global Technologies, Llc Vehicle visual odometry
US20200409385A1 (en) * 2019-06-28 2020-12-31 Ford Global Technologies, Llc Vehicle visual odometry
EP4249421A3 (en) * 2019-10-09 2023-10-11 MetraLabs GmbH Neue Technologien und Systeme Autonomous industrial truck
CN111238488A (en) * 2020-03-18 2020-06-05 湖南云顶智能科技有限公司 Aircraft accurate positioning method based on heterogeneous image matching
US11436742B2 (en) 2020-07-22 2022-09-06 Microsoft Technology Licensing, Llc Systems and methods for reducing a search area for identifying correspondences between images
WO2022019975A1 (en) * 2020-07-22 2022-01-27 Microsoft Technology Licensing, Llc Systems and methods for reducing a search area for identifying correspondences between images
CN112179338A (en) * 2020-09-07 2021-01-05 西北工业大学 Low-altitude unmanned aerial vehicle self-positioning method based on vision and inertial navigation fusion
US11914378B2 (en) * 2021-05-18 2024-02-27 Ford Global Technologies, Llc Intersection node-assisted high-definition mapping
US20220374016A1 (en) * 2021-05-18 2022-11-24 Ford Global Technologies, Llc Intersection node-assisted high-definition mapping
US12046047B2 (en) 2021-12-07 2024-07-23 Ford Global Technologies, Llc Object detection

Also Published As

Publication number Publication date
GB0802629D0 (en) 2008-03-19
IL189454A0 (en) 2008-11-03
GB2446713A (en) 2008-08-20

Similar Documents

Publication Publication Date Title
US20080195316A1 (en) 2008-08-14 System and method for motion estimation using vision sensors
KR102463176B1 (en) 2022-11-04 Device and method to estimate position
US7463340B2 (en) 2008-12-09 Ladar-based motion estimation for navigation
CN109887057B (en) 2023-03-24 Method and device for generating high-precision map
EP2133662B1 (en) 2012-02-01 Methods and system of navigation using terrain features
EP1926007B1 (en) 2014-03-19 Method and system for navigation of an unmanned aerial vehicle in an urban environment
US8315794B1 (en) 2012-11-20 Method and system for GPS-denied navigation of unmanned aerial vehicles
CN104729506B (en) 2017-11-14 A kind of unmanned plane Camera calibration method of visual information auxiliary
CN106289275B (en) 2021-05-14 Unit and method for improving positioning accuracy
US8260036B2 (en) 2012-09-04 Object detection using cooperative sensors and video triangulation
US8213706B2 (en) 2012-07-03 Method and system for real-time visual odometry
US20080195304A1 (en) 2008-08-14 Sensor fusion for navigation
KR20200044420A (en) 2020-04-29 Method and device to estimate position
JP2019074532A (en) 2019-05-16 Method for giving real dimensions to slam data and position measurement using the same
US7792330B1 (en) 2010-09-07 System and method for determining range in response to image data
JP2009019992A (en) 2009-01-29 Position detection device and position detection method
US20170074678A1 (en) 2017-03-16 Positioning and orientation data analysis system and method thereof
CN105953795A (en) 2016-09-21 Navigation apparatus and method for surface inspection of spacecraft
US12140975B2 (en) 2024-11-12 Devices, systems and methods for navigating a mobile platform
KR20150041898A (en) 2015-04-20 Apparatus and method for modifying gps/ins position information
KR20200109116A (en) 2020-09-22 Method and system for position estimation of unmanned aerial vehicle using graph structure based on multi module
US11061145B2 (en) 2021-07-13 Systems and methods of adjusting position information
Indelman et al. 2009 Real-time mosaic-aided aerial navigation: II. Sensor fusion
JP7333565B1 (en) 2023-08-25 Aircraft and method of controlling the aircraft
Sahmoudi et al. 2016 Analysis of a navigation system based on partially tight integration of IMU-visual odometry with loosely coupled GPS

Legal Events

Date Code Title Description
2007-02-12 AS Assignment

Owner name: HONEYWELL INTERNATIONAL INC., NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:KRISHNASWAMY, KAILASH;REEL/FRAME:018881/0665

Effective date: 20070201

2011-04-25 STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION