patents.google.com

US20170329331A1 - Control system for semi-autonomous control of vehicle along learned route - Google Patents

  • ️Thu Nov 16 2017

US20170329331A1 - Control system for semi-autonomous control of vehicle along learned route - Google Patents

Control system for semi-autonomous control of vehicle along learned route Download PDF

Info

Publication number
US20170329331A1
US20170329331A1 US15/596,348 US201715596348A US2017329331A1 US 20170329331 A1 US20170329331 A1 US 20170329331A1 US 201715596348 A US201715596348 A US 201715596348A US 2017329331 A1 US2017329331 A1 US 2017329331A1 Authority
US
United States
Prior art keywords
vehicle
control
route
learned
control system
Prior art date
2016-05-16
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/596,348
Inventor
Wenxue Gao
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Magna Electronics Inc
Original Assignee
Magna Electronics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
2016-05-16
Filing date
2017-05-16
Publication date
2017-11-16
2017-05-16 Application filed by Magna Electronics Inc filed Critical Magna Electronics Inc
2017-05-16 Priority to US15/596,348 priority Critical patent/US20170329331A1/en
2017-11-16 Publication of US20170329331A1 publication Critical patent/US20170329331A1/en
2020-07-22 Priority to US16/947,184 priority patent/US12187326B2/en
Status Abandoned legal-status Critical Current

Links

  • 230000003252 repetitive effect Effects 0.000 claims abstract 4
  • 238000012545 processing Methods 0.000 claims description 22
  • 238000004891 communication Methods 0.000 claims description 13
  • 238000000034 method Methods 0.000 claims description 8
  • 238000013528 artificial neural network Methods 0.000 claims description 7
  • 238000013459 approach Methods 0.000 claims description 5
  • 230000008859 change Effects 0.000 claims description 5
  • 230000008569 process Effects 0.000 claims description 5
  • 238000003384 imaging method Methods 0.000 description 15
  • 238000001514 detection method Methods 0.000 description 6
  • 230000001133 acceleration Effects 0.000 description 4
  • 238000013473 artificial intelligence Methods 0.000 description 4
  • 230000006870 function Effects 0.000 description 4
  • 238000012549 training Methods 0.000 description 4
  • 241000282994 Cervidae Species 0.000 description 3
  • 230000008901 benefit Effects 0.000 description 3
  • 238000012706 support-vector machine Methods 0.000 description 3
  • 238000013527 convolutional neural network Methods 0.000 description 2
  • 238000005516 engineering process Methods 0.000 description 2
  • 230000003993 interaction Effects 0.000 description 2
  • 238000005457 optimization Methods 0.000 description 2
  • 238000002604 ultrasonography Methods 0.000 description 2
  • 241000282412 Homo Species 0.000 description 1
  • 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
  • 230000003044 adaptive effect Effects 0.000 description 1
  • 230000005540 biological transmission Effects 0.000 description 1
  • 230000000903 blocking effect Effects 0.000 description 1
  • 230000000295 complement effect Effects 0.000 description 1
  • 238000013480 data collection Methods 0.000 description 1
  • 238000013135 deep learning Methods 0.000 description 1
  • 238000011161 development Methods 0.000 description 1
  • 239000000428 dust Substances 0.000 description 1
  • 230000000694 effects Effects 0.000 description 1
  • 230000007613 environmental effect Effects 0.000 description 1
  • 238000001914 filtration Methods 0.000 description 1
  • 239000000446 fuel Substances 0.000 description 1
  • 230000004927 fusion Effects 0.000 description 1
  • 231100001261 hazardous Toxicity 0.000 description 1
  • 239000003550 marker Substances 0.000 description 1
  • 238000012986 modification Methods 0.000 description 1
  • 230000004048 modification Effects 0.000 description 1
  • 230000036651 mood Effects 0.000 description 1
  • 230000002787 reinforcement Effects 0.000 description 1
  • 230000004044 response Effects 0.000 description 1
  • 238000012552 review Methods 0.000 description 1
  • 239000004576 sand Substances 0.000 description 1
  • 230000003595 spectral effect Effects 0.000 description 1
  • 230000003068 static effect Effects 0.000 description 1
  • 238000005728 strengthening Methods 0.000 description 1
  • 238000012360 testing method Methods 0.000 description 1
  • 238000012876 topography Methods 0.000 description 1
  • 238000012546 transfer Methods 0.000 description 1
  • 239000013598 vector Substances 0.000 description 1
  • 230000000007 visual effect Effects 0.000 description 1

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0055Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots with safety arrangements
    • G05D1/0061Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots with safety arrangements for transition from automatic pilot to manual pilot and vice versa
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/005Handover processes
    • B60W60/0059Estimation of the risk associated with autonomous or manual driving, e.g. situation too complex, sensor failure or driver incapacity
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/18Conjoint control of vehicle sub-units of different type or different function including control of braking systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W10/00Conjoint control of vehicle sub-units of different type or different function
    • B60W10/20Conjoint control of vehicle sub-units of different type or different function including control of steering systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/10Path keeping
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/14Adaptive cruise control
    • B60W30/16Control of distance between vehicles, e.g. keeping a distance to preceding vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/0098Details of control systems ensuring comfort, safety or stability not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/082Selecting or switching between different modes of propelling
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/005Handover processes
    • B60W60/0053Handover processes from vehicle to occupant
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W60/00Drive control systems specially adapted for autonomous road vehicles
    • B60W60/005Handover processes
    • B60W60/0061Aborting handover process
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0088Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/02Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
    • B60K28/06Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver responsive to incapacity of driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0075Automatic parameter input, automatic initialising or calibrating means
    • B60W2050/0083Setting, resetting, calibration
    • B60W2050/0088Adaptive recalibration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar
    • B60W2420/42
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/26Incapacity
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/40Dynamic objects, e.g. animals, windblown objects
    • B60W2554/402Type
    • B60W2554/4029Pedestrians
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/20Ambient conditions, e.g. wind or rain
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/10Historical data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • B60W2556/50External transmission of data to or from the vehicle of positioning data, e.g. GPS [Global Positioning System] data

Definitions

  • the present invention relates generally to a vehicle control system and, more particularly, to a vehicle control system for autonomously or semi-autonomously driving a vehicle along a road.
  • Driver assist systems are known where a system can autonomously or semi-autonomously control a vehicle during certain driving tasks, such as highway driving and/or parking maneuvers. Such systems may be responsive to processing of image data captured by one or more cameras of the vehicle and/or processing of other sensor data captured by one or more other sensors of the vehicle.
  • the present invention provides a vehicle control system that controls the vehicle to drive the vehicle along a road.
  • the system learns the road during initial driving passes along the road and, after sufficiently learning the road or path of travel typically taken by the vehicle, the system can autonomously or semi-autonomously control the vehicle along the road or path.
  • the system learns the road during multiple drives by the driver of the road or path or route (such as when the driver drives from home to work and/or from work to home) and, after multiple learning passes (where the system “learns” the road, the route, and/or typical driving speeds and traffic conditions), the system can autonomously or semi-autonomously control the vehicle (such as when a driver selects a semi-autonomous driving function) along the route.
  • control system of the present invention provides a user selected semi-autonomous control of the vehicle to allow the driver to rest or be less active in driving the vehicle during routine, often repeated trips, such as to and from work and the like.
  • FIG. 1 is a plan view of a vehicle with a vision system that incorporates cameras in accordance with the present invention
  • FIG. 2 shows a flow chart of the learning and driving execution of a system according the invention.
  • FIG. 3 shows a plan view to a route (dotted line) which may lead along a street way from a shopping area to the driver's home, passing intersections, traffic lights, curves.
  • a vehicle control system or vision system and/or driver assist system and/or object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction.
  • the vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data.
  • the vision system may provide display, such as a rearview display or a top down or bird's eye or surround view display or the like.
  • a vehicle 10 includes an imaging system or vision system 12 that includes at least one exterior facing imaging sensor or camera, such as a rearward facing imaging sensor or camera 14 a (and the system may optionally include multiple exterior facing imaging sensors or cameras, such as a forward facing camera 14 b at the front (or at the windshield) of the vehicle, and a sideward/rearward facing camera 14 c, 14 d at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera ( FIG. 1 ).
  • an imaging system or vision system 12 that includes at least one exterior facing imaging sensor or camera, such as a rearward facing imaging sensor or camera 14 a (and the system may optionally include multiple exterior facing imaging sensors or cameras, such as a forward facing camera 14 b at the front (or at the windshield) of the vehicle, and a sideward/rearward facing camera 14 c, 14 d at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera
  • a forward viewing camera may be disposed at the windshield of the vehicle and view through the windshield and forward of the vehicle, such as for a machine vision system (such as for traffic sign recognition, headlamp control, pedestrian detection, collision avoidance, lane marker detection and/or the like).
  • the vision system 12 includes a control or electronic control unit (ECU) or processor 18 that is operable to process image data captured by the camera or cameras and may detect objects or the like and/or provide displayed images at a display device 16 for viewing by the driver of the vehicle (although shown in FIG. 1 as being part of or incorporated in or at an interior rearview mirror assembly 20 of the vehicle, the control and/or the display device may be disposed elsewhere at or in the vehicle).
  • the data transfer or signal communication from the camera to the ECU may comprise any suitable data or communication link, such as a vehicle network bus or the like of the equipped vehicle.
  • the environment When driving in cities and overland routes, the environment is less defined, and by that there is the possibility of opposing traffic, cross traffic, slow traffic participants (such as, for example, horse carriages, fork lifter, tractors and rollerblader, skateboarder, cyclists) and ‘irresponsible’ road users such as children. Additionally, the roads may be narrower and the routes may have turning points on roundabouts, U-turns and intersections.
  • the semi-autonomous vehicle may employ an artificial intelligence capable of learning the typical path and its scenery.
  • NN neural network
  • DNN Deep Neural Network
  • a different suitable network or algorithm derivate such as, for example, CNN (Convolutional Neural Network) or SVM (Support Vector Machine) having sufficient processing and storage resources within the semi-autonomous vehicle.
  • the system of the present invention aims for driving autonomously, specially trained for known (planned and foreseeable) routes under good weather conditions. Since it is intended to fulfil SAE Level 3 , unforeseen events and driving tasks may be left to the human driver.
  • SAE Level 3 unforeseen events and driving tasks may be left to the human driver.
  • the system's limitations are compensated by the other advantages the system has.
  • the system is able to generate a high level of autonomy or autonomous control, while the sensor requirements and training efforts are comparably low and inexpensive. By the system's nature it may imitate the human driver in terms of speed control and curve turning. In this way, the critical issues in common with autonomous driving development, such as distant curve radius detection and/or speed limit sign recognition, can be ignored.
  • the system is a special implementation of autonomous driving systems.
  • the vehicle may collect data of the GPS positions, speed, selected gear, (position and traffic situational depending), acceleration areas (position and traffic situational depending) and areas where typical braking takes place (position and traffic situational depending), traffic signs and signals position and the road topography.
  • the semi-autonomous vehicle may optionally include cameras, LIDAR sensors, RADAR sensors, ultrasound sensors and/or inert sensor systems as environment detection sensors and according data processing systems (especially for fusion and scene understanding).
  • the NN may take the vehicle's stability sensor information into account such as rotatory (Gyro) and accelerator sensors, wheel speed sensors, gear, clutch condition, ASR, ABS, ESC interaction and the like.
  • the learning or training may cover street driving and optionally as well as the parking-out and parking-in maneuvers at the start and the beginning of the route.
  • the system may filter ‘free ride’ (with no vehicles in front of the subject vehicle such that driving actions may have influence to the human driver's driving) from ‘interfered driving’ (which includes situations where interference with pedestrians or other moving objects around the vehicle or vehicles in front, rear, other lanes and opposing traffic may take place).
  • interfered driving which includes situations where interference with pedestrians or other moving objects around the vehicle or vehicles in front, rear, other lanes and opposing traffic may take place.
  • the data collection in NN or SVM is a matter of strengthening weighting structures or weighted vectors.
  • the learning may have a higher weighting on way sections where ‘free ride’ was possible since the learning is aiming to a best manner of achieving the driving task at a specific way point.
  • Interference such as braking maneuvers that may, for example, be induced due to a slow driving vehicles in front on a curvy lane road may derail the system's learning experience of how to master these curves when having the task to autonomously drive through that curves when no slow vehicle slows down the ride. Due to that ‘no free ride’ sections may be fully ignored or weighted low during the learning procedure.
  • the learning may be a patchwork of free ride sections which were repeatedly driven, so that the DNN was able to learn their characteristics. The more often a way section was learned the higher the confidence level may grow for that section.
  • This method provides reinforcement learning.
  • the system may predict the near future (time wise) or near path (way wise) time segment or way segment (a segment may have 2.5 meters or a quarter second) ahead of it and may rate a segment later how strong the prediction of the human driven segment complied with the prediction, which result than reinforces the data set for that single segment.
  • the route or section (called a ‘confident section’) of the route will be released (highlighted as a “green route”) to be driven in semi-autonomous driving mode to the driver on the driver's choice to engage or disengage and take over by himself or herself whenever he/she likes and is anywhere on that trained route or section.
  • a confident section ends the semi-autonomous driving mode may signal the driver to take over within a certain taking over time (such as, for example, three seconds or thereabouts).
  • the hand over and the hand over time determination as well as the triggering of emergency vehicle handling when the driver does not take over is described in U.S. provisional application, Ser. No. 62/401,310, filed Sep. 29, 2016, which is hereby incorporated herein by reference in its entirety.
  • FIG. 2 shows a flow chart of the learning and driving execution of a system according to the invention, including the sensors.
  • the learning may also take place when the vehicle is driving autonomously by testing minimal driving deviations in the lateral guidance and lateral acceleration (and its higher harmonics) and longitudinally by deviating the speed, acceleration and its higher harmonics for improving/optimizing the ride.
  • the optimization may always tend to minimize the accelerations, travel time, fuel consumption and vehicle wear during maximizing the comfort and safety.
  • the optimization may be done by reflecting the drivers driving interference and the driver's mood (assessed by a face detection AI).
  • the taking over may be signaled by a countdown, so that the driver is always aware how much time remains to him for non-driving activities (when legal) or just not driving but supervising the vehicle driving (such as, for example, when this is legally required). Since the system may not learn well to handle situational interactions, such as dealing with opposing traffic on road sections smaller than the vehicles to pass at the same time (or determined deviations from the learned features of the learned route), the system may optionally hand over these challenging tasks to the human driver and may offer to take back when the situation is passed.
  • the system may have the ability to slow down when approaching a detected or determined obstacle (as may be detected via a camera system and/or other sensor system of the vehicle) or a narrow section of road, or slow running traffic in front, such as in a similar manner as may be provided by an advanced or adaptive cruise control (ACC) collision avoidance system or other driver assistance system or anti-collision controls for assisted or autonomous (piloted) highway driving or the like.
  • ACC advanced or adaptive cruise control
  • the system may, responsive to a determination that the driver has not taken over control of the vehicle (such as after a period of time elapses following when an alert to the driver to take over control of the vehicle), function to slow or stop the vehicle. For example, if a pedestrian or deer or other vehicle is determined (such as via processing of image data captured by one or more cameras of the vehicle or processing of sensor data captured by one or more radar or lidar sensors of the vehicle) to be present in the path or route (or approaching the path or route) where the system does not expect such objects, the system may generate an alert to have the driver take over, and if, after, for example, 1 second or 2 seconds, the driver has not started manually driving/controlling the vehicle, the system may slow and/or stop the vehicle.
  • a pedestrian or deer or other vehicle is determined (such as via processing of image data captured by one or more cameras of the vehicle or processing of sensor data captured by one or more radar or lidar sensors of the vehicle) to be present in the path or route (or approaching the path or route) where the system does
  • the system may determine various deviations from the determined or learned features of the route, such as determination of the presence of pedestrians or other vehicles or objects, or determination of a change in weather conditions or road conditions or the like.
  • the determination of deviations may be made responsive to processing of data captured by one or more sensors of the vehicle (such as one or more cameras or radar or lidar sensors of the vehicle), or responsive to receipt of a communication from a remote transmitter or communication device, such as a V2V communication (from another vehicle) or V2X communication (from an infrastructure) or the like, or responsive to determination of a change in weather (such as responsive to a GPS system that receives communications indicative of weather conditions local to the vehicle's current geographical location).
  • FIG. 3 shows a plan view to a green route (dotted line) which leads along a street way from a shopping area to the driver's home, passing intersections, traffic lights and curves.
  • An already trained system according the invention may be capable of handling the driving tasks on the way home from the shopping center autonomously (also called machine driven), with the learned path marked as “Machine”, while the system may request a driver take over (manual drive) at route sections with exceptional (unforeseen, uncommon, unsafe) driving tasks (such as when the system determines a deviation from the learned features, such as change in weather or presence of pedestrians or other vehicles or the like) marked as “Human” also when the vehicle is on a green route, see also FIG. 2 .
  • FIG. 3 shows a deer close to the street.
  • the deer was not learned as a static property in the data set and is a moving object, and by that the system does a human driver intervention request (request overtake).
  • a human driver intervention request request overtake
  • Another similar example is a pedestrian passing a cross walk which is captured in the data set without humans on it. By that the driver may have passed the cross walk at higher speed as common as when pedestrian close to the cross walk. Because of that, the system, responsive to that information, may request that the driver take over control of the vehicle.
  • the vehicle may offer the driver to continue machine driven.
  • weather conditions or road conditions such as having the street covered by ice, hail, snow, dust or sand or when the sensors signals diminish too much due to weather influence, the system may also hand over the driving task to the driver.
  • the system may have a set of pre-installed safety functions like emergency braking.
  • the system when autonomously driving the vehicle along a learned path, may also have a maximum limit of speed, e.g., 130 kph, even on roads where no speed limit applies.
  • Information may come from vehicle inherent environmental sensors, such as visual cameras, infrared cameras, time of flight sensors, structured light sensors, RADAR, LIDAR, ultrasound sensors or any other kind of ranging sensor, preferably having a long range.
  • vehicle may have a plurality of cameras and/or RADAR sensors and/or LIDAR sensors and/or ultrasonic sensors and/or the like.
  • the system may utilize sensors, such as radar or lidar sensors or the like.
  • the sensing system may utilize aspects of the systems described in U.S. Pat. Nos.
  • information may come from remote sources, such as from a detected obstacle itself (such as, for example, where the blocking object is a broken down vehicle with remote data transmission capability maintained such as by having an E-Call system in a car) or the information may be transmitted by another vehicle (or more than one vehicle, optionally partitionally sent by different peers (such as by utilizing aspects of the systems described in U.S. Publication No. US-2015-0344028, which is hereby incorporated herein by reference in its entirety) or infrastructure which detects the blockage by its own (inherent) sensors.
  • a detected obstacle such as, for example, where the blocking object is a broken down vehicle with remote data transmission capability maintained such as by having an E-Call system in a car
  • the information may be transmitted by another vehicle (or more than one vehicle, optionally partitionally sent by different peers (such as by utilizing aspects of the systems described in U.S. Publication No. US-2015-0344028, which is hereby incorporated herein by reference in its entirety) or infrastructure which detects the blockage
  • the system may also communicate with other systems, such as via a vehicle-to-vehicle communication system or a vehicle-to-infrastructure communication system or the like.
  • vehicle-to-vehicle communication system or a vehicle-to-infrastructure communication system or the like.
  • vehicle-to-vehicle communication system or a vehicle-to-infrastructure communication system or the like.
  • vehicle-to-vehicle communication system such as via a vehicle-to-vehicle communication system or a vehicle-to-infrastructure communication system or the like.
  • vehicle-to-vehicle communication system such as via a vehicle-to-vehicle communication system or a vehicle-to-infrastructure communication system or the like.
  • vehicle communication systems may utilize aspects of the systems described in U.S. Pat. Nos. 6,690,268; 6,693,517 and/or 7,580,795, and/or U.S. Publication Nos.
  • the vehicle may include a control system and sensors that senses/determines the presence of other vehicles ahead of or in the side lanes adjacent to the lane in which the subject vehicle is traveling.
  • the sensors may comprise cameras or RADAR or LIDAR or ultrasonic sensors or the like, whereby the system (responsive to processing of sensor data) may know when a path is available, such as when it is safe to change lanes into an adjacent lane.
  • the camera or sensor may comprise any suitable camera or sensor.
  • the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.
  • the system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras.
  • the image processor may comprise an image processing chip selected from the EyeQ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects.
  • the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.
  • the vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like.
  • the imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640 ⁇ 480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array.
  • the photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns.
  • the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels.
  • the imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like.
  • the logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.
  • the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 9,233,641; 9,146,898; 9,174,574; 9,090,234; 9,077,098; 8,818,042; 8,886,401; 9,077,962; 9,068,390; 9,140,789; 9,092,986; 9,205,776; 8,917,169; 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935
  • the system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO 2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.
  • the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle.
  • the vision system may include a video display device, such as by utilizing aspects of the video display systems described in U.S. Pat. Nos.
  • the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in International Publication Nos.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Human Computer Interaction (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Chemical & Material Sciences (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Combustion & Propulsion (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)

Abstract

A control system for controlling a vehicle includes a vehicle control operable to control driving of the vehicle. The vehicle control includes circuitry and associated software and is operable to learn features of a route driven by the vehicle during multiple repetitive drives of the route by the vehicle. After the vehicle control has sufficiently learned the route, the vehicle control is operable to at least semi-autonomously control the vehicle to drive the vehicle along the route. While the vehicle control is at least semi-autonomously controlling the vehicle to drive the vehicle along the route, and responsive to determination of a deviation from the learned features of the route, the vehicle control adjusts the at least semi-autonomously control of the vehicle.

Description

    CROSS REFERENCE TO RELATED APPLICATION
  • The present application claims the filing benefits of U.S. provisional application Ser. No. 62/336,883, filed May 16, 2016, which is hereby incorporated herein by reference in its entirety.

  • FIELD OF THE INVENTION
  • The present invention relates generally to a vehicle control system and, more particularly, to a vehicle control system for autonomously or semi-autonomously driving a vehicle along a road.

  • BACKGROUND OF THE INVENTION
  • Driver assist systems are known where a system can autonomously or semi-autonomously control a vehicle during certain driving tasks, such as highway driving and/or parking maneuvers. Such systems may be responsive to processing of image data captured by one or more cameras of the vehicle and/or processing of other sensor data captured by one or more other sensors of the vehicle.

  • SUMMARY OF THE INVENTION
  • The present invention provides a vehicle control system that controls the vehicle to drive the vehicle along a road. The system learns the road during initial driving passes along the road and, after sufficiently learning the road or path of travel typically taken by the vehicle, the system can autonomously or semi-autonomously control the vehicle along the road or path. The system learns the road during multiple drives by the driver of the road or path or route (such as when the driver drives from home to work and/or from work to home) and, after multiple learning passes (where the system “learns” the road, the route, and/or typical driving speeds and traffic conditions), the system can autonomously or semi-autonomously control the vehicle (such as when a driver selects a semi-autonomous driving function) along the route. The driver can at any time (such as when desired or when the vehicle approaches the end of its learned route) take over control of the vehicle and override the semi-autonomous control of the vehicle by the control system of the present invention. Thus, the control system of the present invention provides a user selected semi-autonomous control of the vehicle to allow the driver to rest or be less active in driving the vehicle during routine, often repeated trips, such as to and from work and the like.

  • These and other objects, advantages, purposes and features of the present invention will become apparent upon review of the following specification in conjunction with the drawings.

  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1

    is a plan view of a vehicle with a vision system that incorporates cameras in accordance with the present invention;

  • FIG. 2

    shows a flow chart of the learning and driving execution of a system according the invention; and

  • FIG. 3

    shows a plan view to a route (dotted line) which may lead along a street way from a shopping area to the driver's home, passing intersections, traffic lights, curves.

  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • A vehicle control system or vision system and/or driver assist system and/or object detection system and/or alert system operates to capture images exterior of the vehicle and may process the captured image data to display images and to detect objects at or near the vehicle and in the predicted path of the vehicle, such as to assist a driver of the vehicle in maneuvering the vehicle in a rearward direction. The vision system includes an image processor or image processing system that is operable to receive image data from one or more cameras and provide an output to a display device for displaying images representative of the captured image data. Optionally, the vision system may provide display, such as a rearview display or a top down or bird's eye or surround view display or the like.

  • Referring now to the drawings and the illustrative embodiments depicted therein, a

    vehicle

    10 includes an imaging system or

    vision system

    12 that includes at least one exterior facing imaging sensor or camera, such as a rearward facing imaging sensor or

    camera

    14 a (and the system may optionally include multiple exterior facing imaging sensors or cameras, such as a forward facing

    camera

    14 b at the front (or at the windshield) of the vehicle, and a sideward/rearward facing

    camera

    14 c, 14 d at respective sides of the vehicle), which captures images exterior of the vehicle, with the camera having a lens for focusing images at or onto an imaging array or imaging plane or imager of the camera (

    FIG. 1

    ). Optionally, a forward viewing camera may be disposed at the windshield of the vehicle and view through the windshield and forward of the vehicle, such as for a machine vision system (such as for traffic sign recognition, headlamp control, pedestrian detection, collision avoidance, lane marker detection and/or the like). The

    vision system

    12 includes a control or electronic control unit (ECU) or

    processor

    18 that is operable to process image data captured by the camera or cameras and may detect objects or the like and/or provide displayed images at a

    display device

    16 for viewing by the driver of the vehicle (although shown in

    FIG. 1

    as being part of or incorporated in or at an interior

    rearview mirror assembly

    20 of the vehicle, the control and/or the display device may be disposed elsewhere at or in the vehicle). The data transfer or signal communication from the camera to the ECU may comprise any suitable data or communication link, such as a vehicle network bus or the like of the equipped vehicle.

  • Many (maybe over 30 percent) vehicle driving situations are repeating routes, such as, for example, from home to office, from home to school, or from home to supermarkets, and vice versa. With the boom of artificial intelligence (AI), such as deep learning AI algorithms or the like, the driving of such routes in drive autonomously may come into reach. This may be an extended functionality of a highway chauffeur. Highway chauffeurs are able to master a limited set of driving tasks in a defined (highway) environment and situational response tasks. When driving in cities and overland routes, the environment is less defined, and by that there is the possibility of opposing traffic, cross traffic, slow traffic participants (such as, for example, horse carriages, fork lifter, tractors and rollerblader, skateboarder, cyclists) and ‘irresponsible’ road users such as children. Additionally, the roads may be narrower and the routes may have turning points on roundabouts, U-turns and intersections.

  • For enabling a semi-autonomous vehicle to drive consecutively driven routes, such as the typical way from a driver's home to work, the semi-autonomous vehicle may employ an artificial intelligence capable of learning the typical path and its scenery. This may be done under use of a neural network (NN) learning algorithm, such as by a Deep Neural Network (DNN) or a different suitable network or algorithm derivate such as, for example, CNN (Convolutional Neural Network) or SVM (Support Vector Machine) having sufficient processing and storage resources within the semi-autonomous vehicle. In contrast to conventional autonomous driving approaches using artificial learning procedures and data which typically aim to generalize the learned, for being able to master all upcoming situations, which are only partially plannable or foreseeable, the system of the present invention aims for driving autonomously, specially trained for known (planned and foreseeable) routes under good weather conditions. Since it is intended to fulfil SAE

    Level

    3, unforeseen events and driving tasks may be left to the human driver. The system's limitations are compensated by the other advantages the system has. The system is able to generate a high level of autonomy or autonomous control, while the sensor requirements and training efforts are comparably low and inexpensive. By the system's nature it may imitate the human driver in terms of speed control and curve turning. In this way, the critical issues in common with autonomous driving development, such as distant curve radius detection and/or speed limit sign recognition, can be ignored.

  • Therefore, the system is a special implementation of autonomous driving systems. During the NN learning phase, the vehicle may collect data of the GPS positions, speed, selected gear, (position and traffic situational depending), acceleration areas (position and traffic situational depending) and areas where typical braking takes place (position and traffic situational depending), traffic signs and signals position and the road topography. The semi-autonomous vehicle may optionally include cameras, LIDAR sensors, RADAR sensors, ultrasound sensors and/or inert sensor systems as environment detection sensors and according data processing systems (especially for fusion and scene understanding). The NN may take the vehicle's stability sensor information into account such as rotatory (Gyro) and accelerator sensors, wheel speed sensors, gear, clutch condition, ASR, ABS, ESC interaction and the like.

  • The learning or training may cover street driving and optionally as well as the parking-out and parking-in maneuvers at the start and the beginning of the route. Optionally, there may be a training set for several route variants when there is more than one way from one repeatedly driven destination to the other. The system may filter ‘free ride’ (with no vehicles in front of the subject vehicle such that driving actions may have influence to the human driver's driving) from ‘interfered driving’ (which includes situations where interference with pedestrians or other moving objects around the vehicle or vehicles in front, rear, other lanes and opposing traffic may take place). As known the data collection in NN or SVM is a matter of strengthening weighting structures or weighted vectors. Optionally, the learning may have a higher weighting on way sections where ‘free ride’ was possible since the learning is aiming to a best manner of achieving the driving task at a specific way point. Interference, such as braking maneuvers that may, for example, be induced due to a slow driving vehicles in front on a curvy lane road may derail the system's learning experience of how to master these curves when having the task to autonomously drive through that curves when no slow vehicle slows down the ride. Due to that ‘no free ride’ sections may be fully ignored or weighted low during the learning procedure.

  • The learning may be a patchwork of free ride sections which were repeatedly driven, so that the DNN was able to learn their characteristics. The more often a way section was learned the higher the confidence level may grow for that section. This method provides reinforcement learning. The system may predict the near future (time wise) or near path (way wise) time segment or way segment (a segment may have 2.5 meters or a quarter second) ahead of it and may rate a segment later how strong the prediction of the human driven segment complied with the prediction, which result than reinforces the data set for that single segment. When a certain minimal confidence level is exceeded (e.g., around 95 percent) on all segments of a route or partial section of a route, the route or section (called a ‘confident section’) of the route will be released (highlighted as a “green route”) to be driven in semi-autonomous driving mode to the driver on the driver's choice to engage or disengage and take over by himself or herself whenever he/she likes and is anywhere on that trained route or section. When a confident section ends the semi-autonomous driving mode may signal the driver to take over within a certain taking over time (such as, for example, three seconds or thereabouts). The hand over and the hand over time determination as well as the triggering of emergency vehicle handling when the driver does not take over is described in U.S. provisional application, Ser. No. 62/401,310, filed Sep. 29, 2016, which is hereby incorporated herein by reference in its entirety.

  • FIG. 2

    shows a flow chart of the learning and driving execution of a system according to the invention, including the sensors. Optionally (not shown in the flow chart of

    FIG. 2

    ), the learning may also take place when the vehicle is driving autonomously by testing minimal driving deviations in the lateral guidance and lateral acceleration (and its higher harmonics) and longitudinally by deviating the speed, acceleration and its higher harmonics for improving/optimizing the ride. The optimization may always tend to minimize the accelerations, travel time, fuel consumption and vehicle wear during maximizing the comfort and safety. The optimization may be done by reflecting the drivers driving interference and the driver's mood (assessed by a face detection AI).

  • Optionally, since the end of a section is known, the taking over may be signaled by a countdown, so that the driver is always aware how much time remains to him for non-driving activities (when legal) or just not driving but supervising the vehicle driving (such as, for example, when this is legally required). Since the system may not learn well to handle situational interactions, such as dealing with opposing traffic on road sections smaller than the vehicles to pass at the same time (or determined deviations from the learned features of the learned route), the system may optionally hand over these challenging tasks to the human driver and may offer to take back when the situation is passed. In general, the system may have the ability to slow down when approaching a detected or determined obstacle (as may be detected via a camera system and/or other sensor system of the vehicle) or a narrow section of road, or slow running traffic in front, such as in a similar manner as may be provided by an advanced or adaptive cruise control (ACC) collision avoidance system or other driver assistance system or anti-collision controls for assisted or autonomous (piloted) highway driving or the like.

  • Optionally, the system may, responsive to a determination that the driver has not taken over control of the vehicle (such as after a period of time elapses following when an alert to the driver to take over control of the vehicle), function to slow or stop the vehicle. For example, if a pedestrian or deer or other vehicle is determined (such as via processing of image data captured by one or more cameras of the vehicle or processing of sensor data captured by one or more radar or lidar sensors of the vehicle) to be present in the path or route (or approaching the path or route) where the system does not expect such objects, the system may generate an alert to have the driver take over, and if, after, for example, 1 second or 2 seconds, the driver has not started manually driving/controlling the vehicle, the system may slow and/or stop the vehicle. The system may determine various deviations from the determined or learned features of the route, such as determination of the presence of pedestrians or other vehicles or objects, or determination of a change in weather conditions or road conditions or the like. The determination of deviations may be made responsive to processing of data captured by one or more sensors of the vehicle (such as one or more cameras or radar or lidar sensors of the vehicle), or responsive to receipt of a communication from a remote transmitter or communication device, such as a V2V communication (from another vehicle) or V2X communication (from an infrastructure) or the like, or responsive to determination of a change in weather (such as responsive to a GPS system that receives communications indicative of weather conditions local to the vehicle's current geographical location).

  • FIG. 3

    shows a plan view to a green route (dotted line) which leads along a street way from a shopping area to the driver's home, passing intersections, traffic lights and curves. An already trained system according the invention may be capable of handling the driving tasks on the way home from the shopping center autonomously (also called machine driven), with the learned path marked as “Machine”, while the system may request a driver take over (manual drive) at route sections with exceptional (unforeseen, uncommon, unsafe) driving tasks (such as when the system determines a deviation from the learned features, such as change in weather or presence of pedestrians or other vehicles or the like) marked as “Human” also when the vehicle is on a green route, see also

    FIG. 2

    . For example,

    FIG. 3

    shows a deer close to the street. The deer was not learned as a static property in the data set and is a moving object, and by that the system does a human driver intervention request (request overtake). Another similar example is a pedestrian passing a cross walk which is captured in the data set without humans on it. By that the driver may have passed the cross walk at higher speed as common as when pedestrian close to the cross walk. Because of that, the system, responsive to that information, may request that the driver take over control of the vehicle.

  • When the exceptional situation diminishes or is passed and the vehicle is still on a green route, the vehicle may offer the driver to continue machine driven. When weather conditions or road conditions occur that are not well trained or learned, such as having the street covered by ice, hail, snow, dust or sand or when the sensors signals diminish too much due to weather influence, the system may also hand over the driving task to the driver.

  • Optionally, if there are cases that were not covered by the training process, the system may have a set of pre-installed safety functions like emergency braking. The system (when autonomously driving the vehicle along a learned path) may also have a maximum limit of speed, e.g., 130 kph, even on roads where no speed limit applies.

  • Information may come from vehicle inherent environmental sensors, such as visual cameras, infrared cameras, time of flight sensors, structured light sensors, RADAR, LIDAR, ultrasound sensors or any other kind of ranging sensor, preferably having a long range. For example, the vehicle may have a plurality of cameras and/or RADAR sensors and/or LIDAR sensors and/or ultrasonic sensors and/or the like. The system may utilize sensors, such as radar or lidar sensors or the like. The sensing system may utilize aspects of the systems described in U.S. Pat. Nos. 9,146,898; 9,036,026; 8,027,029; 8,013,780; 6,825,455; 7,053,357; 7,408,627; 7,405,812; 7,379,163; 7,379,100; 7,375,803; 7,352,454; 7,340,077; 7,321,111; 7,310,431; 7,283,213; 7,212,663; 7,203,356; 7,176,438; 7,157,685; 6,919,549; 6,906,793; 6,876,775; 6,710,770; 6,690,354; 6,678,039; 6,674,895 and/or 6,587,186, and/or International Publication No. WO 2011/090484 and/or U.S. Publication No. US-2010-0245066 and/or U.S. patent application Ser. No. 15/467,247, filed Mar. 23, 2017 (Attorney Docket MAG04 P-2978), Ser. No. 15/446,220, filed Mar. 1, 2017 (Attorney Docket MAG04 P-2955), and/or Ser. No. 15/420,238, filed Jan. 31, 2017 (Attorney Docket MAG04 P-2935), and/or U.S. provisional applications, Ser. No. 62/375,161, filed Aug. 15, 2016, Ser. No. 62/361,586, filed Jul. 13, 2016, Ser. No. 62/359,913, filed Jul. 8, 2016, and/or Ser. No. 62/349,874, filed Jun. 14, 2016, which are hereby incorporated herein by reference in their entireties.

  • Alternatively, information may come from remote sources, such as from a detected obstacle itself (such as, for example, where the blocking object is a broken down vehicle with remote data transmission capability maintained such as by having an E-Call system in a car) or the information may be transmitted by another vehicle (or more than one vehicle, optionally partitionally sent by different peers (such as by utilizing aspects of the systems described in U.S. Publication No. US-2015-0344028, which is hereby incorporated herein by reference in its entirety) or infrastructure which detects the blockage by its own (inherent) sensors.

  • The system may also communicate with other systems, such as via a vehicle-to-vehicle communication system or a vehicle-to-infrastructure communication system or the like. Such car2car or vehicle to vehicle (V2V) and vehicle-to-infrastructure (car2X or V2X or V2I or 4G or 5G) technology provides for communication between vehicles and/or infrastructure based on information provided by one or more vehicles and/or information provided by a remote server or the like. Such vehicle communication systems may utilize aspects of the systems described in U.S. Pat. Nos. 6,690,268; 6,693,517 and/or 7,580,795, and/or U.S. Publication Nos. US-2014-0375476; US-2014-0218529; US-2013-0222592; US-2012-0218412; US-2012-0062743; US-2015-0251599; US-2015-0158499; US-2015-0124096; US-2015-0352953: US-2016-0036917 and/or US-2016-0210853, which are hereby incorporated herein by reference in their entireties.

  • The vehicle may include a control system and sensors that senses/determines the presence of other vehicles ahead of or in the side lanes adjacent to the lane in which the subject vehicle is traveling. The sensors may comprise cameras or RADAR or LIDAR or ultrasonic sensors or the like, whereby the system (responsive to processing of sensor data) may know when a path is available, such as when it is safe to change lanes into an adjacent lane.

  • The camera or sensor may comprise any suitable camera or sensor. Optionally, the camera may comprise a “smart camera” that includes the imaging sensor array and associated circuitry and image processing circuitry and electrical connectors and the like as part of a camera module, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2013/081984 and/or WO 2013/081985, which are hereby incorporated herein by reference in their entireties.

  • The system includes an image processor operable to process image data captured by the camera or cameras, such as for detecting objects or other vehicles or pedestrians or the like in the field of view of one or more of the cameras. For example, the image processor may comprise an image processing chip selected from the EyeQ family of image processing chips available from Mobileye Vision Technologies Ltd. of Jerusalem, Israel, and may include object detection software (such as the types described in U.S. Pat. Nos. 7,855,755; 7,720,580 and/or 7,038,577, which are hereby incorporated herein by reference in their entireties), and may analyze image data to detect vehicles and/or other objects. Responsive to such image processing, and when an object or other vehicle is detected, the system may generate an alert to the driver of the vehicle and/or may generate an overlay at the displayed image to highlight or enhance display of the detected object or vehicle, in order to enhance the driver's awareness of the detected object or vehicle or hazardous condition during a driving maneuver of the equipped vehicle.

  • The vehicle may include any type of sensor or sensors, such as imaging sensors or radar sensors or lidar sensors or ladar sensors or ultrasonic sensors or the like. The imaging sensor or camera may capture image data for image processing and may comprise any suitable camera or sensing device, such as, for example, a two dimensional array of a plurality of photosensor elements arranged in at least 640 columns and 480 rows (at least a 640×480 imaging array, such as a megapixel imaging array or the like), with a respective lens focusing images onto respective portions of the array. The photosensor array may comprise a plurality of photosensor elements arranged in a photosensor array having rows and columns. Preferably, the imaging array has at least 300,000 photosensor elements or pixels, more preferably at least 500,000 photosensor elements or pixels and more preferably at least 1 million photosensor elements or pixels. The imaging array may capture color image data, such as via spectral filtering at the array, such as via an RGB (red, green and blue) filter or via a red/red complement filter or such as via an RCC (red, clear, clear) filter or the like. The logic and control circuit of the imaging sensor may function in any known manner, and the image processing and algorithmic processing may comprise any suitable means for processing the images and/or image data.

  • For example, the vision system and/or processing and/or camera and/or circuitry may utilize aspects described in U.S. Pat. Nos. 9,233,641; 9,146,898; 9,174,574; 9,090,234; 9,077,098; 8,818,042; 8,886,401; 9,077,962; 9,068,390; 9,140,789; 9,092,986; 9,205,776; 8,917,169; 8,694,224; 7,005,974; 5,760,962; 5,877,897; 5,796,094; 5,949,331; 6,222,447; 6,302,545; 6,396,397; 6,498,620; 6,523,964; 6,611,202; 6,201,642; 6,690,268; 6,717,610; 6,757,109; 6,802,617; 6,806,452; 6,822,563; 6,891,563; 6,946,978; 7,859,565; 5,550,677; 5,670,935; 6,636,258; 7,145,519; 7,161,616; 7,230,640; 7,248,283; 7,295,229; 7,301,466; 7,592,928; 7,881,496; 7,720,580; 7,038,577; 6,882,287; 5,929,786 and/or 5,786,772, and/or U.S. Publication Nos. US-2014-0340510; US-2014-0313339; US-2014-0347486; US-2014-0320658; US-2014-0336876; US-2014-0307095; US-2014-0327774; US-2014-0327772; US-2014-0320636; US-2014-0293057; US-2014-0309884; US-2014-0226012; US-2014-0293042; US-2014-0218535; US-2014-0218535; US-2014-0247354; US-2014-0247355; US-2014-0247352; US-2014-0232869; US-2014-0211009; US-2014-0160276; US-2014-0168437; US-2014-0168415; US-2014-0160291; US-2014-0152825; US-2014-0139676; US-2014-0138140; US-2014-0104426; US-2014-0098229; US-2014-0085472; US-2014-0067206; US-2014-0049646; US-2014-0052340; US-2014-0025240; US-2014-0028852; US-2014-005907; US-2013-0314503; US-2013-0298866; US-2013-0222593; US-2013-0300869; US-2013-0278769; US-2013-0258077; US-2013-0258077; US-2013-0242099; US-2013-0215271; US-2013-0141578 and/or US-2013-0002873, which are all hereby incorporated herein by reference in their entireties. The system may communicate with other communication systems via any suitable means, such as by utilizing aspects of the systems described in International Publication Nos. WO 2010/144900; WO 2013/043661 and/or WO 2013/081985, and/or U.S. Pat. No. 9,126,525, which are hereby incorporated herein by reference in their entireties.

  • Optionally, the vision system may include a display for displaying images captured by one or more of the imaging sensors for viewing by the driver of the vehicle while the driver is normally operating the vehicle. Optionally, for example, the vision system may include a video display device, such as by utilizing aspects of the video display systems described in U.S. Pat. Nos. 5,530,240; 6,329,925; 7,855,755; 7,626,749; 7,581,859; 7,446,650; 7,338,177; 7,274,501; 7,255,451; 7,195,381; 7,184,190; 5,668,663; 5,724,187; 6,690,268; 7,370,983; 7,329,013; 7,308,341; 7,289,037; 7,249,860; 7,004,593; 4,546,551; 5,699,044; 4,953,305; 5,576,687; 5,632,092; 5,677,851; 5,708,410; 5,737,226; 5,802,727; 5,878,370; 6,087,953; 6,173,508; 6,222,460; 6,513,252 and/or 6,642,851, and/or U.S. Publication Nos. US-2012-0162427; US-2006-0050018 and/or US-2006-0061008, which are all hereby incorporated herein by reference in their entireties. Optionally, the vision system (utilizing the forward facing camera and a rearward facing camera and other cameras disposed at the vehicle with exterior fields of view) may be part of or may provide a display of a top-down view or birds-eye view system of the vehicle or a surround view at the vehicle, such as by utilizing aspects of the vision systems described in International Publication Nos. WO 2010/099416; WO 2011/028686; WO 2012/075250; WO 2013/019795; WO 2012/075250; WO 2012/145822; WO 2013/081985; WO 2013/086249 and/or WO 2013/109869, and/or U.S. Publication No. US-2012-0162427, which are hereby incorporated herein by reference in their entireties.

  • Changes and modifications in the specifically described embodiments can be carried out without departing from the principles of the invention, which is intended to be limited only by the scope of the appended claims, as interpreted according to the principles of patent law including the doctrine of equivalents.

Claims (20)

1. A control system for controlling a vehicle, said control system comprising:

a vehicle control comprising circuitry and associated software, wherein said vehicle control is operable to control driving of the vehicle;

wherein said vehicle control is operable to learn features of a route driven by the vehicle during multiple repetitive drives of the route by the vehicle;

wherein, after said vehicle control has sufficiently learned the route, said vehicle control is operable to at least semi-autonomously control the vehicle to drive the vehicle along the route; and

wherein, while said vehicle control is at least semi-autonomously controlling the vehicle to drive the vehicle along the route, and responsive to determination of a deviation from the learned features of the route, said vehicle control adjusts the at least semi-autonomously control of the vehicle.

2. The control system of

claim 1

, wherein, while said vehicle control is at least semi-autonomously controlling the vehicle to drive the vehicle along the route, and responsive to determination of a deviation from the learned features of the route, said vehicle control generates an alert to have a driver of the vehicle take over driving of the vehicle.

3. The control system of

claim 2

, wherein said vehicle control, responsive to determination that the driver of the vehicle has not taken over driving of the vehicle after the alert is generated, controls driving of the vehicle to slow or stop the vehicle.

4. The control system of

claim 1

, wherein said vehicle control determines a deviation from the learned features of the route via processing of data captured by one or more sensors of the vehicle.

5. The control system of

claim 4

, wherein said vehicle control determines a deviation by determining a pedestrian present along the learned route.

6. The control system of

claim 1

, wherein said vehicle control determines a deviation from the learned features of the route responsive to a remote communication.

7. The control system of

claim 1

, wherein said vehicle control determines a deviation from the learned features of the route via determination of a change in weather conditions.

8. The control system of

claim 1

, wherein said vehicle control learns features of the route via a neural network algorithm.

9. The control system of

claim 1

, wherein said vehicle control learns road contour features and road lane features and road curvature features of the route.

10. The control system of

claim 1

, wherein said vehicle control semi-autonomously controls the vehicle to drive the vehicle along the route responsive to a user input when the vehicle is at a location along the learned route.

11. The control system of

claim 10

, wherein said vehicle control provides an alert to a driver of the vehicle as the vehicle approaches an end of the learned route.

12. The control system of

claim 11

, wherein the alert comprises a countdown as the vehicle approaches the end of the learned route.

13. The control system of

claim 1

, wherein said vehicle control controls at least the steering of the vehicle, the accelerator of the vehicle and the braking system of the vehicle as the vehicle controls the vehicle to drive the vehicle along the learned route.

14. The control system of

claim 1

, wherein said vehicle control controls the vehicle at least in part responsive to a GPS system of the vehicle.

15. The control system of

claim 14

, wherein said vehicle control controls the vehicle at least in part responsive to processing of data captured by at least one sensor of the vehicle that senses forward of the vehicle.

16. The control system of

claim 15

, wherein said at least one sensor comprises a forward viewing camera of the vehicle that is operable to capture image data, and wherein said vehicle control includes a processor operable to process image data captured by said camera to detect the presence of objects present in the field of view of said camera.

17. The control system of

claim 15

, wherein, responsive to an output of said at least one sensor being indicative of an obstacle in the path of travel of the vehicle along the learned route, said control one of (i) controls the vehicle to avoid the obstacle and (ii) generates the alert so a driver can take over control of the vehicle.

18. A control system for controlling a vehicle, said control system comprising:

a vehicle control comprising circuitry and associated software, wherein said vehicle control is operable to control driving of the vehicle;

wherein said vehicle control is operable to learn features of a route driven by the vehicle during multiple repetitive drives of the route by the vehicle as the vehicle is driven by a driver of the vehicle;

wherein, after said vehicle control has sufficiently learned the route, said vehicle control is operable to at least semi-autonomously control the vehicle to drive the vehicle along the route;

wherein, while said vehicle control is at least semi-autonomously controlling the vehicle to drive the vehicle along the route, and responsive to determination of a deviation from the learned features of the route, said vehicle control generates an alert to have the driver of the vehicle take over driving of the vehicle; and

wherein said vehicle control determines a deviation from the learned features of the route via processing of data captured by one or more sensors of the vehicle.

19. The control system of

claim 18

, wherein said vehicle control, responsive to determination that the driver of the vehicle has not taken over driving of the vehicle after the alert is generated, controls driving of the vehicle to slow or stop the vehicle.

20. A control system for controlling a vehicle, said control system comprising:

a vehicle control comprising circuitry and associated software, wherein said vehicle control is operable to control driving of the vehicle;

wherein said vehicle control is operable to learn features of a route driven by the vehicle during multiple repetitive drives of the route by the vehicle;

wherein, after said vehicle control has sufficiently learned the route, said vehicle control is operable to at least semi-autonomously control the vehicle to drive the vehicle along the route;

wherein said vehicle control semi-autonomously controls the vehicle to drive the vehicle along the route responsive to a user input when the vehicle is at a location along the learned route;

wherein, while said vehicle control is at least semi-autonomously controlling the vehicle to drive the vehicle along the route, and responsive to determination of a deviation from the learned features of the route, said vehicle control at least one of (i) adjusts the at least semi-autonomously control of the vehicle and (ii) generates an alert to have a driver of the vehicle take over driving of the vehicle;

wherein said vehicle control determines a deviation from the learned features of the route via processing of data captured by one or more sensors of the vehicle; and

wherein said vehicle control provides an alert to the driver of the vehicle as the vehicle approaches an end of the learned route.

US15/596,348 2016-05-16 2017-05-16 Control system for semi-autonomous control of vehicle along learned route Abandoned US20170329331A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/596,348 US20170329331A1 (en) 2016-05-16 2017-05-16 Control system for semi-autonomous control of vehicle along learned route
US16/947,184 US12187326B2 (en) 2016-05-16 2020-07-22 Control system for semi-autonomous control of vehicle along learned route

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662336883P 2016-05-16 2016-05-16
US15/596,348 US20170329331A1 (en) 2016-05-16 2017-05-16 Control system for semi-autonomous control of vehicle along learned route

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/947,184 Continuation US12187326B2 (en) 2016-05-16 2020-07-22 Control system for semi-autonomous control of vehicle along learned route

Publications (1)

Publication Number Publication Date
US20170329331A1 true US20170329331A1 (en) 2017-11-16

Family

ID=60294709

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/596,348 Abandoned US20170329331A1 (en) 2016-05-16 2017-05-16 Control system for semi-autonomous control of vehicle along learned route
US16/947,184 Active 2038-08-26 US12187326B2 (en) 2016-05-16 2020-07-22 Control system for semi-autonomous control of vehicle along learned route

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/947,184 Active 2038-08-26 US12187326B2 (en) 2016-05-16 2020-07-22 Control system for semi-autonomous control of vehicle along learned route

Country Status (1)

Country Link
US (2) US20170329331A1 (en)

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170253237A1 (en) * 2016-03-02 2017-09-07 Magna Electronics Inc. Vehicle vision system with automatic parking function
US20180105171A1 (en) * 2016-10-14 2018-04-19 Honda Motor Co., Ltd Travel control device and travel control method
US10133273B2 (en) * 2016-09-20 2018-11-20 2236008 Ontario Inc. Location specific assistance for autonomous vehicle control system
US10222229B1 (en) * 2017-11-30 2019-03-05 Toyota Jidosha Kabushiki Kaisha Autonomous feature optimization for a connected vehicle based on a navigation route
CN109597317A (en) * 2018-12-26 2019-04-09 广州小鹏汽车科技有限公司 A kind of Vehicular automatic driving method, system and electronic equipment based on self study
CN109857118A (en) * 2019-03-12 2019-06-07 百度在线网络技术(北京)有限公司 For planning the method, apparatus, equipment and storage medium of driving strategy
GB2571154A (en) * 2018-02-15 2019-08-21 Jaguar Land Rover Ltd Vehicle Control System And Control Method
US10496090B2 (en) 2016-09-29 2019-12-03 Magna Electronics Inc. Handover procedure for driver of autonomous vehicle
US20200166746A1 (en) * 2018-11-26 2020-05-28 International Business Machines Corporation Heads up display system
US10839230B2 (en) * 2018-09-06 2020-11-17 Ford Global Technologies, Llc Multi-tier network for task-oriented deep neural network
US10906554B2 (en) 2017-05-23 2021-02-02 Magna Electronics Inc. Autonomous driving system
US20210094571A1 (en) * 2019-09-30 2021-04-01 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for controlling vehicle, device and storage medium
CN112660128A (en) * 2019-10-15 2021-04-16 现代自动车株式会社 Apparatus for determining lane change path of autonomous vehicle and method thereof
US20210166090A1 (en) * 2018-07-31 2021-06-03 Valeo Schalter Und Sensoren Gmbh Driving assistance for the longitudinal and/or lateral control of a motor vehicle
US11055803B2 (en) * 2017-12-26 2021-07-06 Toyota Jidosha Kabushiki Kaisha Vehicle dispatch management device and storage medium
US11059517B2 (en) * 2016-07-22 2021-07-13 Robert Bosch Gmbh Driving assistance method, driving assistance system and vehicle
US11119480B2 (en) 2016-10-20 2021-09-14 Magna Electronics Inc. Vehicle control system that learns different driving characteristics
US11136034B2 (en) * 2017-06-20 2021-10-05 Nissan Motor Co., Ltd. Travel control method and travel control device
US11169537B2 (en) * 2016-04-15 2021-11-09 Honda Motor Co., Ltd. Providing driving support in response to changes in driving environment
US11288963B2 (en) * 2017-08-31 2022-03-29 Uatc, Llc Autonomous vehicles featuring vehicle intention system
US20220163972A1 (en) * 2019-06-21 2022-05-26 Uatc, Llc Methods and Systems for Autonomous Vehicle Motion Deviation
US20220315055A1 (en) * 2021-04-02 2022-10-06 Tsinghua University Safety control method and system based on environmental risk assessment for intelligent connected vehicle
US11493920B2 (en) * 2018-02-02 2022-11-08 Uatc, Llc Autonomous vehicle integrated user alert and environmental labeling
US20220355823A1 (en) * 2019-06-18 2022-11-10 Ihi Corporation Travel route generation device and control device
US20230055023A1 (en) * 2020-01-17 2023-02-23 Hitachi Astemo. Ltd. Electronic control device and vehicle control system
US20230059053A1 (en) * 2018-09-30 2023-02-23 Strong Force Tp Portfolio 2022, Llc Optimizing margin of safety based on human operator interaction data from operators or vehicle safety events
US11613249B2 (en) 2018-04-03 2023-03-28 Ford Global Technologies, Llc Automatic navigation using deep reinforcement learning
US11618438B2 (en) 2018-03-26 2023-04-04 International Business Machines Corporation Three-dimensional object localization for obstacle avoidance using one-shot convolutional neural network
US12187326B2 (en) 2016-05-16 2025-01-07 Magna Electronics Inc. Control system for semi-autonomous control of vehicle along learned route

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112874521B (en) * 2021-01-22 2022-06-14 北京罗克维尔斯科技有限公司 Vehicle follow-up stop control method and device and vehicle

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100152967A1 (en) * 2008-12-15 2010-06-17 Delphi Technologies, Inc. Object detection system with learned position information and method
US20100148948A1 (en) * 2008-12-15 2010-06-17 Delphi Technologies, Inc. Vehicle lane departure warning system and method
US20140156134A1 (en) * 2012-11-30 2014-06-05 Google Inc. Engaging and disengaging for autonomous driving
US20150073620A1 (en) * 2013-09-11 2015-03-12 Toyota Jidosha Kabushiki Kaisha Driving assistance device
US20150168157A1 (en) * 2013-12-17 2015-06-18 Volkswagen Ag Method and system for determining parameters of a model for the longitudinal guidance and for the determination of a longitudinal guide for a vehicle
US20170052028A1 (en) * 2015-08-20 2017-02-23 Zendrive, Inc. Method for accelerometer-assisted navigation
US20170135621A1 (en) * 2015-11-16 2017-05-18 Samsung Electronics Co., Ltd. Apparatus and method to train autonomous driving model, and autonomous driving apparatus
US20170197635A1 (en) * 2014-09-02 2017-07-13 Aisin Aw Co., Ltd. Automatic driving assistance system, automatic driving assistance method, and computer program
US20170371334A1 (en) * 2014-12-29 2017-12-28 Robert Bosch Gmbh Drive state indicator for an autonomous vehicle
US20180178766A1 (en) * 2015-07-02 2018-06-28 Sony Corporation Vehicle control device, vehicle control method, and program

Family Cites Families (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6693517B2 (en) 2000-04-21 2004-02-17 Donnelly Corporation Vehicle mirror assembly communicating wirelessly with vehicle accessories and occupants
US6477464B2 (en) 2000-03-09 2002-11-05 Donnelly Corporation Complete mirror-based global-positioning system (GPS) navigation solution
US6512838B1 (en) 1999-09-22 2003-01-28 Canesta, Inc. Methods for enhancing performance and data acquired from three-dimensional image systems
US6710770B2 (en) 2000-02-11 2004-03-23 Canesta, Inc. Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
AU2001243285A1 (en) 2000-03-02 2001-09-12 Donnelly Corporation Video mirror systems incorporating an accessory module
US6587186B2 (en) 2000-06-06 2003-07-01 Canesta, Inc. CMOS-compatible three-dimensional image sensing using reduced peak energy
US6906793B2 (en) 2000-12-11 2005-06-14 Canesta, Inc. Methods and devices for charge management for three-dimensional sensing
US7352454B2 (en) 2000-11-09 2008-04-01 Canesta, Inc. Methods and devices for improved charge management for three-dimensional and color sensing
US6690354B2 (en) 2000-11-19 2004-02-10 Canesta, Inc. Method for enhancing performance in a system utilizing an array of sensors that sense at least two-dimensions
WO2002095679A2 (en) 2001-05-23 2002-11-28 Canesta, Inc. Enhanced dynamic range conversion in 3-d imaging
US7310431B2 (en) 2002-04-10 2007-12-18 Canesta, Inc. Optical methods for remotely measuring objects
US7203356B2 (en) 2002-04-11 2007-04-10 Canesta, Inc. Subject segmentation and tracking using 3D sensing technology for video compression in multimedia applications
AU2003245628A1 (en) 2002-06-19 2004-01-06 Canesta, Inc. System and method for determining 3-d coordinates of a surface using a coded array
EP1614159B1 (en) 2003-04-11 2014-02-26 Microsoft Corporation Method and system to differentially enhance sensor dynamic range
US7176438B2 (en) 2003-04-11 2007-02-13 Canesta, Inc. Method and system to differentially enhance sensor dynamic range using enhanced common mode reset
US7379100B2 (en) 2004-02-12 2008-05-27 Canesta, Inc. Method and system to increase dynamic range of time-of-flight (TOF) and/or imaging sensors
US7321111B2 (en) 2004-04-12 2008-01-22 Canesta, Inc. Method and system to enhance differential dynamic range and signal/noise in CMOS range finding systems using differential sensors
US7157685B2 (en) 2004-04-12 2007-01-02 Canesta, Inc. Method and system to enhance differential dynamic range and signal/noise in CMOS range finding systems using differential sensors
US7379163B2 (en) 2005-02-08 2008-05-27 Canesta, Inc. Method and system for automatic gain control of sensors in time-of-flight systems
US7283213B2 (en) 2005-02-08 2007-10-16 Canesta, Inc. Method and system to correct motion blur and reduce signal transients in time-of-flight sensor systems
US7408627B2 (en) 2005-02-08 2008-08-05 Canesta, Inc. Methods and system to quantify depth data accuracy in three-dimensional sensors using single frame capture
US7375803B1 (en) 2006-05-18 2008-05-20 Canesta, Inc. RGBZ (red, green, blue, z-depth) filter system usable with sensor systems, including sensor systems with synthetic mirror enhanced three-dimensional imaging
US7405812B1 (en) 2006-05-18 2008-07-29 Canesta, Inc. Method and system to avoid inter-system interference for phase-based time-of-flight systems
US8013780B2 (en) 2007-01-25 2011-09-06 Magna Electronics Inc. Radar sensing system for vehicle
US20100245066A1 (en) 2007-10-23 2010-09-30 Sarioglu Guner R Automotive Ultrasonic Sensor System with Independent Wire Harness
US8027029B2 (en) 2007-11-07 2011-09-27 Magna Electronics Inc. Object detection and tracking system
EP2401176B1 (en) 2009-02-27 2019-05-08 Magna Electronics Alert system for vehicle
WO2010144900A1 (en) 2009-06-12 2010-12-16 Magna Electronics Inc. Scalable integrated electronic control unit for vehicle
US9146898B2 (en) 2011-10-27 2015-09-29 Magna Electronics Inc. Driver assist system with algorithm switching
US9269263B2 (en) 2012-02-24 2016-02-23 Magna Electronics Inc. Vehicle top clearance alert system
DE102013217430A1 (en) 2012-09-04 2014-03-06 Magna Electronics, Inc. Driver assistance system for a motor vehicle
US20140218529A1 (en) 2013-02-04 2014-08-07 Magna Electronics Inc. Vehicle data recording system
US20140375476A1 (en) 2013-06-24 2014-12-25 Magna Electronics Inc. Vehicle alert system
US9881220B2 (en) 2013-10-25 2018-01-30 Magna Electronics Inc. Vehicle vision system utilizing communication system
US9499139B2 (en) 2013-12-05 2016-11-22 Magna Electronics Inc. Vehicle monitoring system
US9688199B2 (en) 2014-03-04 2017-06-27 Magna Electronics Inc. Vehicle alert system utilizing communication system
US10328932B2 (en) 2014-06-02 2019-06-25 Magna Electronics Inc. Parking assist system with annotated map generation
US20150352953A1 (en) 2014-06-04 2015-12-10 Magna Electronics Inc. Vehicle control system with mobile device interface
US9729636B2 (en) 2014-08-01 2017-08-08 Magna Electronics Inc. Smart road system for vehicles
EP3256815A1 (en) * 2014-12-05 2017-12-20 Apple Inc. Autonomous navigation system
US10032369B2 (en) 2015-01-15 2018-07-24 Magna Electronics Inc. Vehicle vision system with traffic monitoring and alert
JP6376059B2 (en) * 2015-07-06 2018-08-22 トヨタ自動車株式会社 Control device for autonomous driving vehicle
US20170222311A1 (en) 2016-02-01 2017-08-03 Magna Electronics Inc. Vehicle sensing system with radar antenna radome
US10863335B2 (en) 2016-03-04 2020-12-08 Magna Electronics Inc. Vehicle trailer angle detection system using short range communication devices
US10571562B2 (en) 2016-03-25 2020-02-25 Magna Electronics Inc. Vehicle short range sensing system using RF sensors
US20170329331A1 (en) 2016-05-16 2017-11-16 Magna Electronics Inc. Control system for semi-autonomous control of vehicle along learned route
US10768298B2 (en) 2016-06-14 2020-09-08 Magna Electronics Inc. Vehicle sensing system with 360 degree near range sensing
US10239446B2 (en) 2016-07-13 2019-03-26 Magna Electronics Inc. Vehicle sensing system using daisy chain of sensors
US10641867B2 (en) 2016-08-15 2020-05-05 Magna Electronics Inc. Vehicle radar system with shaped radar antennas
US10496090B2 (en) 2016-09-29 2019-12-03 Magna Electronics Inc. Handover procedure for driver of autonomous vehicle

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100152967A1 (en) * 2008-12-15 2010-06-17 Delphi Technologies, Inc. Object detection system with learned position information and method
US20100148948A1 (en) * 2008-12-15 2010-06-17 Delphi Technologies, Inc. Vehicle lane departure warning system and method
US20140156134A1 (en) * 2012-11-30 2014-06-05 Google Inc. Engaging and disengaging for autonomous driving
US20150073620A1 (en) * 2013-09-11 2015-03-12 Toyota Jidosha Kabushiki Kaisha Driving assistance device
US20150168157A1 (en) * 2013-12-17 2015-06-18 Volkswagen Ag Method and system for determining parameters of a model for the longitudinal guidance and for the determination of a longitudinal guide for a vehicle
US20170197635A1 (en) * 2014-09-02 2017-07-13 Aisin Aw Co., Ltd. Automatic driving assistance system, automatic driving assistance method, and computer program
US20170371334A1 (en) * 2014-12-29 2017-12-28 Robert Bosch Gmbh Drive state indicator for an autonomous vehicle
US20180178766A1 (en) * 2015-07-02 2018-06-28 Sony Corporation Vehicle control device, vehicle control method, and program
US20170052028A1 (en) * 2015-08-20 2017-02-23 Zendrive, Inc. Method for accelerometer-assisted navigation
US20170135621A1 (en) * 2015-11-16 2017-05-18 Samsung Electronics Co., Ltd. Apparatus and method to train autonomous driving model, and autonomous driving apparatus

Cited By (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170253237A1 (en) * 2016-03-02 2017-09-07 Magna Electronics Inc. Vehicle vision system with automatic parking function
US11400919B2 (en) * 2016-03-02 2022-08-02 Magna Electronics Inc. Vehicle vision system with autonomous parking function
US11169537B2 (en) * 2016-04-15 2021-11-09 Honda Motor Co., Ltd. Providing driving support in response to changes in driving environment
US12187326B2 (en) 2016-05-16 2025-01-07 Magna Electronics Inc. Control system for semi-autonomous control of vehicle along learned route
US11059517B2 (en) * 2016-07-22 2021-07-13 Robert Bosch Gmbh Driving assistance method, driving assistance system and vehicle
US10133273B2 (en) * 2016-09-20 2018-11-20 2236008 Ontario Inc. Location specific assistance for autonomous vehicle control system
US11550319B2 (en) 2016-09-29 2023-01-10 Magna Electronics Inc. Vehicular control system with handover procedure for driver of controlled vehicle
US11137760B2 (en) 2016-09-29 2021-10-05 Magna Electronics Inc. Handover procedure for driver of controlled vehicle
US11927954B2 (en) 2016-09-29 2024-03-12 Magna Electronics Inc. Vehicular control system with handover procedure for driver of controlled vehicle
US10496090B2 (en) 2016-09-29 2019-12-03 Magna Electronics Inc. Handover procedure for driver of autonomous vehicle
US10576980B2 (en) * 2016-10-14 2020-03-03 Honda Motor Co., Ltd. Travel control device and travel control method
US20180105171A1 (en) * 2016-10-14 2018-04-19 Honda Motor Co., Ltd Travel control device and travel control method
US12084063B2 (en) 2016-10-20 2024-09-10 Magna Electronics Inc. Vehicular cabin monitoring system
US11586204B2 (en) 2016-10-20 2023-02-21 Magna Electronics Inc. Vehicular driving assist system that learns different driving styles
US11119480B2 (en) 2016-10-20 2021-09-14 Magna Electronics Inc. Vehicle control system that learns different driving characteristics
US10906554B2 (en) 2017-05-23 2021-02-02 Magna Electronics Inc. Autonomous driving system
US11136034B2 (en) * 2017-06-20 2021-10-05 Nissan Motor Co., Ltd. Travel control method and travel control device
US11288963B2 (en) * 2017-08-31 2022-03-29 Uatc, Llc Autonomous vehicles featuring vehicle intention system
US10222229B1 (en) * 2017-11-30 2019-03-05 Toyota Jidosha Kabushiki Kaisha Autonomous feature optimization for a connected vehicle based on a navigation route
US10816356B2 (en) 2017-11-30 2020-10-27 Toyota Jidosha Kabushiki Kaisha Autonomous feature optimization for a connected vehicle based on a navigation route
US11055803B2 (en) * 2017-12-26 2021-07-06 Toyota Jidosha Kabushiki Kaisha Vehicle dispatch management device and storage medium
US11493920B2 (en) * 2018-02-02 2022-11-08 Uatc, Llc Autonomous vehicle integrated user alert and environmental labeling
GB2571154A (en) * 2018-02-15 2019-08-21 Jaguar Land Rover Ltd Vehicle Control System And Control Method
GB2571154B (en) * 2018-02-15 2020-04-22 Jaguar Land Rover Ltd Vehicle Control System And Control Method
US11618438B2 (en) 2018-03-26 2023-04-04 International Business Machines Corporation Three-dimensional object localization for obstacle avoidance using one-shot convolutional neural network
US11613249B2 (en) 2018-04-03 2023-03-28 Ford Global Technologies, Llc Automatic navigation using deep reinforcement learning
US20210166090A1 (en) * 2018-07-31 2021-06-03 Valeo Schalter Und Sensoren Gmbh Driving assistance for the longitudinal and/or lateral control of a motor vehicle
US10839230B2 (en) * 2018-09-06 2020-11-17 Ford Global Technologies, Llc Multi-tier network for task-oriented deep neural network
US12174626B2 (en) 2018-09-30 2024-12-24 Strong Force Tp Portfolio 2022, Llc Artificial intelligence system for vehicle in-seat advertising
US12153417B2 (en) 2018-09-30 2024-11-26 Strong Force Tp Portfolio 2022, Llc Intelligent transportation systems
US12248316B2 (en) 2018-09-30 2025-03-11 Strong Force Tp Portfolio 2022, Llc Expert system for vehicle configuration recommendations of vehicle or user experience parameters
US12248317B2 (en) 2018-09-30 2025-03-11 Strong Force Tp Portfolio 2022, Llc Neural net-based use of perceptrons to mimic human senses associated with a vehicle occupant
US12153422B2 (en) * 2018-09-30 2024-11-26 Strong Force Tp Portfolio 2022, Llc Optimizing margin of safety based on human operator interaction data from operators or vehicle safety events
US12153421B2 (en) 2018-09-30 2024-11-26 Strong Force Tp Portfolio 2022, Llc Neural network for improving the state of a rider in intelligent transportation systems
US20230059053A1 (en) * 2018-09-30 2023-02-23 Strong Force Tp Portfolio 2022, Llc Optimizing margin of safety based on human operator interaction data from operators or vehicle safety events
US12153418B2 (en) 2018-09-30 2024-11-26 Strong Force Tp Portfolio 2022, Llc Method for improving a state of a rider through optimization of operation of a vehicle
US12153419B2 (en) 2018-09-30 2024-11-26 Strong Force Tp Portfolio 2022, Llc Recommendation system for recommending a configuration of a vehicle
US12253851B2 (en) 2018-09-30 2025-03-18 Strong Force Tp Portfolio 2022, Llc Cognitive system reward management
US12153424B2 (en) 2018-09-30 2024-11-26 Strong Force Tp Portfolio 2022, Llc Three different neural networks to optimize the state of the vehicle using social data
US20240126284A1 (en) * 2018-09-30 2024-04-18 Strong Force Tp Portfolio 2022, Llc Robotic process automation for achieving an optimized margin of vehicle operational safety
US12153426B2 (en) 2018-09-30 2024-11-26 Strong Force Tp Portfolio 2022, Llc Transportation system to use a neural network to determine a variation in driving performance to promote a desired hormonal state of an occupant
US12153423B2 (en) 2018-09-30 2024-11-26 Strong Force Tp Portfolio 2022, Llc Inducing variation in user experience parameters based on outcomes that promote rider safety in intelligent transportation systems
US12147227B2 (en) * 2018-09-30 2024-11-19 Strong Force Tp Portfolio 2022, Llc Robotic process automation for achieving an optimized margin of vehicle operational safety
US12153425B2 (en) 2018-09-30 2024-11-26 Strong Force Tp Portfolio 2022, Llc Artificial intelligence system for processing voice of rider to improve emotional state and optimize operating parameter of vehicle
US11079593B2 (en) * 2018-11-26 2021-08-03 International Business Machines Corporation Heads up display system
US20200166746A1 (en) * 2018-11-26 2020-05-28 International Business Machines Corporation Heads up display system
CN109597317A (en) * 2018-12-26 2019-04-09 广州小鹏汽车科技有限公司 A kind of Vehicular automatic driving method, system and electronic equipment based on self study
CN109857118A (en) * 2019-03-12 2019-06-07 百度在线网络技术(北京)有限公司 For planning the method, apparatus, equipment and storage medium of driving strategy
US20220355823A1 (en) * 2019-06-18 2022-11-10 Ihi Corporation Travel route generation device and control device
US11999384B2 (en) * 2019-06-18 2024-06-04 Ihi Corporation Travel route generation device and control device
US20220163972A1 (en) * 2019-06-21 2022-05-26 Uatc, Llc Methods and Systems for Autonomous Vehicle Motion Deviation
US11628861B2 (en) * 2019-09-30 2023-04-18 Apollo Intelligent Driving Technology (Beijing) Co., Ltd. Method and apparatus for controlling vehicle, device and storage medium
US20210094571A1 (en) * 2019-09-30 2021-04-01 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for controlling vehicle, device and storage medium
CN112660128A (en) * 2019-10-15 2021-04-16 现代自动车株式会社 Apparatus for determining lane change path of autonomous vehicle and method thereof
US20230055023A1 (en) * 2020-01-17 2023-02-23 Hitachi Astemo. Ltd. Electronic control device and vehicle control system
US11518409B2 (en) * 2021-04-02 2022-12-06 Tsinghua University Safety control method and system based on environmental risk assessment for intelligent connected vehicle
US20220315055A1 (en) * 2021-04-02 2022-10-06 Tsinghua University Safety control method and system based on environmental risk assessment for intelligent connected vehicle

Also Published As

Publication number Publication date
US20200348667A1 (en) 2020-11-05
US12187326B2 (en) 2025-01-07

Similar Documents

Publication Publication Date Title
US12187326B2 (en) 2025-01-07 Control system for semi-autonomous control of vehicle along learned route
US11222544B2 (en) 2022-01-11 Lane change system for platoon of vehicles
US11713038B2 (en) 2023-08-01 Vehicular control system with rear collision mitigation
US11814045B2 (en) 2023-11-14 Autonomous vehicle with path planning system
US12228926B1 (en) 2025-02-18 System and method for predicting behaviors of detected objects through environment representation
US10688993B2 (en) 2020-06-23 Vehicle control system with traffic driving control
JP7043450B2 (en) 2022-03-29 Vehicle control devices, vehicle control methods, and programs
JP6051162B2 (en) 2016-12-27 System and method for predicting the behavior of detected objects
US11679780B2 (en) 2023-06-20 Methods and systems for monitoring vehicle motion with driver safety alerts
US20220048509A1 (en) 2022-02-17 Vehicular control system with traffic jam assist
US20220366175A1 (en) 2022-11-17 Long-range object detection, localization, tracking and classification for autonomous vehicles
US20240359691A1 (en) 2024-10-31 Vehicular control system
EP3679441B1 (en) 2023-06-28 Mobile robot having collision avoidance system for crossing a road from a pedestrian pathway
WO2021235043A1 (en) 2021-11-25 Vehicle control device
CN112639808A (en) 2021-04-09 Driving assistance for longitudinal and/or lateral control of a motor vehicle
US20240317256A1 (en) 2024-09-26 Vehicular driving assistance system with enhanced road curve management
EP3825902A1 (en) 2021-05-26 A training method and an automatic speed control system for a motor vehicle

Legal Events

Date Code Title Description
2019-03-24 STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

2019-04-26 STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

2019-08-07 STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

2019-10-30 STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

2020-03-19 STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

2020-11-23 STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION