US20020044081A1 - Track map generator - Google Patents

Track map generator Download PDF

Info

Publication number
US20020044081A1
US20020044081A1 US09/877,493 US87749301A US2002044081A1 US 20020044081 A1 US20020044081 A1 US 20020044081A1 US 87749301 A US87749301 A US 87749301A US 2002044081 A1 US2002044081 A1 US 2002044081A1
Authority
US
United States
Prior art keywords
data points
track
map
heading
ang
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US09/877,493
Other versions
US6420997B1 (en
Inventor
Shan Cong
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Automotive Systems Laboratory Inc
Original Assignee
Automotive Systems Laboratory Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Automotive Systems Laboratory Inc filed Critical Automotive Systems Laboratory Inc
Priority to US09/877,493 priority Critical patent/US6420997B1/en
Assigned to AUTOMOTIVE SYSTEMS LABORATORY, INC. reassignment AUTOMOTIVE SYSTEMS LABORATORY, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CONG, SHAN
Publication of US20020044081A1 publication Critical patent/US20020044081A1/en
Application granted granted Critical
Publication of US6420997B1 publication Critical patent/US6420997B1/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • G01S13/72Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
    • G01S13/723Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9322Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using additional data, e.g. driver condition, road state or weather data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9329Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles cooperating with reflectors or transponders

Definitions

  • FIG. 1 illustrates a block diagram of a radar processing system that incorporates the instant invention
  • FIG. 2 illustrates an example of a target tracking situation during a first period of time
  • FIG. 3 illustrates an example of a target tracking situation during a second period of time after the first period of time of FIG. 2;
  • FIG. 4 illustrates a general process in accordance with the instant invention
  • FIG. 5 illustrates a more detailed process in accordance with the instant invention
  • FIG. 6 illustrates a relationship of angular quantities
  • FIG. 7 illustrates an example of a target tracking situation during a first period of time, but with a host vehicle centered track map
  • FIG. 8 illustrates an example of a target tracking situation during a second period of time after the first period of time of FIG. 7, but with a host vehicle centered track map.
  • the term predictive collision sensing system will also refer to a collision avoidance system to mean a system that can sense and track targets in the environment of the host vehicle, and then either suggest, or automatically take countermeasures, that would improve safety.
  • a predictive collision sensing system tracks the motion of the host vehicle relative to its environment, or vice versa, for example, using a radar system with an associated target tracker.
  • the environment may include both stationary and moving targets.
  • An automotive environment is distinguished from other target tracking environments—for example, that of air or sea vessels-in that automotive vehicles are primarily operated in an environment that is constrained by roadways.
  • target tracking environments for example, that of air or sea vessels-in that automotive vehicles are primarily operated in an environment that is constrained by roadways.
  • Knowledge of roadway constraints can also be used to improve the convergence of an associated target tracker, or to improve an estimate of the environment or situation in which the host vehicle is operated.
  • Roadway constraints are generally characterized in the form of a map of associated coordinates in a two-dimensional space. For example, road trajectories can be plotted in an X-Y coordinate system with positive X directed North, and positive Y directed East. Whereas digitized road maps are presently widely available, the extent of the utility thereof in a predictive crash sensing system of a host vehicle is at least partially dependent upon a navigation process to locate the host vehicle on the map assuming that the maps are of sufficient.
  • the location and direction of a vehicle can be measured by a GPS receiver; by a dead reckoning system using measurements of vehicle heading from a compass or directional gyroscope, and vehicle distance and heading from wheel speed or rotation measurements, in conjunction with a map matching algorithm; or a combination of the two.
  • a dead reckoning system using measurements of vehicle heading from a compass or directional gyroscope, and vehicle distance and heading from wheel speed or rotation measurements, in conjunction with a map matching algorithm; or a combination of the two.
  • a dead reckoning system using measurements of vehicle heading from a compass or directional gyroscope, and vehicle distance and heading from wheel speed or rotation measurements, in conjunction with a map matching algorithm; or a combination of the two.
  • a track map generator 10 is illustrated in a radar processing system 12 for processing radar data from a radar system 14 incorporated in a host vehicle 16 .
  • a sensor for sensing targets in the environment thereof and for actuating and/or controlling associated countermeasures responsive to the relative motion of the host vehicle and one or more targets.
  • the sensor for sensing targets is for example a radar or lidar sensor system that senses and tracks the location of targets relative to the host vehicle, and predicts if a collision between the host vehicle and the target is likely to occur, for example, as disclosed in U.S. Pat. No. 6,085,151, assigned to the assignee of the instant invention, and incorporated by reference herein.
  • each block in the block diagram comprises associated software modules that receive, prepare and/or process data provided by a radar system 14 mounted in the host vehicle.
  • Data from the radar system 14 is preprocessed by a preprocessor 18 so as to generate radar data, for example, range, range rate, azimuth angle, and quality thereof, suitable for target tracking.
  • the radar data may typically covers a wide field of view forward of the host vehicle, at least ⁇ 5 degrees from the host vehicle's longitudinal axis, and possibly extending to ⁇ 180 degrees or larger depending upon the radar and associated antenna configuration.
  • a present exemplary system has a field of view of +/ ⁇ 55 degrees.
  • a tracker 20 converts radar output data (range, range rate & azimuth angle) into target speed and x-y coordinates specifying the location of a target.
  • radar output data range, range rate & azimuth angle
  • the system of Application '035 discloses a system for tracking multiple targets, and for clustering associated radar data for a single target.
  • the associator 22 relates older track data to that from the latest scan, compiling a track history of each target.
  • the track map generator 10 stores this data, and generates a track map therefrom as a record of the progress of target motions relative to the host vehicle.
  • a situation awareness processor 24 uses 1 ) the track map, 2 ) data acquired indicating the motion of the host vehicle, i.e. host vehicle information 26 , and possibly 3 ) “environmental data”, to determine the most likely or appropriate driving situation from a set of possible driving situations.
  • the “environmental data” can include GPS data, digital maps, real-time radio inputs of highway geometry and nearby vehicles, and data from real-time transponders such as electromagnetic or optical markers built into highways.
  • the “environmental data” can be used by a road curvature estimator 28 to provide an estimate of road curvature to the situation awareness processor 24 .
  • the situation awareness processor 24 stores and interprets the track map from the track map generator 10 , and compares the progress over time of several target tracks. Evaluation of the relative positions and progress of the tracked targets permits identification of various driving situations, for example a location situation, a traffic situation, a driving maneuver situation, or the occurrence of sudden events.
  • the situation estimated by the situation awareness processor 24 together with collision and crash severity estimates from a collision estimator 30 and a crash severity estimator 32 respectively, are used as inputs to a response generator 34 to select an appropriate countermeasure 36 for example, using a decision matrix.
  • the decision of a particular response by the response generator 34 may be based on, for example, a rule-based system (an expert system), a neural network, or another decision means.
  • Examples of countermeasures 36 that can be activated include, a warning device to warn the driver to take corrective action, for example 3D audio warning (for example, as disclosed in U.S. Pat. No. 5,979,586 assigned to the assignee of the instant invention and incorporated by reference herein); various means for taking evasive action to avoid a collision, for example the engine throttle, the vehicle transmission, the vehicle braking system, or the vehicle steering system; and various means for mitigating injury to an occupant if a collision is unavoidable, for example a motorized safety belt pretensioner, or internal or external airbags.
  • a warning device to warn the driver to take corrective action for example 3D audio warning (for example, as disclosed in U.S. Pat. No. 5,979,586 assigned to the assignee of the instant invention and incorporated by reference herein); various means for taking evasive action to avoid a collision, for example the engine throttle, the vehicle transmission, the vehicle braking system, or the vehicle steering system; and various means for mit
  • the particular one or more countermeasures 36 selected, and the manner by which that one or more countermeasures 36 are activated, actuated, or controlled, depends up the situation identified by the situation awareness processor 24 , and upon the collision and crash severity estimates.
  • one potential scenario is that the response to encroachment into the host's lane of travel requires a different response depending upon whether the target is coming from the opposite direction or going the same way as the host vehicle, but cutting into the host's lane.
  • the countermeasures 36 can be better adapted to mitigating that threat.
  • the countermeasures 36 may be implemented prior to an actual collision so as to either avoid the collision or to mitigate occupant injury from the collision.
  • the track map generator 10 operates on an assumption that objects in the field of view of a tracking sensor are subjected to evolving but common path constraints, so that, although they have independent controls, their associated trajectories are correlated with one another. This assumption is generally true for ground target trajectory estimation, where most likely the targets are motor vehicles and are driving on roads where there are strong path restrictions. In such cases, using the constraints will certainly improve both target kinematic states tracking performance and target status estimation performance.
  • target status refers to decisions such as the relative position of a target on a road and if a target is making an abnormal maneuver.
  • a road map is not available that can be updated in real-time to provide instant road condition report, nor can sufficient computation resources be allocated so as to provide a road map in real-time.
  • This limitation is overcome by providing a dynamic map of previous tracks can be used as an a map.
  • a map constructed in this way would not provide information of the first object passing by the sensor, however, as more and more targets are encountered, the accuracy of the map improves, so as to be able to provide more significant information.
  • This map is referred to as a track map 38 because it is built by accumulating previous target tracks.
  • a track map 38 can be used in a number of ways. For example, by accumulating information in previously existing tracks 40 , the convergence speed of a new track can be improved and tracking error at the early stage of a track can be reduced. As another example, as track information is combined into a map, vital information about the current and future traffic situation can also be deduced.
  • a particular track 40 follows a path 42 , and can be characterized by the content and quality of the associated information.
  • the information content is the proposition that at location L a target has states S, and the information quality is the strength with which that proposition is believed to be true.
  • the he information content can have different forms, for example, including but not limited to target speed, heading, acceleration and type.
  • the associated information quality can be evaluated under various reasoning frameworks, such as probability, fuzzy logic, evidential reasoning, or random set.
  • the track map 38 comprises a Cartesian grid of individual cells located by coordinates i and j, each cell 44 —denoted grid(i,j)—has associated state and quality information as follows: grid(ij).states, and grid(ij).quality.
  • the resolution and size of the track map 38 is dependent upon the associated computing bandwidth of an associated processor used to generate the track map.
  • a path 42 is then represented as a collection of grids: ⁇ grid(ij) ⁇ . Each time a track is obtained, its associated path 42 can also be deduced. A newly recognized path 42 can then be registered into the track map 38 and the associated information content and quality of the map are also updated. This procedure is called map building. Once a map has been built up, it can be used to improve tracks 40 obtained thereafter. This procedure is called track-map fusion.
  • step ( 402 ) the track data is read from the tracker 20 /associator 22 .
  • step ( 404 ) the backprojected path of the track is determined, which is then related to the map grid of the track map 38 in step ( 406 ).
  • step ( 408 ) the map grid is updated, after which the process is repeated in step ( 402 ).
  • step ( 502 ) the track data comprising position (x,y), velocity (Vx, Vy), target angle ang_h, and data quality S t is read from the tracker 20 /associator 22 .
  • step ( 504 ) the track is new
  • step ( 506 ) the track fit data is initialize. For example, accumulators of summation processes associated with the fitting processes are initialized to zero before accumulating new track data thereinto. Otherwise, in step ( 508 ), the new track data is accumulated in the associated accumulators used in the fitting process.
  • step ( 508 ) which corresponds to step ( 404 ) of FIG. 4, the track history data is processed —for example by a smoothing, regression, or other curve fitting process—to generate a representation of the path of the track.
  • the backprojected path 42 of a track 40 comprises smoothed trajectory of an existing track 40 .
  • an existing track 40 represents the estimated target states based on the information up to that moment.
  • the backprojected path 42 is the improved the history of target states.
  • Kalman smoother, autoregression, or a line or curve fitting process To obtain a backprojected path 42 , one may use Kalman smoother, autoregression, or a line or curve fitting process. The particular approach depends on a tradeoff between nature of target model, performance, available computation bandwidth.
  • autoregression would be appropriate if the backprojected path 42 is subject to strong trajectory restriction and has a smooth trajectory.
  • a typical curve function in parametric form is:
  • dx and dy define the distance from the starting point to the current point. Also note that a fading factor is added to gradually decrease the contribution of the early part of a track to current fitting result, for example:
  • the track map 38 is updated by first calculating the associated quality measures in steps ( 514 ), ( 516 ) and ( 518 ) as follows:
  • Fitting quality measure of step ( 518 ) contains two parts, one is for the contradiction of curve fitting result from step ( 510 ) and the output of the tracker 20 /associator 22 from step ( 502 ); and one is for the difference between a fitted curve (or path 42 ) and a track 40 , i.e.,
  • M diff is defined by the distance between the newly found curve and the existing track:
  • ⁇ trk is the quality of current track
  • ⁇ • ⁇ is the distance between a point on a track at time k, X k , and the corresponding backprojected point ⁇ circumflex over (X) ⁇ k
  • N is the number of points in a track.
  • M cont denotes contradiction component of quality corresponding to the discrepancy between the fitting result and and the output of the tracker 20 /associator 22 from step ( 502 ).
  • f(ang_f,ang_h) can be defined as any monotonic function between 0 and 1, as long as the difference between the two angle is reflected.
  • each fitting has to consider the following: a) the track should be long enough to guarantee a meaningful fitting; b) the fitting should be well-defined for numerical stability; and c) M diff itself should be small enough to guarantee a good fitting quality.
  • the contradiction component of quality can be given by:
  • M cont s,e ⁇ 2f(ang — f,ang — h)
  • S t is a quality measure from the tracker 20 /associator 22 of the current track
  • ang_f and ang_h are the estimates of the target heading angle obtained from line fitting and the tracker 20 /associator 22 , respectively.
  • vx and vy are estimated target velocities on x and y directions, and the angular relationships are illustrated in FIG. 6.
  • [0059] are points on a fitted line and a track, respectively.
  • the old information therein is aged, or faded, in step ( 520 ), so as to reduce the significance of cells 44 for which tracking data is no longer measured, and to eventually clear out old data from the cells 44 .
  • this may be accomplished by multiplying the information in each grid by a fading factor f d as follows:
  • the fading factor is between zero and one so that the quality of each grid can be degraded to zero if no new information is received therein.
  • a fading factor may be given by:
  • n r is the number of reports and v is the velocity, if available.
  • the numbers such as 0.14, 0.85 and 6, are given to tune the decreasing speed.
  • the track map 38 is then updated in steps ( 522 ) and ( 524 ) by a backprojection process that updates the grids from starting point to current point.
  • This updating process may be subject to conditions, for example,
  • p g is the quality of the current information content in this grid
  • f c ⁇ ( ang_g , ang_f ) ⁇ ⁇ ⁇ ang_f - ang_g ⁇ , if ⁇ ⁇ ang_f - ang_g ⁇ ⁇ ⁇ 2 ⁇ ⁇ - ⁇ ang_f - ang_g ⁇ , if ⁇ ⁇ ang_f - ang_g ⁇ > ⁇
  • the information quality is normalized by using:
  • grid.heading (grid.heading p cur +ang — f p g )/( p cur +p g )
  • Forward projection is used to look ahead of the current track to the region where this target has no history and to revive the history of the previous tracks.
  • This technique used in 3.3 can be applied here. The difference lies in the stop condition.
  • back projection well defined starting and ending points exist.
  • forward projection no clearly defined ending point of the projection exists. Instead, we introduce the following two conditions:
  • Stop Condition 1 Reach a point with no information, i.e.
  • Stop Condition 2 Reach a grid where the information content is too different with the projection, i.e.
  • T serves as a threshold
  • FIGS. 2 and 3 The operation of the track map generator 10 is illustrated in FIGS. 2 and 3.
  • a host vehicle 16 at first position 16 . 1 travels over a first trajectory 46 over a first period of time to a second position 16 . 2 .
  • a first target vehicle 48 at first position 48 . 1 travels over a second trajectory 50 over the first period of time to a second position 48 . 2 .
  • the track of the first target vehicle 48 is measured by the radar system 14 and associated elements of the radar processing system 12 at associated sampling times 52 .
  • the second trajectory 50 intersects a first set of cells 54 that are updated by the track map generator 10 .
  • the host vehicle 16 travels from the second position 16 . 2 to a third position 16 . 3 over the first trajectory 46 , and simultaneously, a second target vehicle 58 at first position 58 . 1 travels over a third trajectory 60 to a second position 58 . 2 .
  • the track of the second target vehicle 58 is measured by the radar system 14 and associated elements of the radar processing system 12 at associated sampling times 54 .
  • the third trajectory 60 intersects a second set of cells 62 , some of which are also of the first set of cells 56 , that are updated by the track map generator 10 .
  • the resulting updated cells each contain an associated direction and quality representing a composite of the paths of the first 48 and second 58 target vehicles, relative to the host vehicle 16 .
  • FIGS. 2 and 3 illustrate track maps 38 with associated cells that are fixed in space, as may be suitable for used in conjunction with absolute navigation data.
  • the coordinate system of the track map may also be adapted to move with the host vehicle 16 , so as to provide a map that is localized to the host vehicle 16 .

Abstract

A radar system (14, 20, 22) generates a plurality of data points (x, y) representative of the position of a tracked object (48), and a representation of an associated path (50, 60) is formed therefrom. At least one quality measure, and at least one heading measure, of said representation, is calculated corresponding to at least one map coordinate (44). The quality and heading measures are stored in memory as a track map. Data from other tracked objects (58) is used to update the track map (38), resulting a plurality of heading values at associated map coordinates (44) that are representative of a path (42) followed by the tracked objects (48, 58).

Description

  • The instant application claims the benefit of U.S. Provisional Application Ser. No. 60/210,193 filed on Jun. 8, 2000 (“ASL-264-PRO”), and of U.S. Provisional Application Ser. No. 60/210,878 filed on Jun. 9, 2000 (“ASL-266-PRO”), both of which applications are incorporated herein by reference.[0001]
  • In the accompanying drawings: [0002]
  • FIG. 1 illustrates a block diagram of a radar processing system that incorporates the instant invention; [0003]
  • FIG. 2 illustrates an example of a target tracking situation during a first period of time; [0004]
  • FIG. 3 illustrates an example of a target tracking situation during a second period of time after the first period of time of FIG. 2; [0005]
  • FIG. 4 illustrates a general process in accordance with the instant invention; [0006]
  • FIG. 5 illustrates a more detailed process in accordance with the instant invention; [0007]
  • FIG. 6 illustrates a relationship of angular quantities; [0008]
  • FIG. 7 illustrates an example of a target tracking situation during a first period of time, but with a host vehicle centered track map; and [0009]
  • FIG. 8 illustrates an example of a target tracking situation during a second period of time after the first period of time of FIG. 7, but with a host vehicle centered track map.[0010]
  • There exists a need for an improved predictive collision sensing or collision avoidance system for automotive applications that can sense and identify an environment of a host vehicle with sufficient range and accuracy so that proper countermeasures can be selected and taken sufficiently early to either avoid a collision or to mitigate injury therefrom either to occupants of the host vehicle or to pedestrians thereto. As used herein, the term predictive collision sensing system will also refer to a collision avoidance system to mean a system that can sense and track targets in the environment of the host vehicle, and then either suggest, or automatically take countermeasures, that would improve safety. Generally, a predictive collision sensing system tracks the motion of the host vehicle relative to its environment, or vice versa, for example, using a radar system with an associated target tracker. The environment may include both stationary and moving targets. An automotive environment is distinguished from other target tracking environments—for example, that of air or sea vessels-in that automotive vehicles are primarily operated in an environment that is constrained by roadways. There are, of course, exceptions to this, for example, parking lots or off-road driving conditions, but these exceptions generally occupy a relatively small amount of vehicular operating time, or a relatively small risk of collisions that would benefit from a predictive collision sensing system. [0011]
  • If available, knowledge of roadway constraints, for example, in the form of a map, can be useful in improving the performance of a predictive crash sensing system. For example, if the host vehicle is known to be operating on a particular roadway having a particular path geometry, and an on-board navigation system detects that the trajectory of the vehicle is departing from that path, for example as a result of a skid or driver inattention, then the predictive collision system could identify and react to this situation, whether or not the on-board radar detected a potential target with which the host vehicle might collide. [0012]
  • Knowledge of roadway constraints can also be used to improve the convergence of an associated target tracker, or to improve an estimate of the environment or situation in which the host vehicle is operated. [0013]
  • Roadway constraints are generally characterized in the form of a map of associated coordinates in a two-dimensional space. For example, road trajectories can be plotted in an X-Y coordinate system with positive X directed North, and positive Y directed East. Whereas digitized road maps are presently widely available, the extent of the utility thereof in a predictive crash sensing system of a host vehicle is at least partially dependent upon a navigation process to locate the host vehicle on the map assuming that the maps are of sufficient. For example, the location and direction of a vehicle can be measured by a GPS receiver; by a dead reckoning system using measurements of vehicle heading from a compass or directional gyroscope, and vehicle distance and heading from wheel speed or rotation measurements, in conjunction with a map matching algorithm; or a combination of the two. However, heretofore available navigation systems are generally not sufficiently accurate to provide a map of the roadway traveled by the host vehicle, of sufficient accuracy for predictive crash sensing. Moreover, road conditions can change over time, for example, as a result of road construction, and these changes may not always be timely entered into the associated digital maps. [0014]
  • Accordingly, there exists a need for an improved system and method for generating a map of a track or tracks over which follow the host vehicle and other vehicles in the environment of the host vehicle. [0015]
  • Referring to FIGS. [0016] 1-3, a track map generator 10 is illustrated in a radar processing system 12 for processing radar data from a radar system 14 incorporated in a host vehicle 16. is equipped with a sensor for sensing targets in the environment thereof and for actuating and/or controlling associated countermeasures responsive to the relative motion of the host vehicle and one or more targets. The sensor for sensing targets is for example a radar or lidar sensor system that senses and tracks the location of targets relative to the host vehicle, and predicts if a collision between the host vehicle and the target is likely to occur, for example, as disclosed in U.S. Pat. No. 6,085,151, assigned to the assignee of the instant invention, and incorporated by reference herein.
  • For example, referring to FIG. 1, illustrating a block diagram of a [0017] radar processing system 12, each block in the block diagram comprises associated software modules that receive, prepare and/or process data provided by a radar system 14 mounted in the host vehicle. Data from the radar system 14 is preprocessed by a preprocessor 18 so as to generate radar data, for example, range, range rate, azimuth angle, and quality thereof, suitable for target tracking. The radar data may typically covers a wide field of view forward of the host vehicle, at least ±5 degrees from the host vehicle's longitudinal axis, and possibly extending to ±180 degrees or larger depending upon the radar and associated antenna configuration. A present exemplary system has a field of view of +/−55 degrees.
  • A [0018] tracker 20 converts radar output data (range, range rate & azimuth angle) into target speed and x-y coordinates specifying the location of a target. For example, the system of Application '035 discloses a system for tracking multiple targets, and for clustering associated radar data for a single target. The associator 22 relates older track data to that from the latest scan, compiling a track history of each target.
  • The [0019] track map generator 10—described more fully hereinbelow—stores this data, and generates a track map therefrom as a record of the progress of target motions relative to the host vehicle.
  • A situation awareness processor [0020] 24 uses 1) the track map, 2) data acquired indicating the motion of the host vehicle, i.e. host vehicle information 26, and possibly 3) “environmental data”, to determine the most likely or appropriate driving situation from a set of possible driving situations. For example, the “environmental data” can include GPS data, digital maps, real-time radio inputs of highway geometry and nearby vehicles, and data from real-time transponders such as electromagnetic or optical markers built into highways.
  • For example, the “environmental data” can be used by a [0021] road curvature estimator 28 to provide an estimate of road curvature to the situation awareness processor 24.
  • The situation awareness processor [0022] 24 stores and interprets the track map from the track map generator 10, and compares the progress over time of several target tracks. Evaluation of the relative positions and progress of the tracked targets permits identification of various driving situations, for example a location situation, a traffic situation, a driving maneuver situation, or the occurrence of sudden events.
  • The situation estimated by the situation awareness processor [0023] 24, together with collision and crash severity estimates from a collision estimator 30 and a crash severity estimator 32 respectively, are used as inputs to a response generator 34 to select an appropriate countermeasure 36 for example, using a decision matrix. The decision of a particular response by the response generator 34 may be based on, for example, a rule-based system (an expert system), a neural network, or another decision means.
  • Examples of [0024] countermeasures 36 that can be activated include, a warning device to warn the driver to take corrective action, for example 3D audio warning (for example, as disclosed in U.S. Pat. No. 5,979,586 assigned to the assignee of the instant invention and incorporated by reference herein); various means for taking evasive action to avoid a collision, for example the engine throttle, the vehicle transmission, the vehicle braking system, or the vehicle steering system; and various means for mitigating injury to an occupant if a collision is unavoidable, for example a motorized safety belt pretensioner, or internal or external airbags. The particular one or more countermeasures 36 selected, and the manner by which that one or more countermeasures 36 are activated, actuated, or controlled, depends up the situation identified by the situation awareness processor 24, and upon the collision and crash severity estimates. By way of example, one potential scenario is that the response to encroachment into the host's lane of travel requires a different response depending upon whether the target is coming from the opposite direction or going the same way as the host vehicle, but cutting into the host's lane. By considering the traffic situation giving rise to the threat, the countermeasures 36 can be better adapted to mitigating that threat. By using a radar system, or generally a predictive collision sensing system, to sense targets within range of the host vehicle, the countermeasures 36 may be implemented prior to an actual collision so as to either avoid the collision or to mitigate occupant injury from the collision.
  • The [0025] track map generator 10 operates on an assumption that objects in the field of view of a tracking sensor are subjected to evolving but common path constraints, so that, although they have independent controls, their associated trajectories are correlated with one another. This assumption is generally true for ground target trajectory estimation, where most likely the targets are motor vehicles and are driving on roads where there are strong path restrictions. In such cases, using the constraints will certainly improve both target kinematic states tracking performance and target status estimation performance. Here target status refers to decisions such as the relative position of a target on a road and if a target is making an abnormal maneuver.
  • Generally a road map is not available that can be updated in real-time to provide instant road condition report, nor can sufficient computation resources be allocated so as to provide a road map in real-time. This limitation is overcome by providing a dynamic map of previous tracks can be used as an a map. A map constructed in this way would not provide information of the first object passing by the sensor, however, as more and more targets are encountered, the accuracy of the map improves, so as to be able to provide more significant information. This map is referred to as a [0026] track map 38 because it is built by accumulating previous target tracks.
  • A [0027] track map 38 can be used in a number of ways. For example, by accumulating information in previously existing tracks 40, the convergence speed of a new track can be improved and tracking error at the early stage of a track can be reduced. As another example, as track information is combined into a map, vital information about the current and future traffic situation can also be deduced.
  • A [0028] particular track 40 follows a path 42, and can be characterized by the content and quality of the associated information. The information content is the proposition that at location L a target has states S, and the information quality is the strength with which that proposition is believed to be true. The he information content can have different forms, for example, including but not limited to target speed, heading, acceleration and type. The associated information quality can be evaluated under various reasoning frameworks, such as probability, fuzzy logic, evidential reasoning, or random set.
  • For example, the [0029] track map 38 comprises a Cartesian grid of individual cells located by coordinates i and j, each cell 44—denoted grid(i,j)—has associated state and quality information as follows: grid(ij).states, and grid(ij).quality. The resolution and size of the track map 38 is dependent upon the associated computing bandwidth of an associated processor used to generate the track map.
  • A path [0030] 42 is then represented as a collection of grids: {grid(ij)}. Each time a track is obtained, its associated path 42 can also be deduced. A newly recognized path 42 can then be registered into the track map 38 and the associated information content and quality of the map are also updated. This procedure is called map building. Once a map has been built up, it can be used to improve tracks 40 obtained thereafter. This procedure is called track-map fusion.
  • Referring to FIG. 4, illustrating a map building process, in step ([0031] 402) the track data is read from the tracker 20/associator 22. In step (404), the backprojected path of the track is determined, which is then related to the map grid of the track map 38 in step (406). Then in step (408) the map grid is updated, after which the process is repeated in step (402).
  • Referring to FIG. 5, illustrating the map building and fusion processes in greater detail, in step ([0032] 502) the track data comprising position (x,y), velocity (Vx, Vy), target angle ang_h, and data quality St is read from the tracker 20/associator 22. If, in step (504), the track is new, then in step (506), the track fit data is initialize. For example, accumulators of summation processes associated with the fitting processes are initialized to zero before accumulating new track data thereinto. Otherwise, in step (508), the new track data is accumulated in the associated accumulators used in the fitting process. Then in step (508)—which corresponds to step (404) of FIG. 4, the track history data is processed —for example by a smoothing, regression, or other curve fitting process—to generate a representation of the path of the track.
  • Referring in greater detail to steps ([0033] 404) of FIG. 4, and (508) of FIG. 5, generally the backprojected path 42 of a track 40 comprises smoothed trajectory of an existing track 40. As the tracks are obtained from a stochastic calculation, at a specific time an existing track 40 represents the estimated target states based on the information up to that moment. When more information about the target arrives, a better estimate can be derived about the earlier moment. The backprojected path 42 is the improved the history of target states. To obtain a backprojected path 42, one may use Kalman smoother, autoregression, or a line or curve fitting process. The particular approach depends on a tradeoff between nature of target model, performance, available computation bandwidth.
  • For example, autoregression would be appropriate if the backprojected path [0034] 42 is subject to strong trajectory restriction and has a smooth trajectory. A typical curve function in parametric form is:
  • y=f(x)=a n x n +a n−1 x n−1 +. . . +a 1 x+a 0
  • where {a[0035] 1,i=0,1, . . . n} are the results of regression, and jointly define a curve.
  • As another example, the backprojected path may be found by one-dimensional line fitting of the associated track history, as follows: [0036] S x ( k ) = j = 1 k x j f fit j - k = S x ( k - 1 ) f fit + x k S y ( k ) = j = 1 k y j f fit j - k = S y ( k - 1 ) f fit + y k S xy ( k ) = j = 1 k x j y j f fit j - k = S xy ( k - 1 ) f fit + x k y k S x 2 ( k ) = j = 1 k x j 2 f fit j - k = S x2 ( k - 1 ) f fit + x k 2 S w ( k ) = j = 1 k f fit j - k = S w ( k - 1 ) f fit + 1 a = ( S x S y - S w S xy ) / ( S x 2 - S w S x 2 ) b = ( S x S xy - S x2 S y ) / ( S x 2 - S w S x 2 ) d x = x 1 - x k d y = y 1 - y k
    Figure US20020044081A1-20020418-M00001
  • where a and b are the fitting results which jointly define a straight line pattern:[0037]
  • y=ax+b
  • and dx and dy define the distance from the starting point to the current point. Also note that a fading factor is added to gradually decrease the contribution of the early part of a track to current fitting result, for example:[0038]
  • ffit=0.98
  • Given a backprojected target trajectory y=f(x),x∈[x[0039] 1,xN] from either steps (406) or step (510), where x1≦xN, in steps (406) and (512) the path 42 of the backprojected target trajectory is associated with the particular cells 44 of the track map 38. For the size of each cell 44 given as Δx and Δy for x and y dimensions respectively, and for the center of grid(0,0) starting at (xg0, yg0), then the starting and ending cell locations are given by:
  • x g1=mod(x 1 −x g0xx +x g0 and y g1=mod(y 1 −y g0yy +y g0
  • x gN=mod(x N −x g0xx +x g0 and y gN=mod(y N −y g0yy +y g0
  • If |df(x)/dx|≦1, cell locations between the starting and ending [0040] cells 44 are found by
  • x i =x 1 +iΔ x ,y i =f(x i)
  • and[0041]
  • x gi=mod(x i −x g0xx +x g0 and y gi=mod(y i −y g0 yy +y g0
  • otherwise,[0042]
  • y i =y 1 +iΔ y ,x i =f −1(y i)
  • and[0043]
  • x gi=mod(x i −x g0xx +x g0 and y gi=mod(y i −y g0yy +y g0
  • The [0044] track map 38 is updated by first calculating the associated quality measures in steps (514), (516) and (518) as follows:
  • Fitting quality measure of step ([0045] 518) contains two parts, one is for the contradiction of curve fitting result from step (510) and the output of the tracker 20/associator 22 from step (502); and one is for the difference between a fitted curve (or path 42) and a track 40, i.e.,
  • Mfit=McontMdiff
  • M[0046] diff is defined by the distance between the newly found curve and the existing track:
  • M diff =f({∥{circumflex over (X)} k −X k ∥,k=1,2, . . . N})
  • where q[0047] trk is the quality of current track, ∥•∥ is the distance between a point on a track at time k, Xk, and the corresponding backprojected point {circumflex over (X)}k, N is the number of points in a track. The simplest form of the distance is an Euclid distance: X ^ k - X k = ( i = 0 n a i x k i - y k ) 2
    Figure US20020044081A1-20020418-M00002
  • Depending upon the particular situation, the function f({∥{circumflex over (X)}[0048] k−Xk∥,k=1,2, . . . N}) can be defined as any monotonic function with output between 0 and 1, as long as it consistently conveys the quality of the fitting and has a set of well defined operations.
  • M[0049] cont denotes contradiction component of quality corresponding to the discrepancy between the fitting result and and the output of the tracker 20/associator 22 from step (502). In each cell 44, the contradiction is reflected by the difference between target heading angle given by the tracker 20/associator 22 and the target heading angle defined by the tangent of the curve at the center of the grid, i.e., f ( ang_f , ang_h ) = { ang_f - ang_h , if ang_f - ang_h < π 2 π - ang_f - ang_h , if ang_f - ang_h π
    Figure US20020044081A1-20020418-M00003
  • where[0050]
  • M cont =f(ang f,ang h)
  • As in the case of M[0051] diff, f(ang_f,ang_h) can be defined as any monotonic function between 0 and 1, as long as the difference between the two angle is reflected.
  • Finally, before the newly obtained information can be used for updating each grid, one must check the quality of the curve fitting. The this end, each fitting has to consider the following: a) the track should be long enough to guarantee a meaningful fitting; b) the fitting should be well-defined for numerical stability; and c) M[0052] diff itself should be small enough to guarantee a good fitting quality.
  • As a particular example, the contradiction component of quality can be given by:[0053]
  • M cont =s,e −2f(ang f,ang h)
  • Where [0054] ang_h = a tan 2 ( v y , v x ) ang_f = { a tan ( f ( x ) x x = x g ) , if a tan ( f ( x ) x x = x g ) - ang_h < π 2 a tan ( f ( x ) x x = x g ) + π , if a tan ( f ( x ) x x = x g ) - ang_h π 2
    Figure US20020044081A1-20020418-M00004
  • and S[0055] t is a quality measure from the tracker 20/associator 22 of the current track, ang_f and ang_h are the estimates of the target heading angle obtained from line fitting and the tracker 20/associator 22, respectively. ang_f and ang—h are defined as: ang_h = a tan 2 ( v y , v x ) ang_f = { a tan ( a ) , if a tan ( a ) - ang_h < π 2 a tan ( a ) + π , if a tan ( a ) - ang_h π 2
    Figure US20020044081A1-20020418-M00005
  • wherein vx and vy are estimated target velocities on x and y directions, and the angular relationships are illustrated in FIG. 6. [0056]
  • The difference between a fitted line and the original track: [0057] diff = i = k - 2 k ( x fi - x i ) 2 + ( y fi - y i ) 2 M diff = e - diff / 4
    Figure US20020044081A1-20020418-M00006
  • where[0058]
  • (xfi,yfi) and (x1, yl)
  • are points on a fitted line and a track, respectively. Here to save computation bandwidth, we use the most recent three points to measure the difference. [0059]
  • The resulting fitting quality measure is the multiplication of Mcont and Mdiff:[0060]
  • Mfit=McontMdiff
  • Prior to updating the [0061] cells 44 with new information, the old information therein is aged, or faded, in step (520), so as to reduce the significance of cells 44 for which tracking data is no longer measured, and to eventually clear out old data from the cells 44. For example, this may be accomplished by multiplying the information in each grid by a fading factor fd as follows:
  • grid(i,j).quality=f d(i,j)×grid(i,j).quality
  • wherein the fading factor is between zero and one so that the quality of each grid can be degraded to zero if no new information is received therein. [0062]
  • For example, a fading factor may be given by:[0063]
  • f d=0.14e −(n r +1)/(|v|+6)+0.85
  • where n[0064] r is the number of reports and v is the velocity, if available. The numbers, such as 0.14, 0.85 and 6, are given to tune the decreasing speed.
  • The [0065] track map 38 is then updated in steps (522) and (524) by a backprojection process that updates the grids from starting point to current point. This updating process may be subject to conditions, for example,
  • a) that the track has existed for more than six scans (wherein track length —the number of times a track has been updated—is used as an additional measure for association quality and filtering quality); [0066]
  • b) that the fitting algorithm is well defined, i.e.,[0067]
  • |S x 2 −S w S x2|>0
  • c) that diff<10 so that the fit is of relatively good quality; or [0068]
  • d) that the track is long enough to establish a meaningful fitting, i.e.,[0069]
  • max(|d x |,|d y|)<2
  • To update a grid, first the existing information in the grid is compared with the newly obtained information, as follows:[0070]
  • p cont=1−p g e 31 f c (ang 13 g,ang 13 f)
  • where p[0071] g is the quality of the current information content in this grid, and f c ( ang_g , ang_f ) = { ang_f - ang_g , if ang_f - ang_g π 2 π - ang_f - ang_g , if ang_f - ang_g > π
    Figure US20020044081A1-20020418-M00007
  • The information quality is normalized by using:[0072]
  • p cur =M fit/(M fit +p cont+0.05)
  • where 0.05 accounts for unknowns and for numerical stability, and p[0073] cur defines the quality for newly obtained information. With pg representing the quality of information content in the original grid, then the information in the cells 44 is updated as follows (Bayesian):
  • (a) Heading:[0074]
  • grid.heading=(grid.heading p cur+ang f p g)/(p cur +p g)
  • (b) Quality:[0075]
  • grid.quality=p g p cur/(p g p cur+(1−p cur)(1−P g))
  • Forward projection is used to look ahead of the current track to the region where this target has no history and to revive the history of the previous tracks. Basically the same technique used in 3.3 can be applied here. The difference lies in the stop condition. In back projection, well defined starting and ending points exist. However, in forward projection, no clearly defined ending point of the projection exists. Instead, we introduce the following two conditions: [0076]
  • (a) Stop Condition 1: Reach a point with no information, i.e.[0077]
  • grid.quality≦ξ, ξ≧0.
  • (b) Stop Condition 2: Reach a grid where the information content is too different with the projection, i.e.[0078]
  • pcont≧T,T≧0.
  • Here T serves as a threshold [0079]
  • The operation of the [0080] track map generator 10 is illustrated in FIGS. 2 and 3. Referring to FIG. 2, a host vehicle 16 at first position 16.1 travels over a first trajectory 46 over a first period of time to a second position 16.2. Simultaneously, a first target vehicle 48 at first position 48.1 travels over a second trajectory 50 over the first period of time to a second position 48.2. The track of the first target vehicle 48 is measured by the radar system 14 and associated elements of the radar processing system 12 at associated sampling times 52. The second trajectory 50 intersects a first set of cells 54 that are updated by the track map generator 10.
  • Referring to FIG. 3, during a second period of time, the [0081] host vehicle 16 travels from the second position 16.2 to a third position 16.3 over the first trajectory 46, and simultaneously, a second target vehicle 58 at first position 58.1 travels over a third trajectory 60 to a second position 58.2. The track of the second target vehicle 58 is measured by the radar system 14 and associated elements of the radar processing system 12 at associated sampling times 54. The third trajectory 60 intersects a second set of cells 62, some of which are also of the first set of cells 56, that are updated by the track map generator 10. The resulting updated cells each contain an associated direction and quality representing a composite of the paths of the first 48 and second 58 target vehicles, relative to the host vehicle 16.
  • FIGS. 2 and 3 illustrate track maps [0082] 38 with associated cells that are fixed in space, as may be suitable for used in conjunction with absolute navigation data. Referring to FIGS. 7 and 8—which correspond to the situations illustrated in FIGS. 2 and 3—the coordinate system of the track map may also be adapted to move with the host vehicle 16, so as to provide a map that is localized to the host vehicle 16.
  • While specific embodiments have been described in detail in the foregoing detailed description and illustrated in the accompanying drawings, those with ordinary skill in the art will appreciate that various modifications and alternatives to those details could be developed in light of the overall teachings of the disclosure. Accordingly, the particular arrangements disclosed are meant to be illustrative only and not limiting as to the scope of the invention, which is to be given the full breadth of the appended claims and any and all equivalents thereof. [0083]

Claims (7)

I claim:
1. A method of generating a track map, comprising:
a. reading a first plurality of first data points, wherein each first data point of said first plurality of first data points is representative of a position of first tracked object;
b. generating a first representation of a path of said first tracked object, from said first plurality of first data points;
c. calculating at least one quality measure of said representation, wherein said at least one quality measure corresponds to at least one map coordinate;
d. calculating at least one heading measure from said representation, wherein said at least one heading measure corresponds to said at least one map coordinate; and
e. storing said at least one quality measure and said at least one heading measures in a memory.
2. A method of generating a track map as recited in claim 1, wherein said first representation of said path is a mathematical representation selected from a linear fit of said first plurality of first data points, a curve fit of said first plurality of first data points, a Kalman smoothing of said first plurality of first data points, and an autoregression of said first plurality of first data points.
3. A method of generating a track map as recited in claim 1, wherein said at least one quality measure is responsive to a measure of fit by said representation of said first plurality of first data points.
4. A method of generating a track map as recited in claim 1, further comprising:
a. reading a first plurality of second data points, wherein each second data point of said first plurality of second data points is representative of a velocity of first tracked object, and said first plurality of second data points correspond in time to said first plurality of first data points; and
b. calculating a first plurality of headings of said first tracked object from said first plurality of second data points, wherein said at least one quality measure is responsive to a heading error, wherein said heading error is responsive to a difference between at least one heading calculated from said first representation and at least one heading of said first plurality of headings.
5. A method of generating a track map as recited in claim 1, further comprising:
a. reading a first plurality of third data points, wherein each third data point of said first plurality of third data points is representative of a heading of first tracked object so that said first plurality of third data points are a first plurality of headings, said first plurality of third data points correspond in time to said first plurality of first data points, wherein said heading error is responsive to a difference between at least one heading calculated from said first representation and at least one heading of said first plurality of headings.
6. A method of generating a track map as recited in claim 1, further comprising modifying at least one stored quality measure responsive to at least one quality measure.
7. A method of generating a track map as recited in claim 1, further comprising modifying at least one stored heading measure responsive to at least one quality measure.
US09/877,493 2000-06-08 2001-06-08 Track map generator Expired - Fee Related US6420997B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/877,493 US6420997B1 (en) 2000-06-08 2001-06-08 Track map generator

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US21019300P 2000-06-08 2000-06-08
US21087800P 2000-06-09 2000-06-09
US09/877,493 US6420997B1 (en) 2000-06-08 2001-06-08 Track map generator

Publications (2)

Publication Number Publication Date
US20020044081A1 true US20020044081A1 (en) 2002-04-18
US6420997B1 US6420997B1 (en) 2002-07-16

Family

ID=26904923

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/877,493 Expired - Fee Related US6420997B1 (en) 2000-06-08 2001-06-08 Track map generator

Country Status (4)

Country Link
US (1) US6420997B1 (en)
EP (1) EP1290467A4 (en)
JP (1) JP2003536096A (en)
WO (1) WO2001094970A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030109296A1 (en) * 2001-11-15 2003-06-12 Leach Gary Mark Generation method
US20030135304A1 (en) * 2002-01-11 2003-07-17 Brian Sroub System and method for managing transportation assets
US6721659B2 (en) * 2002-02-01 2004-04-13 Ford Global Technologies, Llc Collision warning and safety countermeasure system
US20040121829A1 (en) * 2002-12-23 2004-06-24 Nintendo Software Technology Corporation Method and apparatus for modeling a track in video games using arcs and splines that enables efficient collision detection
EP1537440A2 (en) * 2002-07-15 2005-06-08 Automotive Systems Laboratory Inc. Road curvature estimation and automotive target state estimation system
US6937165B2 (en) 2002-09-23 2005-08-30 Honeywell International, Inc. Virtual rumble strip
US20060031015A1 (en) * 2004-08-09 2006-02-09 M/A-Com, Inc. Imminent-collision detection system and process
US20100017180A1 (en) * 2006-12-05 2010-01-21 Martin Randler Method and device for object tracking in a driver assistance system of a motor vehicle
US20120116663A1 (en) * 2008-06-05 2012-05-10 Toyota Jidosha Kabushiki Kaisha Obstacle detection device and obstacle detection system
US20150354965A1 (en) * 2012-06-12 2015-12-10 Trx Systems, Inc. Irregular Feature Mapping
EP3364211A1 (en) * 2017-02-21 2018-08-22 Continental Automotive GmbH Method and device for detecting a possible collision, and vehicle
US10203409B2 (en) * 2014-11-17 2019-02-12 Volkswagen Aktiengesellschaft Method and device for the localization of a vehicle from a fixed reference map
US20210256321A1 (en) * 2020-02-14 2021-08-19 Ford Global Technologies, Llc Enhanced object detection with clustering

Families Citing this family (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002221570A (en) * 2001-01-26 2002-08-09 Nec Corp Track tracking device and track tracking method
US6944543B2 (en) * 2001-09-21 2005-09-13 Ford Global Technologies Llc Integrated collision prediction and safety systems control for improved vehicle safety
JP3896852B2 (en) * 2002-01-16 2007-03-22 株式会社デンソー Vehicle collision damage reduction device
WO2003059697A1 (en) * 2002-01-16 2003-07-24 Autoliv Development Ab A camera arrangement
US20050086003A1 (en) * 2002-01-17 2005-04-21 Tarabishy M. N. Method for collision avoidance and collision mitigation
US7522091B2 (en) 2002-07-15 2009-04-21 Automotive Systems Laboratory, Inc. Road curvature estimation system
DE10235414A1 (en) 2002-08-02 2004-02-12 Robert Bosch Gmbh Method and device for determining the impending inevitable collision
US6691018B1 (en) * 2002-11-21 2004-02-10 Visteon Global Technologies, Inc. Method and system for identifying a lane change
US7057532B2 (en) * 2003-10-15 2006-06-06 Yossef Shiri Road safety warning system and method
US7081849B2 (en) * 2004-10-28 2006-07-25 Northrop Grumman Corporation Process for sensor resources management
US7167127B2 (en) * 2004-11-08 2007-01-23 Northrop Grumman Corporation Process for tracking vehicles
US7892078B2 (en) * 2005-12-30 2011-02-22 Microsoft Corporation Racing line optimization
ATE422185T1 (en) * 2006-08-24 2009-02-15 Harman Becker Automotive Sys METHOD FOR IMAGING THE ENVIRONMENT OF A VEHICLE AND SYSTEM THEREFOR
EP1898232B1 (en) 2006-09-08 2010-09-01 Ford Global Technologies, LLC Method and system for collision avoidance
US20080065328A1 (en) 2006-09-08 2008-03-13 Andreas Eidehall Method and system for collision avoidance
US7626535B2 (en) * 2006-11-09 2009-12-01 Raytheon Company Track quality based multi-target tracker
US7675458B2 (en) * 2006-11-09 2010-03-09 Raytheon Canada Limited Dual beam radar system
JP5078637B2 (en) * 2008-01-29 2012-11-21 富士通テン株式会社 Radar apparatus and target detection method
EP2291302A4 (en) * 2008-05-06 2013-04-03 Jeff R Crandall System and method for minimizing occupant injury during vehicle crash events
DE112008004159B4 (en) * 2008-12-09 2014-03-13 Toyota Jidosha Kabushiki Kaisha Object detection device and object detection method
WO2010070708A1 (en) * 2008-12-18 2010-06-24 トヨタ自動車株式会社 Radar system
US8831869B2 (en) * 2009-03-31 2014-09-09 GM Global Technology Operations LLC Using V2X-based in-network message generation, aggregation, distribution and processing protocols to enable road hazard condition warning applications
US20110025548A1 (en) * 2009-07-31 2011-02-03 Gm Global Technology Operations, Inc. System and method for vehicle sensor fusion
JP5251800B2 (en) * 2009-09-16 2013-07-31 株式会社豊田中央研究所 Object tracking device and program
JP5291610B2 (en) * 2009-12-21 2013-09-18 日本電信電話株式会社 Azimuth estimation apparatus, method, and program
CN102753939B (en) 2009-12-23 2016-08-03 通腾北美有限公司 Time that network in numerical map produces and/or the interdependent weight of degree of accuracy
DE102010050167B4 (en) * 2010-10-30 2012-10-25 Audi Ag Method and device for determining a plausible lane for guiding a vehicle and motor vehicles
KR101170094B1 (en) 2010-11-04 2012-07-31 목포대학교산학협력단 System Controlling Collision Evision for Multiple Traffic Vessels
JP5867177B2 (en) * 2012-03-06 2016-02-24 富士通株式会社 MAP DISPLAY PROGRAM, MAP GENERATION DEVICE, MAP DISPLAY METHOD, AND MAP GENERATION SYSTEM
US9255989B2 (en) * 2012-07-24 2016-02-09 Toyota Motor Engineering & Manufacturing North America, Inc. Tracking on-road vehicles with sensors of different modalities
US8976059B2 (en) 2012-12-21 2015-03-10 Raytheon Canada Limited Identification and removal of a false detection in a radar system
EP2826687B1 (en) 2013-07-16 2019-03-06 Honda Research Institute Europe GmbH Technique for lane assignment in a vehicle
US10598764B2 (en) * 2017-10-30 2020-03-24 Yekutiel Josefsberg Radar target detection and imaging system for autonomous vehicles with ultra-low phase noise frequency synthesizer

Family Cites Families (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2709804A (en) 1945-09-14 1955-05-31 Chance Britton Automatic range and azimuth tracking system
US3177485A (en) 1960-03-02 1965-04-06 Jr John W Taylor Automatic radar target tracking system
US3603994A (en) 1966-04-26 1971-09-07 Hughes Aircraft Co System for automatically generating smoothing parameters in an automatic track-while-scan radar system
US3699573A (en) 1966-05-05 1972-10-17 Hughes Aircraft Co System for automatic initiation of target tracking in track-while-scan radar
US3725918A (en) 1970-11-18 1973-04-03 Sperry Rand Corp Collision avoidance display apparatus for maneuverable craft
GB1430389A (en) 1972-06-21 1976-03-31 Solartron Electronic Group Computing apparatus for tracking movinb objects
US3971018A (en) 1974-06-24 1976-07-20 Sperry Rand Corporation Marine traffic conflict assessment system
GB8304686D0 (en) 1983-02-19 1983-03-23 Sperry Ltd Collision avoidance apparatus
IT1240974B (en) * 1990-07-05 1993-12-27 Fiat Ricerche METHOD AND EQUIPMENT TO AVOID THE COLLISION OF A VEHICLE AGAINST OBSTACLES.
US5170440A (en) 1991-01-30 1992-12-08 Nec Research Institute, Inc. Perceptual grouping by multiple hypothesis probabilistic data association
US5638281A (en) * 1991-01-31 1997-06-10 Ail Systems, Inc. Target prediction and collision warning system
US5051751A (en) 1991-02-12 1991-09-24 The United States Of America As Represented By The Secretary Of The Navy Method of Kalman filtering for estimating the position and velocity of a tracked object
US5307289A (en) 1991-09-12 1994-04-26 Sesco Corporation Method and system for relative geometry tracking utilizing multiple distributed emitter/detector local nodes and mutual local node tracking
US5138321A (en) 1991-10-15 1992-08-11 International Business Machines Corporation Method for distributed data association and multi-target tracking
IL100175A (en) * 1991-11-27 1994-11-11 State Of Isreal Ministry Of De Collision warning apparatus for a vehicle
US5202691A (en) 1992-04-28 1993-04-13 The United States Of America As Represented By The Secretary Of The Air Force Hick's probabilistic data association method
US5406289A (en) 1993-05-18 1995-04-11 International Business Machines Corporation Method and system for tracking multiple regional objects
US5402129A (en) * 1993-08-04 1995-03-28 Vorad Safety Systems, Inc. Monopulse azimuth radar system for automotive vehicle tracking
US5983161A (en) 1993-08-11 1999-11-09 Lemelson; Jerome H. GPS vehicle collision avoidance warning and control system and method
US6553130B1 (en) 1993-08-11 2003-04-22 Jerome H. Lemelson Motor vehicle warning and control system and method
US5633642A (en) 1993-11-23 1997-05-27 Siemens Aktiengesellschaft Radar method and device for carrying out the method
US5959574A (en) 1993-12-21 1999-09-28 Colorado State University Research Foundation Method and system for tracking multiple regional objects by multi-dimensional relaxation
US5537119A (en) 1993-12-21 1996-07-16 Colorado State University Research Foundation Method and system for tracking multiple regional objects by multi-dimensional relaxation
GB9417170D0 (en) 1994-08-25 1994-10-12 Isis Innovation Non-linear filtering
US5587929A (en) * 1994-09-02 1996-12-24 Caterpillar Inc. System and method for tracking objects using a detection system
US5689264A (en) 1994-10-05 1997-11-18 Mazda Motor Corporation Obstacle detecting system for vehicles
JP3400875B2 (en) * 1994-10-20 2003-04-28 本田技研工業株式会社 Moving object detection device
JP3721594B2 (en) * 1995-03-15 2005-11-30 日産自動車株式会社 Road shape estimation device
US5657251A (en) * 1995-10-02 1997-08-12 Rockwell International Corporation System and process for performing optimal target tracking
US5703593A (en) 1995-12-12 1997-12-30 Northrop Grumman Corporation Adaptive DPCA subsystem
JP3653862B2 (en) * 1996-04-22 2005-06-02 日産自動車株式会社 Vehicle curve diameter estimation device and target preceding vehicle detection device
US5646613A (en) 1996-05-20 1997-07-08 Cho; Myungeun System for minimizing automobile collision damage
JP3229226B2 (en) * 1996-11-07 2001-11-19 ダイハツ工業株式会社 Leading vehicle recognition device and recognition method
US5948043A (en) * 1996-11-08 1999-09-07 Etak, Inc. Navigation system using GPS data
US6085151A (en) * 1998-01-20 2000-07-04 Automotive Systems Laboratory, Inc. Predictive collision sensing system
JPH1137730A (en) * 1997-07-18 1999-02-12 Nissan Motor Co Ltd Road shape estimating apparatus
US6275231B1 (en) 1997-08-01 2001-08-14 American Calcar Inc. Centralized control and management system for automobiles
DE19855400A1 (en) * 1998-12-01 2000-06-15 Bosch Gmbh Robert Method and device for determining a future course range of a vehicle
US6161071A (en) 1999-03-12 2000-12-12 Navigation Technologies Corporation Method and system for an in-vehicle computing architecture

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030109296A1 (en) * 2001-11-15 2003-06-12 Leach Gary Mark Generation method
US20030135304A1 (en) * 2002-01-11 2003-07-17 Brian Sroub System and method for managing transportation assets
US6721659B2 (en) * 2002-02-01 2004-04-13 Ford Global Technologies, Llc Collision warning and safety countermeasure system
EP1537440A4 (en) * 2002-07-15 2012-03-28 Automotive Systems Lab Road curvature estimation and automotive target state estimation system
EP1537440A2 (en) * 2002-07-15 2005-06-08 Automotive Systems Laboratory Inc. Road curvature estimation and automotive target state estimation system
US6937165B2 (en) 2002-09-23 2005-08-30 Honeywell International, Inc. Virtual rumble strip
US20040121829A1 (en) * 2002-12-23 2004-06-24 Nintendo Software Technology Corporation Method and apparatus for modeling a track in video games using arcs and splines that enables efficient collision detection
US8784171B2 (en) * 2002-12-23 2014-07-22 Nintendo Co., Ltd. Method and apparatus for modeling a track in video games using arcs and splines that enables efficient collision detection
US20060031015A1 (en) * 2004-08-09 2006-02-09 M/A-Com, Inc. Imminent-collision detection system and process
US7409295B2 (en) * 2004-08-09 2008-08-05 M/A-Com, Inc. Imminent-collision detection system and process
US20100017180A1 (en) * 2006-12-05 2010-01-21 Martin Randler Method and device for object tracking in a driver assistance system of a motor vehicle
US8140210B2 (en) * 2006-12-05 2012-03-20 Robert Bosch Gmbh Method and device for object tracking in a driver assistance system of a motor vehicle
US20120116663A1 (en) * 2008-06-05 2012-05-10 Toyota Jidosha Kabushiki Kaisha Obstacle detection device and obstacle detection system
US9297658B2 (en) 2012-06-12 2016-03-29 Trx Systems, Inc. Wi-Fi enhanced tracking algorithms
US20150354965A1 (en) * 2012-06-12 2015-12-10 Trx Systems, Inc. Irregular Feature Mapping
US9441973B2 (en) * 2012-06-12 2016-09-13 Trx Systems, Inc. Irregular feature mapping
US9746327B2 (en) 2012-06-12 2017-08-29 Trx Systems, Inc. Fusion of sensor and map data using constraint based optimization
US9778044B2 (en) 2012-06-12 2017-10-03 Trx Systems, Inc. Irregular feature mapping
US10571270B2 (en) 2012-06-12 2020-02-25 Trx Systems, Inc. Fusion of sensor and map data using constraint based optimization
US10203409B2 (en) * 2014-11-17 2019-02-12 Volkswagen Aktiengesellschaft Method and device for the localization of a vehicle from a fixed reference map
EP3364211A1 (en) * 2017-02-21 2018-08-22 Continental Automotive GmbH Method and device for detecting a possible collision, and vehicle
US20210256321A1 (en) * 2020-02-14 2021-08-19 Ford Global Technologies, Llc Enhanced object detection with clustering
US11586862B2 (en) * 2020-02-14 2023-02-21 Ford Global Technologies, Llc Enhanced object detection with clustering

Also Published As

Publication number Publication date
EP1290467A4 (en) 2006-02-15
EP1290467A1 (en) 2003-03-12
JP2003536096A (en) 2003-12-02
WO2001094970A1 (en) 2001-12-13
US6420997B1 (en) 2002-07-16

Similar Documents

Publication Publication Date Title
US6420997B1 (en) Track map generator
US6470272B2 (en) Situation awareness processor
US11155258B2 (en) System and method for radar cross traffic tracking and maneuver risk estimation
US10073456B2 (en) Automated co-pilot control for autonomous vehicles
US7034742B2 (en) Road curvature estimation and automotive target state estimation system
US5926126A (en) Method and system for detecting an in-path target obstacle in front of a vehicle
EP3879455A1 (en) Multi-sensor data fusion method and device
US8558679B2 (en) Method of analyzing the surroundings of a vehicle
EP1540564B1 (en) Collision avoidance and warning system, method for preventing collisions
US11156717B2 (en) Method and apparatus crosstalk and multipath noise reduction in a LIDAR system
US10416679B2 (en) Method and apparatus for object surface estimation using reflections delay spread
EP1273930B1 (en) A method for collision avoidance and collision mitigation
US6643588B1 (en) Geometric based path prediction method using moving and stop objects
US20220032955A1 (en) Vehicle control device and vehicle control method
US11433897B2 (en) Method and apparatus for determination of optimal cruising lane in an assisted driving system
US10928507B2 (en) Apparatus and method for improved radar beamforming
US11042160B2 (en) Autonomous driving trajectory determination device
CN112498347A (en) Method and apparatus for real-time lateral control and steering actuation evaluation
CN109835338B (en) Turning control method and device and automatic driving vehicle
CN113581181B (en) Intelligent vehicle overtaking track planning method
US20220105927A1 (en) Vehicle traveling control apparatus
US20050086003A1 (en) Method for collision avoidance and collision mitigation
CN111144432B (en) Method for eliminating fuzzy detection in sensor fusion system
CN117055019A (en) Vehicle speed calculation method based on vehicle-mounted radar and corresponding device and module
CN111144432A (en) Method for eliminating fuzzy detection in sensor fusion system

Legal Events

Date Code Title Description
AS Assignment

Owner name: AUTOMOTIVE SYSTEMS LABORATORY, INC., MICHIGAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CONG, SHAN;REEL/FRAME:012493/0543

Effective date: 20010717

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

CC Certificate of correction
FPAY Fee payment

Year of fee payment: 4

FPAY Fee payment

Year of fee payment: 8

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20140716