01/30 2026
361
In the swift progression of self-driving technology, a vehicle's capacity to accurately adhere to a predetermined path stands as a crucial indicator of system maturity. Nevertheless, whether it involves experimental models or self-driving taxis already navigating urban roads, occasional abnormal driving behavior—namely, trajectory drift—can arise. This phenomenon typically presents itself as subtle, snake-like swerves during straight-line driving, an inability to maintain proximity to the centerline in curves, or even significant lateral displacement under specific conditions. What, then, causes trajectory drift in self-driving cars?
Inherent Sensor Characteristics and Cumulative Errors in Dead Reckoning
Self-driving cars predominantly rely on a fused positioning system that integrates Global Navigation Satellite Systems (GNSS) and Inertial Measurement Units (IMUs) to ascertain their location. Ideally, this system can deliver centimeter-level accuracy, yet errors are inevitable in the complex physical environment. Satellite signals experience considerable accuracy fluctuations when traversing the atmosphere, reflecting off tall buildings, or being entirely blocked in tunnels. When satellite signals are disrupted by environmental obstructions, the vehicle must resort to dead reckoning mode, which primarily depends on gyroscopes and accelerometers within the IMU to estimate positional changes.
However, inertial sensors inherently possess insurmountable physical limitations. Microelectromechanical system (MEMS)-based IMUs inevitably generate random noise and zero-point offsets in their output data. Mathematically, a vehicle's position is derived through double time integration of acceleration. This implies that even a minuscule, constant zero-point bias in the sensor will rapidly amplify over time, growing quadratically. This phenomenon is vividly recognized in the industry as 'temperature drift' or 'zero drift.' For low- to mid-cost sensors, in the absence of external reference signals (e.g., satellite signals), longitudinal drift can accumulate to several meters after traveling just a few hundred meters.
To counteract this cumulative error, self-driving systems incorporate high-definition maps and sensor feature-matching technology. The vehicle utilizes LiDAR or cameras to scan roadside features like utility poles, traffic signs, and lane markings, comparing them with precise coordinates pre-stored in high-definition maps to rectify dead reckoning errors, akin to 'making up for lost ground.' However, this approach falters in environments with extremely limited features, such as open desert roads or long tunnels with smooth walls, where matching algorithms lack sufficient distinct features and become ineffective, leading to renewed inaccuracies in dead reckoning. Furthermore, if the high-definition map itself contains surveying inaccuracies or if on-site conditions differ from the map due to road construction, the matching process can introduce new interference, exacerbating trajectory drift.
Sensor Synchronization Errors and Disturbances from Motion Distortion
Self-driving vehicles are outfitted with multiple sensors operating at different frequencies and based on diverse principles. Cameras may capture images at 30 frames per second, while mechanical rotating LiDAR scans a full circle ten times per second. If the information captured by these sensors is not temporally aligned, the system processes data at the incorrect time and location. For instance, at a speed of 60 km/h, a mere 10-millisecond timing error results in a 16-centimeter positional discrepancy. If the perception module fuses outdated LiDAR point cloud data with current camera images, the system misjudges obstacle positions, leading to a deviated trajectory plan.
To ensure all sensors operate on the same 'clock pulse,' specialized time synchronization protocols are implemented. A common approach utilizes satellite-provided pulse-per-second signals as a reference, with dedicated synchronization boards controlling sensor acquisition timing within microsecond-level tolerances across the vehicle.
Even after resolving inter-sensor timing alignment, motion distortion within individual sensors remains a formidable challenge. Take mechanical rotating LiDAR as an example: generating a complete point cloud frame takes approximately 100 milliseconds. During this brief interval, the vehicle is not stationary but continuously moving and rotating. This means the first and last laser beams in the point cloud scan objects while the vehicle is in different poses.
Without distortion correction, the vehicle's perception of the world becomes distorted—straight lane markings may appear curved, and roadside lampposts may seem tilted. This perception distortion caused by the vehicle's own motion directly interferes with the positioning algorithm's estimation of the vehicle's true pose. Currently, the mainstream solution involves compensating each laser beam's coordinates using high-frequency motion data from the IMU, restoring the distorted point cloud to the true physical coordinate system. However, the compensation algorithm relies on the accuracy of the motion model. If the vehicle experiences severe jolts on rough roads, residual motion distortion can still induce minor oscillations in trajectory prediction.
Tire Mechanics and Environmental Uncertainties
Intuitively, many perceive a self-driving vehicle's trajectory as a geometric curve calculated by algorithms. However, a car is not a mass point that can slide freely on ice; it is subject to complex physical constraints, particularly the interaction between tires and the road surface. Trajectory drift often stems from conflicts between a vehicle's physical limits and simplified algorithmic models.
During high-speed cornering, a tire's actual rolling direction does not fully align with the wheel's orientation due to rubber elastic deformation, creating a discrepancy known as the slip angle. When the slip angle exceeds a certain threshold (typically between 5 and 15 degrees), the vehicle enters a skidding state. While self-driving systems aim to avoid extreme drifting, slip angles are ubiquitous. If the control algorithm relies solely on a simple geometric kinematic model—assuming the vehicle goes where the wheels point while ignoring tire forces—the vehicle will 'push out' toward the outside of a curve during cornering due to insufficient centripetal force. This error, caused by neglecting dynamic constraints, becomes particularly pronounced at high speeds and on low-friction surfaces (e.g., rainy or snowy conditions).
Additionally, environmental fluctuations complicate trajectory tracking. Road friction coefficients, vehicle load distribution, and even minor tire pressure differences all alter steering characteristics. A fully loaded self-driving car has significantly greater rotational inertia during cornering than when empty, resulting in slower steering responses. If the control algorithm cannot perceive these parameter changes in real time and adjust steering force accordingly, the vehicle cannot precisely follow the planned trajectory.
Self-driving algorithms attempt to incorporate these physical constraints into optimization frameworks using techniques like model predictive control (MPC), simulating future motion trends in each computational cycle to anticipate and counteract slip angle effects. However, this proactive calculation relies on extremely high model accuracy; any minor model mismatch translates into subtle trajectory drift during execution.
Actuator Delays and Oscillatory Effects from Communication Latency
The final underlying cause of trajectory drift lies in the time lag between electronic commands and mechanical actions. Correcting trajectory deviations in a self-driving system involves a sequence of operations: sensor data acquisition, algorithm recognition, path planning, control command generation, and transmission via a bus to steering actuators.
Physical hysteresis in actuators is the primary source of this delay. Motors in steering systems require time to overcome friction and build torque, while hydraulic mechanisms need time to establish pressure. These mechanical 'lags' mean the vehicle acts on sensor data that is already tens or even hundreds of milliseconds old. Known as 'lag' in control theory, this delay creates a vicious cycle if the control system fails to anticipate it: when the system detects a rightward deviation and issues a left-steering command, the wheels may not begin turning left until the vehicle has deviated further. Subsequently, to correct aggressively, the system issues a larger steering angle, causing the vehicle to swing left across the centerline. This repeated 'overcorrection' and 'undercorrection' manifests visually as the vehicle swaying left and right within the lane, unable to maintain a smooth trajectory.
Self-driving systems overcome this latency interference by adopting state augmentation and predictive compensation techniques. The control algorithm considers not only the current pose but also control commands issued in previous cycles that have not yet been fully executed. Model predictive control again plays a crucial role here, focusing not just on current deviations but continuously predicting future vehicle states over a horizon, issuing steering commands in advance to 'bridge' the temporal gap at the physical execution level. While upgrades to computing platforms and adoption of in-vehicle communication protocols (e.g., automotive Ethernet) have reduced internal data transmission delays, the physical bottlenecks of mechanical actuators remain unavoidable.
Various technical approaches exist within the industry to address trajectory drift. Tesla's solution, for instance, emphasizes large-scale end-to-end deep learning, leveraging vast amounts of real-world driving data to train models in handling diverse disturbances. Other manufacturers focus more on rigorous high-definition map and multi-sensor fusion, using precise physical models and closed-loop detection to counteract errors. These approaches reflect differing technical philosophies: trusting data-driven intuition versus relying on rigorous physical modeling.
Challenges in Trajectory Prediction Bias and Scenario Adaptability
Beyond perception, dynamics, and control errors, trajectory drift sometimes originates from planning-level decisions. In complex urban traffic, self-driving vehicles must continuously predict the intentions of surrounding pedestrians and vehicles. If prediction algorithms misjudge a neighboring vehicle's lane-change intent or oscillate between multiple potential avoidance maneuvers, the planned trajectory undergoes frequent changes.
This decision-level 'indecision' manifests at the execution level as unnatural lateral swaying within short timeframes. While physically distinct from sensor zero-point drift, this phenomenon equally constitutes a dangerous form of trajectory drift for passengers.
Predicting pedestrian trajectories is particularly challenging due to the multimodal nature of human behavior. A pedestrian at the same location may sprint across the road or suddenly stop and wait; the prediction model must generate multiple probabilistic branches. When these branches rapidly invert in weight within short intervals, the vehicle's path planner is forced to rewrite future trajectories.
Such abrupt trajectory jumps are a significant source of the 'mechanical' and 'uncertain' feel in self-driving systems. To mitigate this, interactive perception algorithms have been introduced, observing not just individual targets but also understanding interdependencies among traffic participants to provide more stable, human-like predictive paths.
Final Thoughts
Trajectory drift in self-driving systems results from the layered accumulation of various system errors. It begins with minor sensor inaccuracies, gradually amplifies through tiny temporal and spatial misalignments, faces physical limitations from tire grip and ground friction, and ultimately manifests due to inevitable control system delays. As sensors become more precise, vehicle dynamic models more closely resemble reality, and predictive control algorithms see further ahead, this drift is being progressively reduced. However, as long as vehicles operate in the real world, overcoming trajectory drift will remain a core technical challenge in the self-driving domain.
-- END --