GM dreams of cars that change lanes just by following your eyes

General Motors is sketching a future in which a glance toward the next lane could be all it takes for a vehicle to slide over and pass a slower car. The company is building on its existing driver assistance portfolio to explore lane changes that respond directly to where a driver is looking, folding eye tracking into a broader push toward partially autonomous, and eventually “eyes off,” driving. The vision is ambitious, but it rests on a foundation of hardware, software, and mapping that GM has already begun to deploy at scale.

From hands free to gaze guided

GM’s interest in gaze driven lane changes does not emerge in a vacuum, it grows out of a maturing ecosystem of automated driver assist features that already manage steering, speed, and lane positioning on mapped roads. Reporting on GM’s internal process flows describes a system that would monitor a driver’s eyes, interpret a sustained look toward an adjacent lane as intent, and then decide whether to initiate a lane change based on traffic, road markings, and navigation goals. The idea is to keep the human in the loop as the decision maker, while letting the vehicle handle the mechanical precision of moving across lanes once that intent is clear, an evolution of today’s automated lane change functions rather than a wholesale replacement.

That approach aligns with GM’s broader strategy of layering intelligence on top of existing driver assistance rather than leaping straight to full autonomy. The company has already detailed how its current systems use cameras, onboard computers, sensors, GPS data, and high precision maps to keep a vehicle centered in its lane and to check blind spots before changing lanes. By adding eye tracking to that stack, GM is effectively proposing a new input channel for the same decision engine, one that could make lane changes feel more natural to drivers who already look where they intend to go long before they move the steering wheel.

Super Cruise as the test bed

The most obvious proving ground for gaze based lane changes is Super Cruise, GM’s flagship hands free driving system. According to the company’s own technical descriptions, vehicles equipped with Super Cruise rely on real time cameras, a network of sensors, GPS, and detailed road data to maintain speed and lane position on compatible highways, while also monitoring the driver’s attention. The system already supports automated lane changes on certain models, using its sensor suite to confirm that adjacent lanes are clear and that the maneuver aligns with navigation, then executing the move with minimal driver input. In that context, swapping a turn signal tap for an eye movement is less a radical leap and more a refinement of how the driver communicates with the car.

Current documentation for Super Cruise also underscores how carefully GM has defined the boundary between human and machine control. Drivers can steer MANUALLY at any time when the Super Cruise technology is engaged, and the guidance around CHANGING LANES makes clear that the person behind the wheel remains responsible even when the system is handling the mechanics of the maneuver. That philosophy would likely carry over to any eye based feature, which would still depend on the same underlying sensors and maps that today support hands free driving on one of the largest compatible road networks in North America. GM’s decision to roll Super Cruise out across vehicles such as the GMC Sierra EV, where it is standard on AT4 and Denali trims and available as an upgrade on the base configuration, suggests the company sees this platform as the backbone for more advanced capabilities.

Eyes off, not just eyes on

While gaze controlled lane changes focus on reading what the driver wants to do, GM is simultaneously working on technology that allows the driver to look away from the road altogether. The company has introduced a partially autonomous “eyes off” driving system that it showcased during a major event in New York, positioning it as a step beyond today’s hands free offerings. In that configuration, the vehicle’s AI is designed to handle more of the driving task on its own, with the system monitoring not only the environment but also the driver’s readiness to retake control, and with GM explicitly framing the goal as moving “leaps and bounds ahead of the competition” in vehicle AI.

Public demonstrations have highlighted how this “eyes off” capability is expected to arrive first on high end models, including a version of the Cadillac Escalade IQ that GM says will offer self driving functionality in 2028. Separate commentary from GM focused segments has contrasted this approach with systems from Tesla, which still require the driver to remain responsible for supervision even when advanced assistance is active. By contrast, GM’s messaging around its “eyes off” feature emphasizes that the system is designed to manage the vehicle in defined conditions without constant human oversight, although the company continues to stress that the driver remains the ultimate authority and must be ready to intervene when requested.

Rivals are already blinking

GM is not the first automaker to experiment with eye based control, which underscores both the promise and the risks of the concept. BMW has already introduced a 5 series sedan that allows drivers to initiate a lane change using their eyes, a feature that was widely noted when the model was unveiled. In that system, the driver activates an assisted driving mode, then looks toward the mirror in the direction of the desired lane, and the vehicle interprets that gaze as a command to move over if conditions permit. The BMW example shows that eye tracking hardware and software are mature enough to be deployed in production vehicles, at least in limited scenarios, and it gives GM a concrete benchmark for how such a feature might be received by regulators and customers.

At the same time, GM’s broader ecosystem could give it an advantage if it can integrate gaze control into a more comprehensive automated driving stack. The company already supports hands free driving on extensive stretches of divided highways in North America through Super Cruise, and it is preparing to extend that capability into “eyes off” territory on future Cadillac models. Competitors in the technology space, such as Mobileye with its REM (Road Experience Management) platform, are also pushing toward Level 3 functionality that can operate on sections of interstate roads with separated lanes using a single chip supercomputer. In that competitive landscape, GM’s exploration of eye driven lane changes looks less like a novelty and more like one piece of a larger race to define how humans and AI will share control of the car.

Safety, trust, and the road ahead

The prospect of a car that moves across lanes based on where the driver is looking raises immediate questions about safety and unintended consequences. GM’s existing guidance for Super Cruise, which emphasizes that the driver can always intervene MANUALLY and that CHANGING LANES is subject to strict checks of blind spots and surrounding traffic, hints at how carefully any gaze based feature would need to be constrained. The system would have to distinguish between a quick glance at a side mirror and a deliberate look that signals intent, filter out distractions, and cross check every potential maneuver against sensor data and maps before acting. That is a nontrivial challenge, but it builds on the same research methods and sensor fusion techniques that already underpin Super Cruise and similar systems.

More from Fast Lane Only

Bobby Clark Avatar