Tesla Autopilot Faces 2 Separate Vegas Claims as DUI Case Revives Safety Questions

tesla autopilot has moved back into the spotlight for two very different reasons: one case in California ended with a DUI arrest after a driver was found unconscious behind the wheel, while a separate Las Vegas lawsuit says the system pushed a vehicle into oncoming traffic. Taken together, the incidents underline a basic tension at the center of the technology: it is marketed as assistance, yet some drivers appear to treat it like a substitute for responsibility.
Why the California arrest matters now
In Vacaville, California, police said they were called shortly after 11 a. m. on Wednesday, March 25, after a community member spotted a driver slumped over and apparently unconscious in a moving car. Officers caught up with the vehicle and brought it to a safe stop near Elmira Road and Shasta Drive. Investigators later said the driver was under the influence of alcohol and marijuana and was arrested on suspicion of DUI.
Photos released by Vacaville police showed a four-pack of Cabernet Sauvignon and a pizza box inside the cabin. Police also used the moment to remind drivers that assistive driving features still require a conscious, alert operator. That warning is central to the wider tesla autopilot debate: the system may assist with driving, but it does not erase legal or operational responsibility.
What the Las Vegas lawsuit alleges
In Las Vegas, plaintiffs Simen Ghassan Shamoun and Steven Shamoun filed suit in Clark County District Court, demanding a jury trial after an April 3, 2024 crash involving a 2024 Tesla Model Y. The lawsuit says the automated steering system made an “unexpected and unwarranted wide right turn” from northbound Spencer Street into westbound traffic on Richmar Avenue, where the vehicle collided with another car.
The complaint says both men suffered “significant personal injuries, ” with more than $13, 000 in medical expenses for Simen Ghassan Shamoun and more than $10, 000 for Steven Shamoun. It also alleges the maneuver was caused by a malfunction in the steering control software or related sensors that failed to properly interpret the roadway geometry. Tesla could not immediately be reached for comment.
How the two cases connect
These cases are not the same, but they point in the same direction. In one, a driver allegedly used tesla autopilot while intoxicated and unconscious. In the other, plaintiffs say the vehicle itself made a dangerous move into traffic. Together, they frame the same broader question: how much risk enters the road when driver-assist systems are treated as if they can replace judgment, or when their limitations are misunderstood.
Vacaville police described the California stop as a potential medical emergency before it became a DUI arrest. The Las Vegas complaint, by contrast, focuses on product liability, negligence, negligent misrepresentation and breach of warranty. The legal theories differ, but both cases revolve around control: who had it, who was supposed to have it, and what happens when that expectation fails.
Expert and official warnings about responsibility
Vacaville police said California drivers may use newer assistive driving safety features, but still must remain conscious, alert and sober. That statement aligns with the core distinction in Level 2 systems: the driver remains responsible at all times. In practical terms, that means falling asleep, driving drunk or trusting the system to manage a route without supervision can all create serious danger.
The pattern is not limited to one city or one type of misuse. The repeated appearance of tesla autopilot in cases involving driver inattention or alleged malfunction suggests a gap between public expectations and system design. That gap is especially important when a technology is named and marketed in a way that can imply greater autonomy than it actually has.
Regional and broader impact on road safety
For California and Nevada, the issue is no longer just about one crash or one arrest. It is about how local law enforcement, courts and drivers interpret systems that can steer, accelerate and brake while still depending on human oversight. The Vacaville arrest shows how quickly a vehicle can continue moving even when the driver cannot respond. The Las Vegas suit shows how costly a steering failure can become when plaintiffs believe the automated system chose the wrong path.
As these disputes unfold, regulators and courts will keep facing the same difficult question: when a car is capable of operating with partial automation, how much clearer does the warning need to be before drivers stop confusing assistance with autonomy? The answer could shape not only tesla autopilot claims, but the public’s trust in driver-assist technology more broadly.
For now, the two cases leave one unresolved question hanging over the road: if a system is expected to help steer the future of driving, who will be held accountable when it steers the wrong way?




