Autonomous vehicles – cars that drive themselves using software, sensors, and AI – are expected to reduce human error, but they also raise a critical question: If you’re not driving, can you still be liable for an accident? In South Carolina, this is largely uncharted territory. Fully self-driving cars (Level 4 or 5 automation, with no human intervention needed) are not yet in regular use on our roads. However, many drivers already use partially autonomous features like Tesla’s Autopilot, GM’s Super Cruise, or Ford’s BlueCruise. These systems can steer, brake, and accelerate on their own under certain conditions, but they still require a human driver to monitor and intervene. Understanding how liability works in these scenarios is crucial.
- Partially Autonomous Vehicles (Driver-Assist Systems): In a car that has an “autopilot” or driver-assist feature, the law still views the human behind the wheel as the responsible driver. If you engage a Level 2 or Level 3 automation feature but are expected to take over in an emergency, you can be deemed negligent for failing to pay attention or respond to warnings. For example, if you ignore your Tesla’s alerts to put your hands back on the wheel, and an accident happens, a court could find you negligent for relying too much on the system. In fact, in a recent California jury trial, Tesla successfully argued that, regardless of Autopilot being engaged, the human driver bore ultimate responsibility for a crash, and the jury did not hold Tesla liable. This illustrates that current legal norms still expect an attentive human in semi-automated driving. South Carolina, similarly, would likely treat such a case under normal negligence principles: Did the driver act as a reasonable person would, given that the car was not truly self-driving? If the driver was watching a movie or taking a nap when they should have been ready to take control, that’s strong evidence of negligence on the driver’s part. The more autonomous the technology becomes, however, the more that balance may shift toward the manufacturer’s responsibility.
- Fully Autonomous Vehicles (No Human Driver): In a scenario where a car is genuinely self-driving – for instance, a future Waymo or Cruise robotaxi with no driver at all – holding a human “driver” liable becomes impractical. If you are just a passenger in a driverless taxi, you have no control over the vehicle, so it wouldn’t make sense to blame you for an accident. Instead, liability would shift to others: potentially the company operating the autonomous vehicle, the vehicle’s manufacturer, or the software developers. Legal experts anticipate a transition from the current driver-negligence regime to a products liability regime for fully autonomous cars. In other words, when a car itself is making the decisions, any accident likely implicates a flaw in the vehicle’s sensors, software, or decision-making – which points to the makers of the technology. As one early scholar of self-driving law put it, we may see a “shift from a compensation regime… largely premised on vehicular negligence to one that increasingly implicates product liability”, since the “driver” will effectively be the vehicle’s hardware and software. If the autonomous system is defective or “unreasonably dangerous,” the manufacturer can be held strictly liable for the harm it causes.
Notably, South Carolina has not yet enacted any specific statute addressing autonomous vehicle liability or operation. According to a 2024 state-by-state survey, South Carolina is one of the states with no autonomous vehicle statute on the books. This means there is currently a gray area – we would rely on existing general principles (negligence, products liability, etc.) to handle AV accidents. By contrast, some other states have begun passing laws to clarify these issues. For example, Tennessee’s “Automated Vehicles Act” explicitly provides that when a car is driving itself with an autonomous system engaged, the automated driving system is considered the driver for liability purposes. That kind of law basically says: if a fully self-driving car causes a crash, the law will treat the vehicle’s system (and thus its creator/owner) as the responsible party instead of any occupant. South Carolina has no such provision as of yet, but if AVs become common, the legislature may consider similar rules.
Even without specific statutes, courts are starting to wrestle with autonomous vehicle accidents elsewhere, and those cases provide hints for South Carolina. One high-profile incident occurred in San Francisco in 2023, when a Cruise robotaxi (with no human driver onboard) ran over a pedestrian who had already been struck by another car. Experts noted this case “doesn’t fit neatly in a mold,” because traditional driver negligence and novel product liability issues blur together. The human driver who initially hit the pedestrian would bear much of the blame, but the robotaxi’s actions arguably exacerbated the injury, raising the question of the vehicle’s liability. A human driver might have reacted differently (e.g. not dragging the victim); if the robotaxi’s response was subpar, that opens the company (GM’s Cruise) to liability. One plaintiff’s lawyer commented that such a case “falls squarely within the product liability realm”, since no human was driving the AV – it was essentially a product that failed to behave safely. Indeed, a law professor noted that in a case like that, a victim wouldn’t need to prove a specific negligent act by a person; they could prevail by showing the AV “functioned in an unreasonable fashion” due to a design defect in the software.
These examples highlight a likely trend: as automation increases, driver liability is set to decrease, while manufacturer and software-provider liability increases. The core policy question is how to balance encouraging innovation with protecting the public. On one hand, we want safer cars and shouldn’t over-penalize companies for every accident if overall AVs reduce crashes. On the other hand, if a company’s autonomous driving system makes a poor decision that a reasonable human driver wouldn’t (like misidentifying a pedestrian), fairness suggests the company should pay for the harm. Some scholars have even proposed bold solutions – for instance, one proposal is to hold AV manufacturers liable for all crashes involving their self-driving vehicles, regardless of fault, to force them to design vehicles that are “superhuman defensive drivers”. (This is a radical idea not adopted in law, but it shows how experts are rethinking liability in the AV era.)
South Carolina’s likely approach, absent new legislation, is to apply existing law flexibly. If you are using a car’s autonomous features, you should know: You are not automatically off the hook for liability. If the feature is “driver-assist” (not full self-driving), you are expected to remain alert. Failing to intervene or misuse of the technology can make you negligent. However, if the technology clearly malfunctions independently of you, you and any victims may have a claim against the manufacturer. If you are riding in a completely driverless vehicle (for example, someday hailing a robotaxi in Charleston), you generally would not be liable for anything it does wrong – you’re essentially a passenger. If an accident occurs, your claims would be against the operator or manufacturer of the AV, not against fellow passengers. And if you are an owner sending your autonomous car to run errands on its own, liability could attach to you vicariously (as the vehicle’s owner or as the party who put it in autonomous mode) depending on how courts view ownership responsibility. We don’t have a South Carolina case on that yet, but courts might treat it similarly to letting someone else drive your car – except “someone else” is now a computer you entrusted with driving.
It’s also worth mentioning that South Carolina’s traffic laws currently presume a human driver. For instance, state law requires drivers to follow the rules of the road (stop at signals, etc.), and offenses like reckless driving apply to a “driver.” If an AV runs a red light or speeds, technically a law was broken – but by whom? Without an update, theoretically the owner could receive the ticket. States like Florida have addressed this by defining the “operator” of an autonomous vehicle as the person who engages the automated driving system, even if they’re not in the driver’s seat. South Carolina has not updated definitions yet, which is another grey area. In practice, companies testing AVs typically work closely with regulators and carry insurance to cover any mishaps. If AVs become more common here, expect legal updates to define responsibilities clearly.