Preferences

> The company must pay $329 million in damages to victims and survivor, including compensatory and punitive damages.

> A Tesla owner named George McGee was driving his Model S electric sedan while using the company’s Enhanced Autopilot, a partially automated driving system.

> While driving, McGee dropped his mobile phone that he was using and scrambled to pick it up. He said during the trial that he believed Enhanced Autopilot would brake if an obstacle was in the way. His Model S accelerated through an intersection at just over 60 miles per hour, hitting a nearby empty parked car and its owners, who were standing on the other side of their vehicle.

On one hand I don't think you can apply a price to a human life, but on the other 329 million feels too high, especially since Tesla is only partially to blame, it wasn't FSD, and the driver wasn't using the system correctly. Had the system been being used correctly and Tesla was assigned more of the blame, would this be a 1 billion dollar case? This doesn't hold up logically unless I'm missing something, certainly the victim wouldn't be getting fined 329 million if it was decided to be his fault for not looking at the road


> This doesn't hold up logically unless I'm missing something, certainly the victim wouldn't be getting fined 329 million if it was decided to be his fault for not looking at the road

I hope we haven't internalized the idea that corporations should be treated the same as people.

There's essentially no difference $3M and $300M fine against most individuals, but $3M means very little to Tesla. If you want Tesla's behavior to change - and other automakers take notice and not repeat the behavior - then the fine must be meaningful to them.

That's another difference - fining an indivisible is not going to change risks much, the individual's behavior changing is not that meaningful compared to Tesla's behavior changing. And it's not like a huge fine is gonna make a difference in other drivers deciding to be better, whereas other automakers will notice a huge fine.

>I hope we haven't internalized the idea that corporations should be treated the same as people.

Only when it comes to rights. When it comes to responsibilities the corporations stop being people and go back to being amorphous, abstract things that are impossible to punish.

Would be nice to see executions of corporations as punishment.
Perhaps better to achieve symmetry by ceasing to execute humans.

You're never going to make executing the wrong corporation as thoroughly wicked as the numerous occasions on which we've executed the wrong human, so you can't make the scores even but you can stop putting more on the total for human misery.

Historically it was impractical to permanently warehouse large number of humans, death was more practical = but the US has been doing it for all sorts of crap for decades so that's not a problem here.

The US would still have much harsher punishments than Western Europe even without the death penalty, because it routinely uses life-means-life sentences where no matter what you're never seeing the outside again.

>You're never going to make executing the wrong corporation as thoroughly wicked as the numerous occasions on which we've executed the wrong human

What if we garnished 100% of the future wages of all the employees in perpetuity as well as dissolving the corporate entity? You know, to to make sure the company stays all the way dead.

I guess my bad for not specifying that it'd need be wicked for the corporation not the humans.
> Would be nice to see executions of corporations as punishment

Fines. Massive fines.

"Corporate death penalty" is a genius invention of corporate lawyers to distract from the punitive effect of massive fines.

Fines and license revocable are precedented. They take real money from the corporation and its owners. Corporate death penalties are a legal morass that doesn’t actually punish shareholders, it just cancels a legal entity. If I own an LLC and have a choice between a fine and the LLC being dissolved, I’d almost always opt for the latter.

But fines are boring. Corporate death penalty sounds exciting. The anti-corporate folks tend to run with it like catnip, thus dissolving the coalition for holding large companies accountable. (Which, again, a corporate "execution" doesn't do. Nullifying my LLC doesn't actually hurt me, it just creates a little bit of work for my lawyer, and frankly, getting out of a fuckup by poofing the LLC without touching the udnerlying assets is sort of the selling point of incorporation.)

Corporate fines are a genius invention of corporate execs' personal lawyers to distract from the fact that all corporate malfeasance is conducted by actual people who could be held accountable.
> Corporate fines are a genius invention of corporate execs' personal lawyers

Ahistoric and orthogonal. Corporate fines and personal sanctions have coëxisted since corporations were a thing. Charter revocations, on the other hand, have almost always followed individual liability, because again, just poofing a corporation doesn't actually do anything to its assets, the part with actual value. (In the English world, corporations frequently came pinned with trade charters. The actual punishment was losing a trade monopoly. Not a legal fiction being dissolved.)

Nothing about corporate death penalties or corporate fines prevents personal liability. And neither particularly promotes it, either, particularly if guilt is never acknowledged as part of the proceedings.

I'm guessing that dissolving your LLC as a punishment would include the forfeiture of all the associated assets, not distributing them to shareholders.
Everyone spewing opinions in this thread so they can get their upvotes from like minded readers is missing the 800lb gorilla.

It's not in the state's interest to kill profitable things most of the time except occasionally as a deterrent example. It's the same reason richer people (who pay a lot of taxes, engage in activity spawning yet more taxes, etc) tend to get probation instead of jail. Likewise, the state is happy to kill small(er) businesses. It does this all the time and it doesn't make the news. Whereas with the big ones it just takes its pound of flesh and keeps things running.

As long as that incentive mostly remains, the outcomes will mostly remain.

> Only when it comes to rights.

"Corporations are people" means a corporation is people, not a corporation is a person.

People have rights, whether they are acting through a corporation or not. That's what Citizens United determined.

I hope you think about who misled you to thinking that "corporations are people" meant a corporation is a person and trust them a little less.

In most jurisdictions, a corporation is a juridical person[1]. When not explicitly mentioning natural persons, whether a corporation is a "person" is thus ambiguous.

[1]: https://en.wikipedia.org/wiki/Juridical_person

I agree that Tesla should receive punitive damages. And the size of the punitive damages must be enough to discourage bad behavior.

I'm not necessarily sure the victim(s) should get all of the punitive damages. $329 million is a gargantuan sum of money; it "feels" wrong to give a corporation-sized punishment to a small group of individuals. I could certainly see some proportion going toward funding regulatory agencies, but I fear the government getting the bulk of punitive damages would set up some perverse incentives.

I think in the absence of another alternative, giving it to the victim(s) is probably the best option. But is there an even better possible place to disburse the funds from these types of fines?

>> it "feels" wrong to give a corporation-sized punishment to a small group of individuals

This feeling has a name; loss aversion.

It's a really interesting human traits. About 66% of people feel bad when someone else does well. The impact of this feeling on behavior (even behavior that is self-harming) is instructive.

The concept of "Fairness" comes into play as well. Many people have an expectation that the "world is fair" despite every evidence that it isn't. That results in "everything I don't get is unfair" whereas "everything I get I earned on my own." Someone rlse getting a windfall is thus "unfair".

It is definitely not loss aversion. It also has nothing to do with whether or not someone else is getting the money. Handing me nearly half a billion because a parent died would certainty be welcome, but I think it would feel equally as disproportionate and out-of-place.
Killed, not just died.
That really doesn't sound like loss aversion.
> I'm not necessarily sure the victim(s) should get all of the punitive damages.

I have some great news for you, then: the attorney probably took a third (more if they win an appeal).

> But is there an even better possible place to disburse the funds from these types of fines?

Oh, my mistake: I thought you meant way worse.

$300M means very little to Tesla. The stock didn't even drop a bit (other than the usual market fluctuations today). Perhaps $4.20B or $6.90B would've been meaningful. Elon notices these numbers.
Not doing what it asks - “keep your hands on the wheel” and “eyes on the road” - and crashing the car is somehow Elon Musks’ fault LOL hn logic. Can’t wait to sue lane assist when I drive drunk and crash!
It's supposed to stop if objects appear in its path. For sure you're an idiot if you trust Tesla's autopilot, but I think it's reasonable to partially fault Tesla for setting the consumer's expectation that the car stops when obstacles get in the way even if the vehicle isn't being operated exactly as suggested by the manufacturer.
Maybe system should sound alarm and slow down immediately when there is no hands on wheel and eyes not on the road. Would have avoided this accident, so it seems Tesla was at fault for not doing that.
"Somehow?" We're literally discussing a court case where culpability was proven and accepted by a judge.
To add, this is also punitive for endangering countless other road users that aren’t suing in this particular instance.
> If you want Tesla's behavior to change - and other automakers take notice and not repeat the behavior - then the fine must be meaningful to them.

What behavior do you want them to change? Remove FSD from their cars? It's been nearly 10 years since released and over 3bn miles driven. There's one case where someone died while fetching his cell phone. You would think if it was really dangerous, people would be dying in scores.

This is obviously targeted and the court system should not be playing favorites or going after political opponents

> What behavior do you want them to change?

Don't advertise their driver assist system as "full self driving".

The system involved in this crash was never advertised as "full self driving".
From https://web.archive.org/web/20211002045634/https://www.tesla...:

> Tesla cars come standard with advanced hardware capable of providing Autopilot features, and *full self-driving capabilities* — through software updates designed to improve functionality over time.

> Tesla's Autopilot AI team drives the future of autonomy of current and new generations of vehicles. Learn about the team and apply to help accelerate the world with *full self-driving*.

Now you can say that can be interpreted multiple ways - which means the copywriter is either incompetent, or intentionally misleading. Interestingly, the text from 2019 (https://web.archive.org/web/20191225054133/tesla.com/autopil...) is written a bit differently:

> ...full self-driving capabilities *in the future*...

- FSD came out in October 2020; I suppose rounding up to 10 puts it nearly 10 years since. It also, literally, doubles the number from its actual value.

- There have been a lot more than one incident. This is one court case about one incident.

- There are an insane number of accidents reported; does it only matter to you if someone dies? A lot more than one person has died in an accident that involved a vehicle that was equipped with FSD.

- Your comment is obviously targeted and disingenuous.

There was even a recall over it: https://www.eastbaytimes.com/2023/02/16/tesla-full-self-driv...

So to answer your question of what one might want to come out of it, perhaps another recall where they fix the system or stop making false claims about what it can do.

> This is obviously targeted and the court system should not be playing favorites or going after political opponents

This was a jury trial of a civil case - the family of the deceased took Tesla to court, not an anti-Tesla/Musk court system conspiracy.

> It's been nearly 10 years since released and over 3bn miles driven. There's one case where someone died while fetching his cell phone. You would think if it was really dangerous, people would be dying in scores.

And how many times did humans had to take over and save themselves and others from Tesla killing or injuring them? Tesla won't tell us this numbers, guess why ? The tech might be safe as a backup driver , but so far you need a human to pay attention to save himself from the car bugs/errors/glitches etc.

I really hate this bullshit safety claims pulled from someones ass, it is like me trying to convince you to get operated but an AI doctor by claiming "it is better then the a old and drunk doctor , he only killed a few people when the people supervising it did not payed attention but in the rest was very safe, we can't tell you how many times real doctors had to perform the hard work and our AI doctor only did stitching , those numbers need to be secret, but trust us the human doctors that had to intervene are just there because of the evil laws it could do the full job itself, we would not call it Full competent doctor if it can\t perform fully all expected tasks.

I went into a Tesla dealership nearly 10 years ago to take a look at the cars, and the salespeople were telling me - in no uncertain terms - that the cars were fully self-driving.

I knew that was complete nonsense, because I knew people who worked on Tesla's self-driving software, but that's how Tesla salespeople were selling the cars.

10 years ago Tesla didn't even have Autopilot. All they had as far as I can tell was lane departure warnings, speed alerts, manual cruise control, some sort of of automatic parking, and low speed summoning on private property.

Could the dealer have been referring to the automatic parking or the summoning?

Autopilot launched in 2014. "Full Self Driving" has been offered as an upgrade since 2016. Musk has been saying that fully autonomous driving is just around the corner (1-3 years away) since 2015.
Tesla was found partially liable for this crash. The reason they were liable was they sold something claiming (practically speaking) that it could do something. The customer believed that claim. It failed to do that thing and killed people.

So the question then is - how much did Tesla benefit from claiming they could do this thing? That seems like a reasonable starting point for damages.

And the fine needs to be high enough to prevent them from just saying - oh, well, we can make money if we keep doing it.

If you could only fine a person for committing murder, you wouldn't fine a billionaire $5m, and then hope he wouldn't go on killing everyone he thinks he'd rather have dead than $5m.

The US justice system uses punitive damages very heavily. And Tesla should absolutely get some punishment here.

On most other places you'd see it paying hundreds of millions in fines and a few millions in damages.

I imagine the jury heard "autopilot" and then assigned blame to the company that called it that.

"[Plaintiffs] claimed Tesla’s Autopilot technology was flawed and deceptively marketed."

> I imagine the jury heard "autopilot" and then assigned blame to the company that called it that.

It's only fair. If the name was fine when it was attracting the buyers who were mislead about the real capabilities, it must be fine when it causing the same to jurors.

There's another similar argument to be made about the massive amount awarded as damages, which maybe will be lowered on appeal. If people (Tesla included) can make the argument that when a car learns something or gets an "IQ" improvement they all do, then it stands to reason that when one car is dangerous they all are (or were, even for a time). There are millions of Teslas on the road today so proportionally it's a low amount per unsafe car.

"Autopilot" isn't even the most egregious Tesla marketing term since that honour goes to "Full Self-Driving", which according to the fine text "[does] not make the vehicle autonomous".

Tesla's self-driving advertising is all fucking garbage and then some George McGee browses Facebook while believing that his car is driving itself.

do you think they heard "autopilot" or "full self driving"?
I don't think these terms are meaningfully different in the heads of most people.

I know autopilot in airplanes is a set of assistive systems which don't remotely pretend to replace or obsolete humans. But that's not typically how it's used colloquially, and Tesla's marketing benefits heavily from the colloquial use of "autopilot" as something that can pilot a vehicle autonomously.

You really think the defense wouldn’t have objected if the wrong term was used, or that the judge would allow its continued use?
As gets pointed out ad nauseum, the very first "cruise control" product in cars was in fact called "Auto-Pilot". Also real "autopilot" systems in aircraft (where the term of art comes from!) aren't remotely supervision-free.

This is a fake argument (post hoc rationalization): It invents a meaning to a phrase that seems reasonable but that has never been rigorously applied ever, and demands that one speaker, and only that one speaker, adhere to the ad hoc standard.

> real "autopilot" systems in aircraft (where the term of art comes from!) aren't remotely supervision-free

Pilot here. If my G1000’s autopilot were flying and I dropped my phone, I’d pick it up. If my Subaru’s lane-keeping were engaged and I dropped me phone, I might try to feel around for it, but I would not go spelunking for several seconds.

I can't tell which side of the argument you're on here. The driver in the Tesla case didn't "drop a pen". Your Subaru is a recent car and not a 2018 Tesla Model S (which was launched before the Full Self Driving product everyone here seems to think they're arguing about existed!).

And... no pilot is allowed to operate any automatic pilot system without supervision, I genuinely can't imagine that's what you're implying[1].

[1] Your "drop a pen" example seems deliberately constructed to invent a scenario where you think you're allowed to stop supervising the aircraft because it sounds harmless. It's not. You aren't. And if the FAA traces that post to your license I bet anything they'll suspend it.

That’s why we have a jury.

Autopilot quite literally means automatic pilot. Not “okay well maybe sometimes it’s automatic”.

This is why a jury is made up of the average person. The technical details of the language simply does not matter.

Couldn't agree more. This thing where words have a common definition and then a secret custom definition that only applies in courts is garbage. Everyone knows what "full self driving" means, either deliver that, come up with a new phrase or get your pants sued off for deceptive marketing.
> Everyone knows what "full self driving" means

Sadly most people don't know that this case involved a comparatively ancient Tesla that did not have FSD. Seems like better attention to the "meaning of words" (like, the ones in the article you seem not to have read) might have helped things and not hurt them?

Autopilot is used when referring to a plane (until Tesla started using it as a name for their cruise control that can steer and keep distance).

In the context of a plane, autopilot always meant automatic piloting at altitude, and everyone knew it was the human pilots that were taking off and landing the plane.

Did they?

I think you may be overestimating how much average people know about autopilot systems.

The first cruise control system in cars was released in 1908, before planes and was called a "governor." It maintained throttle position.

The first modern cruise control (tied to speed) was released in 1948, and was called a "speedostat." The first commercial use of the speedostat was in 1958, where the speedostat was called "Auto Pilot" in select Chrylser luxury models. Chrysler almost immediately renamed "autopilot" to "cruise-control" the following year in 1959, because the use of the term "auto pilot" was deemed misleading (airplane autopilots in 1959 could maintain speed and heading).

Or in other words...the history of cruise control is that the name "auto pilot" was explicitly rejected because of the dangerous connotations the term implied about the vehicle's capabilities.

https://www.smithsonianmag.com/innovation/sightless-visionar...

The market Tesla is advertising to is not airplane pilots. It is the general car buying public.

If they are using any terms in their ads in ways other than the way the people the ads are aimed at (the general car buying public) can reasonably be expected to understand them, then I'd expect that could be considered to be negligent.

Much of the general public is going to get their entire idea of what an autopilot can do from what autopilots do in fiction.

The dictionary definition for Americans is:

> A navigation mechanism, as on an aircraft, that automatically maintains a preset course.

https://ahdictionary.com/word/search.html?q=automatic+pilot

Note that “autopilot” and “automatic pilot” are synonyms.

https://ahdictionary.com/word/search.html?q=Autopilot

An autopilot is supposed to be an automatic system, which doesn’t imply supervision.

https://ahdictionary.com/word/search.html?q=automatic

> Self-regulating: an automatic washing machine.

Notably, an aircraft autopilot will NOT avoid hitting anything in its path, or slow down for it, or react to it in any way. It's just that the sky is very big and other aircraft are very small, so random collisions are extremely unlikely.
> an aircraft autopilot will NOT avoid hitting anything in its path, or slow down for it, or react to it in any way

TAWS (terrain) and ACAS (traffic) are built into modern autopilots.

And Tesla lied about its autopilot’s capabilities in proximity to this crash: “In 2016, the company posted a video of what appears to be a car equipped with Autopilot driving on its own. ‘The person in the driver's seat is only there for legal reasons,’ reads a caption that flashes at the beginning of the video. ‘He is not doing anything. The car is driving itself.’ (Six years later, a senior Tesla engineer conceded as part of a separate lawsuit that the video was staged and did not represent the true capabilities of the car.”

https://www.npr.org/2025/07/14/nx-s1-5462851/tesla-lawsuit-a...

Airplanes and automobiles differ in a number of ways.
As is also pointed out ad nauseam, the claims made about autopilot (Tesla) go far beyond the name, partly because they sold a lot of cars on lies about imminent "FSD" and partly because as always Elon Musk can't keep his mouth shut. The issue isn't just the name, it's that the name was part of a full-court-press to mislead customers and probably regulators.
Everyone here seems to think this is a case about full self driving. That product didn't exist yet when the car in question was manufactured. No one was making the claims you believe were made.
It's also worth mentioning he would have been required to keep his hands on the wheel while using autopilot, or else it starts beeping at you and eventually disables the feature entirely. The system makes it very clear you're still in control, and it will permanently disable itself if it thinks you're not paying attention too many times (you get 5 strikes for your ownership duration).
The strikes reset after a week, they do not persist for the duration of your ownership of the vehicle.

https://www.tesla.com/support/autopilot - section “How long does Autopilot suspension last?”

>you get 5 strikes for your ownership duration).

It isn't ownership of the device disables things against you. That's licensing at best.

Is there any contextual difference between the first instance of cruise control (which has since been relabeled cruise control, perhaps with reason), automatic flight control, and a company whose CEO and fanboys incessantly talk about vehicle autonomy?
> On one hand I don't think you can apply a price to a human life

Yes, although courts do this all the time. Even if you believe this as solely manufacturer error, there are precedents. Consider General Motors ignition switch recalls. This affected 800k vehicles and resulted in 124 deaths.

> As part of the Deferred Prosecution Agreement, GM agreed to forfeit $900 million to the United States.[4][51] GM gave $600 million in compensation to surviving victims of accidents caused by faulty ignition switches

So about $5m per death, and 300m to the government. This seems excessive for one death, even if you believe Tesla was completely at fault. And the fact that this is the only such case (?) since 2019, seems like the fault isn't really on the manufacturer side.

https://en.wikipedia.org/wiki/General_Motors_ignition_switch...

If you you make a manufacturing error without intentionally deceiving your customers through deceptive naming of features, you have to pay millions per death.

If you intentionally give the feature a deceptive name like "autopilot", and then customers rely on that deceptive name to take their eyes off the road, then you have to pay hundreds of millions per death.

Makes sense to me.

Wouldn't that logic mean any automaker advertising a "collision avoidance system" should be held liable whenever a car crashes into something?

In practice, they are not, because the fine print always clarifies that the feature works only under specific conditions and that the driver remains responsible. Tesla's Autopilot and FSD come with the same kind of disclaimers. The underlying principle is the same.

There are plenty of accurate names Tesla could have selected.

They could have named it "adaptive cruise control with assisted lane-keeping".

Instead their customers are intentionally led to believe it's as safe and autonomous as an airliner's autopilot.

Disclaimers don't compensate for a deceptive name, endless false promises and nonstop marketing hype.

If it was called "comprehensive collision avoidance system" then yes.
Right, this is the frustrating thing about court room activism and general anger towards Tesla. By any stretch of the imagination, this technology is reasonably safe. It has over 3.6 billion miles, currently about 8m miles per day. By all reasonable measures, this technology is safe and useful. I could see why plaintiffs go after Tesla. They have a big target on their back for whatever reason, and activist judges go along. But I don't get how someone on the outside can look at this and think that this technology or marketing over the last 10 years is somehow deceptive or dangerous.

https://teslanorth.com/2025/03/28/teslas-full-self-driving-s...

> activist judges

Wait what? What activism is the judge doing here? The jury is the one that comes up with the verdict and damage award, no?

The product simply should not be called Autopilot. Anyone with any common sense could predict that many people will (quite reasonably) assume that a feature called Autopilot functions as a true autopilot, and that misunderstanding will lead to fatalities.
> feature called Autopilot functions as a true autopilot

What's a "true autopilot"? In airplanes, autopilot systems traditionally keep heading, altitude, and speed, but pilots are still required to monitor and take over when necessary. It's not hands-off or fully autonomous.

I would argue you are creating a definition of "autopilot" that most people do not agree with.

It can be called anything in an airplane because there the pilot has some level of training with the system and understands its limits. You don't get a pilot hopping on a 767 and flying a bunch of people around solely because Boeing used autopilot in a deceptive marketing ad, then getting the surprise of a lifetime when the autopilot doesn't avoid flying into a mountain.

A car is another thing entirely because the general population's definition of "autopilot" does come into play and sometimes without proper training or education. I can just go rent a tesla right now.

>You don't get a pilot hopping on a 767 and flying a bunch of people around solely because Boeing used autopilot in a deceptive marketing ad, then getting the surprise of a lifetime when the autopilot doesn't avoid flying into a mountain.

...Um... You did get pilots hopping into a 737 MAX, getting training that barely mentioned an automated anti-stall and flight envelope management system called MCAS that eventually flew several planeloads of people into the ground. That was managements idea too, btw.

IANAP but I think they can take their hands off the controls and pick up a dropped phone.
So you can on tesla when used correctly. Can you enable plane autopilot and still crash into a mountain?
Modern autopilots? No, they will not crash into a mountain or another plane.
The word literally means automatic pilot. Legal departments create an alternate definition of the word to not get sued and most people will interpret the word literally.
What matters is the definition which most people use.

https://www.hackerneue.com/item?id=44761341

I guess I don't understand how.

> A navigation mechanism, as on an aircraft, that automatically maintains a preset course.

Applies here. As far as I can tell the system did do exactly that. But the details of the actual event are unclear (I've heard parked car but also just hitting the back of another car?)

It’s an emergent technology. The onus is on Tesla to be crystal clear about capabilities, and consistently so. People might quite reasonably infer that something which is labeled as “auto-“ or “automatic” doesn’t require supervision.
It didnt maintain the course. It kept going at a t intersection and ploughed into another car
Anyone who used it knows its limitations. IDK maybe in 2019 it was different tho, now it's full of warnings that make it barely useable when distracted. Ironically you are better off disabling it and staring into your phone, which seems what regulators actually want.

And by the way what is true autopilot? Is the average joe a 787 pilot who's also autopilot master?

Funny that pretty much every car ships with autosteer now. Ones I've used didn't seem to have much warnings, explanations, disclaimers or agreements that pundits here assume it should.

A true autopilot is a system on a boat or aircraft that keeps it on a set heading. ISTM in this case that's what the autopilot did.
Boats and aircraft are both different from automobiles. They have full freedom of movement. You can't set a course in the same way with an automobile, because the automobile will need to follow a roadway and possibly make turns at intersections. Boats and aircraft can have waypoints, but those waypoints can be mostly arbitrary, whereas a car needs to conform its path to the roadway, traffic, signage, etc.

It's an entirely different domain.

Yes, an autopilot is not what you need on a car.
There's two conflicting goals here, Tesla's marketing department would really like to make you think the car is fully autonomous for financial reasons (hence autopilot and full self driving) and then there's Tesla's legal department which would prefer to blame somebody else for their poor software.
> On one hand I don't think you can apply a price to a human life, but on the other 329 million feels too high, (...)

Let me stop you right there. That's not how damages work.

Damages have two goals: compensate victims, and dissuade offenders from repeating the same mistakes. The latter involves punishments that discourage repeat offenses.

That's where these high values come from. They are intended to force the likes of Tesla to not ignore the lives they are ending due to their failures.

If damages were low, the likes of Tesla would do absolutely nothing and absorb them as operational expenses, and continue to cause deaths claiming they are unavoidable.

Once the likes of Tesla are forced to pay significant volumes of cash in damages, they suddenly find motives to take their design problems seriously.

I tend to agree, however the government is not an unincentivized incentivizer. By being able to impose such fines, the government is potentially itself incentivized to not prevent these accidents for they potentially cause this kind of revenue.

There are ways to mitigate this, such as forcing the government to use these revenues in a way that is relevant to the issue at hand, i.e. creating safety jobs, strengthening control authorities, or something else.

You could also say that the amount is insignificant, but that could of course change with every lawsuit, and it of course accumulates. Or one could speculate that the interests are not really monetarily aligned at all (e.g. prisons), or that the judicial system is independent enough to stop propagation of these incentives. I think it is still needed to consider and try to controlledly align these motives between the relevant actors.

The government does not receive any portion of this damage award. There is no incentive for them here.
> Damages have two goals: compensate victims, and dissuade offenders

Let me stop you right there. Just the compensatory damages were 129 million. And most of that was charged to the driver, no corporate boost there.

The fault with an individual can be reasonably constrained to the one prosecuted death they caused. The fault with "autopilot by Tesla", a system that was marketed and deployed at scale, cannot.

And if you want to draw parallels with individuals, an individual driver's license would be automatically suspended and revoked when found at fault for manslaughter. Would you propose a minimum 1~3 year ban on autopilot-by-Tesla within the US, instead?

329 million is not just compensatory damages (the value of the human life) but also punitive damages. That number floats up to whatever it takes to disincentivize Tesla in the future.
*Punitive damages*. From another article: "They claimed Tesla either hid or lost key evidence, including data and video recorded seconds before the accident." If Tesla is destroying evidence then yeah they ought to feel the pain, and those personally responsible should be charged as well. If you make it cheaper to evade the law than comply, what good is the court at all?
I just did some googling around:

> The case also included startling charges by lawyers for the family of the deceased, 22-year-old, Naibel Benavides Leon, and for her injured boyfriend, Dillon Angulo. They claimed Tesla either hid or lost key evidence, including data and video recorded seconds before the accident.

> Tesla has previously faced criticism that it is slow to cough up crucial data by relatives of other victims in Tesla crashes, accusations that the car company has denied. In this case, the plaintiffs showed Tesla had the evidence all along, despite its repeated denials, by hiring a forensic data expert who dug it up. Tesla said it made a mistake after being shown the evidence and honestly hadn’t thought it was there.

-https://lasvegassun.com/news/2025/aug/01/jury-orders-tesla-t...

Nothing enrages a judge faster then an attempt to conceal evidence that a court has ordered be turned over during discovery. If this is then I suspect the punitive damages have to do as much about disregard to the legal process as it is the case itself.

329 million too high? if you had the money and handing it over would save your life, would you rather keep the money as a corpse?
So wrongful death liability should be infinite, or maybe just equal to the money supply (pick one, I guess)?
I think the conceptually messed up part is, when such an award includes separate components for compensatory damages and punitive damages, the plaintiff receives the punitive damages even if they're part of a much broader class that was impacted by the conduct in question. E.g. how many people's choice to purchase a Tesla was influenced by the deceptive marketing? How many other people had accidents or some damages? I think there ought to be a mechanism where the punitive portion rolls into the beginning of a fund for a whole class, and could be used to defray some costs of bringing a larger class action, or dispersed directly to other parties.
So if I make a lawsuit and prove there is a small possibility my toaster can cause my arm to be cut off, because that’s what it did to me, and win $400,000,000 I should only get $400 if it turns out they sold 1 million units?

It’s not a class action lawsuit. If they want their cash they should sue too. That’s how our system works.

No, you misread that pretty significantly. They're only talking about splitting up the punitive damages.

Using the tesla numbers you'd get somewhere between 43 and 129 million dollars personally, plus a share of the 200 million. And your share of the 200 million would probably be a lot higher than someone merely defrauded. And the 200 million would probably get a lot bigger in a proper class action.

If you want numbers for your theoretical you'll have to specify how much was compensatory and how much was punitive.

If there's an argument to be made that the damages are too high (I'm not making this argument, to be clear), it might be with the compensatory damages prescribed by the jury at $129 million, but then that begs the question of what the cost of a human life is. Money won't bring someone back, but if you're in a courtroom and forced to calculate this figure, it's better to just overestimate IMO.

But the punitive damages at $200 million are appropriate — it's what the jury thought would be appropriate to discourage Tesla's behaviors.

>I don't think you can apply a price to a human life

Not only we can, but it also done routinely. For example, see this Practical Engineering video: https://www.youtube.com/watch?v=xQbaVdge7kU

If anyone's confused about what to expect of autopilot and/or FSD, it's Tesla's doing and they should be getting fined into oblivion for the confusion and risks they're creating.
On the flip side: Penalties should be scaled relative to one's means so that the wealthy (whether people or corporations) actually feel the pain & learn from their mistakes. Otherwise penalties for the wealthy are like a cup of coffee for the average Joe -- just a "cost of business."

I'm also a big proponent of exponential backoff for repeat offenders.

Let's fine Apple too, since they allow smartphones to be using while driving.
It's the sum of damages and punitive damages.
"This doesn't hold up logically unless I'm missing something, certainly the victim wouldn't be getting fined 329 millioon if it was decided to be his fault for not loooking at the road"

There was no "329 million fine"

There was a (a) 59 million compensatory damages award to the representative of the estate of the deceased and (b) 70 million compensatory damages award to her boyfriend who survived

The punitive damages were likley the result of Tesla's misconduct in deliberately concealing evidence, not its percentage of fault in causing the accident

HN front page: https://www.hackerneue.com/item?id=44787780

Why would Tesla conceal evidence. That question is left as one for the reader

Indeed, the HN commenter missed several things

It's sort of like with the Prius acceleration debacle we had, people always want to blame the car and not their own actions.

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal