I hope we haven't internalized the idea that corporations should be treated the same as people.
There's essentially no difference $3M and $300M fine against most individuals, but $3M means very little to Tesla. If you want Tesla's behavior to change - and other automakers take notice and not repeat the behavior - then the fine must be meaningful to them.
That's another difference - fining an indivisible is not going to change risks much, the individual's behavior changing is not that meaningful compared to Tesla's behavior changing. And it's not like a huge fine is gonna make a difference in other drivers deciding to be better, whereas other automakers will notice a huge fine.
Only when it comes to rights. When it comes to responsibilities the corporations stop being people and go back to being amorphous, abstract things that are impossible to punish.
You're never going to make executing the wrong corporation as thoroughly wicked as the numerous occasions on which we've executed the wrong human, so you can't make the scores even but you can stop putting more on the total for human misery.
Historically it was impractical to permanently warehouse large number of humans, death was more practical = but the US has been doing it for all sorts of crap for decades so that's not a problem here.
The US would still have much harsher punishments than Western Europe even without the death penalty, because it routinely uses life-means-life sentences where no matter what you're never seeing the outside again.
What if we garnished 100% of the future wages of all the employees in perpetuity as well as dissolving the corporate entity? You know, to to make sure the company stays all the way dead.
Fines. Massive fines.
"Corporate death penalty" is a genius invention of corporate lawyers to distract from the punitive effect of massive fines.
Fines and license revocable are precedented. They take real money from the corporation and its owners. Corporate death penalties are a legal morass that doesn’t actually punish shareholders, it just cancels a legal entity. If I own an LLC and have a choice between a fine and the LLC being dissolved, I’d almost always opt for the latter.
But fines are boring. Corporate death penalty sounds exciting. The anti-corporate folks tend to run with it like catnip, thus dissolving the coalition for holding large companies accountable. (Which, again, a corporate "execution" doesn't do. Nullifying my LLC doesn't actually hurt me, it just creates a little bit of work for my lawyer, and frankly, getting out of a fuckup by poofing the LLC without touching the udnerlying assets is sort of the selling point of incorporation.)
Ahistoric and orthogonal. Corporate fines and personal sanctions have coëxisted since corporations were a thing. Charter revocations, on the other hand, have almost always followed individual liability, because again, just poofing a corporation doesn't actually do anything to its assets, the part with actual value. (In the English world, corporations frequently came pinned with trade charters. The actual punishment was losing a trade monopoly. Not a legal fiction being dissolved.)
Nothing about corporate death penalties or corporate fines prevents personal liability. And neither particularly promotes it, either, particularly if guilt is never acknowledged as part of the proceedings.
It's not in the state's interest to kill profitable things most of the time except occasionally as a deterrent example. It's the same reason richer people (who pay a lot of taxes, engage in activity spawning yet more taxes, etc) tend to get probation instead of jail. Likewise, the state is happy to kill small(er) businesses. It does this all the time and it doesn't make the news. Whereas with the big ones it just takes its pound of flesh and keeps things running.
As long as that incentive mostly remains, the outcomes will mostly remain.
"Corporations are people" means a corporation is people, not a corporation is a person.
People have rights, whether they are acting through a corporation or not. That's what Citizens United determined.
I hope you think about who misled you to thinking that "corporations are people" meant a corporation is a person and trust them a little less.
I'm not necessarily sure the victim(s) should get all of the punitive damages. $329 million is a gargantuan sum of money; it "feels" wrong to give a corporation-sized punishment to a small group of individuals. I could certainly see some proportion going toward funding regulatory agencies, but I fear the government getting the bulk of punitive damages would set up some perverse incentives.
I think in the absence of another alternative, giving it to the victim(s) is probably the best option. But is there an even better possible place to disburse the funds from these types of fines?
This feeling has a name; loss aversion.
It's a really interesting human traits. About 66% of people feel bad when someone else does well. The impact of this feeling on behavior (even behavior that is self-harming) is instructive.
The concept of "Fairness" comes into play as well. Many people have an expectation that the "world is fair" despite every evidence that it isn't. That results in "everything I don't get is unfair" whereas "everything I get I earned on my own." Someone rlse getting a windfall is thus "unfair".
I have some great news for you, then: the attorney probably took a third (more if they win an appeal).
> But is there an even better possible place to disburse the funds from these types of fines?
Oh, my mistake: I thought you meant way worse.
What behavior do you want them to change? Remove FSD from their cars? It's been nearly 10 years since released and over 3bn miles driven. There's one case where someone died while fetching his cell phone. You would think if it was really dangerous, people would be dying in scores.
This is obviously targeted and the court system should not be playing favorites or going after political opponents
Don't advertise their driver assist system as "full self driving".
> Tesla cars come standard with advanced hardware capable of providing Autopilot features, and *full self-driving capabilities* — through software updates designed to improve functionality over time.
> Tesla's Autopilot AI team drives the future of autonomy of current and new generations of vehicles. Learn about the team and apply to help accelerate the world with *full self-driving*.
Now you can say that can be interpreted multiple ways - which means the copywriter is either incompetent, or intentionally misleading. Interestingly, the text from 2019 (https://web.archive.org/web/20191225054133/tesla.com/autopil...) is written a bit differently:
> ...full self-driving capabilities *in the future*...
- There have been a lot more than one incident. This is one court case about one incident.
- There are an insane number of accidents reported; does it only matter to you if someone dies? A lot more than one person has died in an accident that involved a vehicle that was equipped with FSD.
- Your comment is obviously targeted and disingenuous.
There was even a recall over it: https://www.eastbaytimes.com/2023/02/16/tesla-full-self-driv...
So to answer your question of what one might want to come out of it, perhaps another recall where they fix the system or stop making false claims about what it can do.
This was a jury trial of a civil case - the family of the deceased took Tesla to court, not an anti-Tesla/Musk court system conspiracy.
And how many times did humans had to take over and save themselves and others from Tesla killing or injuring them? Tesla won't tell us this numbers, guess why ? The tech might be safe as a backup driver , but so far you need a human to pay attention to save himself from the car bugs/errors/glitches etc.
I really hate this bullshit safety claims pulled from someones ass, it is like me trying to convince you to get operated but an AI doctor by claiming "it is better then the a old and drunk doctor , he only killed a few people when the people supervising it did not payed attention but in the rest was very safe, we can't tell you how many times real doctors had to perform the hard work and our AI doctor only did stitching , those numbers need to be secret, but trust us the human doctors that had to intervene are just there because of the evil laws it could do the full job itself, we would not call it Full competent doctor if it can\t perform fully all expected tasks.
I knew that was complete nonsense, because I knew people who worked on Tesla's self-driving software, but that's how Tesla salespeople were selling the cars.
Could the dealer have been referring to the automatic parking or the summoning?
So the question then is - how much did Tesla benefit from claiming they could do this thing? That seems like a reasonable starting point for damages.
If you could only fine a person for committing murder, you wouldn't fine a billionaire $5m, and then hope he wouldn't go on killing everyone he thinks he'd rather have dead than $5m.
On most other places you'd see it paying hundreds of millions in fines and a few millions in damages.
"[Plaintiffs] claimed Tesla’s Autopilot technology was flawed and deceptively marketed."
It's only fair. If the name was fine when it was attracting the buyers who were mislead about the real capabilities, it must be fine when it causing the same to jurors.
There's another similar argument to be made about the massive amount awarded as damages, which maybe will be lowered on appeal. If people (Tesla included) can make the argument that when a car learns something or gets an "IQ" improvement they all do, then it stands to reason that when one car is dangerous they all are (or were, even for a time). There are millions of Teslas on the road today so proportionally it's a low amount per unsafe car.
Tesla's self-driving advertising is all fucking garbage and then some George McGee browses Facebook while believing that his car is driving itself.
I know autopilot in airplanes is a set of assistive systems which don't remotely pretend to replace or obsolete humans. But that's not typically how it's used colloquially, and Tesla's marketing benefits heavily from the colloquial use of "autopilot" as something that can pilot a vehicle autonomously.
This is a fake argument (post hoc rationalization): It invents a meaning to a phrase that seems reasonable but that has never been rigorously applied ever, and demands that one speaker, and only that one speaker, adhere to the ad hoc standard.
Pilot here. If my G1000’s autopilot were flying and I dropped my phone, I’d pick it up. If my Subaru’s lane-keeping were engaged and I dropped me phone, I might try to feel around for it, but I would not go spelunking for several seconds.
And... no pilot is allowed to operate any automatic pilot system without supervision, I genuinely can't imagine that's what you're implying[1].
[1] Your "drop a pen" example seems deliberately constructed to invent a scenario where you think you're allowed to stop supervising the aircraft because it sounds harmless. It's not. You aren't. And if the FAA traces that post to your license I bet anything they'll suspend it.
Autopilot quite literally means automatic pilot. Not “okay well maybe sometimes it’s automatic”.
This is why a jury is made up of the average person. The technical details of the language simply does not matter.
Sadly most people don't know that this case involved a comparatively ancient Tesla that did not have FSD. Seems like better attention to the "meaning of words" (like, the ones in the article you seem not to have read) might have helped things and not hurt them?
In the context of a plane, autopilot always meant automatic piloting at altitude, and everyone knew it was the human pilots that were taking off and landing the plane.
I think you may be overestimating how much average people know about autopilot systems.
The first modern cruise control (tied to speed) was released in 1948, and was called a "speedostat." The first commercial use of the speedostat was in 1958, where the speedostat was called "Auto Pilot" in select Chrylser luxury models. Chrysler almost immediately renamed "autopilot" to "cruise-control" the following year in 1959, because the use of the term "auto pilot" was deemed misleading (airplane autopilots in 1959 could maintain speed and heading).
Or in other words...the history of cruise control is that the name "auto pilot" was explicitly rejected because of the dangerous connotations the term implied about the vehicle's capabilities.
https://www.smithsonianmag.com/innovation/sightless-visionar...
If they are using any terms in their ads in ways other than the way the people the ads are aimed at (the general car buying public) can reasonably be expected to understand them, then I'd expect that could be considered to be negligent.
Much of the general public is going to get their entire idea of what an autopilot can do from what autopilots do in fiction.
> A navigation mechanism, as on an aircraft, that automatically maintains a preset course.
https://ahdictionary.com/word/search.html?q=automatic+pilot
Note that “autopilot” and “automatic pilot” are synonyms.
https://ahdictionary.com/word/search.html?q=Autopilot
An autopilot is supposed to be an automatic system, which doesn’t imply supervision.
https://ahdictionary.com/word/search.html?q=automatic
> Self-regulating: an automatic washing machine.
TAWS (terrain) and ACAS (traffic) are built into modern autopilots.
And Tesla lied about its autopilot’s capabilities in proximity to this crash: “In 2016, the company posted a video of what appears to be a car equipped with Autopilot driving on its own. ‘The person in the driver's seat is only there for legal reasons,’ reads a caption that flashes at the beginning of the video. ‘He is not doing anything. The car is driving itself.’ (Six years later, a senior Tesla engineer conceded as part of a separate lawsuit that the video was staged and did not represent the true capabilities of the car.”
https://www.npr.org/2025/07/14/nx-s1-5462851/tesla-lawsuit-a...
https://www.tesla.com/support/autopilot - section “How long does Autopilot suspension last?”
It isn't ownership of the device disables things against you. That's licensing at best.
Yes, although courts do this all the time. Even if you believe this as solely manufacturer error, there are precedents. Consider General Motors ignition switch recalls. This affected 800k vehicles and resulted in 124 deaths.
> As part of the Deferred Prosecution Agreement, GM agreed to forfeit $900 million to the United States.[4][51] GM gave $600 million in compensation to surviving victims of accidents caused by faulty ignition switches
So about $5m per death, and 300m to the government. This seems excessive for one death, even if you believe Tesla was completely at fault. And the fact that this is the only such case (?) since 2019, seems like the fault isn't really on the manufacturer side.
https://en.wikipedia.org/wiki/General_Motors_ignition_switch...
If you intentionally give the feature a deceptive name like "autopilot", and then customers rely on that deceptive name to take their eyes off the road, then you have to pay hundreds of millions per death.
Makes sense to me.
In practice, they are not, because the fine print always clarifies that the feature works only under specific conditions and that the driver remains responsible. Tesla's Autopilot and FSD come with the same kind of disclaimers. The underlying principle is the same.
They could have named it "adaptive cruise control with assisted lane-keeping".
Instead their customers are intentionally led to believe it's as safe and autonomous as an airliner's autopilot.
Disclaimers don't compensate for a deceptive name, endless false promises and nonstop marketing hype.
https://teslanorth.com/2025/03/28/teslas-full-self-driving-s...
Wait what? What activism is the judge doing here? The jury is the one that comes up with the verdict and damage award, no?
What's a "true autopilot"? In airplanes, autopilot systems traditionally keep heading, altitude, and speed, but pilots are still required to monitor and take over when necessary. It's not hands-off or fully autonomous.
I would argue you are creating a definition of "autopilot" that most people do not agree with.
A car is another thing entirely because the general population's definition of "autopilot" does come into play and sometimes without proper training or education. I can just go rent a tesla right now.
...Um... You did get pilots hopping into a 737 MAX, getting training that barely mentioned an automated anti-stall and flight envelope management system called MCAS that eventually flew several planeloads of people into the ground. That was managements idea too, btw.
> A navigation mechanism, as on an aircraft, that automatically maintains a preset course.
Applies here. As far as I can tell the system did do exactly that. But the details of the actual event are unclear (I've heard parked car but also just hitting the back of another car?)
And by the way what is true autopilot? Is the average joe a 787 pilot who's also autopilot master?
Funny that pretty much every car ships with autosteer now. Ones I've used didn't seem to have much warnings, explanations, disclaimers or agreements that pundits here assume it should.
It's an entirely different domain.
Let me stop you right there. That's not how damages work.
Damages have two goals: compensate victims, and dissuade offenders from repeating the same mistakes. The latter involves punishments that discourage repeat offenses.
That's where these high values come from. They are intended to force the likes of Tesla to not ignore the lives they are ending due to their failures.
If damages were low, the likes of Tesla would do absolutely nothing and absorb them as operational expenses, and continue to cause deaths claiming they are unavoidable.
Once the likes of Tesla are forced to pay significant volumes of cash in damages, they suddenly find motives to take their design problems seriously.
There are ways to mitigate this, such as forcing the government to use these revenues in a way that is relevant to the issue at hand, i.e. creating safety jobs, strengthening control authorities, or something else.
You could also say that the amount is insignificant, but that could of course change with every lawsuit, and it of course accumulates. Or one could speculate that the interests are not really monetarily aligned at all (e.g. prisons), or that the judicial system is independent enough to stop propagation of these incentives. I think it is still needed to consider and try to controlledly align these motives between the relevant actors.
Let me stop you right there. Just the compensatory damages were 129 million. And most of that was charged to the driver, no corporate boost there.
And if you want to draw parallels with individuals, an individual driver's license would be automatically suspended and revoked when found at fault for manslaughter. Would you propose a minimum 1~3 year ban on autopilot-by-Tesla within the US, instead?
> The case also included startling charges by lawyers for the family of the deceased, 22-year-old, Naibel Benavides Leon, and for her injured boyfriend, Dillon Angulo. They claimed Tesla either hid or lost key evidence, including data and video recorded seconds before the accident.
> Tesla has previously faced criticism that it is slow to cough up crucial data by relatives of other victims in Tesla crashes, accusations that the car company has denied. In this case, the plaintiffs showed Tesla had the evidence all along, despite its repeated denials, by hiring a forensic data expert who dug it up. Tesla said it made a mistake after being shown the evidence and honestly hadn’t thought it was there.
-https://lasvegassun.com/news/2025/aug/01/jury-orders-tesla-t...
Nothing enrages a judge faster then an attempt to conceal evidence that a court has ordered be turned over during discovery. If this is then I suspect the punitive damages have to do as much about disregard to the legal process as it is the case itself.
It’s not a class action lawsuit. If they want their cash they should sue too. That’s how our system works.
Using the tesla numbers you'd get somewhere between 43 and 129 million dollars personally, plus a share of the 200 million. And your share of the 200 million would probably be a lot higher than someone merely defrauded. And the 200 million would probably get a lot bigger in a proper class action.
If you want numbers for your theoretical you'll have to specify how much was compensatory and how much was punitive.
But the punitive damages at $200 million are appropriate — it's what the jury thought would be appropriate to discourage Tesla's behaviors.
Not only we can, but it also done routinely. For example, see this Practical Engineering video: https://www.youtube.com/watch?v=xQbaVdge7kU
I'm also a big proponent of exponential backoff for repeat offenders.
There was no "329 million fine"
There was a (a) 59 million compensatory damages award to the representative of the estate of the deceased and (b) 70 million compensatory damages award to her boyfriend who survived
The punitive damages were likley the result of Tesla's misconduct in deliberately concealing evidence, not its percentage of fault in causing the accident
HN front page: https://www.hackerneue.com/item?id=44787780
Why would Tesla conceal evidence. That question is left as one for the reader
Indeed, the HN commenter missed several things
> A Tesla owner named George McGee was driving his Model S electric sedan while using the company’s Enhanced Autopilot, a partially automated driving system.
> While driving, McGee dropped his mobile phone that he was using and scrambled to pick it up. He said during the trial that he believed Enhanced Autopilot would brake if an obstacle was in the way. His Model S accelerated through an intersection at just over 60 miles per hour, hitting a nearby empty parked car and its owners, who were standing on the other side of their vehicle.
On one hand I don't think you can apply a price to a human life, but on the other 329 million feels too high, especially since Tesla is only partially to blame, it wasn't FSD, and the driver wasn't using the system correctly. Had the system been being used correctly and Tesla was assigned more of the blame, would this be a 1 billion dollar case? This doesn't hold up logically unless I'm missing something, certainly the victim wouldn't be getting fined 329 million if it was decided to be his fault for not looking at the road