Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Uber exec in March: “we shouldn’t be hitting things every 15,000 miles” (arstechnica.com)
83 points by ProAm on Dec 11, 2018 | hide | past | favorite | 83 comments


It really never ceases to amaze me the level of protections we have in this country for corporations, and the amount of personal insulation that the members of those corporations receive from their actions. If I had hit someone with my car like that I would be sitting in jail right now. I'm not arguing that we shouldn't push things for the sake of science, but the level of willful negligence displayed by tech companies on every front these days is just staggering because of it. It's the default attitude. You see it an absolutely everything, from illegally dumping millions of scooters everywhere in public, to arbitraging the subversion of well established social contracts that in some cases took decades or centuries of literal blood and sweat to work out, like labor rights (Uber) or urban zoning (AirBnB) etc.


>>"If I had hit someone with my car like that I would be sitting in jail right now."

This is tangential to your point, but in many (I'd guess most) parts of this country, that is not true. Unless you are under the influence, distracted driver's frequently kill others, and the cops don't press charges, chalking it up to "mistake".


>This is tangential to your point, but in many (I'd guess most) parts of this country, that is not true. Unless you are under the influence, distracted driver's frequently kill others, and the cops don't press charges, chalking it up to "mistake".

Wasn't it proven that their system recognized the pedestrian and continued driving? That means business logic killed a person. Somebody, somewhere within Uber made the calculation that more miles of uninterrupted data was more valuable to them than public safety. If a jury could be convinced that I, as an individual, recognized a person and chose to purposely continue driving to save myself time, that's manslaughter.


Computer systems aren't people. You're anthropomorphizing the self-driving car. We don't hold cars morally accountable. The software in the car may "recognize a pedestrian", but that's not the same as a person "recognizing a pedestrian". A person would be liable for continuing to drive under those circumstances because they are thought to be able to choose otherwise[1]. The software in the car could not do otherwise. It is not a person, but an object programmed by a person. So whereas a person is expected to "perceive information" -> "act reasonably", a machine by definition can only "receive input" -> "process as instructed".

So the question here has to be whether the creator of the machine acted reasonably in designing the machine as they did.

[1] This may be a fiction, but one at the heart of how we see each other morally accountable agents, and one that underlies all legal systems.


> So the question here has to be whether the creator of the machine acted reasonably in designing the machine as they did.

I'm wary of this line of reasoning. No programmer outside of a terrorist organization will ever intentionally design a machine poorly. If a civil engineer -- specifically a Professional Engineer in the U.S. -- signs off on an unsafe bridge, and that bridge later collapses and kills someone, you can bet your arse they're getting sued and possibly charged with a crime. Unfortunately, we have no real equivalent to this in software engineering/programming -- the concept of a Professional Engineer doesn't exist, and so no one can be held accountable for faulty designs.

Yes, this would bring software development to a grinding halt. I for one don't necessarily consider that a bad thing. It doesn't have to be for every project too -- after all, you don't need a PE to sign off before you build a shed in your backyard. Likewise, a web browser or video game probably ain't gonna need that level of accountability. But for a self-driving car? If someone isn't willing to be held accountable for the software failing, I personally feel the system shouldn't be allowed on the road.


You don't need professional regulation to have accountability. Every day in America, people who negligently cause injury to others are held accountable by the tort system.

If a self-driving car hurts someone, the people who created and operated it can be hauled into court and forced to justify their decisions. Jurors will hear the testimony of experts in the relevant fields and then decide who should bear the burden of the injury.


> If a jury could be convinced that I, as an individual, recognized a person and purposely continued driving, that's manslaughter.

Seems more like murder, of either the depraved heart or plain intentional kind.


If you hit something every fifteen thousand miles, the cops will stop chalking it up to "mistake" pretty quick.


No, they really wouldn't. About yearly for an individual? How many of these incidents would the cops even be involved? I doubt most instances of vehicles being damaged the cops ever get more involved than a report filed days later to satisfy insurance.


Even this isn't true, though. 15,000 miles is 15 months of driving. I would bet a lot of people hit "something" that often, with a more serious incident every 5-10 years.


> 15,000 miles is 15 months of driving

No, it's more like one year or less[1]

> I would bet a lot of people hit "something" that often

That's quite a claim... just from personal experience, I'd wager this is a very wrong assumption. Majority of people go most of their lives, if not their entire lives without hitting "something". More serious incidents are even more rare.

[1] https://www.fhwa.dot.gov/ohim/onh00/bar8.htm


I'm pretty sure that either having an accident every year or never having one in a lifetime is unusual. I base that on the following back of the envelope analysis.

My collision insurance is $270 a year, which I think is fairly reasonable for NY. I was involved in an accident a couple years ago where the corner of my car was smashed in a bit, pretty much the definition of a "fender bender". It cost about $3500 to fix, and if I had been at fault, I would have had a $1000 deductible. Repairs could have been $2500 if I was more lucky in exactly what was bent. So, that suggests my insurance company breaks even if I go about 5-10 years between at-fault accidents, or 75,000-150,000 miles. In the aggregate, I assume they do estimate risk pretty accurately, so I expect that 5-10 years is normal. Whenever people mention what they are paying for insurance, it usually seems to be higher than mine.


> Whenever people mention what they are paying for insurance, it usually seems to be higher than mine

That's almost surely because of your $1000 deductible. That's very high... mine is $100 for just one reference point.

> So, that suggests my insurance company breaks even if I go about 5-10 years between at-fault accidents

That would be a "break even" point, but insurance companies attempt to turn a profit, so this suggests they expect majority of their customers to not have an event every 5-10 years, but on average a longer interval. They also account for non-traffic accidents, such as vandalism, broken window, theft, etc... none of which involve the driver being "at fault" or hitting something, which was the original assertion by the GP.


The amount of the deductible was just an example, but irrelevant to my point because it doesn't change the end result significantly. Why would it, given the economics? For a $100 deductible, I'm quoted $593/year. $3400/$593 = 5.7 years, so the answer is basically the same.

The non-collisions aren't relevant because I quoted the price of my collision premium only and not comprehensive or liability. That was on purpose, to be consistent with the topic of "hitting things" and with the sort of fender bender I was talking about.

Profit is not very large relative to premiums, so "breaking even" is a reasonable simplification. For example Progressive (not my insurer, just the 10-K I was able to find the quickest) made about 6%.

Not at fault accidents are some other companies "at fault" accidents, so counting them is double counting and would inflate my estimate.

All I'm saying is that, without false precision, evidence suggests people hit things every 5-10 years on average. Having an accident every year or never in a lifetime is not normal.


From your source. Combined average is 13,476 miles per year. If you look at only men, or more specifically only men from 20-64 do you get to 15,000 miles in less than an average year of driving.


Nah, your insurance would go up, unless you started killing people.

Besides, you basically just described most teenage/young adult or inexperienced drivers.


> If I had hit someone with my car like that I would be sitting in jail right now."

The best way to kill someone in America is with your car. As long as you're not drunk and can afford a lawyer you're going to get a slap on the wrist.


If you ever kill anyone with your car, your post will be used as evidence in court against you. I don't think it was too smart to post this, honestly.


Stating a hypothetical is not evidence of a crime.


In many high profile murder cases, you'll hear details about what they were searching for or watching on Youtube before that.


Just because the media reports it doesn't necessarily mean the court will consider it admissible evidence, but even so, looking up ways to kill people online is completely different from a tongue-in-cheek remark about how the legal system treats vehicular deaths.


I think you can argue the idea was out there.

http://freakonomics.com/podcast/the-perfect-crime-a-new-frea...


I have never driven a car.


They haven't even charged the actual driver in that car, the one watching Hulu on their phone.

It was only natural that the law-insulating aura that operating a vehicle provides in the US is extended to "self"-driving vehicles.


It was deemed the pedestrian was also partially at fault. She was crossing a road, at night in a dimly lit portion of the road, no where near an intersection and with seemingly no regard of awareness of oncoming traffic in dark colored clothes which is the opposite of pedestrian safety for walking/running at night. If you watch the video, even a human probably wouldn't have hit the brakes in time with average reaction speeds, the person was only visible for 1.5 seconds before impact. I believe the term is contributory negligence, when both parties are making mistakes that lead to the accident.

Grounding the fleet would have saved her life, but its arguable any other driver could have hit her and only applied the brakes for a short period of time, not enough to stop or slow to a less lethal speed before impact.


You're watching a compressed to shit video of a cheap dashcam. Human eyes have literally thousandfold the dynamic range of the pinhole sensor in that thing.

Sorry, but this is a human life we are talking about here.


Tax reform proposal: the limited liability shield is incredibly inexpensive, bordering on free ($600/yr here in MA). On the other hand, it’s insanely valuable. Let’s generate tax revenue by charging fairly for this incredible gift to business operators. In lieu of system wide income taxes, we can charge a percentage of revenue in return for limited liability protection. Don’t like the tax? Chance it, and shareholders can take legal responsibility for the actions of their organizations.


Well let's call it what it is. You want a revenue tax on corporations. The obvious point is: high turnover businesses that make profits through volume would pay a disproportionate tax - a tax which ostensibly is to do with liability insurance, but doesn't correlate with the things that are likely to incur liability.


I don’t think we should lean on the negative impacts progress has on social contracts, since this is one of the primary arguments the alt-right uses to justify their position. Social contracts always change, especially so when a country is composed of an ever increasing portion of immigrants, asylum seekers, and migrants. Despite the blood, sweat, and tears, being open to social change ensures society doesn’t get locked into a local minimum via well intended but ultimately misplaced concern.


Do you mean to say that a change to the social contract is necessary where it could be acceptable in some cases for someone to get killed by a car? I am just trying to understand what you mean by this comment.


I don't think they're saying that change is bad but that THIS change is bad - a regression for a hard fought thing. Characterizing all such critiques as the same reduces the toolset we have to answer things like the alt right.


I honest do not understand your point. What is confusing about corporate vs individual liability? The corporation (Uber) will be held accountable at a civil level and employees can and do get held accountable at criminal levels (and that is also possible at this point). All this "let's frog-march the execs" talk because "if I did X I'd be in jail now" stuff is tiresome. I honestly do not think you understand the law well enough. Perhaps you should be upset over the enforcement of the law (which has been repeatedly and intentionally gutted in multiple areas of regulation by a major political party in the US). But that is a social and cultural problem, not a problem with the legal entity of a corporation.


Potato Potato.

The law is written such that the corporations can do more with less accountability than people.

The law is enforced such that the corporations can do more with less accountability than people.

It seems like a very small difference at this point.

It just seems that corporations can commit crimes with far less disincentives (i.e. punishments) than people.


Actually, I'd add that this is more related to the importance of the individual or company as a node in the country's economic network.

The larger the impact on the network in terms of either magnitude or volume of souls owned, the greater the ability to get away with actions that forward individual gain over general good.

Shareholders and directors of many corporations do get prosecuted.

The disincentive for the government is always going to be the effect of removing an important node from a network. Especially when the government and people gain from the stability of the network, even if certain negatives happen.

Not to say the way we treat corporations isn't really horrible. It is. Adam Smith would roll in his grave if he knew we were justifying our corporatocracy on his writings.

I'm simply pointing out that large nodes are always treated differently because it is logical to do so. If 3 people do something and you take them out, the economy keeps up exactly the same.

But take out the head of a company that owns... er... hires 100,000 souls? Or maybe the company only hires 300 souls, but they are developing nuclear bombs, or maybe developing a new gene editing tech, or green fuel? Then we have magnitude instead of volume. You got yourself political backlash if you remove these nodes.


that's called cowardice and is used as justification all the time, but that doesn't make it right or even rational. in fact, there is no reason to believe the economy won't be stronger through the failure of corporations and holding directors and exectives accountable.

proponents of the banking bailout used this same rationale, that allowing big banks to fail would lead to economic collapse, but that was only the rationale, not the reason for that course of action. bernanke, paulson & geithner were instead worried about blowback to their careers and economic futures, because it would be their friends in high places taking the hit.

if some large banks had failed, most certainly other economic actors would have swooped in to pick up the pieces and resume business. these were lucrative, real assets after all. the danger was not in economic collapse, but in how long the rebound would take, and more pointedly, the effect on bernanke and friends' careers.

we retard progress and innovation if we don't hold institutions accountable and allow them to fail. bankruptcy is the mechanism we use to allow instutions (and people) to bounce back quickly from failure, so they can use their learnings to try another tactic.


My comment was on why governments don't intervene as quickly when detecting bad behavior from important nodes. This applies to the US as to the other countries. I seem to remember a quote from some Russian book I read that went along the lines of "we crucify small thieves and celebrate the big ones".

Your comment on the bailout is something I agree with. Using a non-intervention consideration to justify intervention... has nothing to do with the validity of the non-intervention consideration.

I find it a very odd tangent to use as a reply, since it sounds like a rebuttal but doesn't address the point. It does add to it, in a way, I guess.


> corporations can commit crimes with far less disincentives (i.e. punishments) than people

And that's exactly by design. Wealthy people own corporations and pay lobbists to stay above the law.


Lobbyists, by definition, are used to try to influence legislation (i.e. shape laws), not avoid prosecution once laws have been broken.


> employees can and do get held accountable at criminal levels (and that is also possible at this point)

What? American employees never get held accountable for white collar crime in startups

Every startup I was a part of in America was engaged in fraud, and the co-founders would laugh about it in front of employees


Did you report it? If not, do you understand you are complicit in the problem?


That's naive to think it would change anything. The investors certainly don't want to know and the police don't care. At least in USA


Individual liability carries both criminal and civil penalties, with the criminal prosecution generally making up the bulk of the work.

Corporate liability only carries civil penalties, even when the corporation's behavior is well over the line of a constructive mens rea. A harmed individual is left trying to prove an essentially novel case against a well-funded company - compared to the criminal system where most defendants are presumed guilty and eagerly prosecuted by the state.

If an individual modified their car to develop self-driving software, and the car hit and killed someone, that person would likely be sitting in jail for gross negligence. Uber's insurance carrier will pay out, push Uber to do some "internal reforms", and the actual people responsible for the poor culture won't be affected. That is the double standard people are bemoaning.


You have missed the distinction between not understanding the law, and understanding it perfectly well and disagreeing with it.


Sometimes we have to have trust in private entities in order to achieve technical advancement. Also we don't want to discourage it for others


We certainly do want to discourage willful negligence.


Do you understand the concept of limited liability?


I believe this is precisely the concept under critique.


>Do you understand the concept of limited liability?

I sure do. But I didn't when I was younger, growing up as a member of the lower class never exposed to business. That's why I find it so appalling. The concept of being able to run a business into the ground, maxing out debt, ignoring all reality while paying yourself a paycheck, then just walking away from it in bankruptcy and doing it again is just mind blowing to the blue collar, working class mentality. There's a whole class of people in our society who may or may not provide any benefit, but nevertheless get by making vast sums of money and working in prestigious high paying jobs who have no concept whatsoever of what it means to work for a living is like. The whole system is completely rigged against anyone who doesn't meet that threshold, and I'm increasingly convinced that this is the entire basis of capitalism.


Limited liability is as important to the little guy as anyone else. If you want to open your own store, start your own carpentry business, or run your own taxi, then you should form some kind of business entity with limited liability, such as an LLC. That way, if it turns out that a product you sold is defective and hurts someone, if a staircase you built collapses on someone, or if you get into a car accident, you're only personally on the hook for whatever you invested into the business, not everything you're worth.


I'm just a dumb worker, but I would like business owners to be forced to do their absolute best to avoid making dangerous collapsing staircases.

I don't know how it would work, I guess they'd have to pay for more insurance.

But it seems that the way it is now, everyone has to pay for their ability to move fast and break things.

Just throwing ideas around, my economics knowledge is rudimentary at best.


We all benefit from innovation and risk-taking. The advances and comforts of modern society would be impossible without the risk imposed by factories, machinery, and so forth.

If starting or joining a business venture meant putting everything you're worth on the line in the case that one employee is negligent, then no sane person would do it.

When limited liability is abused, such as when a corporations are undercapitalized[1], then courts can "pierce the corporate veil" and hold its owners personally liable for judgments against the company.

[1] Undercapitalization is when not enough assets are kept in the business to pay for its debts or reasonably expected liabilities.


They clearly don't, and obviously don't understand why it exists. Nothing interesting would be possible without it. Nobody on Earth would be a C-level executive if they had criminal liability for all their employees. I sometimes think that people that want to end it have never been in charge of anything nontrivial and have no idea how terrible the consequences of ending it would be.


>They clearly don't, and obviously don't understand why it exists. Nothing interesting would be possible without it. Nobody on Earth would be a C-level executive if they had criminal liability for all their employees. I sometimes think that people that want to end it have never been in charge of anything nontrivial and have no idea how terrible the consequences of ending it would be.

Well perhaps those things should be illegal or made untenable. If you are publicizing risks in such a way that you are privately profiting from the public endangerment, you really don't have any moral or legal right to that money. Yet here we are again in a situation where people are killed, an executive says "we're sorry", and they continue pocketing the money.


How do you propose we address the problems we face?


Per my earlier comment (https://news.ycombinator.com/item?id=18657024) regulating and enforcing regulations. This already exists, and when not deregulated or captured, works well. There is no magic system that removes the human element to this, and the cost here is a vigilant population that holds elected officials responsible and doesn't allow deregulation or regulatory capture. There is no possible solution to this that works absent a population that doesn't consider it important. This is exemplified in jurisdictions where police turn a blind eye because so-and-so is the son of so-and-so and it's tolerated. Criminal liability (vs limited liability and corporate regulation) for executives is just as weak and has large unintended consequences.


Sounds like they're "moving fast and breaking things"


in their case it is "moving fast and killing people"


> Miller also suggested that Uber might want to drastically scale back its testing program. "I suspect an 85% reduction in fleet size wouldn’t slow development," he wrote.

The vehicles create a firehose of data for the engineers/devs to deal with when there's large fleets. Things are ignored, and you have to dedicate more and more time to create good tools to filter through anomalies, incidents, and disengagements. With a smaller fleet, they'd see less problems come up but perhaps be able to diagnose and act on more of them.


And that is going to put a crimp in their IPO plans.

This sort of corporate malfeasance, pursuit of product in the face of a clear and present danger to the public, is the sort of thing plaintiff attorneys use to drain sad corporations of all their capital to operate.


Surely this is corporate manslaughter at the very least? Systematically ignoring such dangerous problems should result in jail time for someone.


Why ? There is no such legal concept. Manslaughter is criminal law and criminal law only applies to people.

In other words, either that woman watching hulu behind the wheel is responsible, or perhaps some other employee, but a person is.

The corporation is a legal fiction that was not driving any car for the same reason the tooth fairy wasn't riding a pink unicorn next to it.


Citizens United gave them the constitutional right of free speech granted to citizens. The problem is that we let the corporations have it both ways. There are no easy answers but if I am a corporation of 1 person and commit manslaughter do you think it would be handled the same as Uber?


That's an edge case. I would say no, because 1 person companies are not treated the same.

Generally to get this sort of treatment it needs to be an employee that fucks up (meaning not the owner) and the company needs to guarantee it'll take civil responsibility (meaning pay up damages).


If corporations don't have the responsibilities of people then they shouldn't have the privileges either.


In US they have way more privileges, if anything.


Look, there are 2 forces at work here in what we, as humans, seem to want:

People, when working as employees, seem to really protest the idea of being held responsible for anything at all. This has lead to laws that state that employees are almost never responsible for their actions during work time (NOT legal advice, but generally speaking: if you aren't actively sabotaging your employer and it isn't a crime, your employer is responsible. Even if you are extremely negligent)

(Large) corporations tend to be a lot easier to extract money from, and a lot more likely to play fair (ie. actually pay what they're convicted to pay) compared to humans. So even if an employee commits a crime during work hours, the employer takes "civil responsibility" (that's the part of the law where the principle "money (amount to be determined) can take the place of ANY legal conviction"). And criminal responsibility ? Frankly, usually, nobody cares, and employers are generally happy to just fire whoever it was on the spot.

And yes, this is generally true everywhere in the justice system. If the court can trust that you will do whatever it deems necessary, you will get a lot more flexibility out of them. That doesn't mean you can just do anything (American courts especially have historically shown that there are circumstances where they will bankrupt a company if the facts are bad enough, to the point that it's a bit of joke)

Given the above, of course it wouldn't be reasonable to extract heavy punishments against corporations when employees drive a truck through a kindergarten because they fell asleep at the wheel.

This was a woman that was watching Hulu on her cell phone behind the wheel of a car who, we can be pretty sure, was very clearly told to be at any point ready to take over the driving. And, I mean, we've all seen the video of her during that accident ... WTF. I am wondering why you would demand the company be punished ?

This is then further made problematic because actual punishments on companies tend to cause large job losses, large tax base losses, and may cause other companies to leave, further exacerbating the problem. So there's political reasons as well to not go too far.


> In other words, either that woman watching hulu behind the wheel is responsible, or perhaps some other employee, but a person is.

Right. If a group of people came together with a scheme that caused car accidents at the rate Uber does, they'd be held liable and they'd receive conspiracy charges.

Perhaps the same should happen to people who come up with such schemes behind the corporate veil.


Precisely - I think this falls to the classic problem of who to blame in accidents involving self-driving cars. Is the manufacturer at fault? The driver? Certainly statements like this suggest it could be argued in this case.


That's not true. Corporations are found criminally guilty in the US not infrequently. I'm not aware of any being convicted of manslaughter though.

There's even a specific crime in the UK of Corporate Homicide.


I've told so many people that I don't use Uber and why. It just goes in one ear, right out the other. Consumer society in the US has no sense of culpability. I don't mean that in the sense that I'm somehow superior. I'm involved in this too. "business as usual" has so much momentum.

It'd help if lawmakers backed up educated advocates, but the lawmakers are at this point bought by lobbies.


I think there's a sense of powerlessness in general. Nothing actually represents the people. It's a race to the bottom.

I vote with my wallet as well, but I don't think the majority of people even consider that.

Even if I'm not making a difference, at least I feel better about what I'm doing.


I expect that his recommendations are a good indication of how seriously Waymo takes safety.


That is why you don't run tests in a production environment


What is the average number of miles per accident for humans? Is it less than 15000? If yes, this is still good.


The average driver drives like 13,500 miles a year [1].

The average driver files a claim for collision once every 17.9 years [2].

That makes human drivers 16 times safer than Uber self-driving cars. This is concerning for people who think self-driving cars are right around the corner! Improved algorithms would need to avoid 94% of the collisions that self-driving cars currently get into in order to have the same failure rate as humans.

[1] https://www.fhwa.dot.gov/ohim/onh00/bar8.htm

[2] https://www.forbes.com/sites/moneybuilder/2011/07/27/how-man...


this is the thing about most of what we're doing with machine learning. It gets exponentially harder the closer you get to 100%; most of the things we use machine learning for where it works pretty well, 99% is really pretty great. 99% won't cut it for self-driving cars.

Which isn't to say that self driving cars won't get there, just that the fact that it looks like we are almost there doesn't mean as much as you'd think; we still have the really hard bits in front of us.


> That makes human drivers 16 times safer than Uber self-driving cars.

While that's a difference large enough to make wide scale deployment of those cars irresponsible, it is not large enough to make one pessimist on self-driving cars existing soon. It's the kind of difference that often enough go away with a few years of engineering.

And I expect those to be the worse stats for all the self-driving car companies around, since others seem to be doing things in a less "move fast and break things" way.


Many accidents have no claim filed


"The average driver" isn't our cutoff for allowing people to drive. Once the self-driving cars are safer than the most dangerous drivers that we allow, there's theoretically an advantage moving those people to self-driving cars.


"The average driver files a claim for collision once every 17.9 years"

That leaves open the question of how much damage that driver is at fault for on average.


Looks like there's still a lot of work left on that front. I was trying to get some context for the 15000 miles number.


AI should be safer than humans... not vice versa..


Citation desprately needed.

I would theorize the opposite.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: