Speaking as someone in tech talking to others in tech I'm constantly surprised how optimistic people are about the timetable for fully self-driving cars. Honestly, I think we're still 20+ years away.
Here's another thing to consider: people act differently when its a machine vs a person they're affecting. There are countless examples of this eg:
- I live in a building with a doorman. The main purpose of a doorman is to be a person to prevent criminal action from starting. After all a person can be fooled, distracted or bribed but people seem less inclined to mess with an unpredictable person vs, say, an unmanned security system on a similar building.
- Forget humans, this applies to animals. It's well-documented that just having a dog (or even saying you do) reduces the likelihood you'll get burgled. Thieves will generally pick easier, more predictable prey.
- ... and this brings us back to cars. Reading some of the coverage of Waymo, Uber, Tesla, etc you see that other drivers act in ways around a car they realize has no human driver than they would if it were a human-driven car. Cutting it off, messing with it, etc.
How exactly does an autonomous car deal with that aspect of human nature?
> as someone in tech talking to others in tech I'm constantly surprised how optimistic people are
Me too. I can't explain it. Here's one point I try to make (in vain): When people imagine such tech, the level of "just-works" is very high, almost sci-fi high. The car would would perform on a super-human level, making near-perfect split-second life-and-death decisions. There's very little room for software/hardware failures. Where do you, as someone in tech, see anything consumer facing that is even close to that level of polishness? It's <current year>, and I still have to restart my Firefox browser every few hours because it gets bloated [0]. We haven't figured out bluetooth yet [1]. Heck, state-of-the-art CAPTCHAs are using street-signs [2].
Of course these are tongue-in-cheek anecdotes, but the points stands, that it seems we just haven't figured out how to make complex code work very well yet.
Software for consumer computing hasn't chosen bug free as a path. Because you can build software like NASA does, but you have to pay money like NASA does, and develop slowly like NASA does.
Sadly NASA does not pay as well as you think it does, at least from my anecdotal experience (worked with NASA and DARPA folks that left the gov sector for better pay.) You'll get better pay at a tech company in a major US city.
I think that GP is commenting on the overall cost of software at NASA opposed to individual salaries. At that level, software/hardware bugs will kill people so testing/safety is paramount.
Building non-safety-critical software is cheap when compared to building software that has been verified to virtually never fail.
Add country to the equation, and you can easily add another 20-50 years into the already 20+ you mention.
Take India, for example.
-- The infrastructure is poor.
-- Drivers ignore rules to suit their convenience.
-- A lot of people depend on driving for their livelihood. Union minister, Nitin Gadkari, has already made it clear that they will not let tech that affects the livelihood of so many people come in and take jobs. Most governments that rely on the vote bank of poorer sections of the society will not dare to move against this mandate.
-- The preferred and most popular mode of transportation is still two-wheelers. I don't see anyone building self-driving two bikes. That rules out almost half of the Indian population's transportation
I don't think India will be ready for another 50 years to have fully self-driving vehicles.
The poor infrastructure could work to their advantage, they could build it to suit new applications. In the West we have a huge legacy infrastructure that is hard to deal with. It’s somewhat similar to the way Africa skipped traditional telecommunications and went straight to mobile.
Conjusted cities like Bangalore and reeling under bad city planning. Adding a self-driving vehicle is not going to be of much help.
The govt does not have money to spend on road repairs. Corruption at the contract level is rampant. Putting 1+1, I doubt India will use it to their advantage.
China on the other hand has horrible traffic problems and no where to build new roads in its most dense cities. They also have an autocratic government that can say “no more manual driven cars on or inside the 4th ring road” to optimize road usage.
In my city in Europe, my average speed is 23 km/h but I rarely go less than 40, more often 60-70 - meaning that I wait a lot, and I'm alone in the car. Yes, shared autonomous cars would help my city a lot.
Maybe TomMarius means that autonomous vehicles could negotiate intersections without stopping. No traffic lights required, only rules of precedence. It's going to be a slow procession but maybe the average speed would increase.
I'm not sure how that would mix with pedestrian crossings. In some dense areas there would be a non stop flow of people walking across the street and cars wouldn't be able to move.
The cars can work together to reduce riding distances, move faster in tandem when going in a green light, and so on. They virtually make the road much wider by eliminating human inefficiency in driving.
Driving is a social activity. You’re interacting with other humans, with only their behavior (and maybe the occasional gesture) from which to infer intent.
From that perspective, think of what a computer has to “know” in order to get along? It’s way more complex than just following the road and not hitting things.
Maybe the best chance for a fully autonomous driving experience is some city that takes the next step beyond congestion pricing and reserves their dense core for autonomous vehicles. If computers only have to deal with other computers, and pedestrians, that seems a much more tractable problem.
they'll be on Motorways first, there's just so many less edge cases.
Not from what I've seen. Heavy construction, sudden gridlock with huge lane speed differentials, multi-vehicle collisions, road debris, heavy trucks, severe inclement weather, high speeds, gore points; these things combine to make for an extremely challenging and dangerous driving environment.
An autonomous car driving through a residential neighbourhood is moving slowly and can stop if a child chases a basketball onto the road. On the freeway, when you're going over 100km/h, the car simply can't stop if one from the next lane spins out in front of you.
> On the freeway, when you're going over 100km/h, the car simply can't stop if one from the next lane spins out in front of you.
Neither can you.
In fact, an autonomous car is much more likely to be able to react fast and well to save lives and property in that situation than a human.
Self-driving cars don't need to be absolutely 100% perfect and accident-free to be worthwhile. They just need to be better than us, and frankly, we're pretty lousy at operating tons of metal flying along the road at over 100km/h. Hell, we're not even that great at it when going 50 km/h.
Humans have the advantage of understanding context enough to (try to) avoid being in those situations though. You may not be able to avoid a car suddenly spinning out right in front of the you, but you may have realized 7 minutes ago the driver kept drifting outside his lane and backed off to give him a bigger gap; getting autonomous cars up to that level would require strong AI tech
Not at all. That level of pattern recognition is much easier than the actual hard problems involved in getting driverless cars working.
The hard problems, right now, are things like recognizing where the road is when it (and everything beside it) is under an inch and a half of snow, or where you're supposed to drive when there's construction and the lanes are shifted (assuming no new standard means are developed to indicate this), or how to recognize, and what to do, when the road you want to take is under a foot of rushing water—or washed out altogether.
This is the thing that continues to baffle me. There are genuine, hard problems between where we are now and a nation of completely autonomous cars. But every time there's a discussion about it, even on a site like Hacker News where people should have the background to recognize the difference, most of the problems people bring up are the kind that self-driving cars either are now or can fairly easily become good at dealing with.
People say this, but I suspect that "epsilon better" is not enough. Personally, I think that to give up my (fallble) agency, I would require an order of magnitude better, perhaps two.
My dream - which I'm sure I'll never live long enough to see it come to fruition - is that manual driving is BANNED.
That's right. Some day I hope, and believe, that the average consumer actually won't be able to manually operate their vehicle (edge cases aside). I love to drive but there are some true idiots that we share the road with. I'd gladly give up my right to drive, if it meant those idiots also lost theirs. You will require some specialized license to do so, which would require passing some stringent checks.
So that's how I can see it being dealt with in the future. But the early stages? That's a very interesting question and one which I don't have any idea for.
I think it would be just as easy for the law to turn the other way, and say you must have a driver competent to take control over the autonomous car at all times for it to be allowed on the road. This negates the pipe dreams of empty cars appearing to scuttle people away, sleeping on your way to work, taking one blacked out drunk, etc., but is perfectly in the context of safety first auto regulations. Road laws are made to favor safety above all else, its why highways still are capped at 70mph across most of the U.S. despite cars being much safer at speed than when these highways were built in the 1950s.
While I fear you are correct, I enjoy driving. Taking away the hobby I enjoy, and my family has enjoied for generations will not be done with my consent, nor the consent of many of the car clubs, often populated by thoze who own the companies who will enact such measures. I respect your vision, but you have a fight ahead. I think an alternate type of transportation than cars is the future. Most current emissions are heat and brake dust, and autonomous cars are only marganilly better there.
That must have been what people thought in 1994/1995 at the final presentation of PROMETHEUS:
"The first was the final presentation of the PROMETHEUS project in October 1994 on Autoroute 1 near the airport Charles-de-Gaulle in Paris. With guests on board, the twin vehicles of Daimler-Benz (VITA-2) and UniBwM (VaMP) drove more than 1,000 kilometres (620 mi) on the three-lane highway in standard heavy traffic at speeds up to 130 kilometres per hour (81 mph). Driving in free lanes, convoy driving with distance keeping depending on speed, and lane changes left and right with autonomous passing have been demonstrated; the latter required interpreting the road scene also in the rear hemisphere. Two cameras with different focal lengths for each hemisphere have been used in parallel for this purpose."[1]
How could you not expect all cars to drive autonomously in the year 2000 at the latest if we were already so far? Everything else was just a bit of doing, right? Unfortunately, sometimes we don't know what we don't know, so things may seem "just around the corner" when they are many years away.
> How exactly does an autonomous car deal with that aspect of human nature?
Sure. So the way that they could handle it is to drive safely enough that it doesn't matter at all if this happens or not.
Having the car in front of you hit the brakes really hard is of no consequence if you were maintaining a safe follow distance to begin with. Let the human slam on the brakes. The self driving car will have been driving the speed limit, at a safe distance in the first place.
The other week I drove a hire car with "intelligent cruise control" which, as you'd expect, tried to keep a safe following distance from the car in front.
The problem was, the cruise control following distance was not the ~2 seconds recommended by my country's highway code, but more like ~3.5 seconds. Other drivers would interpret that as an invitation to merge, and my car would slow down further to restore its ~3.5 second following distance.
I don't think you can drive in such a way that other drivers' behaviour is of no consequence.
There's plenty of cities in the U.S. where the local driving culture is so aggressive, if you are unwilling to cut someone off you just aren't going to be able to merge at all or get through the intersection, and that will undermine the brand if people are supposed to be commuting these cars through rush hour to work and it takes twice as long because the cars are too passive.
It's common to merge on a highway under speed, knowing you will be hammering the throttle and the car in the right lane behind you will slow down, but could self driving cars make that same judgement to merge or stay in the onramp lane waiting for the nonexistant opening? And if they do make that judgement and other cars do let themselves be cut off, what happens in the 1/1000 time that this maneuver results in an accident, and the self driving company is brought to court? These companies are going to be held responsible for these judgement calls that human drivers make every day. There's a highway near me where to merge onto the next interchange you have to cross four lanes of traffic in 0.25-0.5 miles, could a self driving car fight for that opening during rush hour? I don't think there would be a safe way to do it unless every single car on the road is a self driving car, and that's just not possible when most drivers can barely afford $2000 cars from 30 years ago.
I originally typed out a reply saying that if I bought a sure-to-be-expensive self-driving car, I would be disappointed if it made my trip slower.
But then I thought that maybe if I wasn't the one fighting to get into lanes I wouldn't care that my trip ended up a couple of minutes longer, I would just relax and read a book and not even pay attention while all that was happening.
Not sure if that would work in reality though. In my city if a car that was that passive was recognized, people would just take total advantage of it and never let it in, cut it off, etc. I think that I would still get annoyed watching this as a passenger.
The car would have to make sudden stops to avoid accidents every time someone cut it off as well, which would be unpleasant for a passenger.
My firm belief is that we will have to design roads for self-driving cars before self-driving cars will be the majority. Right now roads are approximately designed for human sensors. We need roads designed for machine sensors. If we started today, and during every repaving we built roads for machine sensors, we'd have the roadways where most miles are driven repaved in about a decade. I don't know exactly what it will look like. It might looks like RFIDs embedded in roads, it might looks like self driving car lanes (similar to carpool lanes of today), etc. Until this process begins of redesigning roads for machine sensors, I'm not really optimistic for self-driving cars. I think you're right about the 20+ mark.
I agree with you and I doubt we'll see self-driving cars before we see self-driving trains (like trams in cities).
The problem space is vastly more simpler for self-driving trains; trains move on rails, rails do not move, trains/trams cannot move sideways at will, stations do not move, and so on.
I thought it might be possible to effectivize intracity rail traffic via an on-demand mini-tram service, given self-driving trams and computer-optimized routing. Kind of "packet routing", if packets were trams.
Speaking as someone in tech talking to others in tech I'm constantly surprised how optimistic people are about the timetable for fully self-driving cars. Honestly, I think we're still 20+ years away.
Here's another thing to consider: people act differently when its a machine vs a person they're affecting. There are countless examples of this eg:
- I live in a building with a doorman. The main purpose of a doorman is to be a person to prevent criminal action from starting. After all a person can be fooled, distracted or bribed but people seem less inclined to mess with an unpredictable person vs, say, an unmanned security system on a similar building.
- Forget humans, this applies to animals. It's well-documented that just having a dog (or even saying you do) reduces the likelihood you'll get burgled. Thieves will generally pick easier, more predictable prey.
- ... and this brings us back to cars. Reading some of the coverage of Waymo, Uber, Tesla, etc you see that other drivers act in ways around a car they realize has no human driver than they would if it were a human-driven car. Cutting it off, messing with it, etc.
How exactly does an autonomous car deal with that aspect of human nature?