[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
Autonomous Automobiles
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /g/ - Technology

Thread replies: 22
Thread images: 3
File: 1449739793280.jpg (84 KB, 960x640) Image search: [Google]
1449739793280.jpg
84 KB, 960x640
I am going to assume that everyone here knows it's coming. I could easily be wrong about the timetable, but I think it will be fairly common (not ubiquitous, but common) for cars to be able to drive themselves in most conditions under human supervision within ten years, and maybe twice that until people are eating, sleeping, and reading in the back seat of a robot. I think the former type of car will be technologically possible within five years, but it'll take a while for it to become common both because the price will need to come down and there are legal barriers that will need to be overcome.

I mostly want to discuss those legal barriers, but this can be a robot-car general.

I've heard people saying that firmware for autonomous cars should have to be government regulated and certified the way that aircraft software is, and I have a question about that: Why not just give a car a driving test, the way that we give people driving tests? Does it make sense to hold people to lower standards than computers? Why be super anal and perfectionist about the competence of a computer when any mentally unstable jackass who can parallel park for an instructor gets a license? The goal of maximizing road safety is certainly a good one, but I think that this is an extremely lopsided approach that stems from a fear of the unknown. Is there any reason that the standards shouldn't be the same for computers and people?

I think we should grant driver's licenses to individual firmware image/machine pairs the same way we grant them to people, using some combination of the same licensing and inspection frameworks that we already use. This way somebody who modifies their physical machine in a stupid way that causes it not to be able to drive itself properly loses the right to let it drive itself, and somebody who modifies their firmware in a way that doesn't make it less effective can still use it legally. People could easily use third party firmware, etc.

What do you think?
>>
Another reason to do it this way is that cars do not exist in permanently in the same state that they first leave the showroom in. Even more commonly than people make stupid modifications, they allow their cars to fall into disrepair. What happens when you push a meticulously certified firmware update to a fifteen year old car with broken sensors and a corroded electrical system?

Maybe you require firmware not to boot if sensors don't pass tests, and maybe you even require sensors to be redundant so that they can fail on the road without causing problems, but you're naive if you think people aren't going to be buying $10 dummy sensors on eBay rather than $120 genuine OEM ones to get their jalopies back on the road.
>>
>does it make sense to hold people to lower standards than computers?
Nice wording dipshit. Try:
>if we can make computers better at driving than people should we?


Also, we know that humans can adapt to different situations, we don't need to test that. A short computer program with a set of instructions could pass a driving test, the question is, can it adapt to every different scenario?
>>
>>53698305
>if we can make computers better at driving than people should we?
That is not a matter of "wording". This is a completely separate question. Whether we can and should make them better has no bearing on whether or not we should -require- them to be better. Why should a computer have to be better at something than a person, rather than simply "as good" in order to be allowed to do it instead of a person?

My driving test involved driving on city streets in traffic, not on a closed course. It was a short demonstration of a random and unpredictable (due to the random and unpredictable nature of traffic) subset of the skills required of a competent driver. No "short program" could pass such a test, and if that is true of any test then it is no more valid as a test for humans than it is as a test for computers.

It's true that a driving test could miss bugs and quirks, but in humans the same sort of testing can miss things like night blindness, alcoholism, schizophrenia, dementia, etc (all of which could potentially develop for the first time after the test, even if we did use tests that would catch them). That's not to mention that a large percentage of people forget or permanently disregard a huge percentage of traffic laws and safety guidelines immediately upon completing their tests, make terrible decisions like texting and driving, suffer various forms of fatigue, etc. Basically, driving tests are woefully and completely inadequate for determining whether or not someone is likely to behave in ways that are dangerous any of the infinite possible driving situations that they might ever find themselves in, but we figure that it would be impractical to do better, so it's good enough, despite that people kill each other with cars literally all day every day.

Why should it be different for computers, which completely unlike humans are at least guaranteed to always perform as well as they did on their driving test in the same conditions?
>>
>>53699093
See, you're missing the point of all this - not just autonomous vehicles, but AI in general... maybe even technology in general.

Consider the technology of transportation. Thousands of years ago, faster transportation was a faster person. Then horses, then trains, then cars, then planes, etc..
When we made each development, the definition of "fast transportation" changed. You wouldn't judge the speed of a person by the speed of a horse, or the speed of a car by the speed of a plane. New technology creates its -own- standards.

Relating this to AI and autonomous vehicles: a smarter AI is only currently judged against human intelligence in things like go because we humans are still the "faster transportation". But in driving, computers are better than us - just look at google's self driving car statistics. The main reason to use autonomous vehicles is that they're better - they won't drunk drive, text, or be tired. We must hold them to a higher, new standard because 1) they've created a new standard, and 2) otherwise... what's the point?
>>
/o/ here. I will continue driving my glorious five speed RWD V8 powered 90s shitbox forever and ever.
>>
>>53699459

i doubt self-driving cars will be able to manage weather conditions like icy roads mixed with contextual conditions like erratic cyclist weaving around on the shoulder during heavy rain
>>
>>53699530
>Humans are so much better! XDD
>Computer will never be able to do X or Y or Z!!!!
I.... Ok
>>
>>53699459
>You wouldn't judge the speed of a person by the speed of a horse
Indeed not. Rather, you judge the speed of a horse by the speed of a human. That is to say that we wouldn't use horses for transportation if they weren't faster, or at least superior in some respect like tow capacity or whatever. Humans were the benchmark by which we judged the usefulness of horses as replacements for human labor. And why shouldn't that have been the case? That's exactly what I'm talking about doing with AI. Use humans capability as the benchmark. Well, that's half of it at least.

Beyond simply saying that they should have to be demonstrably as safe as humans, and no safer, I'm also saying that there's no reason that the method of demonstration should be radically different. After all, unlike a human who is walking vs a human who is flying a passenger jet, a human who is driving a car and a computer that is driving a car are both doing exactly the same thing.
>>
>Rather, you judge the speed of a horse by the speed of a human.

Sure, if you want to figure out which is faster. But we know horses are faster, so why would you do that? To determine if a horse is fast, you judge it by the speed of other horses.

Following the analogy, if we know computers are better drivers than humans then they need their own, new, comparative tests.

If computers are not better drivers than humans, however, then you're right - let's start with basic drivers' tests. But at that point all you've proven is that computers are equal to humans at driving, which is not what we want. The entire point of self-driving cars is to be better.
>>
>>53699459
>But in driving, computers are better than us - just look at google's self driving car statistics. The main reason to use autonomous vehicles is that they're better - they won't drunk drive, text, or be tired. We must hold them to a higher, new standard because 1) they've created a new standard, and 2) otherwise... what's the point?
>they've created a new standard
No they haven't, and the evidence that they haven't is that >>53699493 will still be allowed to drive his 90s shitbox despite not being as good at it as a computer would be. As long as humans are still allowed to drive, we will either have multiple standards, or humans will be the standard. To have multiple standards in these circumstances could actually cost lives by increasing the difficulty of switching to autonomy (making it slower, more expensive, less appealing, etc). Even if that were not the case (and it might not be), why should we be allowed to drive manually, not be allowed to use a tool that's statistically 0.01% less likely to crash than we are (which is obviously terrible for AI), but allowed to use a tool that's 80% less likely to crash? Where is the sense in the idea that is has to be a LOT better, to some arbitrary degree, or you can't have it?

>otherwise... what's the point
Even if the statistical safety of autonomous cars was exactly the same as that of humans and would never get better, there would still be reasons to use them. They have the potential to tremendously improve the independence of people with various disabilities, for example.
>>
I agree, except I'm a little confused here
>To have multiple standards in these circumstances could actually cost lives by increasing the difficulty of switching to autonomy (making it slower, more expensive, less appealing, etc)
Are you saying that if half of the people in the world switched to high standard safe autonomous cars, and the other half didn't because of your given reasons, that it would be a more dangerous road?
Or are you just saying that we should force autonomy on people
>>
>>53700164
The biggest, strongest, fastest and best duck in the world is still not useful to humans as transportation. We use(d) horses and not ducks because horses are faster than humans, not because they're faster than each other. We are the benchmark. We never outlawed slow horses because horses can theoretically be fast. Why should we outlaw incompetent AI because theoretically they can be good?

Regardless of the law, people will still try to get the best one that they can, because it makes sense to do that, and that will drive development and improve safety and so on and so forth.

>>53700361
The first one.
>>
File: cctv.jpg (33 KB, 400x300) Image search: [Google]
cctv.jpg
33 KB, 400x300
can't wait until a car drives you to the nearest police station because you said that the goverment is bad on anonymous board, all autonomously.
>>
File: lit on nihilism.png (984 KB, 3180x2088) Image search: [Google]
lit on nihilism.png
984 KB, 3180x2088
>>53696885
b-but muh trolley problem
>>
>>53700199
>will still be allowed to drive his 90s shitbox despite not being as good at it as a computer would be

Yeah right, his insurance company is going to gouge the shit out of him for not using an autonomous car. In their eyes, a human controlling nearly 2 tons of metal is a gigantic liability when everyone else has sophisticated AI controlling their vehicles.
>>
>>53700428
>We never outlawed slow horses because horses can theoretically be fast.
True, but if you needed a fast horse, you didn't use a slow horse.
Use slow horses for plowing the field, not doing fast horse things.

>The first one
Okay, this is a bit ridiculous. Uber drivers make up a very small percentage of drivers on the road, yet they've had massive impacts on crash rates in cities because they cut down on drunk/tired/stupid driving.
If you have 50% autonomous cars with a 2% chance of crashing and 50% human-driven cars with a 5% chance of crashing, then the total expected percent is .5(.02)+.5(.05) =3.5% vs all human-driven cars at 1(.05) = 5%
>>
>>53700452
A simple, accessible certification process could prevent this by allowing third party firmware, ala OpenWRT. or whatever.

>>53700571
>Yeah right, his insurance company is going to gouge the shit out of him for not using an autonomous car. In their eyes, a human controlling nearly 2 tons of metal is a gigantic liability when everyone else has sophisticated AI controlling their vehicles.
Personally I've thought for a long time that drivers licenses should be at least five times as hard to get. I know people who absolutely never check their mirrors, obliviously drive around in cars at risk of sudden failure, don't know what all of their controls do, etc, and I don't think they should be allowed to drive. The argument is that you can't set the barrier too high because people need to be able to get to work or whatever, but IMO these people should learn to drive or move to a walkable city, period. What I suspect (and hope) will happen is that this argument will eventually stop working and that the barrier will rise enough that a driver's license is a meaningful certification of actual skill, and its status as such will allow competent people to continue to drive if they want to. There will be less leeway for things like epilepsy, less forgiveness for things like DUI's, etc. Basically, a driver's license will be seen as a privilege and not a right, like a pilot's license.
>>
Honestly if you cant drive properly you should have been killed long ago. And if you ever get in a non mechanical failure or some idiot slamming into you accident you should never get behind the wheel ever again.
>>
>>53700602
>Okay, this is a bit ridiculous. Uber drivers make up a very small percentage of drivers on the road, yet they've had massive impacts on crash rates in cities because they cut down on drunk/tired/stupid driving.
>If you have 50% autonomous cars with a 2% chance of crashing and 50% human-driven cars with a 5% chance of crashing, then the total expected percent is .5(.02)+.5(.05) =3.5% vs all human-driven cars at 1(.05) = 5%
You've completely missed the point.

When I agree "that if half of the people in the world switched to high standard safe autonomous cars, and the other half didn't because of your given reasons, that it would be a more dangerous road", I'm saying that it would be more dangerous than a world where the barrier for autonomy is lower, not more dangerous than a world without autonomy.

Imagine a world where autonomous cars are expensive because designing and certifying them them is hard. 50% of people can afford them, 50% can't, and crappy ones don't exist. Now imagine a world where autonomous cars are cheap because designing and certifying them is easy. 50% of people have good ones, 40% of people have crappy ones that are still better than humans, and 10% of people don't have them. Which world is safer? It seems like the second world is safer.

There are of course various reasons why lowering the barrier may or may not result in a safer road, but I don't think the answer is obvious. To be clear, it was never an important point in any of my arguments.
>>
>>53701175
Ah, I see, so now it just depends on the numbers.

this is just a basic optimization problem depending on crash rate and accessibility

Ignoring the many extraneous variable that will come into play, if you can make 90% instead of 50% of people use self-driving cars only by sacrificing a minute increase in crash rate in the accessible cars, then, yes, you're totally right. But if you have to sacrifice too high of a crash rate to make the cars accessible, then it won't be worth it.
>>
>>53701287
As far as I can tell, you are never sacrificing the crash rate by getting people into autonomous cars that are better than they are by any degree, unless people are choosing crappy ones over good ones. That is, if people will settle for one that's "good enough" instead of buying the best one that they can get, then introducing sub-par ones could negatively impact safety. If people will try to by safe ones, though, then I don't think it could.

In terms of performance and luxury and so one people will absolutely settle, but in terms of safety I don't think they do. A four star vs a five star safety rating could easily be the deciding factor between two entry-level cars for many people, I think.
Thread replies: 22
Thread images: 3

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.