You are mixing two fundamentally different things. I made it clear I am not referring to robots containing humans. A person wielding a hammer is fundamentally different than a pre-programmed hammer some human unleashed on it's own. I consider the latter a serious threat.
If you don't mean "right" than don't use that word. As is you have redefined it to be practically meaningless.
I'm not convinced you did re-define it. Your edit makes it more clear:
"it's not because I'm giving robots special treatment"
Yes, you really are. You are arguing that since it's owned by someone it has "plenty of right" to use the road. That's wrong. Par with humans is as special as treatment gets.
F(r) = -F(h)
ma(r) = -ma(h)
a(r) = -a(h)
Which a is more important? Should we treat them even remotely the same when one a ends a life and the other a causes a financial loss?
I mean... oops. Sorry for the hammer malfunction. We sent out a patch and fired someone though.
Admit it, the end solution is to just remove humans from having direct control over their KE and direction? Right? Programmers are always smarter aren't we?
I often wonder if the pre-programmed car proponents realize what side they are on in the war on general purpose computing.
> If you don't mean "right" than don't use that word.
If you so desire, I won't use it, sure.
> A person wielding a hammer is fundamentally different than a pre-programmed hammer some human unleashed on it's own. I consider the latter a serious threat.
> Which a is more important? Should we treat them even remotely the same when one a ends a life and the other a causes a financial loss?
It's more complicated than that. You can drive a car remotely, and you can have a robocar with a human in it.
A car with a human in it deserves more protection, but there's no reason it has to have priority toward getting to use lanes. A car en route to pick up a human might as well have the same priority as one delivering a human.
Whether a car is being controlled by computer or human being shouldn't matter at all. Whether it's carrying a human should matter in some ways but not in others. One case where it shouldn't matter is blocking it; neither should be blocked.
"Whether a car is being controlled by computer or human being shouldn't matter at all."
There is human-equivalent software? That's astonishing. Where can I find it? Does it run on ARM?
Edit: Bummer. I called Denso, they were adamant that their human-level code v3.11 is a trade secret.
"A car with a human in it deserves more protection"
We agree. _please_ explain how to accomplish that.
Remember, KE is relative, therefore velocity and mass limits for the pre-programmed machine are not relevant.
Clearly my empty pre-programmed human carrying drone should enjoy the same lanes as a loaded 777. Lets set aside the distraction about remote control which totally ignores the "skin in the physics game" which transportation of life requires.
Did I imply that? I don't see how that's relevant.
> We agree. _please_ explain how to accomplish that.
If you agree with me then why do I have to explain anything?
You're being confusing with all your talk of kinetic energy. Are you trying to imply that robocars inherently make the roads more dangerous? I don't think that's true at all. As an extreme example, even with today's tech you could flood the roads with 20mph robocars and make things safer overall.
> Clearly my empty pre-programmed human carrying drone should enjoy the same lanes as a loaded 777. Lets set aside the distraction about remote control which totally ignores the "skin in the physics game" which transportation of life requires.
If you want to pay the same airport fees, go for it. Seems fair to me.
If you're worried about congestion and road funding when it comes to cars then limit priority to one robocar per person. But when I have exactly one car, my ability to use the roads to drive it to the store shouldn't depend on whether my butt is inside of it.
Why would "skin in the game" be necessary? A taxi with a driver inside can act exactly the same as a taxi without a driver inside.
I believe a vehicle akin to the one you're describing ("act exactly the same" is a bit unnecessarily specific, but "drive safely on a road under its own direction from a point A to a point B chosen by a human") is definitely what Waymo is prototyping, and such vehichles have already driven on public roads (with human occupants, but without human occupants operating the control surfaces).
There will never be conventional digital pre-programmed cars that "act exactly the same" or even remotely as "same" as a human driver. What is going to happen, is for a bit, the control writers will blame humans for their failure, followed shortly by the population's understanind and eventual ban in many places on this unworkable idea.
Tightly restricted societies that put "safety" over "freedom" will just end up banning non-elite direct human control of KE>nJ. The repercussions of that mistake have ugly ends.
If you don't mean "right" than don't use that word. As is you have redefined it to be practically meaningless.
I'm not convinced you did re-define it. Your edit makes it more clear:
"it's not because I'm giving robots special treatment"
Yes, you really are. You are arguing that since it's owned by someone it has "plenty of right" to use the road. That's wrong. Par with humans is as special as treatment gets.
Which a is more important? Should we treat them even remotely the same when one a ends a life and the other a causes a financial loss?I mean... oops. Sorry for the hammer malfunction. We sent out a patch and fired someone though.
Admit it, the end solution is to just remove humans from having direct control over their KE and direction? Right? Programmers are always smarter aren't we?
I often wonder if the pre-programmed car proponents realize what side they are on in the war on general purpose computing.