1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

Accidents have eroded public trust in self-driving vehicle technology

By Shawn Knight · 37 replies
May 22, 2018
Post New Reply
  1. wiyosaya

    wiyosaya TS Evangelist Posts: 4,218   +2,484

    How many other instances will need to be accounted for that are, as yet, unknown?
  2. wiyosaya

    wiyosaya TS Evangelist Posts: 4,218   +2,484

    For me, unless an argument is supported by sound science, it remains an argument.
  3. wiyosaya

    wiyosaya TS Evangelist Posts: 4,218   +2,484

    Unless all self-driving vehicles can avoid accidents reliably, this is marketing designed to do only one thing - sell self-driving vehicles.
  4. m4a4

    m4a4 TS Evangelist Posts: 1,488   +1,074

    Well, look into the stats. I'm sure you'll be surprised how many hours of autopilot have actually been logged with no incident. I'm not saying they're almost perfect (far from it), all I'm saying is that a lot of people are being ignorant over them (hearing 1 or 2 bad things and basing their entire opinion on that).

    As for the block, it wasn't technically autonomous self driving tech. It was a glorified driver assist. It didn't know what it was and let the driver know that they (the driver) needed to react. Heck, the driver was supposed to be paying attention anyways, as per the warnings along with using the system. 100% the drivers fault for (funny enough) being ignorant (and if the majority of people actually looked into situations like this without being ignorant, they'd know that too).

    As where the tech stands, and what stage it's in, I trust that it will keep on improving (and that it's safer in the basic ways). I see an id!ot driver pretty much every time I go out driving, so how could this be worse? :p
  5. axiomatic13

    axiomatic13 TS Maniac Posts: 228   +161

    This is nothing more than overzealous management moving the research too quickly. It's all they know how to do.
    wiyosaya likes this.
  6. hood6558

    hood6558 TS Evangelist Posts: 353   +110

    Fake results are better than no results.
  7. wiyosaya

    wiyosaya TS Evangelist Posts: 4,218   +2,484

    Perhaps I did not make myself clear. THE CAR SHOULD HAVE STOPPED. It has sensors that can recognize obstacles. It is not like the Kangaroo that jumped out of the way.

    Also, if it is not "autopilot" then Tesla/Musk should not market it as such even though laws allow them to with reams of fine print. Musk is a spoiled brat that has no honor.

    Drawing a parallel between idjit drivers and self-driving AI is a red herring/irrelevant conclusion - https://en.wikipedia.org/wiki/Irrelevant_conclusion#Red_herring
    Last edited: May 24, 2018
  8. m4a4

    m4a4 TS Evangelist Posts: 1,488   +1,074

    Perhaps I did not make myself clear. THE PERSON SHOULD HAVE STOPPED. There, I can "yell" too. It was his responsibility to pay attention and he (regrettably) dropped the ball for the several seconds that he was warned for.

    And I agree with your marketing comment, it shouldn't be called autopilot. But you won't catch me being ignorant enough to assume it's function based only on the name.

    Lol, all I'm saying is that I'd like to replace a lot of the idjit drivers (especially the drunk ones) with decent autopilot tech.
  9. Abraka

    Abraka TS Addict Posts: 176   +54

    Artificial Intelligence is no match for Natural Stupidity.
  10. wiyosaya

    wiyosaya TS Evangelist Posts: 4,218   +2,484

    It's plain that we are not going to agree on this.

    I'll pose another scenario.

    Hypothetically speaking: Suppose that the driver had a heart attack or some other medical scenario that caused them to be unable to control the vehicle. This does happen and such medical conditions go undetected such that there may be no advance warning that this kind of thing might happen to a driver - which is my long way of saying that in such a hypothetical scenario, the person is legally able to drive the car, but unexpectedly develops an inability to control the vehicle due to the medical condition - so no "driver's fault" here.

    Are you saying that in such a scenario, the car should just continue to drive into a solid object at high speed that the car can easily detect with its advanced sensors that include radar and lidar and the ability to determine "hey, that thing is not moving"? As I see it, it could just simply pull off the to the side of the road and stop.

    As I see it, self-driving cars need to have that ability and should have the ability to determine if a driver is paying attention and then simply refuse to go any further if they are not. We are talking about people's lives here, and for me, this becomes an ethical matter. Would you prefer your self-driving car drives you into the wall in this type of scenario?

    In fact, there are reports that Tesla refutes (and unless they want their a$$ sued off, they have to refute them) that Tesla engineers wanted to include such safety features and they were rejected because of cost. https://www.techspot.com/community/...fety-features-over-costs.246608/#post-1684441

    So if you don't like seeing people driving that you believe are drunk, why not call 911 on them when you see them? Program 911 into your hands-free speed-dial and let the police take car of them. You might be doing the drunk driver, and other people, for that matter, a favor.
    Last edited: May 24, 2018
  11. m4a4

    m4a4 TS Evangelist Posts: 1,488   +1,074

    Yes (to be blunt).
    In it's current state, it is a driver assist. It did not know what the object was and sudden deceleration could also pose a problem (hence why it warned the driver to take charge). I understand what you're trying to get at, but it is not the system's fault in this context. Whether or not this specific system would be active, your hypothetical situation would still result in the same end.

    We're not talking about a competent autonomous driving AI (because it doesn't exist in 2018). It's still in it's infancy and still needs a human to monitor it. And frankly, you don't seem to understand that (signalling an end to this conversation).

    Otherwise, you missed the context with my idjit drivers comment again. It's a wish. Nothing more.
  12. wiyosaya

    wiyosaya TS Evangelist Posts: 4,218   +2,484

    The car supposedly told the driver several times to put his hands on the wheel for some time before the accident. If there is time to warn the driver several times of a potential problem, there was likely time for the car to decelerate safely. Depending on who you listen to, it was the choice of Tesla not to incorporate the feature.

    It is not that much of a stretch, as I see it from my perspective as a programmer with over 20-years experience, to incorporate a program branch that says "car needs driver control, car warns driver, driver is not responding, car decelerates and pulls of to road shoulder or as far as possible off the road safely, car refuses to go any further if driver does not respond."

    Personally, I really do not care what features the car had, this type of feature should be incorporated by default as a matter of ethics.
    m4a4 likes this.
  13. Impudicus

    Impudicus TS Addict Posts: 159   +119

    Good thing cars driven by humans don't have accidents. Lets stick with what works.

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...