• Welcome to the ShrimperZone forums.
    You are currently viewing our boards as a guest which only gives you limited access.

    Existing Users:.
    Please log-in using your existing username and password. If you have any problems, please see below.

    New Users:
    Join our free community now and gain access to post topics, communicate privately with other members, respond to polls, upload content and access many other special features. Registration is fast, simple and free. Click here to join.

    Fans from other clubs
    We welcome and appreciate supporters from other clubs who wish to engage in sensible discussion. Please feel free to join as above but understand that this is a moderated site and those who cannot play nicely will be quickly removed.

    Assistance Required
    For help with the registration process or accessing your account, please send a note using the Contact us link in the footer, please include your account name. We can then provide you with a new password and verification to get you on the site.

Driverless cars - a moral conundrum

Wheels on fire


  • Total voters
    5

MK Shrimper

Striker
Joined
Aug 6, 2005
Messages
52,643
Here's an interesting one: It's the year 2026 and Person A is in a Driverless Skoda. Lovely and safe.

However, there's an accident up ahead, the car is doing 30mph and the computer has two options...

a) Save the driver and run into and probably kill 10 pedestrians.
b) Swerve into a wall and kill you, but save the pedestrians.

How should the car of the future be programmed?

Now replace Person A with you and your family........
 
Here's an interesting one: It's the year 2026 and Person A is in a Driverless Skoda. Lovely and safe.

However, there's an accident up ahead, the car is doing 30mph and the computer has two options...

a) Save the driver and run into and probably kill 10 pedestrians.
b) Swerve into a wall and kill you, but save the pedestrians.

How should the car of the future be programmed?

Now replace Person A with you and your family........

Has to be answer B......The Skoda driver is lowest priority in life
 
Before you get to upset. A driverless car removes human error so it would not be speeding or tailgating. It will react to the accident ahead because it won't be reading the latest Shrimperzone thread on its phone. So the car will glide to a safe halt in front of the accident.
 
c) The car performs an emergency stop procedure and because it is programmed to keep a safe distance from other traffic stops well short of the pedestrians without killing the passenger.
 
That article doesnt seem to be about programmers, just asking the general public what they should do.

In the situation above the car would have, unlike a human driver, been driving at a distance and speed it could stop at so it would just stop.

Its an interesting discussion though as its possible there are situations where the car wont be doing what it should.

For example what if you are driving along and another vehicle hits your car, suddenly its in a situation where it wasnt in control of its speed and direction. If the human car knocked you into the path of a group of nuns and the only alternate route was a lamppost.
 
As a software developer with some experience with AI I propose a Ted Kennedy algorithm where the computer drives off a bridge instead.
 
D) Someone hacks into your cars computer and does whatever the **** they want.

maybe retinal recognition or fingerprint technology would prevent that, or a simple PIN.

It'd be great to have a camper van - go to bed in Southend, wake up in Cornwall!
 
its a very good point

Those saying it will NEVER happen live in a different world to me . Maybe in your world computers never fail, your PC never slows down, and planes never crash because of system issues/ defects /lack of understanding
 
maybe retinal recognition or fingerprint technology would prevent that, or a simple PIN.

It'd be great to have a camper van - go to bed in Southend, wake up in Cornwall!

Rob Noxious would like that very much after a home game visit I think!
 
Saw this earlier.

Couldnt recognise a white truck against a bright sky.

Sounds fair enough.

Should work a treat at night time.
 
Here's an interesting one: It's the year 2026 and Person A is in a Driverless Skoda. Lovely and safe.

However, there's an accident up ahead, the car is doing 30mph and the computer has two options...

a) Save the driver and run into and probably kill 10 pedestrians.
b) Swerve into a wall and kill you, but save the pedestrians.

How should the car of the future be programmed?

If a crash into a wall at 30mph is going to kill you, we need to be looking at making the car crumple zones and safety features more effective than the person/computer driving it.

As for the driverless cars, I don't get it. Why don't we get computers to cook food for us, select birthday presents for the family, read to the kids, etc?

Live life folks!
 
First death reported due to driverless technology.

http://www.bbc.co.uk/news/technology-36680043

There's a lot of confusion about this, Tesla's autopilot is not a self driving car. It's essentially a more capable version of cruise control that can spot most dangers and know when it's safe to change lane however it is not capable of spotting everything. When using it you are expected to keep both hands on the wheel and have full power to steer manually or hit the brakes if you feel you have seen a danger you think the car hasn't, it's a second pair of eyes not a replacement for your own (although it sounds like some Tesla drivers ignore this advice and stop paying attention, which is a very dangerous risk to take). True driverless technology is still being tested and is not available to the public because it simply isn't ready yet.

The question is should partial driverless technology be allowed if drivers are going to get complacent when using it?
 
Back
Top