Let’s imagine a child steps out in front of an oncoming driverless car, it calculates the following decisions. It could A) Swerve to avoid the child, but hit a tree, leaving the occupant with a 70% chance of survival. Or B), Apply the brakes but continue straight . Driver 1 has a 99% chance of survival, leaving the child with a 1% survival rate. Now under these circumstances all of us would choose what would seem to be the logical but selfless option A, where the child is not harmed are the driver is left with a 70% survival rate. But unlike us, this driverless car has no emotion, acting on only what it has been programmed to do. The car takes option b, continuing straight giving the driver a 99% chance of survival, but killing the child instantly. Presently driverless cars are being programmed to make decisions like this, whereby it protects the occupant of the car at all costs. Do you think self driving cars are a good idea ? Because I don’t want that child to be my child in future. …show more content…
They have multiple saftey issues that develpoers have not explored thouroughly enough for them be deemed safe. We all know the technology is bound to fail at some point but I don’t think people have thought about how easily this can occur and the horrendous effects this will have compared to an accident with normal cars. An example that would occur everyday are roadworks were the road is not the same as it would look on the cars GPS mapping system. This is an issue as the car does not register that the road is closed. These cars could end up driving into construction zones, causing damage to the car, workers, machinery, and the occupant. This is not acceptable and would not happen if a human was driving this car. The GPS tracking of maps in these cars cannot calculate for roadworks like this unless someone is constantly updating their system, feeding it information, which is not the case. This therefore makes driverless cars a saftey issue of the highest