I recently wrote an article about self driving cars(http://novaknows.com/trusting-in-an-ai-that-could-kill-you/). I personally like the idea, but have major issues with the way they are being played out, and I believe my article was quite a bit biased as a result. To make it easier to look into the things I said about self driving cars, I decided an annotated bibliography containing the sources I used, and the way that I interpreted the information.
Dizikes, Peter, and MIT News Office. “How Should Autonomous Vehicles Be Programmed?” MIT News, 24 Oct. 2018, news.mit.edu/2018/how-autonomous-vehicles-programmed-1024.
MIT created versions of the “Trolley Problem” which simulated situations self driving cars may encounter where the car has to choose one of two actions which will both result in casualties. They surveyed around 2 million people from around 200 countries to see people’s preferences on what action the car should take. I personally believe that ethical algorithms are less important than actually perfecting software. I feel like everyone will make different decisions when put in the actual situation an ethics problem refers to, which is a part of our humanity. If those details were inputted into a machine, I feel it could cause major problems.
Denton, Jack. “Is the Trolley Problem Derailing the Ethics of Self-Driving Cars?” Pacific Standard, 29 Nov. 2018, psmag.com/economics/is-the-trolley-problem-derailing-the-ethics-of-self-driving-cars.
Pamela Robinson, a philosopher at the University of Massachusetts, is working on programming ethics into the software used for autonomous vehicles. The same information from the previous article applies. I personally believe that a philosopher should not be involved with programming ethics at all, as they probably don’t know how it will turn out. It would also have devastating effects if the device or database was hacked or if the code was messed up.
Davies, Aarian Marshall and Alex. “Uber’s Self-Driving Car Saw the Woman It Killed, Report Says.” Wired, Conde Nast, 24 May 2018, www.wired.com/story/uber-self-driving-crash-arizona-ntsb-report/.
During a crash involving an Uber self driving car, the software was aware of the imminent collision right before it happened, but was unable to break. The car detected the pedestrian around 6 seconds before the crash, identifying her as an unknown object,then a vehicle,and then as a bicycle. Each time it changed classifications, it adjusted the predictions for it’s path of travel. Around a second before impact, the car determined it needed to brake, but didn’t do it due to the programming in the self driving car. Uber was not held criminally liable for the crash, because the controls were handed over to the human driver during the actual last second before the crash.
Wamsley, Laurel. “Uber Not Criminally Liable In Death Of Woman Hit By Self-Driving Car, Prosecutor Says.” NPR, NPR, 6 Mar. 2019, www.npr.org/2019/03/06/700801945/uber-not-criminally-liable-in-death-of-woman-hit-by-self-driving-car-says-prosec.
Contains mostly the same information as above. The system of the autonomous car is not actually designed to alert the driver, so almost no information was given before the crash. The driver had been streaming the show “The Voice” and not paying attention to the road. Due to the term “self driving car” and “autonomous vehicles”, a lot of people have been putting too much trust into the car itself. The software is not on a level where it’s able to operate without any guidance, but due to advertising, people don’t always seem to know that.
Madrigal, Alexis C. “7 Arguments Against the Autonomous-Vehicle Utopia.” The Atlantic, Atlantic Media Company, 20 Dec. 2018, www.theatlantic.com/technology/archive/2018/12/7-arguments-against-the-autonomous-vehicle-utopia/578638
This article mentions arguments made by those opposing self driving cars. Some people believe it won’t work till cars are as smart as humans, believe they can be hacked,believe they just won’t work as a transportation service, won’t work because they cannot be proved safe, that though self driving cars may exist, it will take a longer amount of time, think that it will mostly be computer assisted cars rather than self driving ones, and last, some believe that self driving cars may make emissions and traffic worse. The major issue here is that all the problems mentioned are valid and have evidence to back them up. There are signals made by humans during driving that may not be able to be interpreted by cars. Current electronic and non electronic vehicles can already be seized by hackers, and the range of control will only increase when self driving cars are fully implemented. They may not work as transportation due to their cost, or due to the fact that it can’t go faster than a normal car and still basically works the same. The people who believe they will work, but not anytime soon, could potentially be right due to the fact the components required for self driving cars can not yet be mass produced at a low cost. Some people argue that it will mean computer assisted driving rather than actual autonomy. I personally believe it’s more of an issue when companies overstep and skip through the computer assisted part. For example, with uber they never did computer assistance, and as a result the link between the device and the user doesn’t seem to work very well, as shown by it not alerting the driver of an imminent crash.
“Seven Problems Self-Driving Cars Need to Overcome.” RSS, www.smithslawyers.com.au/post/self-driving-car-problems.
This article discusses issues with the current self driving cars. A major issue is finding a way to either detect or replace the normal human interaction that occurs when driving. Until self driving cars car find a way to simulate this,it may cause danger in situations that would normally be resolved by communication among drivers. Another issue is that without the experience of a human driver, crashes that would normally be avoided due to previous encounters may not be avoided by an autonomous vehicle.Snow can also cause major issues with an autonomous vehicles ability to detect and follow a road due to it potentially blocking the road lines. An issue mainly problematic in australia is that certain animals have movement that is harder to be predicted. Until things like kangaroos have been worked into the system, it could cause major issues in certain locations. The current laws for self driving cars also happen to be in a strange position, which has caused issues and will cause issues in the future regarding who is responsible for crashes. The cost for autonomous vehicles is currently way too high for most people, and with them being developed off customers who tend to be well off enough to buy an autonmous vehicle in the first place, lower amounts of data are being collected. With all these issues as well as others, there is an overall lack of trust for self driving cars. That lack of trust is the main issue that they will need to overcome in order to be successful.
Stewart, Jack. “For a Much-Needed Win, Self-Driving Cars Should Aim Lower.” Wired, Conde Nast, 21 Apr. 2018, www.wired.com/story/uber-self-driving-crash-strategy/
This article discusses that even though statistically car crashes are higher with normal cars compared to self driving ones, it’s not a statistic that most people can relate to. The media also exaggerates the deaths caused by autonomous vehicles, so it’s harder for people to just interpret the numbers. In order for them to actually promote their vehicles, they should get more benefits and help more people rather than pushing a statistic that is hard for people to see as a clear benefit. I find it weird that companies are pushing a statistic about car crashes when the amount of autonomous vehicles vs normal vehicles is massively disproportionate.
“Bill Regulating Self-Driving Cars Awaits Governor’s Signature.” News On 6, www.newson6.com/story/40382523/bill-regulating-selfdriving-cars-awaits-governors-signature.
Oklahoma’s central location and many roads allow for it to be a great place for testing autonomous vehicles. However, the laws to regulate these vehicles are only now being established.
Laing, Keith. “Carmakers Face Pressure to Set Self-Driving Car Rules.” Detroit News, The Detroit News, 29 Apr. 2019, www.detroitnews.com/story/business/autos/mobility/2019/04/29/carmakers-face-pressure-set-self-driving-car-rules/3554772002/.
Autonomous vehicle production is being sped ahead, but congress isn’t exactly keeping up. Rather than setting clear rules for testing vehicles and for the eventual sale of fully autonomous ones, they allow for a certain amount of exceptions per year for each car producing company. The U.S. Department of Transportation has released three sets of voluntary guidelines for autonomous vehicle producers. All the guidelines are voluntary, and companies can choose to ignore them if they wish. Many officials believe that self regulation among car companies is more important than having set guidelines by the government. Even the regulations that are planned to be passed don’t have a set timeline, as the government cannot predict where autonomous vehicle production will go in the next few years.
From the information I could find, I believe that having autonomous vehicles on our roads in their current state could be a danger, and that the development has been pushed too far ahead. Though some statistics may state you’re less likely to be in a crash in autonomous vehicles, the data samples are largely disproportionate, and the fact that it’s still so high with an autonomous vehicle even with the much lower sample size is a bad sign. Regulations are far too slow to be pushed, and the government isn’t holding companies responsible for most incidents, even going far enough to allow them a certain amount of exceptions to the current laws. I am open to discussion on the topic, but at their current state, I believe autonomous vehicles are unsafe and unnecessary.