Provide data analysis and business recommendations for tesla


Problem: Interview Response

As with any innovation, self-driving cars bring with them a lot of technical issues, but there are moral ones as well. Namely, there are no clear parameters for how safe is considered safe enough to put a self-driving car on the road. At the federal level in the United States, guidelines in place are voluntary, and across states, the laws vary. If and when parameters are defined, there's no set standard for measuring whether they're met.

Human-controlled driving today is already a remarkably safe activity - in the United States, there is approximately one death for every 100 million miles driven. Self-driving cars would, presumably, need to do better than that, which is what the companies behind them say they will do. But how much better isn't an easy answer. Do they need to be 10 percent safer? 100 percent safer? And is it acceptable to wait for autonomous vehicles to meet super-high safety standards if it means more people die in the meantime?

Testing safety is another challenge. Gathering enough data to prove self-driving cars are safe would require hundreds of millions, even billions, of miles to be driven. It's a potentially enormously expensive endeavor, which is why researchers are trying to figure out other ways to validate driverless car safety, such as computer simulations and test tracks

(Code #1) Moral implications of self-driving cars
(Code #2) Lack of safety standards
(Code #3) The difficulty of testing safety
(Code #4) Cybersecurity concerns

Provide Data analysis and business recommendations for Tesla.

Request for Solution File

Ask an Expert for Answer!!
Marketing Management: Provide data analysis and business recommendations for tesla
Reference No:- TGS03216888

Expected delivery within 24 Hours