Staying true to the company vision of autonomous driving, Tesla vehicles can now read speed limit signs and detect green or red lights. When the camera detects a speed limit sign, it will be displayed on the driver dashboard in the car and used to set a speed limit warning, according to the software release notes.
But even though Tesla calls that version of autopilot “full self-driving,” and despite CEO Elon Musk’s promises to have fully autonomous Teslas by this year, steering-less cars are pipped to make an appearance in 2021. So far, it’s not there yet. Teslas still need humans behind the wheel. The “full self-driving” package, which starts at around $7,000, gives customers access to incremental improvements like the speed limit and stop sign controls.
Despite the astronomical advancements, Tesla always maintains that its constantly improving autopilot software is not designed to be a substitute for a human driver.
In the arena of driverless cars, the very idea of bridging the gap the future was supposed to be now. In 2020, you’ll be a “permanent backseat driver,” the Guardian predicted so confidently in 2015. “10 million self-driving cars will be on the road by 2020,” blared a Business Insider headline from four years ago.
Those declarations were accompanied by announcements from the largest automotive companies that they’d be making self-driving cars by 2020. Elon Musk himself forecasted that Tesla would do it by 2018, but the year has been here – the self-driving cars aren’t.
Despite extraordinary efforts from technology’s leading names in tech and in automaking, fully autonomous cars are still out of the picture except in extremely limited special trial programs.
With the trials of driverless cars ongoing in different parts of the world, you can buy a car that will automatically brake or slow down for you when it anticipates a collision or one that helps keep you in your lane, such as a Tesla. But will it assume your role as the driver? We will have to wait for that.
In the world of autonomous cars, there is no widely accepted and agreed basis for ensuring that the machine learning algorithms used in the cars are safe. There is no agreement across the industry, or across standardisation bodies, on how this machine learning should be trained, tested or validated.
What there is, instead, for self-driving cars, are emerging regulations for particular functions such as for automated lane-keeping systems. That being said, there does exist international standards for autonomous systems that include autonomous vehicles. This sets relevant requirements for future reference but does not solve the problem of machine learning. But, here is to hoping that over time it might. Because without recognised regulations and standards, no self-driving car, whether declared safe or not, will make it to the open road.
Do you have a story that you think would interest our readers? write to us email@example.com