top of page
Comarch_300x600.png
GenerativeAI_728x90 (4).png
TechNewsHub_Strip_v1.jpg

LATEST NEWS

  • Matthew Spencer - Tech Journalist

Telsa FHD Beta facing questions: Does it drive like a human or a robot?

The EV maker Telsa was faced with questions if it is secure enough to deploy on the road, drive-by pedestrians, or on the highway. We are way past that. The EV demand has skyrocketed since we last faced the question. We can easily spot Telsa self-driving cars now and then. But with the recent update, drivers are questioning if it makes decisions like a human or a robot.



The Tesla Model Y was preprogrammed to roll passed the stop signs at a slower speed automatically. Humans often make this kind of decision. They could either follow the traffic law and stop at the light or ignore the rolling stop. Rolling stops are a part of daily driving schedules in some areas, but it is strictly prohibited in other areas.


It is safe to say regional laws are strictly encoded into the AI in self-driving cars. But to what extent? A report by NBC news brought out such an event where Tesla’s experiment driver-assistance feature rolled passed the stop sign at 5.6 miles per hour without stopping.

Nearly 54,000 cars and SUVs are recalled by Tesla for their full self-driving software rolling through traffic. In some areas, such as Texas, rolling past a stop sign is illegal. No matter how slow the roll is. When does an automated mobile robot with tons of materials make this decision?


CEO Elon Musk has always prioritised automated driving in carbon-neutral vehicles. In a recent Tweet, he said, “Tesla AI might play a role in AGI. It trains against the outside world, especially with the advent of Optimus.” Musk is quite sarcastic when it comes to troublesome comments against him and his company. Of course, not everything is perfect, and he also aggresses to that.


Think of an event such as Halloween, where many kids are crossing the kids to collect candy and have fun. In that kind of event, kids will not look like themselves with costumes, which doesn’t happen throughout the year. When does AI think about Halloween? How does it calculate those small kids running around? It is a valid problem that manufacturers and programmers should find a solution to as there is no self-learning there.



Tesla’s experimental features are getting a bit of controversy for rolling out in the stop sign. How does the algorithm work in this case? Is it some sort of IF-ELSE or IF-THEN parameter? IF nighttime, if there is no traffic, and if there is a stop sign, THEN roll at a low speed. If this kind of script exists, it should be removed immediately. In some areas, where multiple cars are behind, rolling through makes the most sense if you’re casually driving by. Another case is that the automatic rolling through may trigger case filings through CCTV or roadside cameras. Robots should not be given the power to roll through a stop sign.


Last year, “assertive mode” was released as a premium version of self-driving assistance future. It featured a rolling stop with a small following distance and a propensity to “not exit passing lines.” The rolling stop feature was removed with a recent update, and average pedestrians do not fall under self-driving beta and thus raising questions on road safety. Even though Tesla revolutionised autonomous electric self-driving vehicles, it has a long way to go before perfection. As competitor manufacturers are trying to pull out the same thing, it may come quicker than expected.


On Thursday, Elon Musk said they are opening a Tesla assembly plant in Austin, Texas, where the self-driving program, FSD Beta, and other features will be rolled out. Phil Koopman, an engineering professor at Carnegie Mellon University and expert in advanced driver assistance systems and autonomous vehicle technology, raised questions to Musk against such features. “You said they would be perfect drivers. Why are you teaching them bad human habits?”

wasabi.png
Gamma_300x600.jpg
paypal.png
bottom of page