Kunne sitert mye fra denne artikkelen, men dette får holde - og er ikke det mest bekymringsfulle. Teslas egne ingeniører melder at Full Self Driving er utrygg, at Musk overdriver hva systemet kan for å fyre opp sine investorer og at fjerning av ultralyd-sensorer, radar og nei til LIDAR, gjør bilene farlige.
Tesla bør påbys å slå av all "autopilot" teknologi inntil Musk slutter å ribbe plattformen for samme.Radar er avslått på biler der den tidligere ble installert, fordi Musk tror kameraer holder. Radar og ultralyd-sensorer installeres ikke lenger, for å spare penger.
Folk sitter og tagger gjenstander på bilder, for at kameraene skal "se" hva de skal unngå.
“One of the key advantages of lidar is that it will never fail to see a train or truck, even if it doesn’t know what it is,” said Brad Templeton, a longtime self-driving car developer and consultant who worked on Google’s self-driving car. “It knows there is an object in front and the vehicle can stop without knowing more than that.”
Cameras need to understand what they see to be effective, relying on Tesla workers who label images the vehicles record, including things like stop signs and trains, to help the software understand how to react.
Toward the end of 2020, Autopilot employees turned on their computers to find in-house workplace monitoring software installed, former employees said. It monitored keystrokes and mouse clicks, and kept track of their image labeling. If the mouse did not move for a period of time, a timer started — and employees could be reprimanded, up to being fired, for periods of inactivity, the former employees said.
After a group pushing to
unionize Tesla’s Buffalo factory raised concerns about its workplace monitoring last month, Tesla responded in a blog post. “The reason there is time monitoring for image labeling is to improve the ease of use of our labeling software,” it
said, adding “its purpose is to calculate how long it takes to label an image.”
Musk had championed the “vision-only” approach as simpler, cheaper and more intuitive. “The road system is designed for cameras (eyes) & neural nets (brains),” he
tweeted in February 2022.
Some of the people who spoke with The Post said that approach has introduced risks. “I just knew that putting that software out in the streets would not be safe,” said a former Tesla Autopilot engineer who spoke on the condition of anonymity for fear of retaliation. “You can’t predict what the car’s going to do.”