Quantcast
Channel: Business – FlashNews18 – Breaking news and insights
Viewing all articles
Browse latest Browse all 924

Two young drivers drive. One is a teenager. The other drives a Tesla car.

$
0
0
Two young drivers drive. One is a teenager. The other drives a Tesla car.


At the corner of Iris Avenue and Folsom in North Boulder, my Model 3 Tesla was self-driving when it noticed two human drivers in the lane directly in front of us. The three of us were turning left — the two human-only drivers and my robot — when the two humans violated a basic traffic rule and veered into the right-hand lane.

The Tesla took the inside lane, as the driver's manual states is the proper rule of the road. I wish my teenage son had been watching this.

Milo is 15, with a learner's permit. I imagine that when he gets his license, he will develop the memory, memorization, and mundane habits that have proven possible in the short lifespan of self-driving cars. On the other hand, my son is less likely to suddenly disengage and stop steering altogether than the Tesla's software (which would require me to take over the controls).

The brains of both the machine and the teen are still in the process of development, the human brain is driven by millions of years of evolutionary biology, and algorithms are shaped by decades of engineers. Viewed through the lens of cognition and neuroscience, this paradox says a lot about the next generation of drivers.

So far only the Tesla (not my son) has been involved in accidents. The federal government reported in April Tesla's Autopilot technology was involved in 956 crashes between January 2018 and August 2023, including 29 deaths. The National Highway Traffic Safety Administration's report found that, “Autopilot's system controls may be inadequate for driver assistance systems that require continuous monitoring by a human driver.”

recently, Elon Musk concludes talks in Beijing Clearing the way with regulators to bring Autopilot to Chinese roads. Plenty of other companies are developing their own versions — General Motors, BMW, Mercedes, Lincoln, Kia, and others — most that take some control in limited situations, such as on the highway.

In short, don't doubt that these cars will be on the roads soon, just as my son will get his license within a year. He has inherited a tough job; statistically speaking, driving is the most dangerous routine activity most of us do in our lives.

After years of reporting on driver safety, I can say with certainty why we face such risk: a mismatch between the capacity of the human brain and the complexity of the road. It presents an onslaught of rapidly changing stimuli, inputs and risks. Cars, pedestrians and cyclists come and go in our frame, our brains get tired, distracted, miss inputs; we are humans, with biologically limited attention and cognition, driving a car that is a missile at highway speeds.

From this perspective, I watch my son learn in our Toyota Highlander. To stay focused, he prefers to keep the radio off, and listen to parental commentary at a low volume. His serious work is revealed by his firm grip on the wheel, his body leaning forward, as if to connect a little more with whatever is happening outside the windshield. When he settles into a rhythm, I ask him to identify the different inputs around him, the car in his blind spot, the cyclist turning right without signaling, the pedestrian looking down at the phone. He gets tired. Safe driving requires effort. Despite the control and teenage glory that comes with taking the wheel, sometimes, he just doesn't want to do it.

When it comes to monitoring our leased Tesla, there's one aspect of the technology that highlights what is, for me, a study in contrasts: On the screen where the map is displayed, an animation shows the surrounding inputs that the Tesla picks up with its many cameras and sensors. Cars appear around us, intersections appear as we approach, along with the presence of cyclists or pedestrians. It sees everything, everywhere, simultaneously, processing multiple streams of information in parallel. For example, when the Tesla “sees” something on the right, it does not do so at the expense of seeing something on the left; my son can only see one side at a time.

This algorithm works at night and in the rain. It follows the rules too much. In fact, its strict adherence has frustrated other drivers – and my teenage passengers – by strictly following the speed limit. One teenage passenger told me: “It's suspicious,” which means suspicious, “because it only goes 20 mph.” We were in a school zone.

“Even if autonomous vehicles aren’t perfect, they do an astonishingly good job and are getting better all the time,” David Strayer, a cognitive neuroscientist at the University of Utah and one of the world’s foremost experts on driver distraction, told me. “We really need to focus on relative risk,” he added, meaning that computers can save many more lives than they can risk, mainly because of their cognitive advantage.

“They're not distracted. They're not tired. They're not drunk or high. They only drive fast when the driver tells them to.”

However, they do glitch. Over the past few months since I've been trying this technique, there have been times when Tesla's software suddenly shuts down, forcing me to immediately take control. I feel like the attendant of a high-speed Disney ride that inexplicably leaves the track and heads for a burger hut. It's jarring. Take control! Save us all!

To be honest, the system constantly warns me to keep my hands on the wheel and be ready to take over control. Sometimes, the auto-driving stops because I pull the wheel too hard, indicating that I'm willing to take over control. Other times when it bugs out, who knows what happened inside that mysterious algorithm? Did another zero get crossed?

In the federal government's latest report on Tesla's crash activity, it said the cars still rely heavily on human supervision, and, at the same time, humans aren't always up to the task. Crash consequences can be severe “when a driver is separated from a Tesla vehicle operating in Autopilot and the vehicle encounters a situation outside of Autopilot's object or event detection response capabilities.”

What a powerful statement: Cars and humans must pay attention when neither has a brain fully adapted to the task.

Still, for the time being, I'd rather trust my son to drive me home than I would trust Tesla; I don't know when the car will be fixed, or why, or what I can do to stop this mess.

Soon, humans will give up control. When that happens, I'm not sure it will be because robots have nearly unlimited knowledge and they will keep us safer, even if that is true. The real reason robots take over the wheel is because humans have better things to do than drive. Stream shows, stretch your legs, take a nap. At that point, when machines rule the road, I might tell my teenager to go ahead and watch TikTok behind the wheel, though I think I won't mind the growth of social media watching myself at that point.



Source link


Viewing all articles
Browse latest Browse all 924

Trending Articles