Autopilot Archives - VICE https://www.vice.com/en/tag/autopilot/ Tue, 30 Dec 2025 14:17:55 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.3 https://www.vice.com/wp-content/uploads/sites/2/2024/06/cropped-site-icon-1.png?w=32 Autopilot Archives - VICE https://www.vice.com/en/tag/autopilot/ 32 32 233712258 Plane Lands Itself After In-Flight Emergency for the First Time https://www.vice.com/en/article/plane-lands-itself-after-in-flight-emergency-for-the-first-time/ Tue, 30 Dec 2025 14:17:47 +0000 https://www.vice.com/en/?p=1943878 For everyone out there who thinks planes are so automated these days they’re just taking off and landing themselves, leaving the pilots with nothing to do in between, this story is for you. For the first time outside of a demo or test flight, an airplane successfully landed itself after an in-flight emergency. Contrary to […]

The post Plane Lands Itself After In-Flight Emergency for the First Time appeared first on VICE.

]]>
For everyone out there who thinks planes are so automated these days they’re just taking off and landing themselves, leaving the pilots with nothing to do in between, this story is for you. For the first time outside of a demo or test flight, an airplane successfully landed itself after an in-flight emergency.

Contrary to popular belief, this is a line in the sand for the aviation industry that has not been crossed until now, when it was forced to happen to save lives. Not many lives, but lives nonetheless.

On December 20, a Beechcraft Super King Air 200 flying over Colorado experienced a sudden loss of cabin pressure. Garmin’s Emergency Autoland system took over, flew the aircraft, talked to air traffic control, and landed safely at Rocky Mountain Metropolitan Airport near Denver.

Operated by charter company Buffalo River Aviation, the flight had no passengers on board, just two pilots who willingly relinquished control while still maintaining control of the situation.

When the cabin altitude exceeded safe levels, the pilots put on oxygen masks. At that point, Autoland automatically engaged, as designed. Rather than disengaging it, the pilots decided to just let it do its thing while keeping their hands close just in case something went wrong. A real “Jesus, take the wheel” kind of moment.

Autoland isn’t the same thing as the autoland systems airlines use in foggy conditions. This technology is built specifically for emergencies where pilots might be incapacitated or overwhelmed. Once it’s activated, either automatically or via a very literal, very conspicuous big red button, the system takes full control. It chooses an appropriate airport based on distance and runway length, communicates with air traffic control using an automated voice, avoids terrain, and lands the plane without human input.

In this case, the system announced to controllers that it had taken over due to “pilot incapacitation,” which, as you can imagine, sparked a little bit of concern at first. Buffalo River Aviation later clarified that this was just how the system reports emergencies, not a literal interpretation of the conditions in the cockpit. The first responder video shows both pilots exiting the craft unharmed after safely landing.

Garmin says Autoland is currently installed on around 1,700 aircraft, mostly smaller private and charter planes. This was the first real-world proof that a fully autonomous emergency landing system can work exactly as intended, under pressure, without a safety net.

The FAA is investigating, but the outcome is promising. The aviation industry might finally have a true failsafe plan that, one hopes, will not panic in an emergency when there are no other options.

The post Plane Lands Itself After In-Flight Emergency for the First Time appeared first on VICE.

]]>
1943878
You’re Running on Autopilot Way More Often Than You Think https://www.vice.com/en/article/youre-running-on-autopilot-way-more-often-than-you-think/ Tue, 23 Sep 2025 16:17:32 +0000 https://www.vice.com/en/?p=1910566 Think about what you did this morning. You woke up, brushed your teeth, made coffee, maybe scrolled your phone, maybe drove the same route to work that you do every day. How many of those things did you actually think about? According to new research, probably none. Scientists say nearly nine out of every ten […]

The post You’re Running on Autopilot Way More Often Than You Think appeared first on VICE.

]]>
Think about what you did this morning. You woke up, brushed your teeth, made coffee, maybe scrolled your phone, maybe drove the same route to work that you do every day. How many of those things did you actually think about? According to new research, probably none. Scientists say nearly nine out of every ten daily actions happen on autopilot, with our brains running the show long before conscious thought gets involved.

The study, published in Psychology & Health, tracked 105 people for a week. Participants were pinged six times a day and had to report what they were doing, along with how deliberate or automatic it felt.

Across more than 3,700 reports, researchers found that 88 percent of behaviors were carried out automatically, while about two-thirds were triggered by habit rather than decision-making.

Lead researcher Amanda Rebar, an associate professor at the University of South Carolina, explained that this automation shows up in two ways.

“Habitual instigation occurs when environmental cues automatically trigger the decision to do something, like reaching for your phone when you hear a notification. Habitual execution happens when you perform an action smoothly without thinking about the mechanics, such as brushing your teeth or driving a familiar route,” she said in a statement.

Most people like to imagine themselves as rational actors, carefully weighing each choice they make. In practice, the study shows, life is closer to a string of well-worn loops. And those loops don’t vary much. Age, gender, and relationship status had no real effect on how habitual someone’s behavior looked.

One exception was exercise. People were more likely to start workouts based on cues, which could mean a reminder on their phone or a regular time of day, but still had to engage consciously once they got moving. Running, lifting, or cycling doesn’t complete itself, even if the decision to start feels automatic.

Habits, it turns out, often line up with what people want. Almost half of all reported behaviors were both intentional and automatic, while only a small fraction clashed with someone’s goals. That makes habits a surprisingly strong ally for anyone hoping to change.

Benjamin Gardner, a psychology professor at the University of Surrey and co-author of the study, said strategies for habit formation are more effective than willpower alone.

“For people who want to break their bad habits, simply telling them to ‘try harder’ isn’t enough,” he said. Building cues for healthier choices—or dismantling the ones tied to unhelpful patterns—might be the clearest path to change.

Most of what you do today will unfold without much thought. The trick, researchers suggest, is shaping those automatic moments to get you one step closer in the direction you actually want to go.

The post You’re Running on Autopilot Way More Often Than You Think appeared first on VICE.

]]>
1910566
Tesla Updating Autopilot In Nearly Every Car In U.S. After Investigation Into Crashes https://www.vice.com/en/article/tesla-autopilot-recall-crash-investigation/ Wed, 13 Dec 2023 15:59:03 +0000 https://www.vice.com/?p=21764 The update will ensure drivers using the company's so-called “autopilot,” which is not autonomous, remain in control of the vehicle.

The post Tesla Updating Autopilot In Nearly Every Car In U.S. After Investigation Into Crashes appeared first on VICE.

]]>
For the second time this year, Tesla has issued what is termed a recall notice at the behest of federal regulators due to safety issues with its driver-assist software. This time, the recall affects nearly all of the two million vehicles Tesla has sold in the United States.

Specifically, the recall is regarding Autosteer, a component of Autopilot which is available as an add-on package with every Tesla. Despite the name, Autopilot does not drive the car by itself. It is only available for use on highways and other limited-access roads and drivers are supposed to remain alert while it is activated. In theory, if the car doesn’t detect the driver’s hands on the wheels, it will sound alarms and eventually hand control of the vehicle back to the driver.

In practice, drivers have long misunderstood the capabilities of Autopilot, especially after Tesla released a separate software upgrade called “Full Self-Driving Beta” that, like Autopilot, must be supervised by an attentive driver but unlike Autopilot can be enabled on most kinds of roads including city streets. In addition, some Tesla owners have been experimenting with how to disable or circumvent the features that attempt to ensure the software is being used properly for at least five years.

But, just in time for the holidays, here comes the National Highway Transportation Safety Administration right down Transportation Safety Lane. Wrapping up a two-year-plus investigation, in which NHTSA said it reviewed logs from nearly 1,000 crashes involving Autopilot, the agency has slapped Tesla on the wrist yet again, resulting in Tesla issuing an over-the-air software update to two million vehicles in the U.S. that will “incorporate additional controls and alerts to those already existing on affected vehicles to further encourage the driver to adhere to their continuous driving responsibility whenever Autosteer is engaged.”

The recall notice is somewhat vague on what exactly this will result in, but it does say vehicles will now have visual alerts with more “prominence,” simpler engagement and disengagement of Autosteer, and “additional checks” that Autosteer isn’t being used on roads it is not intended for, an issue the Washington Post recently detailed.

In February, Tesla issued a NHTSA-initiated recall for approximately 362,000 vehicles enabled with Full Self-Driving Beta because the vehicles using the experimental software on public roads sometimes ran through “stale yellow” lights. In both recalls, Tesla did not agree with the agency’s analysis but volunteered to issue the recall anyways in order to close the issue.

The post Tesla Updating Autopilot In Nearly Every Car In U.S. After Investigation Into Crashes appeared first on VICE.

]]>
21764
Tesla Updating Autopilot In Nearly Every Car In U.S. After Investigation Into Crashes https://www.vice.com/en/article/tesla-updating-autopilot-in-nearly-every-car-in-us-after-investigation-into-crashes/ Wed, 13 Dec 2023 15:23:50 +0000 https://www.vice.com/?p=21760 The update will ensure drivers using the company's so-called “Autopilot,” which is not autonomous, remain in control of the vehicle.

The post Tesla Updating Autopilot In Nearly Every Car In U.S. After Investigation Into Crashes appeared first on VICE.

]]>
For the second time this year, Tesla has issued what is termed a recall notice at the behest of federal regulators due to safety issues with its driver-assist software. This time, the recall affects nearly all of the two million vehicles Tesla has sold in the United States.

Specifically, the recall is regarding Autosteer, a component of Autopilot which is available as an add-on package with every Tesla. Despite the name, Autopilot does not drive the car by itself. It is only available for use on highways and other limited-access roads and drivers are supposed to remain alert while it is activated. In theory, if the car doesn’t detect the driver’s hands on the wheels, it will sound alarms and eventually hand control of the vehicle back to the driver.

In practice, drivers have long misunderstood the capabilities of Autopilot, especially after Tesla released a separate software upgrade called “Full Self-Driving Beta” that, like Autopilot, must be supervised by an attentive driver but unlike Autopilot can be enabled on most kinds of roads including city streets. In addition, some Tesla owners have been experimenting with how to disable or circumvent the features that attempt to ensure the software is being used properly for at least five years.

But, just in time for the holidays, here comes the National Highway Transportation Safety Administration right down Transportation Safety Lane. Wrapping up a two-year-plus investigation, in which NHTSA said it reviewed logs from nearly 1,000 crashes involving Autopilot, the agency has slapped Tesla on the wrist yet again, resulting in Tesla issuing an over-the-air software update to two million vehicles in the U.S. that will “incorporate additional controls and alerts to those already existing on affected vehicles to further encourage the driver to adhere to their continuous driving responsibility whenever Autosteer is engaged.”

The recall notice is somewhat vague on what exactly this will result in, but it does say vehicles will now have visual alerts with more “prominence,” simpler engagement and disengagement of Autosteer, and “additional checks” that Autosteer isn’t being used on roads it is not intended for, an issue the Washington Post recently detailed.

In February, Tesla issued a NHTSA-initiated recall for approximately 362,000 vehicles enabled with Full Self-Driving Beta because the vehicles using the experimental software on public roads sometimes ran through “stale yellow” lights. In both recalls, Tesla did not agree with the agency’s analysis but volunteered to issue the recall anyways in order to close the issue.

The post Tesla Updating Autopilot In Nearly Every Car In U.S. After Investigation Into Crashes appeared first on VICE.

]]>
21760
People Think Their Cars Are Self-Driving Even Though They’re Not, Study Finds https://www.vice.com/en/article/people-think-their-cars-are-self-driving-even-though-theyre-not-study-finds/ Wed, 12 Oct 2022 13:00:00 +0000 https://www.vice.com/?p=43608 Yet more confusion over what is or isn’t self-driving.

The post People Think Their Cars Are Self-Driving Even Though They’re Not, Study Finds appeared first on VICE.

]]>
A new survey of about 600 owners of cars with advanced driving assist features like Tesla Autopilot and GM Super Cruise found that even the people who own these cars are confused about what they are technologically capable of. The survey, conducted by the Insurance Institute for Highway Safety (IIHS), found 53 percent of Super Cruise drivers, 42 percent of Autopilot users, and 12 percent of Nissan ProPILOT Assist drivers “were comfortable treating their vehicles as fully self-driving” even though they are not. 

The survey’s findings add to the mounting evidence that drivers don’t know exactly what self-driving cars are which leads them to believe their own cars are more capable than they are. It is a product of the wide gap between complex engineering jargon and colloquial terms people use all the time.

The study asked approximately 200 owners of each of three prominent driver assist technology suites sold in cars today—GM’s Super Cruise, Tesla’s Autopilot, and Nissan’s ProPILOT Assist—how comfortable they are doing various things while these systems are engaged. These systems are, in industry jargon, “Level 2” driving assist, meaning they can perform some driving tasks on their own but must have a driver paying attention at all times. Specifically, the cars can maintain a set speed and brake for any slower-moving cars ahead (adaptive cruise control) and it can, under ideal conditions, maintain a lane. All three will offer varying degrees of alerts if it detects the driver is not paying attention, with Super Cruise being the strictest, using eye-tracking technology to ensure drivers look at the road.

While the main finding of the survey is that more than half of all Super Cruise drivers and almost half of all Autopilot users think their cars are self-driving, the actual survey results make clear that is mostly because people do not understand or care about the engineering definition of “self-driving.” 

For engineers and safety experts, “self-driving” means the car can literally drive itself under all circumstances and no human is required. But it is obvious the survey respondents were interpreting the term differently, possibly to mean something more like “I am literally not doing anything to drive the car at that particular time” even if they are still remaining somewhat alert.

For example, only three percent of GM owners and seven percent of Tesla owners thought it was safe to sleep while their driver assist systems were on. That’s worryingly high considering the correct answer is that it is absolutely not safe to do that, but it demonstrates ordinary people think “self-driving” means something very different than safety experts and engineers do. (It is also worth noting that one cannot sleep while Super Cruise is engaged because the eye tracking system notices when eyelids are closed, even with sunglasses on. Which adds another caveat to the survey results: Asking people if they feel like something is safe is not the same as if they actually do it.)

Similarly, less than 15 percent of owners of both GMs and Teslas think it is safe to read while the systems are on. And a smaller percentage of Autopilot users (36 percent) think it is safe to “look away from the road for more than a few seconds” when Autopilot is engaged than ones who think the car is self-driving (42 percent). Again, these are worryingly high numbers considering they are incorrect from a technological and safety perspective, but they also show people do not so much think their cars are self-driving as much as they have made up their own definitions of what self-driving is, or are seeing the survey as a type of evaluation of the system’s capabilities.

Which is, of course, exactly the problem. There is a fundamental tension with these so-called driver assist features. The car companies cannot market them as safety features because then they’d have to prove they make driving safer. It is much easier, cheaper, and with much smaller legal liability to market them as convenience features instead. So owners assume they can do other things while the features are on, since that is what would make them convenient. And it is unlikely finger-wagging by safety experts or government watchdogs over definitional differences will overcome the misperceptions already cemented in people’s minds.

Sign up for Motherboard’s daily newsletter for a regular dose of our original reporting, plus behind-the-scenes content about our biggest stories.

The post People Think Their Cars Are Self-Driving Even Though They’re Not, Study Finds appeared first on VICE.

]]>
43608
Government Data on Computer-Assisted Driving Car Crashes ‘Raise More Questions Than They Answer’ https://www.vice.com/en/article/government-data-on-computer-assisted-driving-car-crashes-raise-more-questions-than-they-answer/ Wed, 15 Jun 2022 14:29:42 +0000 https://www.vice.com/?p=35258 The first-ever data release doesn’t tell us anything new, but may signal a first step towards real oversight for systems manufacturers insist are safe.

The post Government Data on Computer-Assisted Driving Car Crashes ‘Raise More Questions Than They Answer’ appeared first on VICE.

]]>
For the last year, the National Highway Traffic Safety Administration, an agency within the Department of Transportation, has been collecting data from vehicle manufacturers on car crashes involving computer-assisted driving systems, known in the industry as Automated Driving Assistance Systems (ADAS). This includes Tesla’s Autopilot and Full Self-Driving (which is not self-driving) as well as similar systems available from most manufacturers these days. 

NHTSA released the first set of data on Wednesday, which is a landmark in federal oversight over this still-nascent technology, because until now nobody had any idea how safe these systems are. This data does not answer that question, but it does get us a step closer.

NHTSA received 392 reports of crashes involving ADAS systems, defined as any crash where such a system was active within 30 seconds of the crash, and 273 of them were in Teslas. The crashes were self-reported by the manufacturers themselves. Most of those reports were generated based on vehicle telematics data, but 35 percent were from complaints or claims submitted by motorists. In most crashes (294), NHTSA doesn’t know if anyone was injured as a result, but 11 of the 98 crashes where injury information was available had a serious injury or fatality. And the cars crash into all types of stuff: other cars, “other fixed objects,” animals, poles and trees, buses, trucks, vans, three pedestrians and one cyclist. In 146 cases, NHTSA doesn’t know what the car crashed into.

To put these numbers into perspective, there are approximately 5.2 million car crashes in the U.S. per year, according to the Bureau of Transportation Statistics, although that’s just an estimate because nobody officially tracks that. At least 36,500 drivers crash their cars into buildings every year, but that’s just an estimate by an independent researcher because nobody tracks that either. Even if someone died in each of the 392 ADAS crashes, it would account for less than one percent of the 42,915 estimated road deaths last year in the U.S. 

For now, there are too many issues with the ADAS crash data to draw any conclusions, something NHTSA acknowledges. NHTSA head Steven Cliff told reporters, “The data may raise more questions than they answer.” Among the problems with the data: There is likely some double-counting, there are surely many crashes that never go reported, manufacturers with more centralized data harvesting capabilities (like Tesla) will be more aware of crashes than manufacturers that don’t regularly analyze car data, and none of the data is weighted for how many ADAS-capable vehicles are on the road for each manufacturer, not to mention the percentage of time those systems are actually active.

Nevertheless, the simple act of acknowledging this a problem under NHTSA’s purview is a significant and noteworthy break from NHTSA’s past, where the agency more or less took everything auto manufacturers said about the safety of these systems for granted. 

The post Government Data on Computer-Assisted Driving Car Crashes ‘Raise More Questions Than They Answer’ appeared first on VICE.

]]>
35258
More Phantom Braking in Teslas as It Keeps Fixing Then Busting Its Software https://www.vice.com/en/article/more-phantom-braking-in-teslas-as-it-keeps-fixing-then-busting-its-software/ Mon, 15 Nov 2021 19:46:39 +0000 https://www.vice.com/?p=79155 The more updates, the more potential for bugs.

The post More Phantom Braking in Teslas as It Keeps Fixing Then Busting Its Software appeared first on VICE.

]]>
Teslas appear to once again be struggling with the “phantom braking” phenomenon, in which cars on Autopilot apply the brakes when it detects an obstruction that doesn’t exist, according to Electrek and National Highway Traffic Safety Administration (NHTSA) complaints.

These instances pose serious safety issues because other drivers are not expecting the cars in front of them to slam on the brakes, increasing the odds of rear-end collisions. They also undermine the general narrative that Autopilot and similar automatic-braking systems make driving safer, because they introduce a new hazard of sudden, inexplicable brake-slamming that didn’t previously exist with human drivers.

Automakers have had problems with phantom braking for years, but Teslas seem to be especially prone to them, as extensively documented in various driver forums such as the popular TeslaMotors subreddit. Because Tesla owners tend to be more technically savvy and observant to software variations, numerous theories have always been floated whenever the phantom braking gets particularly bad for any given driver. But because NHTSA, the regulatory agency tasked with overseeing road safety, has so far failed to take advanced driver assist features like Autopilot seriously as a potential safety issue, it is little more than a guessing game as to how well these systems work and why they might be buggy.

Which is exactly where we find ourselves with the most recent problems reported by Electrek. The website said “things have been seemingly getting worse lately for many Tesla owners” in the last few weeks in particular. This is seemingly a separate and distinct issue from the recent Full Self-Driving Beta recall for aggressive phantom braking among other issues because it impacts Teslas without FSD Beta, running plain old Autopilot. The issues with Autopilot appear to be linked to the recent 2021.40 software update, but are only affecting some vehicles running that software. Electrek concludes that “There’s undeniably a significant uptick in phantom braking events, but it doesn’t seem to be affecting all cars the same way.”

For now, Tesla gets an outsized amount of attention on the phantom braking issue both because of its cavalier attitude towards using public roads as software testing grounds and because it updates its software the most frequently. But as most every other automaker moves towards frequent over-the-air updates to roll out new features as a new revenue stream, it is possible these problems will become more common.

The phantom braking issue highlights a problem with automatic safety features in cars that is often ignored or even dismissed by its proponents: Software quality ebbs and flows over time. It is not enough to simply make one good version once, because every future version has the potential to break what previously worked. For example, Musk said phantom braking would be “fixed” in October 2020; maybe it was, maybe it wasn’t, but either way it is obviously back. This concept is not strange to computing. Almost every app update features “bug fixes and performance improvements.” That’s fine when talking about the latest update to Google Docs, less so when it has the potential to cause a pileup on the interstate.

The post More Phantom Braking in Teslas as It Keeps Fixing Then Busting Its Software appeared first on VICE.

]]>
79155
Tesla Recalls ‘Self-Driving’ Software Update That Made Cars ‘Undrivable’ https://www.vice.com/en/article/tesla-recalls-self-driving-software-update-that-made-cars-undrivable/ Mon, 25 Oct 2021 13:39:39 +0000 https://www.vice.com/?p=77310 For a little more than a day, thousands of Teslas had software causing some cars to slam on their brakes for no apparent reason.

The post Tesla Recalls ‘Self-Driving’ Software Update That Made Cars ‘Undrivable’ appeared first on VICE.

]]>
Over the weekend, Tesla rolled out a new version of Full Self-Driving Beta to thousands of cars that was so full of safety-critical bugs it was uninstalled by the company within 36 hours. 

The software, version 10.3, rolled out Friday to Teslas enrolled in the FSD Beta program with a Safety Score of 99 or better, a metric that can easily be gamed. On Saturday, Elon Musk tweeted, “Regression in some left turns at traffic lights found by internal QA in 10.3. Fix in work, probably releasing tomorrow.” But that was not the biggest problem with 10.3. A few hours later, he said the company was “rolling back to 10.2 temporarily” after “seeing some issues with 10.3.”

What was the problem? According to hundreds of comments on the subreddit r/Teslamotors, the biggest issue appeared to be that the car’s forward collision warning (FCW) system stopped functioning properly, warning drivers of imminent crashes when there was plenty of distance or even no cars around at all. Several commenters said the car slammed on the brakes for no reason multiple times on short drives, almost causing them to get hit from behind. One commenter said, “I’ve had entirely too many phantom forward collision warnings and it is undrivable in its current state.” The bug affected the car’s FCW system even when Full Self-Driving mode was not enabled. 

Because Musk has disbanded Tesla’s public relations department and functionally made himself the company’s spokesperson, here is what he said about this release: “Please note, this is to be expected with beta software. It is impossible to test all hardware configs in all conditions with internal QA, hence public beta.”

Tesla and its fans have long used the “it’s a beta, bugs are expected, it will get better” line to justify missteps, bugs, and half-baked ideas like the Safety Score (also a beta). That’s all well and good when talking about a weather app’s beta program to get the latest redesign. It’s another matter entirely for safety-critical software. Other car companies, airplane manufacturers, and similar industries of course have beta software as well, but it gets tested in private before rolling out to the public for precisely this reason.

Federal regulators are increasingly interested in Tesla’s software because of the company’s cavalier attitude towards using public roads as testing grounds for unproven technology that can radically change in capability from one software update to the next. The very existence of a “public beta” on safety critical software exemplifies the problem. 

Tesla supporters like to argue the company is unfairly maligned by detractors for being on the vanguard of a revolution. But if any car company sent out a software update that made thousands of its vehicles dangerous to drive, NHTSA would be asking questions of them, too.

Early Monday morning, Musk tweeted, “10.3.1 rolling out now.” 

The post Tesla Recalls ‘Self-Driving’ Software Update That Made Cars ‘Undrivable’ appeared first on VICE.

]]>
77310
A Tesla on Autopilot Crashed Into Highway Patrol Car https://www.vice.com/en/article/a-tesla-reportedly-on-autopilot-has-crashed-into-highway-patrol-car/ Mon, 30 Aug 2021 14:47:31 +0000 https://www.vice.com/?p=72675 The crash occurred days after federal regulators announced a probe into this very type of crash.

The post A Tesla on Autopilot Crashed Into Highway Patrol Car appeared first on VICE.

]]>
A Tesla reportedly on Autopilot crashed into a stationary police vehicle with its emergency lights on early Saturday morning in Orlando, just days after federal regulators announced an investigation into this type of crash.

According to the Orlando Sentinel, the crash occurred around 5 a.m. on I-4 near downtown Orlando. The police officer had stopped to help a disabled vehicle and the Tesla crashed into it. The police vehicle then crashed into the disabled vehicle. The Tesla “narrowly missed” the trooper, Florida Highway Patrol spokesperson Kim Montes told the Sentinel. The two drivers sustained minor injuries and the officer was not hurt.

According to the press release issued by FHP, the 26-year-old Tesla driver said Autopilot was activated at the time of the crash.

Either way, the description of the crash fits with previous instances of Teslas crashing into emergency vehicles, the subject of a recently announced National Highway Traffic Safety Administration (NHTSA) investigation. NHTSA is looking into 11 confirmed cases where a Tesla on Autopilot or Traffic Aware Cruise Control crashed into an emergency vehicle parked on the side of a road, typically at night, and with the first responders using some kind of high visibility warning like flashing lights, flares, or road cones. The 11 crashes resulted in 17 injuries and one death. FHP Lieutenant Kim Montes told Motherboard FHP’s fleet manager is reaching out to NHTSA today and the crash remains under investigation.

Update: This article has been updated with a response from FHP that confirmed the Tesla driver told police the car was on Autopilot at the time of the crash.

The post A Tesla on Autopilot Crashed Into Highway Patrol Car appeared first on VICE.

]]>
72675
The Government Is Finally Catching Up With Tesla’s Wild Autopilot Claims https://www.vice.com/en/article/the-government-is-finally-catching-up-with-teslas-wild-autopilot-claims/ Wed, 18 Aug 2021 15:43:39 +0000 https://www.vice.com/?p=71655 After years of looking the other way, regulators might finally be getting around to caring about Tesla's deceptive self-driving claims.

The post The Government Is Finally Catching Up With Tesla’s Wild Autopilot Claims appeared first on VICE.

]]>
Tesla, the world’s most frustrating company, simultaneously makes what are widely regarded as the best electric vehicles and most functional and comprehensive charging network while also selling the world’s most dangerous and widely abused driver-assist features. Thanks to years of the company’s misleading marketing of the “Autopilot” and “Full Self-Driving” packages—as well as the frequent wild claims by the extremely online CEO Elon Musk such as the prediction in 2019 that there would be one million Tesla robotaxis by 2020—owners perceive it to be far more capable than it is

After years of looking the other way, it’s possible that maybe, just maybe, the government is finally going to do something about Tesla’s massive beta test in which we are all experiment subjects.

On Monday, the National Highway Traffic Safety Administration (NHTSA) opened an investigation into 11 cases where a Tesla on Autopilot crashed into emergency vehicles. NHTSA has previously disclosed it is also investigating 30 other Tesla crashes where 10 people died, most involving Autopilot of FSD.

NHTSA’s investigations alone indicate a new degree of seriousness from the agency under the Biden administration, but Tesla faces criticism from elsewhere in the government, too. On Wednesday, Senators Richard Blumenthal and Ed Markey sent a letter to Federal Trade Commission chief Lina Kahn asking her to open its own investigation into Tesla’s deceptive marketing practices around Autopilot and FSD. The letter cites a video Tesla posted to YouTube in 2019 with 18 million views showing someone “driving” the car without touching the wheel for more than a minute, in violation of Tesla’s own stated safety policies. 

Even taking Tesla’s policies at relatively face value—and not including the highly publicized ways Teslas have been easily tricked for years into driving on their own for extended periods, bugs for which Tesla could issue a software update to fix—Tesla has always tried to have it both ways. It promotes these driver assist features as if they basically drive the car itself—the names are “Autopilot” and “Full-Self Driving,” after all—and you can pay $10,000 for the privilege of using them, a premium price for what’s being sold as a premium experience. But, in the fine legal print, the company says these features are no more reliable than any other Level 2 driver assist system that can be found from virtually every other manufacturer, and the driver must still pay close attention at all times. Some drivers tragically find this out the hard way, like George McGee, a man in Florida who reached down to pick up his phone thinking Autopilot was in control when it promptly slammed into another car, killing a woman. When police arrived, he referred to the car’s capabilities as “stupid cruise control.”

Whether anything will come of these investigations remains to be seen—or, in the FTC’s case, if an investigation will be made at all. But if the last five years or so have taught us anything, it’s that Tesla won’t stop until someone makes them. 

The post The Government Is Finally Catching Up With Tesla’s Wild Autopilot Claims appeared first on VICE.

]]>
71655