China regulator says Tesla recalling 107,293 China-made Model 3, Model Y vehicles

TESLA-CHINA:China regulator says Tesla recalling 107,293 China-made Model 3, Model Y vehicles

China’s market regulatory agency said on Monday that Tesla Inc is recalling 107,293 China-made Model 3 and Model Y vehicles due to overheating that may cause the centre touchscreen display to malfunction, among other issues.

The overheating could also lead to other malfunctions, including windshield settings and gear displays, according to a statement published by the State Administration for Market Regulation (SAMR).

Also read: Looking for a smartphone? To check mobile finder click here.

Tesla Autopilot Stirs U.S. Alarm as ‘Disaster Waiting to Happen’

(Bloomberg) — Derrick Monet and his wife, Jenna, were driving on an Indiana interstate in 2019 when their Tesla Model 3 sedan operating on Autopilot crashed into a parked fire truck. Derrick, then 25, sustained spine, neck, shoulder, rib and leg fractures. Jenna, 23, died at the hospital.

The incident was one of a dozen in the last four years in which Teslas using this driver-assistance system collided with first-responder vehicles, raising questions about the safety of technology the world’s most valuable car company considers one of its crown jewels.

Now, U.S. regulators are applying greater scrutiny to Autopilot than ever before. The National Highway Traffic Safety Administration, which has the authority to force recalls, has opened two formal defect investigations that could ultimately lead Tesla Inc. to have to retrofit cars and restrict use of Autopilot in situations it still can’t safely handle.

A clampdown on Autopilot could tarnish Tesla’s reputation with consumers and spook investors whose belief in the company’s self-driving bona fides have helped make Tesla Chief Executive Officer Elon Musk the world’s wealthiest person. It could damage confidence in technology other auto and software companies are spending billions to develop in hope of reversing a troubling trend of soaring U.S. traffic fatalities.

It also could also bring long-simmering tensions between Washington and Tesla to a boil. The iconoclastic Musk has already derided NHTSA as the “fun police” and chafed at President Joe Biden’s unwillingness to lavish the pioneering company with praise. He’s not shy about lambasting lawmakers and regulators on Twitter, the social media platform he has offered to purchase for $43 billion.

Tesla, which reports earnings later this week, has lately had an aura of invincibility. As larger rivals were hobbled by the global chip shortage and other pandemic disruptions, the electric-car maker managed to substantially increase production. A modestly funded, slow-moving government agency is one of few obstacles threatening to throw it off course.

Musk and Tesla did not respond to requests for comment. “Making our vehicles safer is foundational to our company culture and how we innovate new technologies,” Rohan Patel, Tesla’s senior director of public policy and business development, wrote in a March letter to lawmakers.

A crackdown from NHTSA would follow repeated pleas from the National Transportation Safety Board, the independent accident-investigation agency, to tighten oversight of automated vehicles. The NTSB, which doesn’t have the power to compel carmakers to follow its recommendations, has suggested Tesla embrace automated-driving system safeguards that General Motors Co. and Ford Motor Co. have adopted for their systems. Tesla hasn’t responded to the NTSB’s guidance and instead continued its riskier approach.

“We essentially have the Wild West on our roads right now,” Jennifer Homendy, the chair of the NTSB, said in an interview. She describes Tesla’s deployment of features marketed as Autopilot and Full Self-Driving as artificial-intelligence experiments using untrained operators of 5,000-pound vehicles. “It is a disaster waiting to happen.”

Light Touch

Musk has taken advantage of a light-touch approach in the U.S. to regulating self-driving technology. Within days of Tesla releasing a software update that enabled Autopilot in October 2015, YouTubers posted videos of themselves ignoring the company’s warnings against taking their hands off the wheel. One nearly auto-steered off the road; the other almost veered into an oncoming car.

Two months before a Tesla driver in Florida died when his Model S on Autopilot plowed into an 18-wheel trailer in May 2016, NHTSA said existing laws in the country posed few barriers to driver-assistance systems. Then-Transportation Secretary Anthony Foxx said weeks after the crash that NHTSA would release guidelines, rather than rules, for the technology. Congress hasn’t enacted any laws specifically addressing oversight of vehicle automation.

Musk alluded to this regulatory permissiveness in March when he was asked when Europeans will get to test Full Self-Driving, or FSD, a set of beta features available in the U.S. Contrary to the name, FSD doesn’t render Tesla cars capable of driving themselves.

“In the U.S., things are legal by default,” Musk said. “In Europe, they’re illegal by default. So we have to get approval beforehand, whereas in the U.S., you can kind of do it on your own cognizance, more or less.”

Tesla’s approach to automated-driving features contrasts with that of legacy automakers GM and Ford, whose systems use cameras behind the steering wheel to monitor whether drivers are paying attention. The companies also restrict use of the systems to highways their engineers have mapped and tested out before deploying the technology to drivers.

“Tesla sticks out like a sore thumb,” said David Friedman, who was deputy and acting administrator of NHTSA from 2013 to 2015. “And it has for years.”

NHTSA has repeatedly reminded the public — including in comments provided for this story — that no commercially available vehicle can drive itself. The agency has opened 31 special investigations into crashes involving driver-assistance systems, 24 of which involved Teslas. But the company keeps hawking FSD — and charges $12,000 for it.

There’s growing discomfort with this state of play in Washington.

“I really dislike a lot of what Tesla has done, and at the top of the list in bright, bold letters, is Elon Musk’s habit of making false public claims, and using his podium in a way that creates safety risks,” Heidi King, a deputy and acting administrator of NHTSA during the Trump administration, said in an interview.

“We all admire his visionary attributes,” King said of Musk. “But visionary exaggerations about a consumer product can be very, very dangerous.”

Growing Scrutiny

King was one of several acting heads of NHTSA during what has been a five-year leadership vacuum. The last Senate-confirmed administrator left the post in January 2017. A vote to permanently place Biden’s pick to run the agency, Steve Cliff, in the position is being held up indefinitely.

Impermanent leadership — along with a tight budget and modest headcount — may have prolonged Autopilot’s free ride. But a series of moves NHTSA has made over the last 10 months suggest it may not last much longer:

In June, NHTSA ordered automakers to report crashes in which automated-driving systems are activated

In August, NHTSA opened the defect investigation related to first-responder crash scenes

In September, NHTSA sought documents from a dozen Tesla competitors about their automated systems

In October, NHTSA grilled Tesla over why it neglected to do a recall when it deployed a software update to improve emergency-vehicle detection, and sought information about expanded availability of FSD

In November, Tesla recalled a version of FSD

In February, Tesla conducted another FSD-related recall to disable a setting that allowed vehicles to roll through stop signs, and NHTSA opened a second Autopilot defect investigation

Former safety officials are encouraged by the growing scrutiny on Autopilot, seeing it as long overdue. They are calling for NHTSA to put its recall authority to use and seek additional powers and resources from Congress that would allow it to modernize safety standards.

“NHTSA is empowered with robust tools and authorities to protect the public, to investigate potential safety issues, and to compel recalls when we find evidence of noncompliance or an unreasonable risk to safety,” a spokesperson for the agency said in a statement. “NHTSA has collected data and conducted research, developed test procedures and measured their effectiveness, which are all necessary requirements before a safety standard can be developed.”

Two Democratic Senators — Ed Markey and Richard Blumenthal — have called for the Federal Trade Commission to probe whether Tesla has deceptively marketed Autopilot and FSD. FTC Chair Lina Khan told the lawmakers in September she couldn’t reveal information regarding any non-public investigations.

Recall Options

In the event NHTSA determines from either of its investigations there are defects pertaining to Autopilot, it can order Tesla to conduct recalls. Those could take a variety of different forms, because Tesla is permitted by law to choose how exactly it responds to such an order.

Addressing a defect could be as simple as beaming an over-the-air update to Tesla cars using their internet connection, much in the way smartphones receive software updates. Tesla has already carried out several recalls this way, and could update Autopilot’s software to keep the system from operating in certain domains it’s not yet able to safely navigate.

But pricier fixes may end up being needed. One example: Tesla could determine it needs to install cameras behind its steering wheel to monitor whether drivers are paying attention while using its systems, as other automakers do.

While the company has put cabin-facing cameras in its cars for years, they’re positioned over the rear-view mirror, rather than directly in front of the driver. Musk has said the cameras are meant for a robotaxi service that doesn’t yet exist.

It’s unlikely Tesla would opt for the most expensive outcome of all: replacing vehicles entirely. But a third option for manufacturers to remedy vehicles they’re forced to recall is to issue refunds, which also would be costly. Tesla has steadily increased the price of FSD, and used to charge thousands of dollars for Autopilot before making it a standard feature in 2019.

Tesla will have had it coming if NHTSA does take action on Autopilot, according to Friedman.

“The NTSB has been pointing out since that 2016 crash — where the Tesla literally couldn’t see the broadside of an 18-wheeler — that there are serious concerns,” Friedman, who is now vice president of advocacy for Consumer Reports, said in an interview. “How is it that an automated vehicle can’t safely maneuver around an emergency vehicle? That’s literally one of the first things you learn in driver’s ed: if there’s an emergency vehicle, you don’t run into it.”

Taking the Mantle

When NHTSA first investigated more than five years ago whether Autopilot was defective, it found that the driver of the Tesla Model S that crashed into a trailer in Florida had ignored his Tesla’s warnings to maintain control. In a report stating it found no defect and was closing its probe, NHTSA said Tesla supplied data that showed Tesla vehicles’ crash rate dropped almost 40% after installation of Autosteer, an Autopilot feature.

Two years later, a data-analysis company issued a report calling that finding into question. Quality Control Systems, a firm that sued the Transportation Department to obtain the mileage and crash figures NHTSA studied, found the data was incomplete and criticized the company and regulator for making “tenuous” safety claims.

“NHTSA never, ever, ever, should have just taken Tesla at their word,” Friedman said. “It’s NHTSA’s responsibility to do high-quality analysis, and dot their i’s and cross their t’s. In this case, it doesn’t look like they did either.”

An agency spokesperson said NHTSA made no claim in its report regarding the effectiveness of Autosteer, and that it lacked critical information to do so.

NHTSA will have a fresh advantage in its latest probes of Autopilot: Now that other companies have followed Tesla to market with automated-driving features, the agency has other systems to compare against.

Friedman likens the situation to decades ago, when it wasn’t unusual for carmakers to put gas tanks behind or hovering over the rear axle. When manufacturers started moving tanks inboard, and Ford didn’t with its Pinto model — rendering the car prone to catching fire — the agency deemed the design an unreasonable safety risk.

“Only NHTSA knows their intentions relative to this,” Friedman said of the agency’s Autopilot investigations. “But it is certainly great to see NHTSA spending more time doing its core job when it comes to putting safety first.”

For all the latest Technology News Click Here 

Read original article here

Denial of responsibility! TechAI is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.