Disclaimer: We may earn a commission if you make any purchase by clicking our links. Please see our detailed guide here.

Follow us on:

Google News
Whatsapp

Tesla fires worker reviewing its full self-driving feature on YouTube

Join the Opinion Leaders Network

Join the Techgenyz Opinion Leaders Network today and become part of a vibrant community of change-makers. Together, we can create a brighter future by shaping opinions, driving conversations, and transforming ideas into reality.

Elon Musk-run Tesla has fired an employee who reviewed the electric car maker’s full self-driving (FSD) beta software on his YouTube channel.

John Bernal posted the video that showed his Tesla hitting a bollard on his YouTube channel AI Addict.

As reported by CNBC, Bernal said that prior to his dismissal, he was told verbally by his managers that he “broke Tesla policy” and that his YouTube channel was a “conflict of interest”.

However, his written separation notice did not specify a reason for his dismissal, reports The Verge.

The video had more than 2,50,000 views and was shared widely on social networks like Twitter.

Bernal said that after posting the video, “A manager from my Autopilot team tried to dissuade me from posting any negative or critical content in the future that involved FSD Beta. They held a video conference with me but never put anything in writing.”

Tesla’s social media policy for employees does not forbid criticism of the company’s products in public, but says that the company “relies on the common sense and good judgment of its employees to engage in responsible social media activity”.

Bernal says that after being fired, his access to the FSD Beta software was revoked.

Meanwhile, the US senators have rejected Elon Musk-run Tesla’s claim that its autopilot and FSD features are safe for driving, saying this is just “more evasion and deflection from Tesla“.

Rohan Patel, Senior Director of Public Policy at Tesla, wrote in a letter to the US Senators Richard Blumenthal (D-CT) and Ed Markey (D-MA) that Tesla’s autopilot and FSD capability features “enhance the ability of our customers to drive safer than the average driver in the US.

Patel responded to the Senators, who had raised “significant concerns” about autopilot and FSD. They also urged federal regulators to crack down on Tesla to prevent further misuse of the company’s advanced driver-assist features.

The FSD beta mode recently resulted in a Tesla Model Y crashing in Los Angeles.

No one was injured in the crash, but the vehicle was reported: “severely damaged”.

The crash was reported to the National Highway Traffic Safety Administration (NHTSA), which has multiple and overlapping investigations into Tesla’s autopilot system.

Tesla FSD beta aims to enable Tesla vehicles to virtually drive themselves both on highways and city streets by simply entering a location in the navigation system, but it is still considered a level-2 driver-assist since it requires driver supervision at all times.

The driver remains responsible for the vehicle and needs to keep their hands on the steering wheel and be ready to take control.

There have been several Tesla Autopilot-related crashes, currently under investigation by the US NHTSA.

Join 10,000+ Fellow Readers

Get Techgenyz’s roundup delivered to your inbox curated with the most important for you that keeps you updated about the future tech, mobile, space, gaming, business and more.

Recomended

Partner With Us

Digital advertising offers a way for your business to reach out and make much-needed connections with your audience in a meaningful way. Advertising on Techgenyz will help you build brand awareness, increase website traffic, generate qualified leads, and grow your business.

Power Your Business

Solutions you need to super charge your business and drive growth

More from this topic