Elon Musk’s Tesla Motors is mostly mum when it comes to talks about the Autopilot controversy that started back in May when the first fatal crash involving one of its electric cars was reported. However, the Palo Alto-headquartered automotive and energy storage company appears to have subtly addressed its critics on Twitter and to have reminded everyone how Autopilot really works. 

Earlier today, Tesla took to its official Twitter account (@TeslaMotors) to share Wired’s report on how the company’s Autopilot actually works. Wired’s video report explores the different hardware components that enable the self-driving feature to function. Using a Tesla Model S, Wired reviewed how Autopilot radars and camera work to make driving safe and easy. 

In the video, Autopilot noticeably informs the driver to keep his hands on the wheel, but the driver differs and even says, “I’m supposed to keep my hands on the wheel, but you can see how relaxing it does feel. It lulls you to this sense of security.” Nevertheless, it is pretty clear that the video report aims to inform consumers that Autopilot does not entail the absence of supervision by the driver and that Tesla’s cars may be self-driving, but they are not at all designed to be driverless. 

Tesla’s tweet comes more than a week since another case of a Tesla car crash happened. According to Electrek, the new case took place in Kaufman, Texas on Aug. 7, and it involved luxury home builder Mark Molthan, who survived the accident and admitted today that he was not paying attention when his Model S hit a guardrail. 

“I used Autopilot all the time on that stretch of the highway. But now I feel like this is extremely dangerous. It gives you a false sense of security. I’m not ready to be a test pilot. It missed the curve and drove straight into the guardrail. The car didn’t stop — it actually continued to accelerate after the first impact into the guardrail,” Molthan told Bloomberg in a phone interview.

However, Electrek defended Tesla by stating that Autopilot does not give a “false sense” of security as what Molthen suggests. Instead, the feature is just there to reduce the workload of the driver and make driving more secure for the driver, but the latter still needs to be vigilant while on the road. 

Just two weeks ago, another accident involving a Model S made headlines. The accident happened in Beijing, China and the driver reportedly admitted that he was looking at his phone at the time of the crash, according to Electrek. Still, the driver put the blame on Autopilot 

In June, the National Highway Transportation Safety Administration (NHTSA) opened an investigation on Tesla’s Autopilot after the first fatal crash involving a Tesla Model S happened in central Florida a month prior, according to The Verge

At the time, Tesla CEO Elon Musk took to Twitter to explain that the car’s radar did not help prevent the crash against the huge tractor-trailer because the “radar tunes out what looks like an overhead road sign to avoid false braking events.” Musk also expressed his sympathy over the death of the driver, identified as Joshua Brown, in a different tweet, which read: “Our condolences for the tragic loss.”