TeslaModelS 2017
James Robinson12 Sept 2016
NEWS

Autopilot update on Tesla's radar

Tesla's software update paints a radar-guided picture of the world

Tesla has announced Version 8 of its Autopilot software for radar to enhance the signal processing power of its vehicles.

Installed in all Tesla's from 2014 onwards, the radar was originally part of an Autopilot suite and was only intended to be used as a supplementary sensor for the car's primary camera and image processing system.

After careful consideration and much research however, Tesla believes the radar can now be used as a primary control sensor without the need to rely on the car's camera system for visual-image confirmation.

According to Tesla, the benefit of using this technology is that a radar's photon wavelength travels very easily through visual obstructions such as fog, dust, rain and snow.

However, Tesla has admitted this is no small feat, as a radar's view of the world is vastly different to that of a camera.

Radar can struggle to recognise anything metallic-looking – like a mirror – and humans only register as partially translucent. Furthermore, something made of wood or painted plastic appears to a radar almost as transparent as glass.

According to Tesla however, the biggest problem is caused by any metal surface with a dish shape, as this is not only reflective to radar signal, but is amplified in size.

Tesla cites the example of a soft drink can. If discarded on the road, with its concaved bottom facing towards the car, the drink can could appear to radar to be a large and dangerous object.

Tesla has recognised these technical difficulties throw up a big issue with cars unnecessarily braking to avoid crashing into objects that aren't there, which in-turn could do far more harm.

While the company has solutions to these problems, they are quite technical, so we will let Tesla explain it:

The first part of solving that problem is having a more detailed point cloud. Software 8.0 unlocks access to six times as many radar objects with the same hardware with a lot more information per object.

The second part consists of assembling those radar snapshots, which take place every tenth of a second, into a 3D "picture" of the world. It is hard to tell from a single frame whether an object is moving or stationary or to distinguish spurious reflections. By comparing several contiguous frames against vehicle velocity and expected path, the car can tell if something is real and assess the probability of collision.

The third part is a lot more difficult. When the car is approaching an overhead highway road sign positioned on a rise in the road or a bridge where the road dips underneath, this often looks like a collision course. The navigation data and height accuracy of the GPS are not enough to know whether the car will pass under the object or not. By the time the car is close and the road pitch changes, it is too late to brake.

This is where fleet learning comes in handy. Initially, the vehicle fleet will take no action except to note the position of road signs, bridges and other stationary objects, mapping the world according to radar. The car computer will then silently compare when it would have braked to the driver action and upload that to the Tesla database. If several cars drive safely past a given radar object, whether Autopilot is turned on or off, then that object is added to the geocoded whitelist.

When the data shows that false braking events would be rare, the car will begin mild braking using radar, even if the camera doesn't notice the object ahead. As the system confidence level rises, the braking force will gradually increase to full strength when it is approximately 99.99% certain of a collision. This may not always prevent a collision entirely, but the impact speed will be dramatically reduced to the point where there are unlikely to be serious injuries to the vehicle occupants.

We're guessing that was written by an engineer...

In addition to the points already noted, a Tesla will also be able to bounce a radar signal underneath the car in front of it using radar pulse signature and photon time of flight to establish what is in front of the car ahead.

So even if you are following a car in dense fog with zero visibility, a Tesla will still be able to brake in an emergency if it a hazard that perhaps the car in front can't avoid.

Tags

Tesla
Model S
Car News
Sedan
SUV
Green Cars
Performance Cars
Written byJames Robinson
Our team of independent expert car reviewers and journalists
Love every move.
Buy it. Sell it.Love it.
®
Scan to download the carsales app
    DownloadAppCta
    AppStoreDownloadGooglePlayDownload
    Want more info? Here’s our app landing page App Store and the Apple logo are trademarks of Apple Inc. Google Play and the Google Play logo are trademarks of Google LLC.
    © carsales.com.au Pty Ltd 1999-2025
    In the spirit of reconciliation we acknowledge the Traditional Custodians of Country throughout Australia and their connections to land, sea and community. We pay our respect to their Elders past and present and extend that respect to all Aboriginal and Torres Strait Islander peoples today.