Inicio Tesla Chinese real-world self-driving test: 36 cars, 216 crashes, with Tesla on top

Chinese real-world self-driving test: 36 cars, 216 crashes, with Tesla on top

Chinese real-world self-driving test: 36 cars, 216 crashes, with Tesla on top

Chinese media outlet Dongchedi closed down a real highway for a multi-day test of 36 different car driver assist systems in complicated, dangerous real-world driving situations, and most came up severely lacking – though Tesla escaped the tests relatively unscathed.

Over the years, we’ve seen our fair share of crash tests, often conducted in labs to detect the severity of a crash and the probability of injury to occupants. These tests focus on how well a car’s physical crash structures protect occupants, and occasionally other road-users, in the event of a crash.

Then there are “real-world” tests, like the famous “moose test” where a driver has to do a rapid direction change to avoid an intruding object in the road, testing vehicle dynamics and whether a car is able to handle quick changes in direction at high speed without rolling over.

More recently, crash tests have started to incorporate systems like Automatic Emergency Braking (AEB), which is intended to apply the brakes when a crash is imminent to reduce severity of the crash, or even more recently, advanced driver aids collectively known as Advanced Driver Assist Systems or ADAS.

Advertisement – scroll for more content

You’ve likely heard of these ADAS systems before, usually offered and branded by individual automakers, like Tesla’s Full Self-Driving, BYD’s God’s Eye, or Mercedes’ Drive Pilot. They’re not actually self-driving (well, Drive Pilot can drive you in certain circumstances, and Tesla says unsupervised FSD is Coming This Time Next Year™… for about the tenth year running), but they can fully control the vehicle on highways.

All of these fit under a common umbrella of SAE Level 2 systems that are meant to take some of the responsibilities of driving away from a human driver and let a computer handle them. This can help reduce driving fatigue, but more importantly, could also lead to safer driving as computers don’t lose attention or get tired and can theoretically make decisions much faster than a human could – or at least, that’s what auto industry marketing would like you to believe.

Despite the theoretical superiority of these computer systems, in the real world, anyone who has had experience with them knows that they can be strangely indecisive, and will often make different decisions even when encountering the same situation multiple times. That applies to these level 2 systems, and also to true self-driving systems like Waymo’s level 4 system.

Even if you haven’t driven in one, you’re probably skeptical. By now, we’ve all seen the Tesla Robotaxi fail videos, and heard about Autopilot deaths (including ones that get wrongly credited to Autopilot despite just being pedal confusion).

We’ve also seen that Tesla Wile E. Coyote video, where American Youtuber Mark Rober tested Tesla’s vision-only approach versus the vision+LiDAR approach – the latter of which most professionals agree is a more robust solution.

But there have been some real tests missing, among all this: a real-world, on-highway comparative test of several brands of car, in various complicated but plausible situations, with other cars driving around nearby, with full ADAS system activation, by an independent source.

Well, in comes Dongchedi with a test that beats the scale of any we’ve seen yet, which it posted on its youtube channel, DCARSTUDIO.

It’s in the form of a 92-minute video, only available in Chinese language (with English subtitles), where DCAR ran 36 separate cars available in China through six different situations to see how ADAS performed. It’s a great video that merits a watch, even though the language barrier and length may be a tough sell.

And, spoiler alert: things didn’t go all that well for most of the cars tested.

The six tests went as follows, and most included other active vehicles nearby to increase complexity and realism:

  1. A situation where you are following a lead vehicle, and the lead vehicle suddenly darts out of lane into another lane, revealing a stopped car in front of you, with traffic on your left restricting your ability to swerve/merge.
  2. A temporary construction zone in your lane, with short lead-up, requiring a merge.
  3. A construction zone forcing a merge, but a stationary truck parked on the shoulder, partially intruding into the active lane, at night.
  4. A stationary car with lights off, parked across two lanes, simulating a recently-crashed vehicle, at night.
  5. A vehicle joining the highway from an on-ramp and aggressively merging across lanes into the left lane in front of you, with no ability to avoid to the left due to guardrails.
  6. A boar darting across the highway.

Each test is a clearly difficult situation, and one which has led to many accidents in real life with human drivers. And each is plausible, and I would even hazard to say that most of us have seen a situation similar to one of these with our own eyes while driving (even beyond the simple construction zone test).

But if ADAS is supposed to be better and faster than humans, it should be able to handle these challenges, right? That is, after all, how many people use these systems, and how automakers market them (which is currently subject to legal action in California).

The Xiaomi SU7 reacted quickly in test 1, then let off the brakes, then hit them again, but couldn’t avoid a crash

What makes this test different than others that we’ve seen (for example, the Mark Rober video) is that it happened on an actual public highway. Some automakers restrict certain ADAS features to public roads, or specifically to public highways, which are well-marked and thus less likely to offer unpredictable situations to systems that are still not ready to brave chaotic city roads. Doing the test on an actual highway means that these systems can run at their full potential.

In each of the tests, a majority of the cars either failed miserably or only did so-so. It almost seemed at first like the tests were configured deliberately to be impossible by the ADAS systems – but in each test, a few cars ended up being able to avoid any accident, a few managed to reduce the accident to a minor and survivable collision, and sometimes a few even seemed to behave like a human would, stopping and then creeping around the obstacle in question in as safe a manner as they could.

Several cars were damaged, with the Mercedes losing its radar sensor on the boar test

Not all vehicles did all six tests, some due to damage that made it impossible for them to continue (e.g. the Mercedes C-Class broke its radar sensor on the boar test), and some because DCAR trimmed the field down to the best-performing vehicle of each brand for some of the more difficult tests, like the aggressive merging test. So, 216 crashes might be a little poetic license on DCAR’s part, but maybe they just didn’t want to spoil the results in the title.

Some vehicles also showed weirdly conflicting behavior between performance of the AEB and ADAS systems. Cars which DCAR had previously tested and given a passing grade due to their AEB performance seemed to do worse on ADAS than otherwise. For example, there was a moment when the Xiaomi SU7 indicated it was activating AEB during test 1, but then stopped decelerating for a few moments, then started to decelerate again but was unable to avoid a crash.

The only test the Model X failed was the construction zone test

And another interesting pattern that revealed itself was that many of the systems tried to swerve first, and only after that would hit the brakes, to try to avoid crashing into an object ahead of them in the lane. Swerving is often a less-safe behavior, at least in the situations tested on crowded highways, because swerving can spread an incident to other lanes, and because you don’t always know what’s right beside you at all times, given you only have two eyes on a swivel.

ADAS systems theoretically don’t have this disadvantage, since they can have cameras and sensors all around the car.

And yet, despite having those sensors and knowing there was no space to move to the side into neighboring vehicles, quite often the cars would try to swerve into a side lane, making those cars have to take evasive action even if they were close to the median, and only after creating a more dangerous situation would they return back to their lane, attempt to brake, and be unable to do so because of the time spent swerving and unsettling the car’s mass, time which could have been better spent slowing the vehicle to avoid or minimize the severity of a collision. These systems which are supposed to think much faster than a human showed the same potentially fatal indecision that so many human drivers show.

Of course, the best way to avoid all of this is just to leave more space between yourself and the car ahead. DCAR’s test driver often mentioned that the vehicles seemed to be following far too close before these accidents happened.

This is what happens when you swerve instead of braking: you don’t have enough time to slow down, and you hit the car in front of you.

In the end, across all of the tests, Tesla came out on top, with both the Model 3 and Model X passing 5/6 tests. But they failed different tests – with the Model X driving into a well-marked construction zone and the Model 3 recognizing but not slowing fast enough to avoid the boar (only one vehicle avoided the boar: the Model X).

This is an interesting result, because Tesla has a vision-only system, using cameras and no other sensors. The cars were equipped with a variety of systems, some vision-only and some also including LiDAR and radar. The LiDAR systems should have had the advantage during nighttime, though none of the tests happened in inclement weather (heavy rain and fog), which is where LiDAR really shines.

Lidar did not save the Leapmotor C10 from driving right into the back of a truck at night

But Tesla also has more experience offering driver-assist systems than the other brands. Tesla has been offering some form of driver assist since 2014, which is well before many of these companies even existed. That, along with the millions of miles of data collected from its vehicle fleet, surely helped Tesla get its crown in these tests.

But despite Tesla’s high performance, there is still a worrying pattern among the tests – even Tesla’s. Because, strangely, even cars within the same brand showed wildly differing results on the same tests.

The Tesla Model 3 passed the aggressive merge test

For example, the top-range Aito M9 passed 3/6 tests, but the next step down, the M8, passed 1/6 tests. The lower-end Aito, the M7, passed 2/5 tests, faring better than the M8. The Aito M9 has the most sophisticated system the brand offers, but still failed the construction truck test, while the M7 passed it. DCAR compiled the results into tables in the video, but they’re all in Chinese – so CarNewsChina helpfully compiled a table in English text form.

And as mentioned above, the Teslas each failed a different test, despite having the same systems installed. It’s possible that they could have been on different versions of FSD, but each individual update usually doesn’t make that much difference in capability.

This inconsistency doesn’t inspire confidence – given systems showed wildly differing results in the same situation, it makes one think that some of the systems might have just had a good or bad day, and that a future test could flip the results completely. The problem is, we don’t know exactly what went wrong, because we can’t examine the rules in the code that led to these decisions… because there is no ruleset behind the machine learning models used by ADAS systems these days.

Every car except the Model X failed to avoid the boar, though a few cars slowed enough for a minor collision

In the video, DCAR interviewed Lu Guang Quan, from the Beijing University of Aeronautics and Astronautics, who pointed out this behavior as a concern with today’s ADAS systems. Since so many of them use machine learning to learn the rules of driving, when mistakes happen it’s impossible to figure out what rule in the computer’s programming might have led to the error.

“A learning model is just collecting experience. It knows how to drive but not why,” said Lu. “These so-called ‘long tail scenarios’ barely ever happen, but the risk is sky high. You won’t find them in any training dataset. The systems straight up haven’t learned this stuff.”

Lu said that “rule based models would provide stronger failsafes,” because then it would be possible to correct errors in the code, rather than the black box that machine learning models currently offer.

Many cars failed the nighttime “crashed car in middle of road” simulation

Given the results of its tests, DCAR concludes the video by saying “We hope everyone takes a rational look at this. These highway crash recreations show the limits of ADAS. Given their current capabilities, they cannot support full hands-free or feet-free driving. No matter what marketing claims, we should treat ADAS only as a safety assist. Human driving must remain primary. ADAS only helps reduce your driving fatigue. That 1% risk, once it happens, it can lead to 100% casualties.”

So we at Electrek also hope this is a reminder to everyone who has gotten comfortable with using these systems routinely. Not only is there still a lot they can’t do, but even if your car does show it’s capable of handling a situation once, there’s always a chance it might do something different the next time around. So keep your eyes on the road – and don’t just leave it to God’s Eye to watch what’s going on.


The 30% federal solar tax credit is ending this year. If you’ve ever considered going solar, now’s the time to act. To make sure you find a trusted, reliable solar installer near you that offers competitive pricing, check out EnergySage, a free service that makes it easy for you to go solar. It has hundreds of pre-vetted solar installers competing for your business, ensuring you get high-quality solutions and save 20-30% compared to going it alone. Plus, it’s free to use, and you won’t get sales calls until you select an installer and share your phone number with them.

Your personalized solar quotes are easy to compare online and you’ll get access to unbiased Energy Advisors to help you every step of the way. Get started here.