TORONTO, Ont. — Drivers can be forgiven for getting annoyed when truck brakes apply all by themselves. It’s shocking, to say the least, and even more so when there’s no apparent reason for it. But ask any driver who has driven one of the early versions of an automatic emergency braking system – as well as a newer generation of that system — and they will surely admit that such “false positives” are happening less frequently.
The truth is the systems are getting better at figuring out what’s going on around them and sorting the actual threats from developing situations that just look threatening. Sensors are better, there’s more of them adding to every equation, and larger microprocessors are analyzing the data and initiating responses.
All these different sensors have strengths and weaknesses. For example, a camera is not so good in snow or fog, whereas radar is better in those conditions. Radar is good at detecting range and speed, but it’s hard for radar to say if the target out front is a passenger car or a commercial vehicle. Cameras can do that.
Bringing all that technology together provides a more complete picture.
“Sensor fusion takes information from cameras, from radar, and perhaps from some vehicle-to-vehicle communication rather than relying on just one or the other,” says Dan Williams, ZF’s director of ADAS (advanced driver assistance systems) and autonomy. “We try to fuse together all the data we get from those sensors into one common scenario that can be interpreted by the automation.”
Obviously, the more data sources, the better the overall picture will be. Bendix was one of the pioneers in the very first levels of vehicle automation, beginning with antilock brakes (ABS) back in the 1990s. The tone rings on tractor and trailer wheels, identifying if the wheels were actually turning, represented a key role in the evolving technology.
“Wheel speed sensors did two things for the braking system. It gave it eyes and a brain,” says Bendix’s ADAS chief, Fred Andersky. “We could now help drivers avoid skids by releasing the brakes when the sensors told the electronic control unit the wheels weren’t turning.”
Skip ahead 10 years or so to 2004 and three more sensors were added, measuring steering angle, yaw rates, and later acceleration. Those are the underpinnings for today’s advanced driver assistance systems.
“That gave us stability control on top of ABS and traction control,” Andersky says. “Next we added a radar unit to the front of the truck and that gave us insight on the proximity of and closure rate of vehicles in front of the truck, and we turned this into adaptive cruise control.”
Next came cameras to supplement radar’s limited view of the world. With all the electronic and mechanical sensors, radar, and cameras feeding data into the system, safety system “intelligence” increased exponentially. This enabled “system fusion”, where the brake system’s electronic control unit could work with the engine management system to apply the brakes and de-throttle the engine.
The latest innovation is the integration steering into the suite of automated vehicle safety systems. In January, Daimler released its Active Lane Assist feature that using micro-steering movements to keep the truck centered in its lane. That is supplemented with Lane Departure Protection that will counter steer the truck back into its lane if the truck begins to drift without the turn signal engaged.
Daimler also has taken collision mitigation and adaptive cruise control to the next level with Detroit Assurance 5.0, also announced in January. Combining input from cameras and radar, this system can bring a truck to a complete stop in an emergency situation using the truck’s full braking capacity. Previously, emergency brake assist systems have not brought the truck to a complete stop. Daimler says it’s an industry first.
Radar is also the key enabler in Daimler’s new Side Guard Assist feature. This uses two radar units, one facing forward the other to the rear that covers the full length of a 53-foot trailer, mounted beneath the doors that can detect pedestrians and cyclists in the driver’s traditional blind spot. This represents a key advancement in radar. Previous systems were unable to detect non-metallic objects.
In its early days, ADAS was seen as the end product. Now it’s seen as the means to another end: automation.
None of these technologies actually affect the way the truck is driven. It’s all reactionary and designed to intervene only when something regrettable seems about to happen, assisting the driver. This seems more likely than ever to change within the next few years. Technology we already have is already enabling autonomous operation. The only remaining question seems to be when will we declare it ready for real-world operation.
“These systems need to be significantly better than a human driver, even though about 93% of crashes are caused by human error,” says Dr. Axel Gern, technical project lead for automated vehicles at Daimler. “As we advance through the levels of automation … we will need to validate that the system is better than a human. The challenge is you’d need to drive hundreds of millions of miles to prove that with just a single software update. We need alternative ways to validate that the system is working 100% of the time.”
Level 2 automation is realized with two sensors. Level 4, which requires little interaction from a driver, would require significantly more sensors — all of which would need to perform at a higher level, Daimler says.
This leads to higher data volumes but also places extremely high demands on the quality of data processing. The goal is to recreate a driver’s perception by using different types of sensors to record traffic and vehicle situations.
The goal is not as far-fetched as it would have sounded even five years ago. We are seeing it demonstrated by several Silicon Valley startup companies including TuSimple, Waymo, Plus AI, and others.
It’s evolutionary and revolutionary technology competing in the same space. The startups don’t have the history of companies like Bendix, Wabco, or Daimler, but they are moving forward and showing that the technology works — at least under controlled conditions.
But the real world is anything but controlled.
“Companies like ZF might be more comfortable with the evolutionary approach, working our way up through the SAE levels of automation, but the revolutionary approach can’t be ignored,” says Williams. “Silicon Valley does things differently than Detroit [the hub of traditional automotive manufacturing], and while these two paths might seem parallel at the moment, they will not doubt converge into an on-highway solution.”
Lidar, radar, and cameras explained
For any advanced safety system to work, a truck must know where it is in space and what’s going on around it. Since trucks can’t see in the human sense, this requires devices that detect what is happening to the side and front of the truck.
Unlike a human driver, whose eyes can wander from the road at inopportune moments, the truck’s “eyes” always remain focused.
Onboard microprocessors interpret the data from the sensors much like our brain interprets the data from our eyes. Digital cameras positioned high on the windshield record images, which the computer processes, using preset parameters to determine when a threat might exist.
The cameras are coupled to a radar unit that provides an additional view. Data from both are blended so the computer gets the best possible “picture” of what’s going on around the truck.
The newest emerging technology is called lidar. It’s something like a blend of radar and camera, using reflected lasers to detect objects and create a high-resolution 3D map of the environment. It may not be necessary for today’s advanced driver assistance systems, but many people believe fully autonomous vehicles will require it.
Radar (radio detection and ranging) uses radio waves transmitted from an antenna to detect objects at a distance, and to define their speed and position relative to the transmitter. Lidar (light imaging detection and ranging) uses laser pulses from a specialized optical device to detect and define objects based on reflections.
Each presents some unique advantages and disadvantages.
- can operate in fog, rain, snow, and darkness
- is not limited by line of sight
- can’t provide a precise image of an object because of the longer wavelengths
- has a short wavelength that better detects small objects
- can build an exact monochromatic image of an object
- is quite expensive
For the purposes of a collision prevention systems like Daimler’s Active Brake Assist or Bendix’s Fusion, radar is ideal, but it lacks the definition needed to recognize if an object is a pedestrian, car, or a wall. The added definition would be needed for a higher level of automation.
“As an ADAS solution or in a platooning situation where the driver is still in control, cameras are fine,” says Jason Eichenholz, co-founder and chief technology officer of Luminar Technologies, a lidar provider.
“For Level 4 [automation] and beyond, lidar sensor becomes the primary sensor in the vehicle, and you have cameras and radar supporting them as secondary sensors.”
But lidar does have limitations. It can’t detect color, for example, which would be important when looking at something like a traffic light. Also, it doesn’t do well with LED-based signage, such as variable speed limit signs.
“Cameras are important,” says Eichenholz. “I think it’s a one-plus-one-equals-three scenario.
“If you want to safely operate a vehicle without a driver behind the steering wheel — and I’m not talking about 99.9%, I’m talking 99.99999% reliability — you better have a lidar system in there with you,” he says.
Have your say
We won't publish or share your data