skip to main content

Cover Story: Driving Innovation—Cars & Electronics Converge

David Lammers

Fortunately for the semiconductor industry, the outlook for automobiles includes an unprecedented array of innovations that require significantly greater numbers of electronic devices. From the cameras, sensors, and image processors needed for advanced driver assistance systems (ADAS) in conventional autos, to sophisticated power electronics used in electric vehicles (EVs) and advanced sensor technologies for autonomous "self-driving" vehicles, the automotive and semiconductor industries are more tightly linked than ever.

There is strong evidence that the worldwide automotive industry is in a milestone period, one that heralds significant changes that will span several decades. One powerful force behind these events is the fast-forming consensus that climate change is real, with the gases coming out of tailpipes as one of the causes. Little wonder, then, that the smog-challenged Chinese government is pushing adoption of EVs and ADAS-capable cars and highways, or that a China-based EV vendor, BYD (it stands for Build Your Dreams) is building battery-powered buses at its factory near Los Angeles.

There is a virtuous cycle developing, one that benefits semiconductor suppliers focused on automotive electronics. New ADAS safety features, such as automatic emergency braking systems (EBS), are attracting car buyers (and insurance companies) who want to avoid costly crashes. Beyond that self-interest, governments are stepping in: the US National Highway Traffic Safety Administration (NHTSA) will require automatic emergency braking as a standard feature on new cars by 2022, for example.

However, visions of self-driving electric vehicles cruising around while their owners take care of email must be balanced against the realities of high development costs, the need for infrastructure improvements, and the impact on modern family budgets. As University of Michigan Professor James Moyne notes, “There is a small percentage of people, myself included, who make car buying decisions based on ecology, but most people vote their pocketbooks. If gas is cheap, they will go with what keeps money in their pocket. If gas is $5 a gallon, that’s a motivator for people to get hybrids or electric cars.”

Indeed, EV sales are in the early stages. Navigant Research predicts the US market—the largest for plug-in EVs—will go from about 133,000 sold in 2014 to between 860,000 and 1.2 million sold in 2024. In a good year, total car sales in the US hit 17 million.

Yet the net effect of all this is the creation of strong demand for automotive electronics (see “Automotive ICs Lead Market Growth” elsewhere in this issue of Nanochip Fab Solutions).

48 V Mild Hybrids

While progress in the ADAS and EV fields garners much of the media attention, gasoline-powered cars are also adopting new technologies. One development that is seldom discussed but important to the semiconductor industry, is the advent of conventional gas-powered cars that receive supplemental electrical propulsion from a 48 V lithium-ion battery and an intelligent energy-capture system. Major car manufacturers are designing cars with a 48 V power network and high-performance lithium-ion battery that will complement today’s 12 V battery, which would continue to handle traditional loads such as lighting, ignition, entertainment, audio systems, and electronic modules.

Peter Harrop, chairman of market research firm IDTechEx, said these “mild hybrids” will begin to hit markets next year or in 2018. The 48 V battery will be linked to a reversible motor-generator that will capture braking energy and other forms of kinetic energy, storing it in the battery for use when the vehicle is stopped or running at low speeds. Because so much pollution is caused by cars stuck in urban traffic jams, Harrop claims that 48 V systems “will probably contribute more to emissions reduction in the next 15 years than all EVs— strong hybrids and pure electric—combined.”

Ford Motor Company, for example, is working with automotive suppliers on a 48 V split-voltage prototype car that reduces fuel consumption by 25%. The car can begin moving in stop-and-go traffic without running the gas engine, powered solely by the battery and a small electric motor. Carmaker Audi estimates a carbon dioxide savings of 10 grams per kilometer because the 48 V lithium-ion battery operates in conjunction with a powerful new alternator to achieve significant energy recovery output.

Sensor Fusion

Emergency braking is among the currently available ADAS options—along with lane-departure warning systems, adaptive cruise control, backup alerts, and parking assistance. It’s based largely on cameras and other sensors, according to Yole Développement analysts Guillaume Girardin and Eric Mounier, who noted in email exchanges with this author that “many other advanced autonomous driving capabilities will arrive in the near future.”(see figure 1)

Figure 1: Yole predicts “massive opportunities” for sensor providers as cars adopt assisted-driving features.
(Source: Yole Développement)

The successful proliferation of ADAS vehicles will depend in part on the ability of designers to reduce costs, especially for light-detection and -ranging (LIDAR) modules, and high-end GPS. The LIDAR on Google’s self-driving car uses 64 lasers to map the physical world, collecting more than a million data points on its surroundings every second and costing about $50,000. The next-generation Google car, the Yole analysts said, will have an $8,000 version, “still way too expensive for wide consumer adoption.”

Much of the work required for autonomous vehicles depends on sensor fusion, which means integrating LIDAR with ultrasonic sensors, radars, cameras, and inertial sensors. “All of them must be working simultaneously, with redundancy,” according to Mounier and Girardin.

But these “direct” sensors won’t be enough. The cars will also need to rely on “indirect” sensors—the sensors of other vehicles—because cars will operate in an “Internet of Things” connected mode. (see figure 2)

Sophisticated Communications Systems

The technology for this is already in reach. Vehicle-to- vehicle (V2V) and vehicle-to-infrastructure (V2I) capabilities may draw on cellular networks, including the still-under-development 5G wireless standard, to connect to other cars and infrastructure. Peter Rabbeni, senior director of RF technologies at GLOBALFOUNDRIES, said “to make autonomous vehicles a reality requires some pretty sophisticated communications systems. We have to make the cellular system reliable enough for autonomous vehicles, and achieve very high data rates delivered to many, many users.”

Rabbeni said GLOBALFOUNDRIES has deployed its silicon-germanium (SiGe) process for automotive radar transmit-and-receive functions being rapidly deployed for range sensing and detection. The foundry expects to provide automakers with MCUs based on its 22FDX process, which uses fully depleted silicon on insulator (SOI) to reduce power consumption and stay within the thermal envelope permitted by automakers.

Moreover, at the 22nm node GLOBALFOUNDRIES is planning to switch from embedded flash to MRAM for its automotive processor manufacturing, “because traditional e-flash is more difficult to integrate at smaller nodes,” Rabbini said.

Figure 2. Features of ADAS vehicles requiring sensor technology.

Dave Eggleston, Vice President of Embedded Memory Technology at GLOBALFOUNDRIES

Dave Eggleston, vice president of embedded memory technology at GLOBALFOUNDRIES, said automotive MCUs are driving to much-higher amounts of embedded memory. “A car with advanced automation systems will have an estimated 300 million lines of code across 50 distributed systems.

“Saying that a car is a smartphone on wheels completely understates the complexity,” Eggleston said. “For something like emergency braking, the speed of decision-making has to be very fast, so you need on-chip integration, including the embedded memory and analog. We’ve also integrated RF into the platform, so the automotive SOC designer now has the key elements of fast compute, motor control, and wireless connectivity.”

Deep Learning for Autonomous Cars

The concept of augmenting (or replacing) the human driver implies an artificial intelligence (AI), based on what is now being referred to as “deep learning.” The AI applications used in augmented driving are accelerating development of a new class of processors, optimized for deep-learning algorithms. NVIDIA, for example, has attracted many of the leading carmakers to its Drive PX 2 development platform, made at TSMC on its 16nm FinFET-based technology.

Based on a combination of graphical processing engines and general-purpose processing cores, the NVIDIA platform delivers up to 24 trillion “deep-learning operations” per second. These are specialized instructions that accelerate the math used in inference engines.

In order for cars to drive themselves, the onboard intelligence must quickly learn how to address unexpected road debris, erratic drivers, and construction zones. Vision systems must develop to the point where they can handle rain, snow, fog, and difficult lighting conditions such as sunrise, sunset, and extreme darkness.

“Drivers deal with an infinitely complex world,” said NVIDIA CEO Jen-Hsun Huang. Autonomous vehicles must be “continuously alert,” and eventually achieve “superhuman levels of situational awareness,” he added.

It is little wonder, then, that TSMC co-CEO Mark Liu recently said that automotive semiconductors, for both enhanced safety and improved infotainment systems, “will certainly speed up the adoption of TSMC’s leading edge technology.”

It is clear that cars are driving sensor and semiconductor technologies, ranging from new generations of power devices (see “Automotive Power Device Market Amps Up” elsewhere in this issue of Nanochip Fab Solutions) to embedded memory, processors, and MEMS sensors (see “Piezo Effect Boosts MEMS Microphones, Fingerprint Sensors” elsewhere in this issue of Nanochip Fab Solutions).

In remarks at a Global Semiconductor Alliance (GSA) meeting in Europe, Markus Tremmel, a senior Bosch technology manager, said the semiconductor industry needs to provide more sophisticated processors to support deep learning. “Current microprocessors are not suited to do [deep learning algorithms] efficiently,” Tremmel said in a report by EE Times Europe. “We need new microprocessor architectures.”

Brian Matas, who tracks automotive applications at IC Insights, said automotive processors will need a sharp boost in computational power in order to handle inputs from multiple high-precision sensors, and to execute powerful algorithms that respond quickly to different driving conditions. Yole’s Girardin put it only slightly differently: “The main bottlenecks [facing automotive electronics] are the sensor fusion, image processing, and power computing.”

Je Bier, a veteran analyst who tracks embedded vision markets, said vision algorithms “are typically quite demanding of processor performance, and getting that performance at low cost and low power usually involves some sort of heterogeneous processor, a CPU coupled with some sort of coprocessor.”

These stringent processor demands are being met by consumer-oriented companies such as NVIDIA, with its graphical processing expertise, and by the traditional automotive semiconductor suppliers. Akhilesh Kona, senior analyst for automotive semiconductors at market research fi rm IHS, said chip suppliers with consumer backgrounds “clearly lead on outright performance” in image processing. However, traditional automotive IC suppliers are far more familiar with functional safety standards and how to handle inputs from sensors in an engine, for example.

Yole’s Girardin said: “Two approaches are contending with each other. One is with a central processing unit combined with ‘dumb’ sensors, an approach which requires high transfer rates, and a big processing unit with multipurpose skills. The other approach deals with a ‘delocalized’ intelligence in each sensor.” Girardin said the recent evolution of computing “gives credit to the second option,” with computing intelligence close to the sensors.

Lane Markings and Infrastructure

The demands on semiconductor suppliers are mirrored by the need for cities and national governments to think about outdated roads and signage. Infrastructure and regulatory challenges are likely to lag behind the purely technical development of cars that can “see” far down highways and streets.

The infrastructure situation may be worse in China, where the central government is seeking to take the lead in self-driving technologies. Junyi Zhang, a partner with the consulting firm Roland Berger, commented in a New York Times report that people, animals, three-wheel rickshaws, and trucks all converge on China’s roads, which also have poorly marked lanes.

“It is harder in China, where many roads have pedestrians, bicycles, low-speed vehicles and high speed vehicles all mixed together,” Zhang said. “It is a very complicated environment, and many don’t ride or drive to the same standard.”

How long will it take before self-driving cars populate roads? The Yole analysts said the answer depends largely on the regulatory environment and legal considerations.

“For completely autonomous vehicles in all conditions, we don’t expect such vehicles before 15 to 20 years,” they said, though adoption could come faster if regulations are quickly put in place.

For additional information, contact nanochip_editor@amat.com.