In-depth analysis: Intel under The AI

Traditional chip giant Intel Has had to transform itself against the cpu alone and acquire chips suitable for training neural networks in the context of Moore’s Law being invalidated and artificial intelligence (AI) revival sour, the business magazine Express said in an in-depth article. At the same time, Intel began to change its long-held belief that outsourcing chip manufacturing would reduce costs and ease delivery pressures.

In-depth analysis: Intel under The AI

Figure 1: Intel-trained Nervana chip for neural networks

Here’s the full text of the article:

As the author walked to Intel’s visitor center in Santa Clara, California, a large group of South Korean teenagers ran down from their bus, happily focusing on taking selfies and posing for photos in front of the giant Intel logo. This may be a big fan you only see on Apple or Google, but why does Intel have one?

Don’t forget that in the name Silicon Valley, “silicon” is the symbol of the chip, and its typical representative is Intel, whose processors and other technologies provide many underlying performance support for the PC revolution. Today, Intel is 51 years old, and it still retains some of its “star charm.”

Intel’s profound transformation

But Intel is also undergoing a period of profound change: reshaping the company’s culture and the way it produces its products. As ever, Intel’s core products are the “brains” of desktops, laptops, tablets, and servers, microprocessors that etche millions or billions of transistors on silicon wafers through specialized processes. Each transistor has two states, On and Off, on to correspond to the computer’s binary numbers “1” and “0”.

Since the 1950s, Intel has achieved a steady improvement in processing performance by adding more transistors to silicon wafers. The speed of the rise was so steady that Gordon Moore, Intel’s co-founder, famously predicted in 1965 that the number of transistors on a chip would double every two years, which is Moore’s Law. Analysts say Moore’s Law has been established for years, but Intel’s strategy of increasing transistors has reached a point of “diminishing returns.”

At the same time, the market’s need for higher processing performance has never been higher. Analysts say AI is now widely used in core business processes in almost every industry. Its revival has left computing performance in a situation of “overdemand”. Neural networks require a lot of computing performance, and only when computer networks work together to perform best, are used far beyond the PCs and servers that initially established Intel’s status.

“Whether it’s smart cities, retail stores, factories, cars, or homes, all of this is a bit like computers today. Bob Swan, who has been Intel’s president since January, said he was president.

Structural changes brought about by AI and Intel’s ambitions to expand their business have forced Intel to adjust the design and functionality of some of its chips. Intel is developing and designing software and chips that can collaborate, and even acquiring companies to keep pace with the changing world of computing. As AI moves into corporate and personal life, and the industry increasingly relies on Intel to provide chip performance to drive AI, Intel’s further transformation is imperative.

The Death of Moore’s Law

Currently, large technology companies with data centers use AI technology in their main businesses, some of which offer AI as a cloud service to enterprise customers such as Amazon, Microsoft and Google. However, AI has begun to spread to other large enterprises, which have trained AI models to analyze massive amounts of data and take appropriate action.

This shift requires significant computational performance to support, and AI’s “desire” for computational performance is the head-on collision between the rise of AI and Moore’s Law.

For decades, Moore’s 1965 prediction was significant for the entire technology industry. Hardware manufacturers and software developers are used to linking their product roadmaps to the performance they get from next year’s CPUs. Arguably, Moore’s Law makes everyone “dance the same song.”

In-depth analysis: Intel under The AI

Intel co-founder Moore (right)

Moore’s Law also foreshadows Intel’s annual promise of chip performance improvement. For most of the past, Intel has made good on this promise by finding ways to add more transistors to silicon wafers, but it has become increasingly difficult.

“Chip factories are about to lose the ability to deliver performance improvements to us,” said Patrick Moorhead, chief analyst at Market Research Firm Moor Insights and Strategy. ”

Although it is still possible to add more transistors to silicon wafers, the cost is increasing, the time is getting longer, and the performance improvements may not be sufficient to meet the needs of computer scientists to build neural networks. For example, the largest neural network known in 2016 had 100 million parameters, and the largest neural network so far in 2019 had 1.5 billion parameters, adding an order of magnitude in just a few years.

Compared with the previous computing paradigm, AI presents a very different growth curve, putting Intel on ways to improve the processing performance of its chips.

Swann, however, sees AI more as an opportunity than a challenge. He acknowledged that data centers could be a major beneficiary of Intel because companies need strong chips for AI training and reasoning, but he believes Intel also has more opportunities to sell AI-compatible chips, such as smart cameras and sensors, to small devices. What sets these devices apart are their small size and low power consumption, rather than the original performance of the chip.

“I think we need to accelerate development in three technologies: AI, 5G, and one that is the equivalent of a mobile computer. Mr Swan said he took over as Intel’s CHIEF executive last year after Brian Krzanich left office over an extramarital affair.

In-depth analysis: Intel under The AI

Intel CEO Swan

In a large, ordinary conference room at Intel’s headquarters, Swan splits Intel’s business into two columns on a whiteboard in front of the conference room. On the left is the PC chip business, which now accounts for half of Intel’s revenue, and on the right, the data center business, which includes the emerging Internet of Things, self-driving cars and the network devices market.

“The world we’re moving into requires more and more data, which requires greater processing, storage, retrieval, faster data movement, analytics, and intelligence to increase data relevance.” Swann said.

Instead of seeking about 90 percent of the $50 billion data center market, Swan wants to grab a 25 percent share of the larger $300 billion connected device market, including smart cameras, futuristic self-driving cars and networking devices. He describes the strategy as “starting with our core competencies and then inventing in some ways, but at the same time expanding our existing business”. That could also be a way for Intel to quickly get out of the shadow of a failed smartphone chip attempt, having recently abandoned a massive investment in the smart machine base band and sold it to Apple. In the smart chip space, Qualcomm’s long-term dominance is like Intel’s dominance of the PC chip market.

By 2023, the IoT market is expected to reach $2.1 trillion, covering chips for robots, drones, cars, smart cameras and other mobile devices. While Intel’s share of the IoT chip market is growing at double-digit rates year-on-year, the IoT’s share of Intel’s total revenue is still only 7%.

Data centers are Intel’s second-largest business, accounting for 32% of the company’s revenue, after the PC chip business, which accounts for 50% of revenue. If the ai is the most affected by the AI, and the data center is the first, that’s why Intel has been tweaking its strongest CPU family to suit machine learning tasks. In April, Intel added Deep Learning Acceleration (DL Boost) to its second-generation Xeon CPU to provide greater performance for neural networks while losing accuracy. For the same reason, Intel will start selling two chips that are good at running large machine learning models from next year.

AI Revival Highlights Chip Short Board

By 2016, the prospect of neural networks being used in a variety of applications has become clear, from product recommendation algorithms to natural language processing by customer service robots. Like other chipmakers, Intel knows that the company must provide its large customers with a chip designed for AI. The chip will be used to train AI models to infer from massive amounts of data.

At the time, Intel was missing such a chip. The industry thinks Intel’s Xeon processors are very good at analytics, but the AI graphics processors (GPUs) produced by Intel’s rival Nvidia are better suited for training AI models. This is an important view that has affected Intel’s business.

So Intel began buying in 2016 for $400 million from a deep learning chip company called Nervana, which is already developing ultra-fast chips designed to train AI.

Three years on, looking back, it seems intellywise. At an event in San Francisco in November, Intel announced two new Nervana neural network processors, one designed to run neural network models, inferring meaning from a large amount of data, and the other to train neural networks. Intel has partnered with two major customers, Facebook and Baidu, to help validate their chip designs.

In-depth analysis: Intel under The AI

Nervana CEO Rao

Nervana is not Intel’s only acquisition in 2016. That same year, Intel also acquired another company, Movidius, which has been developing small chips that can run computer vision models inside devices such as drones or smart cameras. Intel’s Movidius chips are not selling well, but they’re growing fast and opening up an Internet of Things market that excites Mr. Swan. At the San Francisco event, Intel also announced a new Movidius chip, which will be launched in the first half of next year.

Naveen Rao, founder and CEO of Nervana, says many Intel customers work in AI computing at least in the conventional Intel CPU used by data center servers, but it’s not easy for them to work together to meet the needs of neural networks. On the other hand, the Nervana chip contains multiple connections so they can easily collaborate with other processors in the data center.

“Now I can bring up my neural networks and break them up into small systems that can work together,” says Rao, “so that we can have the entire server rack, or four racks, solve a problem together.” ”

Intel expects to generate $3.5 billion in revenue from AI-related products in 2019. Currently, only a handful of Intel customers are using the Nervana chip, but its user base is likely to expand significantly next year.

The long-term chip concept changes

The launch of the Nervana chip represents an evolution of Intel’s deep-rooted belief that a single CPU can handle all the computing tasks a PC or server needs to do. This pervasive belief changes with the game revolution, which requires extreme lying power to display complex graphics. It makes sense to hand over the graphics processing to the GPU so that the CPU does not have to take on this part of the task. A few years ago, Intel began consolidating GPUs in the CPU, and will release stand-alone GPUs for the first time next year, Swan said.

The same approach applies to the AI model. In a data center server, a certain amount of AI tasks can be processed by the CPU, but as the amount of tasks increases, it is more efficient to transfer it to another dedicated chip. Intel has been investing in designing new chips that combine CPUs with a range of dedicated acceleration chips to meet customer performance and workload needs.

“When you design a chip, you need to use the power of the system to solve problems, which often requires more chips, not a CPU can do it. Swann said.

In addition, Intel is now relying more on software to push the performance and power of its processors to new levels, which has changed the balance within Intel. At Intel, software development is now “on par” with hardware development, according to one analyst.

In-depth analysis: Intel under The AI

Intel Nervana Inference Chip

In some cases, Intel no longer produces all chips independently. This epoch-making change is a departure from the company’s traditional practice. Now, if chip designers think other companies can do a better and more efficient job of making a chip than Intel, Intel will outsource its production. For example, the new chip for AI training is contracted by TSMC.

Intel outsources some chip manufacturing for both business logic and economic considerations. Due to capacity constraints in Intel’s state-of-the-art chip manufacturing process, many customers have to wait for shipments of the new Xeon CPU. As a result, Intel outsourced some of its other chips to other manufacturers. Earlier this year, Intel wrote to customers apologizing for the delay in chip shipments and announcing plans to catch up.

All of these changes challenge Intel’s long-held belief that the company has refocused and rebalanced the old corporate power structure.

Intel’s performance looked impressive during this transition period. Analyst Mike Feibus said Intel’s sales of traditional PC chips were down 25 percent from five years ago, but data center-oriented seamount processor sales were “turning upside down.”

Some Intel customers already use XE processors to run The AI model. If the workload increases, they may consider adding new Nervana-specific chips. Intel expects the first Nervana chip customers to be “super-large users” or tech giants that run large data centers, such as Google, Microsoft and Facebook.

Intel has missed the mobile revolution, and it’s nothing new to hand over the smartphone processor market to Qualcomm. But in reality, mobile devices have become service vendors that are delivered to your phone from a cloud data center. So when you’re watching streaming video on your tablet, Intel’s chips are probably helping you. The advent of the 5G era may make real-time services possible, such as playing games in the cloud. A pair of futuristic smart glasses may be able to connect to algorithms in the data center with ultra-fast speeds to instantly identify objects.

All of this converges into a very different era, far from the technology world built around PCs that use Intel chips. But as AI models become more complex and versatile, Intel has the opportunity to be the best company to drive them, just as they have driven our PCs for nearly half a century. (Author/Rain)

Add a Comment

Your email address will not be published. Required fields are marked *