Intel unveils inference-capable chip sired by Haifa team
search

Intel unveils inference-capable chip sired by Haifa team

The Nervana NNP-I chip, already in use by Facebook, is meant for large computing centers, enabling computers to reach valuable conclusions

Members of Intel's Haifa team at the presentation of the new AI chip at the Hot Chips conference; Aug. 20, 2019 (Courtesy)
Members of Intel's Haifa team at the presentation of the new AI chip at the Hot Chips conference; Aug. 20, 2019 (Courtesy)

Intel Corp. on Tuesday unveiled details of a new artificial intelligence-based chip that enables computers to gain knowledge by inference — that is, to reach conclusions that are drawn from evidence and reasoning. The technology was sired by the tech giant’s Haifa lab.

Called Intel Nervana NNP-I or Spring Hill, the chip is designed for large computing centers, Intel said in an emailed statement. Intel showcased the product for the first time on Tuesday at the Hot Chips Conference in Silicon Valley, an annual tech symposium.

Social media giant Facebook is already using the product, the statement said.

Artificial intelligence — the field that gives computers the ability to learn — has been around since the 1950s. It is now enjoying a renaissance made possible by the higher computational power of chips. The field is expected to be a $191 billion market by 2025, according to MarketsandMarkets, a research firm.

Intel Corp’s Naveen Rao, vice president and general manager for the Artificial Intelligence Products Group (YouTube screenshot)

Companies such as Intel, Nvidia, Qualcomm and Google and startups globally are all on the hunt for new technologies in this field, which involves among other things creating the hardware to enable the processing of huge amounts of information.

This processing hardware is used for two purposes: training the computers to do new tasks and teaching them to infer and thereby to reach valuable insights.

Intel’s new chip, a hardware accelerator card, is meant to handle high workloads and complex artificial intelligence tasks in the field of inference, the statement said.

The chip is based on an Intel 10-inch Ice Lake core processor and uses low energy, even when coping with high workloads. The chip can handle as many as 3,600 images per second at 10 watts of power, the statement said.

“To get to a future state of ‘AI everywhere,’ we’ll need to address the crush of data being generated and ensure enterprises are empowered to make efficient use of their data, processing it where it’s collected when it makes sense and making smarter use of their upstream resources,” Naveen Rao, Intel vice president and general manager, Artificial Intelligence Products Group, said in the statement.

Data centers and the cloud need to have access to high performance computing and “specialized acceleration for complex AI applications,” he said. “In this future vision of AI everywhere, a holistic approach is needed—from hardware to software to applications.”

The new chip will give a boost to the existing Intel Xeon processors, already used by large enterprises, the statement said. Intel Xeon processors are used by companies for complicated computational tasks.

Intel has invested in several Israeli startups that develop artificial intelligence technologies, including Habana Labs and NeuroBlade. This chip, however, was developed inhouse, at the Intel R&D center in Haifa, in collaboration with Israeli startup C2DG, which also develops cores of Intel processors.

Aside from its manufacturing plant in Kiryat Gat, in the northern Negev, Intel has  R&D centers in Jerusalem, Petah Tikva and Haifa.

read more:
comments