On August 18th, the Chinese technology company Baidu announced that the company has successfully self-developed the second-generation Baidu Kunlun AI Chip and that it has achieved mass production. Nowadays a lot of the companies, not only specializing in internet-related services but also some of the very big names of the smartphone world, are looking to invest into and develop a self-sufficient program to self-produce various types of chips mainly to be used in smartphones.
On August 18 at the 2021 Baidu World Congress, Baidu founder, chairman, and CEO Robin Li announced that the independently developed Kunlun AI chip will soon be mass-developed.
The Kunlun core 2 uses a 7nm process and is equipped with a self-developed second-generation XPU architecture. The inclusion of the latest XPU architecture is supposed to improve the performance of the chip by 2-3 times when compared to its first generation.
The Kunlun Core 2 will be useful for cloud, terminal, and edge. In addition to the inclusion of the new XPU architecture and the 7nm process, the AI chip developed by Baidu has also completed end-to-end adaptation with a variety of domestic general-purpose processors. One of the domestic general-purpose processors compatible with Baidu’s new AI chip is Feitung; on the other hand, a number of domestic operating systems such as Kirin and the company’s self-developed propeller deep learning framework too could be used with the Kunlun AI chip with end-to-end adaptation.
However, the uses of the new chip will not be limited to the above-mentioned systems only. In the future, it is hoped that the chip will be able to integrate seamlessly with various other domestic software and hardware especially because the chip comes with a full stack of domestic AI integrating capabilities.
Baidu hopes that in the future its chip will be used in autonomous driving, intelligent transportation, and intelligent assistants.
Subscribe to our newsletter and never miss an update on the latest tech, gaming, startup, how to guide, deals and more.