17 April 2019
Qualcomm has announced plans to begin testing its new Cloud AI 100 chip with partners like Microsoft Corp in mid-2019, with mass production likely to begin in 2020. Qualcomm made the announcement at an event in San Francisco.
The new AI chips are designed for what artificial intelligence researchers call “inference” that is the process of using an AI algorithm that is “trained” with massive amounts of data in a bid to do functions like translate audio into text-based requests. According to analysts, chips for speeding up inference will be the largest part of the AI chip market.
Rivals Nvidia has already released special chips for the task and Intel is reportedly working with Facebook Inc for the chips which will be released later this year. A number of Cloud computing vendors like Amazon.com’s Amazon Web Services and Alphabet Inc’s Google Cloud unit are also developing their own inference chips.
Cristiano Amon, Qualcomm's president and the chief of its chip division, stated that the company is aiming to serve smaller, simpler data centres which are that are spread around the world so that users can profit from faster response times.
“You can’t rely on big [data centre] buildings with air conditioning,” Amon told reporters at the San Francisco event. “That’s our bet - performance per watt leadership.” “I think this is a good start for Qualcomm, but they have a lot to prove in the higher performance accelerator space,” said Patrick Moorhead, founder of Moor Insights & Strategy.