What we’re reading from around the web on AI, Machine Learning and Deep Learning.
Facebook out of Compute?
This post by Tiernan Ray at ZDNet was fascinating – how Facebook quite literally ran out of GPU compute power. From the article: “Facebook’s giant “XLM-R” neural network is engineered to work word problems across 100 different languages, including Swahili and Urdu, but it runs up against computing constraints even using 500 of Nvidia’s world-class GPUs.”
Intel acquires Habana for $2 Billion
The Israeli tech scene is abuzz this week with the news that Intel is making a strategic move to acquire chip manufacturer Habana. Forbes covered the news with this assessment, “Habana Labs stands out as one of the first to deliver working hardware with impressive performance claims for both training and inference processing. Habana Labs launched its Goya chip for inference processing in September 2018, claiming roughly 3X performance advantage over NVIDIA with lower latency.” NVIDIA has so far won the AI accelerator race, but the AI Infrastructure market looks more like a marathon, and Intel is playing the long game.
The Influence of Hardware on Deep Learning
This video is from Yan LeCun, VP and Chief Data Scientist at Facebook, also Professor of Computer Science and Data Science at New York University. Yan concludes that the “influence of hardware on research and development is very important. It’s clear from the history of Neural Net and AI that the type of hardware [researchers] have at their disposal will influence the type of research they do.” Watch the video below.