In recent years, the artificial intelligence field has seen massive advancements. However, according to Facebook's Chief AI Scientist, Yann LeCun, for the growth to continue the industry might need to focus on producing chips dedicated to deep learning, as well as a new more efficient programming language.
Artificial intelligence is much older than you might expect - it has been around for fifty years. LeCun has been part of that history. He worked at Bell Labs in the 1980s and developed ConvNet (CNN) - an AI with the ability to read ZIP codes. In his view, hardware innovation is inseparably tied to advancements in the deep learning field and we need to see even more progress in the future.
This is why, during his his keynote address at the 2019 International Solid-State Circuits Conference, LeCun addressed the need for more DL-specific hardware. The scientist thinks that demand for it will only increase in the future, thanks to "new architectural concepts such as dynamic networks, associative-memory structures, and sparse activations".
After a report from the Financial Times there also are rumors that Facebook might be working on an AI chip of its own: "Facebook has been known to build its hardware when required — build its own ASIC, for instance. If there's any stone unturned, we're going to work on it," LeCun said. However, there has been no official confirmation on the part of Facebook.
Finally, Yann LeCun also stressed that current programming languages might also be limiting to AI: “There are several projects at Google, Facebook, and other places to kind of design such a compiled language that can be efficient for deep learning, but it’s not clear at all that the community will follow, because people just want to use Python,” LeCun told VentureBeat.
Do you agree with Facebook's chief AI scientist? What do you think the future of AI holds? Let us know in the comments.