In the spirit of helping everyone do artificial intelligence more efficiently, Facebook is giving away some of its seriously powerful deep learning tools for free in the Torch open source library.
More specifically, the social networking company is open-sourcing code that helps it run complex algorithms for deep learning, an increasingly popular type of artificial intelligence. The company will announce the news in a blog post today.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1643479,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"big-data,business,","session":"B"}']Deep learning involves training systems, called artificial neural networks, on lots of information derived from audio, images, and other inputs, and then presenting the systems with new information and receiving inferences about it in response. Torch, an open source library that’s been around since 2002, contains a framework for building and training neural networks.
Facebook is making several modules available via Torch, including one with a convolutional neural network layer featuring highly customized kernels — templates that slide over images to recognize certain objects — that could help researchers and engineers at other companies catch up with some of the performance improvements Facebook has made internally.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
“For everyone out there, these kernels are faster than anything else in the open source community,” Soumith Chintala, a Facebook artificial intelligence researcher and software engineer, told VentureBeat in an interview.
What’s more, Facebook is releasing “containers” to help distribute the work of training neural network across multiple graphics processors. “You’ll get a good speed-up,” Chintala explained. And the supercharged convolution layer code is as much as 23.5 times faster than the fastest system that was publicly available until this point, Chintala wrote in his blog post.
Facebook has previously contributed a wide variety of other software to the rest of the world under open source licenses. But these new components demonstrate a serious commitment from within the Facebook Artificial Intelligence Research arm the company first formed in 2013, with the arrival of deep learning luminary Yann LeCun. More recently, that group brought on Vladimir Vapnik, known for his work on the Support Vector Machine algorithm.
Google, Twitter, Spotify, Netflix, and others have been quickly bringing in their own talent in the domain of deep learning. But open source contributions in the area have been uncommon, making Facebook’s move notable.
Facebook could also get people outside of the company to improve the Torch modules and thus find new researchers to bring aboard.
Hiring motivations aside, Chintala believes the Torch project holds serious merit and that the new components should make it still more powerful.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1643479,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"big-data,business,","session":"B"}']
“It’s like building some kind of electronic contraption or, like, a Lego set,” Chintala said. “You just can plug in and plug out all these blocks that have different dynamics and that have complex algorithms within them.”
At the same time, he said, Torch is actually not extremely difficult to learn — unlike, say, the Theano library.
“We’ve made it incredibly easy to use,” Chintala said. “We introduce someone to Torch, and they start churning out research really fast.”
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More