Facebook today is publishing an academic paper and a blog post detailing Torchnet, a new piece of open-source software that’s designed to streamline deep learning, a type of artificial intelligence.
Deep learning is a trendy approach that involves training artificial neural networks on lots of data, like photos, and then getting the neural networks to make predictions about new data. Rather than build a completely new deep learning framework, of which there are many, Facebook chose to build on top of Torch, an open-source library to which Facebook has previously contributed.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1986549,"post_type":"exclusive","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"big-data,business,dev,","session":"B"}']“It makes it really easy to, for instance, completely hide the costs for I/O [input/output], which is something that a lot of people need if you want to train a practical large-scale deep learning system,” Laurens van der Maaten, a research scientist in Facebook’s Artificial Intelligence Research (FAIR) lab, told VentureBeat in an interview. “It’s not making Torch faster or slower or anything like that, that’s not the point of the framework.”
Torchnet, which is written in Lua and can run on standard x86 chips or graphics processing units (GPUs), also lets programmers reuse certain code, which means doing less work and lowering the chances of introducing bugs, said van der Maaten.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
Facebook is not the only company building tools for Torch, and specifically the Torch/nn library; Twitter is, too, and, sure enough, the Twitter employees van der Maaten has spoken with seemed excited about Torchnet.
Amazon, Google, and Microsoft, among others, have released totally new deep learning frameworks in recent months, and it’s interesting to see Facebook, which has previously contributed original open-source projects like React Native and Presto, do something different.
If anything, the approach is a bit like the Blocks and Fuel libraries for the Theano framework, van der Maaten said.
And Torchnet might not always be limited to Torch. Its abstractions “can readily be implemented” for other frameworks, like Caffe and Google’s TensorFlow, van der Maaten and his colleagues Ronan Collobert and Armand Joulin wrote in the paper.
Facebook came up with the first version of Torchnet six or seven months ago. “There are a bunch of different teams that use it in different applications,” van der Maaten said.
He wouldn’t identify specific parts of Facebook that rely on Torchnet, but it can be applied to things like image recognition and natural language processing. And that comes in handy for doing things like finding relevant Instagram photos or choosing the best Facebook posts for your News Feed. Facebook wants its content to be more appealing than anything else on the internet, both to keep people coming back and to attract new people, so this is important stuff.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1986549,"post_type":"exclusive","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"big-data,business,dev,","session":"B"}']
For more detail on Torchnet, check out the full paper, which van der Maaten is presenting today at the 2016 International Conference on Machine Learning (ICML) conference in New York, and the blog post.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More