Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":2056525,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,cloud,dev,entrepreneur,media,mobile,","session":"D"}']

SwiftKey’s mobile keyboard app now offers better word predictions built on neural networks

SwiftKey for Android

Image Credit: Paul Sawers / VentureBeat

SwiftKey, the popular mobile keyboard app that was acquired by Microsoft earlier this year, has given its flagship app a major overhaul this week with a completely new predictive typing engine that promises increased accuracy.

The London-based company gave its first public demonstration of the technology last year through a new standalone app called the SwiftKey Neural Alpha app, which was effectively an experimental version of its main app powered by a different language engine. It made sense for the company to do it this way, as it was still fine-tuning the underlying technology, but now it’s ready for prime time within the main consumer SwiftKey app.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":2056525,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,cloud,dev,entrepreneur,media,mobile,","session":"D"}']

Founded in 2008, SwiftKey built a big following on Android as an alternative to the stock keyboard app on phones and tablets, though it later launched for iOS after Apple opened up to third-party keyboards. To speed up typing, SwiftKey learns your writing style over time and predicts the next word before you’ve started typing — this is partly based on historical patterns, but it also scans texts from other sources to “learn” popular sequences in which words are often placed. But things are about to get a whole lot better.

SwiftKey and the machines

Artificial neural networks (ANNs) represent part of the broader field of machine learning and artificial intelligence (A.I.) and are based on the structure and workings of the human brain. Compared to the existing n-gram model, which leans toward computational linguistics and probability, ANNs should be better at predicting and correcting language by considering the context of what someone is trying to say.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

Previously, SwiftKey would suggest the next word in a sentence based on the order of words it has seen in the past — and it was pretty good at guessing what you’re trying to say. But there were restrictions, because it didn’t look at the actual meaning of the words — if it hadn’t seen words used before in a particular sequence, its predictions would be limited. Neural networks, on the other hand, usher a more human element into the mix by digging deeper into what you’re trying to communicate.

“It [the neural network] gives the ability for SwiftKey to predict and suggest words in a way that’s more meaningful and more like how language is actually used by people,” said SwiftKey chief marketing officer Joe Braidwood in an interview with VentureBeat last year.

An example that SwiftKey itself uses to illustrate the difference is this: If SwiftKey has previously seen the phrase “Let’s meet at the airport,” when a user begins typing a similar sentence in the future, it’s now able to understand that “hotel” or “office” are appropriate alternatives, because they’re all places that people do indeed meet at.

Above: Neural Network

Neural networks have become something of a buzz phrase in recent times, with tech companies increasingly turning their attention to artificial intelligence to improve their products. For example, Shutterstock developed its own convolutional neural network for a new reverse image search feature, while Facebook, too, is using neural networks to understand the content of images.

Interestingly, SwiftKey claims that its latest update represents the first time that neural networks have been deployed locally on a smartphone, rather than relying on a connection to the cloud. It’s also worth noting here that the feature is limited to U.S. and U.K. English for now, though the company does say that more languages are coming. The update is only available on Android for now.

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":2056525,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,cloud,dev,entrepreneur,media,mobile,","session":"D"}']

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More