Google researchers Brendan McMahan and Daniel Ramage report that Google has begun offloading some of its machine learning to mobile devices to keep private data… private. Instead of pulling all your personal info into a central system to train its algorithms, Google has developed the chops to let your phone do that data analysis. They call it federated learning:

Federated Learning allows for smarter models, lower latency, and less power consumption, all while ensuring privacy. And this approach has another immediate benefit: in addition to providing an update to the shared model, the improved model on your phone can also be used immediately, powering experiences personalized by the way you use your phone.

We’re currently testing Federated Learning in Gboard on Android, the Google Keyboard. When Gboard shows a suggested query, your phone locally stores information about the current context and whether you clicked the suggestion. Federated Learning processes that history on-device to suggest improvements to the next iteration of Gboard’s query suggestion model.

Old way: beam everything you do on your Google keyboard (!!) back to the mothership. New way: keep it all local, and beam back only an encrypted summary of relevant learnings. “Your device downloads the current model, improves it by learning from data on your phone, and then summarizes the changes as a small focused update.” To do this, Google has smartphones running a minature version of TensorFlow, the open-source software library for machine learning .

One knock against predictive interfaces is how much you have to give up about yourself to get the benefits. If this new model works as promised, new systems may be just as helpful, without the central service absorbing your nitty-gritty details to learn how.

Read more about...