Airship

Google's neural network library optimised for mobile

Tyrone Stewart

Google I/OGoogle has created a neural network library for mobile that will enable developers to build deep learning models designed to run on Android.



First announced at Google’s I/O last month, the new version of TensorFlow has been optimised for mobile and is called TensorFlow Lite. It gives developers a chance to run their AI apps in real time on mobile phones, while being designed to be fast and small, and work to the same standard as the original. Google says it will be released later this year as part of the open source TensorFlow project.



The move from Google follows in the footsteps of Facebook. Toward the end of last year, the social network announced Caffe2Go – a mobile version of its Caffe deep learning framework. This mobile framework has been used for real time style transfer, which adds art-like filters to mobile phones.



It’s likely these two examples from the tech giants will pave the way to a host of deep learning frameworks for mobile runtimes. Eventually, these frameworks will become another common layer to mobile.