Google's ML Kit to enable developers on iOS and Android integrate AI models into their apps



Google has launched a new software development kit (SDK) for developers on both iOS and Android platforms, called ML Kit, to allow them to easily integrate machine learning models into their apps.

First announced at Google I/O developer conference, ML Kit will afford app developers on iOS and Android pre-trained machine learning models, with text/image recognition, barcode scanning, labeling and landmark recognition support, which could be either online or offline, or both depending on the developer’s preference.

While Google intends the current set of available APIs expanded for integration of smart replies functionality, akin to what's available on Inbox by Gmail app, with a high-density face contour feature for the face detection API.

The offline models are perhaps the caveat, though developers can integrate them directly into their apps and it comes free of any charges, it offers a lower level of accuracy, given that they are smaller and run on the device. Against the online models that runs on Google Cloud, which are larger and more accurate.

Albeit, the experience with offline Google Cloud machine learning APIs, is that it isn’t integrated tightly with Firebase and the Firebase Console.

The SDK falls under Google’s Firebase brand, which is specifically targeted at making it easy for developers to bring machine learning to their mobile apps. As also TensorFlow Lite models, which Google is working on compressing the models to fit into more works.

For developers to expand the pre-trained models, ML Kit is the right SDK as it also supports TensorFlow Lite models.
Previous
Next Post »