The environment supports TensorFlow Lite, Caffe 2 and other neural network frameworks, plus non-neural ML algorithms. It also provides turnkey integrated ML solutions for voice, vision and anomaly detection applications, including data acquisition, trained models, with user feature customization.
The development environment can target a broad range of NXP processors including i.MX and Layerscape applications processors.
"Having long recognized that processing at the edge node is really the driver for customer adoption of machine learning, we created scalable ML solutions and eIQ tools, to make transferring artificial intelligence capabilities from the cloud-to-the-edge even more accessible and easy to use," said Geoff Lees, senior vice president and general manager of microcontrollers.
NXP eIQ includes model conversion for neural net (NN) frameworks and inference engines, such as, TensorFlow Lite, Caffe2, CNTK, and ARM NN; support for emerging NN compilers such as GLOW and XLA; classical ML algorithms.
NXP also recently introduced a software infrastructure called EdgeScale to unify how data is collected, curated, and processed at the edge, with focus on enabling ML applications.
Related links and articles:
News articles :