Deep Neural Network (DNN) is gradually spreading to factories.
For some early adopters, the neural network is a new intelligence embedded behind the eyes of computer vision cameras. Ultimately, these networks will zigzag through robotic arms, sensor network gateways and controllers, thereby changing industrial automation. However, such changes will not happen soon.
Rob High, chief technology officer of IBM Watson, said: "We are still at an early stage in terms of the advanced era and next generation machine learning algorithms that may evolve into decades, but I think we will see tremendous progress in the next few years."
Neural network will appear in more and more Linux multi-core x86 gateways and controllers in factory environment. High said that the emerging 5G cellular network will one day allow neural networks to access remote data at any time.
Car and aircraft manufacturers and health care providers are taking early steps, especially smart cameras. Cannon has embedded Nvidia Jetson embedded development board in its industrial camera, which initiates in-depth learning. Cognex Corp., an industrial camera supplier, is speeding up production of its product range. Horizon Robotics, a Chinese startup, has shipped security surveillance cameras embedded in its Deep Learning Inference Accelerator.
Deepu Talla, general manager of Nvidia's automation department, said: "All early adopters are deploying in-depth learning for visual perception, and others are beginning to notice them. It's not difficult to implement perceptual design nowadays, and researchers have solved this problem.
However, "the biggest problem now is the use of artificial intelligence (AI) to interact with humans, and to achieve more sophisticated driving tasks - these are issues that require long-term research. In the fields of UAV and robot navigation, we have further developed to the prototype stage.
Talla calls robotics "the intersection of computers and AI," but many industrial uses of in-depth learning may be less desirable -- and come sooner.
Doug Olsen, CEO of Harmonic Drive LLC, a supplier of robotic components, said factory robots had not yet started using AI. "Because embedded machines in factories can be used to predict failures and collect daily usage data to determine when the system needs preventive maintenance. This is where AI was first established. Therefore, in the short term, we should not expect too much from the intelligent robotic arm.
Several large chip manufacturers agree. Three years ago, Renesas Electronics began experimenting with microcontrollers (MCUs) to support AI at terminal nodes, to detect faults in its semiconductor wafer plants and to predict whether production systems need to be maintained.
In October, the Japanese chip giant launched its first MCU with dynamic reconfigurable processor blocks for real-time image processing. The company's goal is to keep up with controllers that support real-time recognition in 2020 and incremental learning in 2022.
The competitor ST Microelectronics (ST) STM32 chip uses a similar approach. In February this year, the company announced an in-depth study of SoC and an accelerator under development, part of which is aimed at fault detection in factories.
Intelligent robots are coming. For example, covariant. ai, a startup, is working to achieve this goal through intensive learning. Pieter Abbeel, of the University of California, Berkeley, said: "The ability to watch robots and act on what they see will be one of the biggest differences in depth learning over the next few years." Abbeel founded covariant. AI in UC Berkeley and managed a robotic laboratory.
Abbeel shows how robots learn to use neural network technology to perform simulations, but this technology is still in its early stages. "In fact, part of the reason we set up covariant. AI is that we are optimistic that the current industrial AI sector is not so crowded," he said.