Science goes online

Soon, algorithms, generally known as deep learning, could take control of our health or home appliances. Science goes online.

Science goes online
Science goes online

Although we may not be aware of it, we encounter machine learning techniques today everywhere when using the Internet. To a large extent, they manage, for example, social media, which we use on a large scale, and they serve search engine results. Soon, algorithms, generally known as "deep learning", could take control of our health or home appliances. Science goes online.

In November 2020, information appeared in the media about a new solution developed by MIT specialists that allows you to transfer data processing techniques, so far known from powerful neural networks, to miniature chips in worn medical devices, household appliances and up to 250 billion other devices, which now make up what we call the Internet of Things.

This system, called MCUNet, constructs compact neural networks that provide unprecedented in IoT devices speed and accuracy of deep learning processing. And all this despite the still very limited memory and processing power. This technology can facilitate the expansion of the Internet of Things universe, while saving energy and improving data security.

 "Little deep learning"

The date of the beginning of the Internet of Things (1) is considered to be the date in 1978 on which students of Carnegie Mellon University, including Mike Kazar, they connected the cola-serving machine to the Internet. Their motivation was mainly laziness, as the vending machine was standing at a distance from the room where they were working. They wanted to be sure that it made sense to take the drink, i.e. the cola was actually in the vending machine. "We considered it a kind of joke," recalled Kazar, now an engineer at Microsoft, in the media. "Nobody expected billions of devices connected to the network back then."

  1. Internet of Things

Today, just about everything from wearable medical heart monitors to smart coolers that tell you when you don't have enough milk can be found on the rapidly growing IoT network. Devices on this network often operate using microcontrollers, simple computer systems without an operating system, with minimal processing power and a thousand times less memory than that offered by a regular smartphone. Thus, pattern recognition tasks typical of deep machine learning techniques are difficult to perform locally on IoT devices. When complex analyzes are required, the data collected in the IoT system is often sent to the cloud, making it vulnerable to hacker attacks.

As part of the MCUNet project, a group of researchers from MIT developed two components for "small deep learning", i.e. for operating neural networks on IoT microcontrollers. One of the components is TinyEngine, a resource management deduction engine similar to the operating system. TinyEngine is adapted to handle a specific neural network structure that is selected by the second MCUNet component called TinyNAS, a neural architecture search algorithm.

The available search techniques in a neural architecture typically start with a large array of possible network structures based on a predefined template, and then gradually find the one with high accuracy and low cost. While this method works, it is not the most effective. "This can work quite well for GPUs or smartphones," MIT's co-author Ji Lin of the Department of Electrical Engineering and Computer Science (EECS) tells "But it was difficult to apply these techniques directly to tiny microcontrollers."

So Lin developed TinyNAS, a search method for a neural architecture that creates custom-sized networks. "We have many microcontrollers with different wattages and memory sizes," explains Lin. "So we developed the [TinyNAS] algorithm to optimize the search space for different microcontrollers." The configurable nature of the TinyNAS system allows you to generate compact neural networks with the best possible performance for a given microcontroller - without unnecessary parameters. "We then deliver the final efficient model to the microcontroller," continues the Lin mechanism.

To run this little neural network, the microcontroller also needs an inference engine. A typical inference engine is usually loaded with hints for seldom performed tasks. The extra code is not a problem for a laptop or smartphone, but can easily overwhelm the microcontroller. There is no such burden in the MIT solution. Only a megabyte of flash memory is available here, so resources had to be managed economically. TinyEngine generates the code necessary to run the TinyNAS neural network. Any additionally charged code is rejected. "We only keep what we need here," explains Song Han, Lin's associate. œ This is an advantage of system-related algorithms."

Must Read: iOS 15 “ Should You Install Beta?

In TinyEngine tests, the size of the compiled binary code was 1.9 to 5 times smaller than in comparable inferential microcontroller projects that Google and ARM are working on. TinyEngine also offers innovations that shorten working time, including built-in so-called depth-wise convolution mechanism that cuts peak memory usage by almost half.

The first test task for MCUNet was image classification. Researchers used the ImageNet database with the labeled images to train the system and then test its ability to classify new images. On the commercial microcontroller they tested, MCUNet correctly classified 70.7%, which is a significant improvement compared to the previously tested system (2), which showed 54%. relevance. The team saw similar results in the ImageNet tests of three other microcontrollers, both in speed and accuracy, MCUNet also outperformed the competition in wake word audiovisual tasks where the user initiates interaction with the computer using vocal cues (e.g. "Hey, Siri ") or by simply entering the room.

  One of the benchmarks of the MCUNet system

Promising test results give Lin and Han hope that their solution will become the new standard for microcontrollers. The authors of the system point out that MCUNet could also increase the security of IoT devices. "Privacy is a key advantage," says Han. "You don't need to upload any sensitive data to the cloud."

Han has visions of smart watches integrated with the MCUNet system that not only monitor the heartbeat, blood pressure and oxygen levels in the user's body, but also analyze and help understand this information by themselves, without transferring data elsewhere. The solution can also help deep learning network IoT devices in vehicles and in rural areas with limited internet access (see also: How to choose the perfect running watch?).

AI + IoT = AioT

The project developed by MIT specialists falls into a category that has already got its name - AIoT (3) - from the combination of the Internet of Things (IoT) and artificial intelligence (AI). The network of things collects and processes data that feeds into artificial intelligence systems that make decisions and learn from them. The intelligence and functionality of the entire network is growing.

  1. AIoT

Practical examples of the use of AIoT are quite numerous. Take, for example, intelligent sales systems in stores. In a smart environment, a camera system with appropriate algorithms can use facial recognition to identify customers. The system collects information about customers, including their gender, product preferences, traffic flows over time, etc. It analyzes data to accurately predict consumer behavior and then uses that information to make marketing or shelf placement decisions. These techniques are used, for example, in Amazon Go stores.

Another application is vehicle traffic monitoring. When drones are used to track a large area, they can upload traffic data, then AI can analyze this data and make decisions on how best to reduce traffic by adjusting speed limits and traffic light timing without human intervention. ET City Brain, a solution owned by Alibaba Cloud, optimizes the use of city resources through the use of AIoT. This system can detect accidents, illegal parking and can change traffic lights to help ambulances reach patients faster and in other emergencies.

Smart office buildings are another area where artificial intelligence and the Internet of Things intersect. Some companies decide to install a network of intelligent sensors in their office buildings. They can detect the presence of personnel and adjust temperature and lighting accordingly to economically manage energy consumption. The intelligent building can control access to the building using facial recognition technology. A combination of a network of cameras and artificial intelligence that can compare real-time images with a database to determine who should have access to the building.

AIoT is currently used in the management of vehicle fleets, in vehicle monitoring, reducing fuel costs, and identifying dangerous behavior by drivers. Thanks to IoT devices such as GPS and other sensors and an artificial intelligence system, companies are better able to manage their fleet. Another area where AIoT is used today is in autonomous vehicle systems such as Tesla's autopilots, which use radar, sonar, GPS, and cameras to collect driving condition data and then AI to make decisions. Other autonomous robot systems function in a similar way.

The infrastructure in which this intelligent network of intelligent things will learn to be an even more intelligent network of even more intelligent things is to be the emerging 5G Internet, and then even more advanced, 6G. And this is an obvious direction of development, regardless of whether the processing and neural learning will take place in large hubs, clouds, or locally on small microcontrollers.