Home Edge Computing HMI- The Technology of Connecting Humans and Machines

HMI- The Technology of Connecting Humans and Machines

Have you ever heard the word HMI? If you’re interested in IT, you’ve probably heard of the word UI. So what about MMI? This article explains the meaning of the words HMI, UI, and MMI, and discusses the specific examples of technology, the relationship between edge computing, and future development.

HMI Provisions

HMI is a relatively new concept and is an abbreviation for human-machine interface. It is the part between humans and machines that sends instructions from humans to machines and the results from machines to humans. In other words, it is the function/part that mediates the dialogue between humans and machines. By the way, it is also called UI (user interface) in the computer world and MMI (man-machine interface) in Japan.

Specific examples of HMI

Nowadays, it is almost gone because it is pushed by the autofocus camera that automatically focuses, but until then, the focus of the camera was basically manual. The lens had a focus ring that allowed the focus to be adjusted (even with today’s cameras, you can switch to manual adjustment). As for how to operate, while checking the focus displayed in the viewfinder, turn the focus ring to adjust the focus to the optimum. If the focus is not optimal even if you turn it once, or if you want to focus on something different, turn the focus ring again. You can think of this repetition as a dialogue between humans and machines. It’s a simple mechanism, but it’s also a good HMI.

Let’s look at a new case. The personal computers and smartphones that you casually use are designed so that when you give instructions with the keyboard, mouse, tap, etc., the corresponding display will appear. In other words, humans give instructions with keyboards and mice, and computers show the results. This repetition can also be regarded as a dialogue between humans and machines.

Furthermore, recently, voice input is also possible. This kind of technology is called voice recognition. And, for example, it is possible to have the website searched by voice recognition read aloud. A technique called speech synthesis is used for reading aloud. This kind of keyboard-free input can be said to be an HMI that is closer to the human-to-human conversation.

Current HMI

When a human drives a car, visually check the speedometer. However, shifting the line of sight while driving can cause a momentary gap, which may lead to an accident. To avoid this, a heads-up display (HUD) has been developed. A HUD is a vehicle, especially a car, in which information, as it travels, is displayed on the windshield by a laser beam. Since there is no need to shift the line of sight, safety is said to increase. The information displayed on the HUD allows the driver to operate the accelerator and brakes. If we think of this repetition as a dialogue between the car and the driver, HUD can be said to be a mechanism for smoothly executing conversations between people and cars.

Automatic dozing detection and driver monitoring functions of automobiles are also considered to be mechanisms for smooth HMI execution. A system has been developed to detect information such as heart rate and pupil size with sensors and cameras built into the car, determine the driver’s condition such as drowsiness and excitement, and to alert and control the car. For example, if the driver is drowsy, the car automatically takes measures such as vibrating the steering wheel or stopping the engine. Such a system may be able to be said that the machine approaches the human side in the sense that the machine automatically detects and performs the human condition even if the human does not give instructions consciously.

In addition, it is worth noting that the introduced technology is becoming more and more familiar as the current HMI situation. In the past, speech recognition and speech synthesis systems were complex and expensive. However, voice input and automatic reading on AI speakers and smartphones are now commonplace. These realizations are due to tremendous advances in hardware and software, but the speed of their adoption is also remarkable. Just as smartphones have spread rapidly in the last decade, the sophistication and speed of HMI may be entering a new phase.

HMI in the future

Now let’s predict how HMI will progress. As already mentioned, there is a tendency that the machine side in HMI approaches the human side. Typing words with speech recognition instead of the keyboard is probably one of the most obvious examples. In addition, it is possible that not only words but also human senses and emotions are transmitted to machines, and machines come to understand them. Let’s look at an example.

Have you ever read the manual and operated the machine? In fact, there are many people who say that there are not many. Manuals are important, but reading them doesn’t mean you’ll be able to operate the actual machine right away. This is because it is important not only for information such as letters and pictures but also for the experience of actually manipulating them. Therefore, let’s consider augmented reality and virtual reality of manuals. In this way, you will be able to educate the operation without using the actual equipment. In addition, rather than reading only the manual, you can learn with your senses, so the educational effect will be enhanced. This augmented and virtualized manual requires the machine to always provide interactive responses at a level that feels like it’s actually being manipulated. Advanced HMI technology is essential to achieve this.

There is also HMI technology that transmits tactile and force sense to machines and reproduces them. Currently, research is being made on the realization of a robot hand that grabs eggs without breaking them. In addition, research has been made to remotely control this robot hand. As an HMI technology, it conveys the feeling of grabbing a human egg to the robot hand, and the sense that the robot hand is grabbing is transmitted to the human hand.

So what happens when machines are ultimately closer to humans? Based on this idea, research is being carried out on the brain machine interface (BMI) technology. This technology seeks to directly pick up brain signals and interact with machines. There are many ways to BMI, and some embed sensors in the brain. However, although this method has been successful as an experiment, it is not very practical, and in reality it is common to detect EEG and process signals. It is thought that it is possible to realize a more human-friendly HMI by directly detecting what humans are thinking in the brain and operating the machine, but the movement of the brain is complicated and it will be necessary to wait for the development of future research.

HMI and Edge Computing

I have just mentioned HMI’s specific technologies and future development expectations. One thing to note is that in order to achieve a higher level of HMI, which will increase demand in the future, it is necessary to process large amounts of information close to real time. When prompt response is required, cloud computing alone may be difficult to achieve. Therefore, there may be an approach of achieving advanced HMI while maintaining processing speed by using edge computing according to the situation.

Related Posts