Technical Systems are all the systems that can be programmed and controlled (dishwashers, TVs, quadcopters or wheelchairs, for example). In today’s world, we interact with them through multiple interfaces. We use mouses, trackpads and keyboards to operate computers and we employ our fingers to use mobile phones. However, operating different devices with several disparate controllers is impractical.
Due to recent developments in neurorobotics, it is now possible to control several kinds of everyday electrical gadgets with internal signals from the human body [1, 2, 3]. Being able to do so would enable people to easily work with several devices at the same time and would result in a faster and more intuitive interaction with technology.
Moreover, for people who have lost or never had control over the movement of their limbs or who depend on other people’s help, this technology could be the beginning of a life change. It will be the starting point of more independence and autonomy.
Set-Up of a Controlling System
Renouncing to these human-machine-interfaces (HMI) means that we need new data sources, that can be used to generate useful control signals in order to command a technical system. Because this article intends to discuss only internal body signals for controlling technical systems, we will identify only sensors that can be connected with the body to record internal bio-data.
Figure 1: Fundamental set-up of a controlling system .
As seen in Figure 1, the complete setup for controlling a technical system through brain and muscle signals contains sensors, which record the bio-signals, a system which processes these into computer-readable control signals, and the technical system that we would like to operate. The controlling system needs to be calibrated for each user individually in order to effectively control the technical system [2, 3]. Let us take an example to guide us through the different steps below: imagine a robotic arm controlled by a user who lost control of their arms to eat food from a plate.
Signals from the brain and the muscles can be collected using various measurement systems [8, 9]. For reasons of practical use in daily life as well as for spatial and temporal resolution, only three brain-computer interfaces (BCI), listed in Table 2, will be considered here :
|Name of BCI||Abbreviation|
TABLE II: Brain Computer Interfaces.
In our example, the user could use any of these BCI to control the technical system.
Classification of Signals into Commands
After recording and filtering the signals, we need to match certain sequences of data points to specific actions that the technical system should execute. We can imagine two commands for our example: (1) collect food and (2) bring food to the mouth.
Extracted features (e.g. mean absolute value, variance, wave length, zero crossings) from these signals can be used to train machine learning algorithms to identify all further bio-signals with the same characteristics as the training signal. This is called “supervised learning” and is mostly done using artificial neural networks .
Figure 2: Data Processing (an MUA is a Multi Unit Array, a way to visualize neural activity).
After distinguishing the bio-signals according to the stimuli from recorded signals, we can begin to group them. As displayed in Figure 2, data sequences resulting from the same stimuli are matched in the same group, which is linked to one specific command.
Every time the decoder has identified a certain data sequence and has been able to adjust this data to an existing class, the decoder should send the commands linked to this specific class, to the technical system . The decoder will therefore either send the command to collect food, or to raise the arm up to the mouth (or will not send any commands)
Training and Recalibration
After creating preliminary control signals out of the received bio-data, we need to train the user step by step in the use of the whole system in order to achieve a higher level of competence. A system developed on one user will need to be calibrated again for another user.
Extensions and Improvements
The complete controlling system can be made more user friendly by providing additional feedback: visual, audio or haptic signals, according to the state of the technical system [2, 3] to the user. For our imagined system, the visual feedback provided by looking at the robotic arm could be complemented by a haptic feedback to mimic the resistance encountered by the robotic arm when touching the food.
Moreover, improvements of specific properties of the controlling systems can be attempted:
- Increased sensitivity
- Increased speed of the calculation of the control signal
- Reduction of the unwanted actions of the technical system resulting from misclassification errors
Finally, to bypass before-hand failing actions, we can program an external supporting system which will allow to improve the chances of a correctly executed action. For example, if a robotic hand is next to a bottle and the user sends out the triggered signal for grasping, the supporting system will refer the grasping to the bottle and will therefore calculate a direction before starting to grasp the bottle [2, 7].
Controlling signals using body signals is great example of the power of Machine Learning enabled applications. Neural networks allow controlling systems to understand users’ intentions and help disabled people gain more independence.
A lot of work still needs to be done in order to enhance these solutions to the point that they become compact, modern-looking, efficient, easy to use and adaptable but promising research in Deep Learning enables the conception of lighter models which can be embedded on smaller devices while staying as accurate [5, 11].
Controlling systems will lead to new social etiquettes and will have a great universal impact on people’s daily and professional lives. Everyone will soon need to decide for their own, how extensively they would like to use them.
Credits to Neurable Inc. for the cover photography.
 H. Szu, “BCI: Smartphone EEG.” 2017 [Online]. Available: https://www.researchgate.net/publication/315064949_ BCI_Smartphone_EEG
 J. Vogel, S. Haddadin, B. Jarosiewicz, J. D. Simeral, D. Bacher, L. R. Hochberg, J. P. Donoghue, and P. Van Der Smagt, “An assistive decision-and-control archi- tecture for force-sensitive hand-arm systems driven by human-machine interfaces,” The International Journal of Robotics Research, vol. 34, no. 6, pp. 763–780, 2015.
 K. LaFleur, K. Cassady, A. Doud, K. Shades, and B. He, “Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain-computer inter- face,” Journal of Neural Engineering, 2013.
 P. Fedele, C. Fedele, and J. Fath, “Braincontrol Basic Communicator: A Brain-Computer Interface Based Communicator for People with Severe Disabilities,” ser. Lecture Notes in Computer Science, 2014, pp. 487–494. [Online]. Available: https://link.springer.com/10.1007/978-3-319-07437-5_46
 E. Strickland, “Building Mind-Controlled Gad- gets Just Got Easier”, Tech. Rep. [Online]. Available: https://spectrum.ieee.org/biomedical/devices/ building-mindcontrolled-gadgets-just-got-easier
 J. Dethier, P. Nuyujukian, S. I. Ryu, K. V. Shenoy, and K. Boahen, “Design and validation of a real-time spiking- neural-network decoder for brain-machine interfaces.” Journal of neural engineering, vol. 10, no. 3, 2013.
 T. Carlson and J. del R. Millán, “Brain-controlled wheelchairs: A robotic architecture,” IEEE Robotics and Automation Magazine, vol. 20, no. 1, pp. 65–73, 2013.
 M. Baum, “Monkey Uses Brain Power to Feed Itself With Robotic Arm.” Pitt Chronicle, sep 2008.
 A. Sarasola-Sanz, N. Irastorza-Landa, F. Shiman, E. Lopez-Larraz, M. Spuler, N. Birbaumer, and A. Ramos-Murguialday, “EMG-based multi-joint kine- matics decoding for robot-aided rehabilitation therapies,” IEEE International Conference on Rehabilitation Robotics, vol. 2015-Septe, pp. 229–234, 2015.
 A. Balbinot, A. Júnior, and G. W. Favieiro, “Decoding Arm Movements by Myoelectric Signal and Artificial Neural Networks,” Intelligent Control and Automation, vol. 2013, no. 4, pp. 87–93, 2013. [Online]. Available: http://dx.doi.org/10.4236/ica.2013.41012
 S. Y. Rojahn, “Samsung Demos a Tablet Controlled by Your Brain,” MIT Technology Review, Tech. Rep., 2013. [Online]. Available: https://www.technologyreview.com/s/513861/samsung-demos-a-tablet-controlled-by-your-brain/