Source – https://www.analyticsinsight.net/
The discovery by a bunch of researchers reveal how AI can now read and interpret our personal choices
Artificial Intelligence has been disrupting many industries, business processes, and our lifestyle. With artificial intelligence technology, it is now possible to augment human intelligence and use it in decision-making and customer interactions. The ongoing digital transformation has brought many cutting-edge technologies to the mainstream and stressed the significance of AI and Big Data in revolutionizing industries. The role of artificial intelligence in business has been proved to be positively redefining operations and encouraging cost-efficiency.
But there are still areas connected to AI that researchers are studying to enhance the simulation of human intelligence to an extent, which enables sentiment analysis. Although researchers at the University of Helsinki and the University of Copenhagen have come up with an interesting discovery, wherein AI can read the brainwaves to understand and define subjective notions. In a paper published by these universities, AI can interpret the data generated from a brain-computer interface to build facial images that appeal to or attract different individuals.
A brain-computer interface (BCI), also known as brain-machine interface technology, is a communication system that connects the brain with an external machine or device. A brain-Computer interface is capable of measuring the activity in the Central Nervous System (CNS). This measured brain activity is converted into electronic and software signals that can be interpreted by AI.
Electroencephalography (EEG) and electromyography (EMG) are already in use by doctors to understand the neural activities of our brain and muscles, respectively.
BCI is extensively used in the healthcare and medical fields to treat broken neural connections between our brain and other body parts.
How interesting is it that this technique literally explains the old proverb, ‘beauty is in the brain’? Beauty is in fact inside our brains, which can now be interpreted by some machines and the wide range of AI applications can enable this.
But jokes apart, this study opens up new avenues for artificial intelligence, machine learning, and data analytics and also. According to a Daily Mail report, “The team strapped 30 volunteers to an electroencephalography (EEG) monitor that tracks brain waves, then showed them images of ‘fake’ faces generated from 200,000 real images of celebrities stitched together in different ways.”
The machine learning model called Generative Adversarial Neural Networks was trained to familiarise with individual preferences of faces so that it could easily generate new facial dimensions according to the brainwaves.
A report by Technology Networks revealed that the researchers developed new portraits for each participant, to test the validity of their modeling, and predicted that they will personally find these models attractive. Further, the researchers tested them in a double-blind procedure against matched controls to find that the new images match the preferences of the subjects with an accuracy of over 80%.
Connecting artificial neural networks to our brain can now produce results based on our personal preferences through a non-verbal communication process. This development is new since the neural networks or BCIs couldn’t peek into our personal choices and only establish the pattern of activities.
If it is possible to understand something this unique and personal, AI is not very far from augmenting and understanding the human brain to a more satisfying extent. However, such an invasion of artificial intelligence and technology into the internal structures of our brain will raise concerns about privacy and ethics. This new development will enable the understanding of individual and subjective biases that are internalized deep in our brains. Well, these innovations and developments in the field of AI will aid AI companies in expanding their business avenues and services.