The brain has long been a fascination for those working in the field of artificial intelligence (AI). With billions of neurons working together to enable us to think, see, hear, and remember, scientists have been keen to unlock the secrets of the mind.
And now, it seems that AI is on the cusp of doing just that. Take the case of Ann Johnson, a woman from Saskatchewan who suffered a brain-stem stroke at the age of 30, leaving her unable to speak. As part of a clinical trial in California, Ann had over 200 electrodes placed inside her head in the area of the brain responsible for speech. These electrodes were connected to a computer running an AI algorithm that used deep-learning techniques to interpret her neural activity.
The results were astounding. Ann was able to communicate clearly with her husband through an avatar that spoke as she thought. The AI algorithm accurately read her thoughts and produced speech because it had been trained on a dataset of sentences and their corresponding brain signals. Ann would repeat these sentences in her mind, allowing the AI to recognize which brain signal was associated with each sound. Once trained, the algorithm was able to translate Ann’s thoughts into speech in real time.
“While this AI can currently process about 78 words a minute and has 39 distinctive sounds to form words and sentences, it is not ready for widespread use,” explains Yalda Mohsenzadeh, a computer science professor at Western University. “Each person’s brain activity is unique, and it varies throughout the day. Additionally, the current method requires invasive procedures as the electrodes must be implanted directly into the patient’s head.”
However, progress is being made in overcoming these limitations. Mohsenzadeh’s team has been using wearable sensors on individuals' scalps to record brain activity while showing them videos or images. By decoding the neural dynamics, they have been able to determine what the person was looking at and gain insights into the brain processes behind visual perception.
“While the data collected through scalp sensors is noisier than that obtained from implanted electrodes, it brings us closer to understanding and translating the brain’s activity,” Mohsenzadeh says. “This technology could provide a means for people with severe paralysis, stroke damage, or other conditions that affect their ability to speak to communicate.”
The potential applications of this technology are vast, extending beyond just the medical field. Mohsenzadeh envisions a future where we can interact with computers and the internet using only our thoughts. Rather than typing a question into a search engine, we could simply think of the query, send it wirelessly to the cloud, and receive the answer directly into our brains.
However, there are challenges that need to be considered. Mohsenzadeh highlights the importance of privacy, data security, and consent when reading brain impulses. “Similar to the ethical considerations we have with social media today, we must ensure that the wrong people don’t have access to our thoughts.”
The field of AI and deep learning is evolving rapidly, with new algorithms, methods, and techniques constantly being developed. While the idea of controlling computers and vehicles with our thoughts may still be theoretical, it is not far-fetched. It requires the integration of sensor technology and existing AI algorithms for translation and autonomous driving.
In conclusion, the ability of AI to read our minds is not science fiction. “It could very likely happen in the next decade,” says Mohsenzadeh. With advancements in sensor technology and AI techniques, the possibilities are vast. However, it is essential to consider the ethical implications and ensure that privacy and data security are upheld. The future is exciting, but we must proceed with caution as we delve into the realm of mind-reading machines.
Use the share button below if you liked it.