IBM Research has revealed 5 technology innovations that will help change lives within the next 5 years.
Read more: IBM technology predictions on innovations
IBM’s 5 innovations include a list of scientific innovations with the potential to change the way people work, live, and interact during the next five years.
# With AI, words will open a window into our mental health
# Hyperimaging and AI will give us superhero vision
# Macroscopes will help us understand Earth’s complexity in infinite detail
# Medical labs on a chip will serve as health detectives for tracing disease at the nanoscale
# Smart sensors will detect environmental pollution at the speed of light
In five years, what we say and write will be indicators of our mental health. Patterns in our speech and writing analyzed by cognitive systems will provide clinicians tell-tale signs of early-stage mental and neurological conditions that prompt us to seek treatment.
Brain disorders, including developmental, psychiatric and neurodegenerative diseases, represent an enormous disease burden, in terms of human suffering and economic cost. The global cost of mental health conditions is projected to surge to US$6 trillion by 2030.
Patterns in our speech and writing analyzed by new cognitive systems will provide tell-tale signs of early-stage developmental disorders, mental illness and degenerative neurological diseases that can help doctors and patients better predict, monitor and track these conditions.
IBM scientists are using transcripts and audio inputs from psychiatric interviews, coupled with machine learning techniques, to find patterns in speech to help clinicians accurately predict and monitor psychosis, schizophrenia, mania and depression. Today, it only takes about 300 words to help clinicians predict the probability of psychosis in a user.
In the future, similar techniques could be used to help patients with Parkinson’s, Alzheimer’s, Huntington’s disease, PTSD and even neuro developmental conditions such as autism and ADHD.
Cognitive computers can analyze a patient’s speech or written words to look for tell-tale indicators found in language, including meaning, syntax and intonation.
Hyperimaging and AI
In five years, our ability to “see” beyond visible light will reveal new insights to help us understand the world around us. This technology will be widely available in our daily lives, giving us the ability to perceive or see through objects and opaque environmental conditions anytime, anywhere.
In five years, new imaging devices using hyperimaging technology and AI will help us see broadly beyond the domain of visible light by combining multiple bands of the electromagnetic spectrum to reveal valuable insights or potential dangers that would otherwise be unknown or hidden from view. These devices will be portable, affordable and accessible, so superhero vision can be part of our everyday experiences.
A view of the invisible or vaguely visible physical phenomena all around us could help make road and traffic conditions clearer for drivers and self-driving cars. Using millimeter wave imaging, a camera and other sensors, hyperimaging technology could help a car see through fog or rain, detect hazardous and hard-to-see road conditions such as black ice, or tell us if there is some object up ahead and its distance and size.
Cognitive computing technologies will reason about this data and recognize what might be a tipped over garbage can versus a deer crossing the road, or a pot hole that could result in a flat tire.
Embedded in our phones, these same technologies could take images of our food to show its nutritional value or whether it’s safe to eat. A hyperimage of a pharmaceutical drug or a bank check could tell us what’s fraudulent and what’s not. What was once beyond human perception will come into view.
IBM scientists are building a compact hyperimaging platform that “sees” across separate portions of the electromagnetic spectrum in one platform to enable a host of practical and affordable devices and applications.
In five years, we will use machine learning algorithms and software to organize the information about the physical world to help bring the vast and complex data gathered by billions of devices within the range of our vision and understanding.
We call this a “macroscope” – but unlike the microscope to see the very small, or the telescope that can see far away, it is a system of software and algorithms to bring all of Earth’s complex data together to analyze it for meaning.
By aggregating, organizing and analyzing data on climate, soil conditions, water levels and their relationship to irrigation practices, for example, a new generation of farmers will have insights that help them determine the right crop choices, where to plant them and how to produce optimal yields while conserving precious water supplies.
Medical labs on a chip
In the next five years, new medical labs “on a chip” will serve as nanotechnology health detectives – tracing invisible clues in our bodily fluids and letting us know immediately if we have reason to see a doctor. The goal is to shrink down to a single silicon chip all of the processes necessary to analyze a disease that would normally be carried out in a full-scale biochemistry lab.
The lab-on-a-chip technology could ultimately be packaged in a convenient handheld device to allow people to quickly and regularly measure the presence of biomarkers found in small amounts of bodily fluids, sending this information securely streaming into the cloud from the convenience of their home. There it could be combined with real-time health data from other IoT-enabled devices, like sleep monitors and smart watches, and analyzed by AI systems for insights. When taken together, this data set will give us an in depth view of our health and alert us to the first signs of trouble, helping to stop disease before it progresses.
IBM Research scientists are developing lab-on-a-chip nanotechnology that can separate and isolate bioparticles down to 20 nanometers in diameter, a scale that gives access to DNA, viruses, and exosomes. These particles could be analyzed to potentially reveal the presence of disease even before we have symptoms.
In five years, sensing technologies deployed near natural gas extraction wells, around storage facilities, and along distribution pipelines will enable the industry to pinpoint invisible leaks in real-time.
Networks of IoT sensors wirelessly connected to the cloud will provide continuous monitoring of the vast natural gas infrastructure, allowing leaks to be found in a matter of minutes instead of weeks, reducing pollution and waste and the likelihood of catastrophic events.
IBM scientists are tackling this vision, working with natural gas producers such as Southwestern Energy to explore the development of an intelligent methane monitoring system and as part of the ARPA-E Methane Observation Networks with Innovative Technology to Obtain Reductions (MONITOR) program.