IBM has been placing huge bets on artificial intelligence (AI) over the years and as part of its annual “IBM 5 in 5” has revealed how IBM scientists are unravelling mysteries locked inside our brain by unlocking them through speech as a key forged with AI.
Mental health is of paramount importance for leading a normal life, but brain disorders, including developmental, psychiatric and neurodegenerative diseases, represent an enormous disease burden, in terms of human suffering and economic cost. Over the last few years, mental health issues have been increasing across the globe including the US and diseases such as depression, bipolar disease or schizophrenia, are impacting life of millions of people around the world.
In US alone, one in five adults suffer from mental health issues with roughly half of individuals with severe psychiatric disorders not even receiving any form of treatment making their lives difficult. Studies have estimated that the cost of mental health conditions across the globe is projected to surge to US$6 trillion by 2030.
IBM says the speech will enable us to unlock the mysteries of our brain and in a matter of just five years, things that we say and write will be used as indicators of our mental health and physical wellbeing. IBM says that by analysing patterns in our speech and by analyzing the writing, new new cognitive systems will provide tell-tale signs of early-stage developmental disorders, mental illness and degenerative neurological diseases that can help doctors and patients better predict, monitor and track these conditions.
Scientists at IBM are using transcripts and audio inputs from psychiatric interviews, coupled with machine learning techniques, to find patterns in speech to help clinicians accurately predict and monitor psychosis, schizophrenia, mania and depression. Today, it only takes about 300 words to help clinicians predict the probability of psychosis in a user.
The technology giant is optimistic that in the future, similar techniques could be used to help patients with Parkinson’s, Alzheimer’s, Huntington’s disease, PTSD and even neurodevelopmental conditions such as autism and ADHD.
Cognitive computers can analyze a patient’s speech or written words to look for tell-tale indicators found in language, including meaning, syntax and intonation. Combining the results of these measurements with those from wearable devices and imaging systems and collected in a secure network can paint a more complete picture of the individual for health professionals to better identify, understand and treat the underlying disease.
What were once invisible signs will become clear signals of patients’ likelihood of entering a certain mental state or how well their treatment plan is working, complementing regular clinical visits with daily assessments from the comfort of their homes.