Artificial intelligence can be used to assess whether a person is suffering from post-traumatic stress disorder through an analysis of the subject’s voice patterns, noting and processing any variations to predict the medical diagnosis.
The research is not only useful at close quarters, it also offers a potential telemedical approach to use applied to the assessment of patients located in remote areas and away from specialist medical facilities.
The study comes from the NYU Langone Health and NYU School of Medicine, where the researchers used a specially designed computer program to assess the stress levels of veterans by analyzing their voices. The key findings have been presented to the conference of the International Speech Communication Association.
Conventionally post-traumatic stress disorder by clinical interviews or self-assessment. This can prove to be a lengthy and variable process, which was partly the reason for training artificial intelligence as well as the remote medical reasons.
To develop the technology, the scientists used a statistical and machine learning tool termed ‘random forest’. This form of artificial intelligence has the ability to “learn” how to classify individuals based in learnt examples and using decision-making rules together with mathematical models.
The first step with the development of the technology involved recording standard long-term diagnostic interviews (which are classed as PTSD Scales under Clinician’s Checks) with 53 U.S. veterans from campaigns in Iraq and Afghanistan, who has been assessed as suffering from different forms of post-traumatic stress disorder. These were compared with interviews with 78 non-ill veterans.
Each of the recordings was added into the voice software and this produced a total of 40,526 short speech voices. These were used to train the artificial intelligence. Once trained, the technology was then tested with a new set of subjects, who were known to the researchers and some of who had been assessed as having post-traumatic stress disorder. The next aim is to introduce the artificial intelligence into the clinical setting.
Commenting on the study, lead scientist Dr. Charles R. Marmar notes: “Our findings suggest that speech characteristics can be used to diagnose this disease, and with further training and confirmation, they can be used in the clinic in the near future.”
The output from the study has been published in the journal Depression and Anxiety, with the research study titled “Speech‐based markers for posttraumatic stress disorder in US veterans.”
North America’s first digital hospital launches second generation Command Centre
Do the words ‘command centre’ make you think of huge rooms with NASA scientists, expertly making sure a Mars rover lands safely on the Red Planet?
What if a command centre could revolutionize the patient experience in one of the busiest hospitals in North America, bringing a new standard of patient-centered, quality healthcare?
Combining Artificial Intelligence, Machine Learning, and professional expertise, Humber River Hospital in Toronto has launched the world’s first clinical analytic applications, in partnership with GE Healthcare Partners (GEHC).
Displayed on large-screen monitors at HRH’s 4,500 square-foot Command Centre, these four new applications or analytic ’tiles’ use “standardized early warning systems, predictive analytics, real-time information from multiple digital systems,” alongside the professional expertise of experienced nurses.
Canadian Patient Safety Institute and Canadian Institute for Health Information data shows that 1 in 18 hospital stays in Canada involved at least one harmful event. This addition to the Command Centre means quicker alerts for clinical staff, and better protection for patients with conditions that make them vulnerable to risks of adverse events, or adverse outcomes.
The Humber River Hospital is the Greater Toronto Area’s (GTA) largest acute care centre, serving a catchment area of more than 850,000 in the city’s northwest. Opening in 2015, it was also North America’s first fully digital hospital.
Just two years later, HRH opened the first generation of its Command Centre, a data-driven ‘mission control’ offering real-time insight on patient flow, via advanced algorithms and predictive analytics. As a result, the hospital has “unlocked” the equivalent of 35 additional beds — and the ability to treat thousands of additional patients.
Get an inside look at HRH’s Generation 2 Command Centre:
DX Journal covers the impact of digital transformation (DX) initiatives worldwide across multiple industries.
Canadian startup Deep Genomics uses AI to speed up drug discovery
One of the biggest challenges pharmaceutical companies face is with the time taken to discover new drugs, develop them and get them to market. This lengthy process is punctuated with false starts. Startup Deep Genomics uses AI to accelerate the process.
Canadian startup Deep Genomics has been using artificial intelligence as a mechanism to speed up the drug discovery process, combining digital simulation technology with biological science and automation. The company has built a platform which uses machine learning to delve into the molecular basis of genetic diseases. The platform can analyze potential candidate drugs and identify those which appear most promising for further development by scientists.
The drug development process is dependent upon many factors, such as those relating to combining molecules (noting the interactions between hundreds of biological entities) and with the assessment of biomedical data. The data review required at these stages is highly complex. For these reasons, many researchers are seeking algorithms to help to extract data for analysis.
According to MaRS, Deep Genomics is addressing the time consuming element involved in the initial stages of drug discovery. The artificial intelligence system that the company has designed is able to process 69 billion molecules, comparing each one against around one million cellular processes. This type of analysis would have taken a conventional computer (or a team of humans) many years to run the necessary computations.
Within a few months, Deep Genomics AI has narrowed down the billions of combination to shortlist of 1,000 potential drugs. This process is not only faster, it narrows down the number of experiments that would need to be run, saving on laboratory tests and ensuring that only those drugs with a high chance of success are progressed to the clinical trial stage.
This type of system goes some way to addressing the lengthy typical time to market, which stands at around 14 years for a candidate drug; as well as reducing the costs for drug development, which run into the billions of dollars per drug.
Health service partners with Alexa to provide medical support
The U.K. National Health Service (NHS) is to partner with Amazon’s Alexa in order to provide health information. This is being piloted as an alternative to medical advice helplines and to reduce the number of medical appointments.
While the U.K. NHS is much admired around the world as a free-at-the-point-of-use healthcare system, health officials are always keen to find ways to reduce the strain on the systems, especially relating to medical visits where the process of booking appointments and waiting times for sessions with doctors can be lengthy. The average time to obtain a non-medical emergency appointment with a general medical practitioner is averaging around two weeks.
Although a non-emergency medical helpline is active (accessed by dialling 111), plus an online system, health officials are keen to explore other ways by which the U.K. population can access medical services. For this reason, NHS England is partnering with Amazon.
The use of Alexa voice technology not only offers an alternative service for digitally-savvy patients, it provides a potentially easier route for elderly and visually impaired citizens, as well as those who cannot access the Internet through a keyboard, to gain access to health information. This fits in with a new initiative from the U.K. Government called NHSX, which is about the NHS Long Term Plan intended to make more NHS services available digitally.
As PharmaPhorum reports, Alexa can now answer questions such as “Alexa, how do I treat a migraine?” and “Alexa, what are the symptoms of flu?”
Outside of the U.K., Amazon is working with several healthcare providers, including digital health coaching companies, in order to launch six new Alexa healthcare ‘skills’. According to Rachel Jiang, head of Alexa Health & Wellness: “Every day developers are inventing with voice to build helpful and convenient experiences for their customers. These new skills are designed to help customers manage a variety of healthcare needs at home simply using voice – whether it’s booking a medical appointment, accessing hospital post-discharge instructions, checking on the status of a prescription delivery and more.”
Manufacturing2 months ago
IoT + Digital Twin = Operations Intelligence: An Equation that Delivers Useful What-If Scenarios
Manufacturing1 month ago
What you need to know if you’re attending AVEVA World Summit
Manufacturing4 weeks ago
IoT + big data analytics = operations intelligence: An equation that draws a better picture
Financial Services2 months ago
Measures to put the digital transformation of banks back on track
Financial Services2 months ago
Why private label banking apps and products are on the rise