AI can now detect COVID-19 in lung ultrasound images

March 21, 2024

Media Inquiries Name Roberto Molar Candanosa Email [email protected] Office phone 443-997-0258 Cell phone 443-938-1944Artificial intelligence can spot COVID-19 in lung ultrasound images much like facial recognition software can spot a face in a crowd, new research shows. The findings boost AI-driven medical diagnostics and bring health care professionals closer to being able to quickly diagnose patients with COVID-19 and other pulmonary diseases with algorithms that comb through ultrasound images to identify signs of disease. Tiffany Fong Assistant professor, Johns Hopkins Medicine "What we are doing here with AI tools is the next big frontier for point of care." "What we are doing here with AI tools is the next big frontier for point of care," Fong said. The AI analyzes ultrasound lung images to spot features known as B-lines, which appear as bright, vertical abnormalities and indicate inflammation in patients with pulmonary complications.

Name
Roberto Molar Candanosa
Email
[email protected]
Office phone
443-997-0258
Cell phone
443-938-1944

Artificial intelligence can spot COVID-19 in lung ultrasound images much like facial recognition software can spot a face in a crowd, new research shows.

The findings boost AI-driven medical diagnostics and bring health care professionals closer to being able to quickly diagnose patients with COVID-19 and other pulmonary diseases with algorithms that comb through ultrasound images to identify signs of disease.

The findings, newly published in Communications Medicine, culminate an effort that started early in the pandemic when clinicians needed tools to rapidly assess legions of patients in overwhelmed emergency rooms.

Tiffany Fong

Assistant professor, Johns Hopkins Medicine

"What we are doing here with AI tools is the next big frontier for point of care."

"We developed this automated detection tool to help doctors in emergency settings with high caseloads of patients who need to be diagnosed quickly and accurately, such as in the earlier stages of the pandemic," said senior author Muyinatu Bell, an associate professor in the Department of Electrical and Computer Engineering in the Whiting School of Engineering at Johns Hopkins University. "Potentially, we want to have wireless devices that patients can use at home to monitor progression of COVID-19, too."

The tool also holds potential for developing wearables that track such illnesses as congestive heart failure, which can lead to fluid overload in patients' lungs, not unlike COVID-19, said co-author Tiffany Fong, an assistant professor of emergency medicine at Johns Hopkins Medicine.

"What we are doing here with AI tools is the next big frontier for point of care," Fong said. "An ideal use case would be wearable ultrasound patches that monitor fluid buildup and let patients know when they need a medication adjustment or when they need to see a doctor."

The AI analyzes ultrasound lung images to spot features known as B-lines, which appear as bright, vertical abnormalities and indicate inflammation in patients with pulmonary complications. It combines computer-generated images with real ultrasounds of patients — including some who sought care at Johns Hopkins.

"We had to model the physics of ultrasound and acoustic wave propagation well enough in order to get believable simulated images," Bell said. "Then we had to take it a step further to train our computer models to use these simulated data to reliably interpret real scans from patients with affected lungs."

Early in the pandemic, scientists struggled to use artificial intelligence to assess COVID-19 indicators in lung ultrasound images because of a lack of patient data and because they were only beginning to understand how the disease manifests in the body, Bell said.

Her team developed software that can learn from a mix of real and simulated data and then discern abnormalities in ultrasound scans that indicate a person has contracted COVID-19. The tool is a deep neural network, a type of AI designed to behave like the interconnected neurons that enable the brain to recognize patterns, understand speech, and achieve other complex tasks.

"Early in the pandemic, we didn't have enough ultrasound images of COVID-19 patients to develop and test our algorithms, and as a result our deep neural networks never reached peak performance," said first author Lingyi Zhao, who developed the software while a postdoctoral fellow in Bell's lab and is now working at Novateur Research Solutions. "Now, we are proving that with computer-generated datasets we still can achieve a high degree of accuracy in evaluating and detecting these COVID-19 features."

This work was supported by NIH Trailblazer Award Supplement R21EB025621-03S1.

The team's code and data are publicly available here.

The source of this news is from Johns Hopkins University

Popular in Research

1

5 days ago

Conspiracy theory runs wild linking New York City’s 4.8-magnitude earthquake to date of solar eclipse

2

Mar 26, 2024

Diddy's homes in L.A. and Miami are raided by Homeland Security as part of 'sex trafficking probe' after mounting lawsuits

3

Mar 23, 2024

The Solution to the Children’s Mental Health Crisis Won’t Look Like ‘Hell Camp’ on Netflix

4

Mar 27, 2024

Researchers Show Classical Computers Can Keep Up with, and Surpass, Their Quantum Counterparts

5

2 days ago

The rise of Dawn

An MIT philosopher’s call for a civil discussion on gender and sex

2 days ago

Most voters say Biden, Trump both mentally unfit for 2nd term, poll finds

3 days ago

Nasdaq Futures Up 2% as Nvidia Powers Global Rally: Markets Wrap

3 days ago

Meet the former organized-crime prosecutor now overseeing the Trump Organization

6 days ago

Outreach Roundup

3 days ago

MIT course aids social connection, better relationships, and happiness

4 days ago