By KIM BELLARD
I feel like I’ve been writing a lot about futures I was pretty worried about, so I’m pleased to have a couple developments to talk about that help remind me that technology is cool and that healthcare can surely use more of it.
First up is a new AI algorithm called FaceAge, as published last week in The Lancet Digital Health by researchers at Mass General Brigham. What it does is to use photographs to determine biological age – as opposed to chronological age. We all know that different people seem to age at different rates – I mean, honestly, how old is Paul Rudd??? – but until now the link between how people look and their health status was intuitive at best.
Moreover, the algorithm can help determine survival outcomes for various types of cancer.
The researchers trained the algorithm on almost 59,000 photos from public databases, then tested against the photos of 6,200 cancer patients taken prior to the start of radiotherapy. Cancer patients appeared to FaceAge some five years older than their chronological age. “We can use artificial intelligence (AI) to estimate a person’s biological age from face pictures, and our study shows that information can be clinically meaningful,” said co-senior and corresponding author Hugo Aerts, PhD, director of the Artificial Intelligence in Medicine (AIM) program at Mass General Brigham.
Curiously, the algorithm doesn’t seem to care about whether someone is bald or has grey hair, and may be using more subtle clues, such as muscle tone. It is unclear what difference makeup, lighting, or plastic surgery makes. “So this is something that we are actively investigating and researching,” Dr. Aerts told The Washington Post. “We’re now testing in various datasets [to see] how we can make the algorithm robust against this.”
Moreover, it was trained primarily on white faces, which the researchers acknowledge as a deficiency. “I’d be very worried about whether this tool works equally well for all populations, for example women, older adults, racial and ethnic minorities, those with various disabilities, pregnant women and the like,” Jennifer E. Miller, the co-director of the program for biomedical ethics at Yale University, told The New York Times.
The researchers believe FaceAge can be used to better estimate survival rates for cancer patients. It turns out that when physicians try to gauge them simply by looking, their guess is essentially like tossing a coin. When paired with FaceAge’s insights, the accuracy can go up to about 80%.
Dr. Aerts says: “This work demonstrates that a photo like a simple selfie contains important information that could help to inform clinical decision-making and care plans for patients and clinicians. How old someone looks compared to their chronological age really matters—individuals with FaceAges that are younger than their chronological ages do significantly better after cancer therapy.”
I’m especially thrilled about this because ten years ago I speculated about using selfies and facial recognition AI to determine if we had conditions that were prematurely aging us, or even we were just getting sick. It appears the Mass General Brigham researchers agree. “This opens the door to a whole new realm of biomarker discovery from photographs, and its potential goes far beyond cancer care or predicting age,” said co-senior author Ray Mak, MD, a faculty member in the AIM program at Mass General Brigham. “As we increasingly think of different chronic diseases as diseases of aging, it becomes even more important to be able to accurately predict an individual’s aging trajectory. I hope we can ultimately use this technology as an early detection system in a variety of applications, within a strong regulatory and ethical framework, to help save lives.”
The researchers acknowledge that much has to be accomplished before it is introduced for commercial purposes, and that strong oversight will be needed to ensure, as Dr. Aerts told WaPo, “these AI technologies are being used in the right way, really only for the benefit of the patients.” As Daniel Belsky, a Columbia University epidemiologist, told The New York Times: “There’s a long way between where we are today and actually using these tools in a clinical setting.”
The second development is even more out there. Let me break down the CalTech News headline: “3D Printing.” OK, you’ve got my attention. “In Vivo.” Color me highly intrigued. “Using Sound.” Mind. Blown.
That’s right. This team of researchers have “developed a method for 3D printing polymers at specific locations deep within living animals.”
Apparently, 3D printing has been done in vivo previously, but using infrared light. “But infrared penetration is very limited. It only reaches right below the skin,” says Wei Gao, professor of medical engineering at Caltech and corresponding author. “Our new technique reaches the deep tissue and can print a variety of materials for a broad range of applications, all while maintaining excellent biocompatibility.”
They call the technique the deep tissue in vivo sound printing (DISP) platform.
“The DISP technology offers a versatile platform for printing a wide range of functional biomaterials, unlocking applications in bioelectronics, drug delivery, tissue engineering, wound sealing, and beyond,” the team stated. “By enabling precise control over material properties and spatial resolution, DISP is ideal for creating functional structures and patterns directly within living tissues.”
The authors concluded: “DISP’s ability to print conductive, drug-loaded, cell-laden, and bioadhesive biomaterials demonstrates its versatility for diverse biomedical applications.”
I’ll spare you the details, which involve, among other things, ultrasound and low temperature sensitive liposomes. The key takeaway is this: “We have already shown in a small animal that we can print drug-loaded hydrogels for tumor treatment,” Dr. Gao says. “Our next stage is to try to print in a larger animal model, and hopefully, in the near future, we can evaluate this in humans…In the future, with the help of AI, we would like to be able to autonomously trigger high-precision printing within a moving organ such as a beating heart.”
Dr. Gao also points out that not only can they add bio-ink where desired, but they could remove it if needed. Minimally invasive surgery seems crude by comparison.
“It’s quite exciting,” Yu Shrike Zhang, a biomedical engineer at Harvard Medical School and Brigham and Women’s Hospital, who was not involved in the research, told IEEE Spectrum. “This work has really expanded the scope of ultrasound-based printing and shown its translational capacity.”
First author Elham Davoodi has high hopes. “It’s quite versatile…It’s a new research direction in the field of bioprinting.”
“Quite exciting” doesn’t do it justice.
In these topsy-turvy days, we must find our solace where we can, and these are the kinds of things that make me hopeful about the future.
Kim is a former emarketing exec at a major Blues plan, editor of the late & lamented Tincture.io, and now regular THCB contributor