And now for a fun future – the health care blog

By Kim Bellard
I feel like I have written a lot about the future who worry me enough, so I am delighted to have some developments to talk about this help to remember that technology is cool and that health care can surely use more.
The first is a new AI algorithm called faceage, as published last week in Lancet digital health By researchers from the mass general Brigham. What he does is to use photographs to determine biological age – as opposed to chronological age. We all know that different people seem to age at different rates – I mean, honestly, how old is Paul Rudd ??? – But so far, the link between the appearance of people and their state of health was at best intuitive.
In addition, algorithm can help determine the survival results for various types of cancer.
The researchers formed the algorithm in nearly 59,000 photos of the public databases, then tested against photos of 6,200 cancer patients taken before the start of radiotherapy. Patients with cancer seemed to be about five years older than their chronological age. “We can use artificial intelligence (AI) to estimate the biological age of a person from the face images, and our study shows that information can be clinically significant,” said Co-Senior and Author corresponding Hugo Aerts, PHD, director of the Artificial Medicine Intelligence Program (AIM) to general Brigham.
Curiously, the algorithm does not seem to worry about whether someone is bald or has gray hair, and can use more subtle clues, such as muscle tone. We do not know which makeup, lighting or plastic surgery of difference. “So that’s something we are studying and actively looking for,” said Dr. Aerts The Washington Post. “We are now testing in various data sets [to see] How we can make the algorithm robust against this. »»
In addition, it was trained mainly on white faces, which researchers recognize as a deficiency. “I would be very worried about whether this tool works as well for all populations, for example women, the elderly, racial and ethnic minorities, those who have various disabilities, pregnant and other women,” said Jennifer E. Miller, co -director of the program for biomedical ethics at the University of Yale ,, The New York Times.
Researchers think that the face can be used to better estimate survival rates for cancer patients. It turns out that when doctors try to assess them simply by looking, their supposition is essentially as launching a room. When associated with faceage ideas, precision can reach around 80%.
Dr. Aerts says: “This work shows that a photo like a simple selfie contains important information that could help to clarify clinical decision -making and care plans for patients and clinicians. How old is to look like their chronological age is really important, individuals with faces that are younger than their chronological ages succeed significantly after the treatment of cancer.”
I am particularly delighted with that because ten years ago, I speculated on the use of selfies and the AI facial recognition to determine if we had conditions that were aging prematurely, or even we become sick. It seems that mass researchers Brigham agreed. “This opens the door to a whole new area of the discovery of biomarkers from photographs, and its potential goes far beyond the care of cancer or the age of prediction,” said the co-printed author Ray Mak, MD, a member of the faculty of the AIM program of the Général de Masse Brigham. “As we consider more and more different chronic diseases as an aging disease, it becomes even more important to be able to predict with precision the trajectory of an individual’s aging. I hope that we can finally use this technology as an early detection system in a variety of applications, in a solid regulatory and ethical framework, to help save lives. ”
The researchers recognize that many must be accomplished before being introduced for commercial purposes, and that strong surveillance will be necessary to ensure, as Dr. Aerts said Wapo“These AI technologies are used in the right way, really only for patients.” Like Daniel Belsky, an epidemiologist from Columbia University, said it The New York Times: “There is a long way between the place where we are today and using these tools in a clinical setting.”
The second development is even more. Let me break down the Caltech News Title: “3D printing. “Ok, you have my attention.In vain. “Color-moi very intrigued.Use of sound. “Mind.
That’s right. This team of researchers has “developed a method for 3D printing polymers in specific places in the depths of living animals”.
Apparently, 3D printing was made in vivo before, but using infrared light. “But infrared penetration is very limited. He does not reach just below the skin, ”explains Wei Gao, professor of medical engineering at Caltech and corresponding author. “Our new technique reaches deep fabric and can print a variety of materials for a wide range of applications, while maintaining excellent biocompatibility.”
They call the technique of the sound printing platform of deep fabric in vivo (DISP).
“DISP technology offers a versatile platform to print a wide range of functional biomaterials, unlock bioelectronics applications, drug administration, tissue engineering, wound sealing and beyond,” said the team. “By allowing precise control over the properties of materials and spatial resolution, DISP is ideal for creating structures and functional models directly in living tissues.”
The authors concluded: “The capacity of DISP to print conductive biomaterials, loaded with drugs, loaded with cells and bioadhesive demonstrates its versatility for various biomedical applications.”
I will spare you the details, which involve, among other things, ultrasounds and low -temperature liposomes. The key point is the key: “We have already shown in a small animal that we can print hydrogels loaded with drugs for tumor treatment,” explains Dr. Gao. “Our next step is to try to print in a larger animal model, and I hope that, in the near future, we can assess this in humans … In the future, with the help of AI, we would like to be able to trigger high precision in a moving organ as a beating organ.”
Dr. Gao also points out that not only can they add bio-like links, but they could delete it if necessary. Mini-invasive surgery seems raw in comparison.
“It’s quite exciting,” said Yu Shrike Zhang, biomedical engineer at the Harvard Medical School and Brigham and Women’s Hospital, who was not involved in research, said Spectrum ieee. “This work has really expanded the scope of ultrasonic printing and has shown its translation capacity.”
The first author Elham Davoodi has great hopes. “It’s quite versatile … It is a new research department in the field of biopritation.”
“Quite exciting” does not do him justice.
In these days upside down, we have to find our comfort where we can, and these are the kinds of things that give me hope for the future.
Kim is a former Emarketing leader in a major blues plan, editor Dye.ioand now a regular THCB contributor
:max_bytes(150000):strip_icc()/VWH-GettyImages-2164163540-9088ff13c1d0440cb0bf81f5978b3060.jpg?w=390&resize=390,220&ssl=1)
:max_bytes(150000):strip_icc()/GettyImages-1416818056-28104e91d2f64f5bb76cca9159b4b93b.jpg?w=390&resize=390,220&ssl=1)
:max_bytes(150000):strip_icc()/fatigue-78030063-resized-56a03df45f9b58eba4af7f44.jpg?w=390&resize=390,220&ssl=1)

