AI could transform health care. Can security net suppliers follow?

This audio is generated automatically. Please let us know if you have comments.
North Country Healthcare is the “only city game” for the provision of care in some of the regions it serves, said Dr. Jennifer Cortes, a quality and population health doctor at the federal qualified health center. The supplier operates 13 primary care clinics and two mobile units, taking up 55,000 people in northern Rural Arizona.
Some communities are very distant – which means that patients can be forced to travel for travel hours to achieve specialized care – which makes supplier recruitment a challenge, said Cortes.
It is an area artificial intelligence could help. The adoption of an AI scribe, which generally records supplier conversations with patients and the writing of clinical documentation, could mitigate some of the clinician administrative work and reduce professional exhaustion, she said.
“When Chatgpt was released for the first time, I said to myself:” Oh my God, it could make things so much better for those of us who work in this area “”, said Cortes. “I just want my work not that difficult all the time. It would be incredible if it works. ”
But taking an AI project is not easy for a supplier of safety nets. Technology may be in labor to be implemented, requiring technical expertise and surveillance capacities that many are unlikely to access easily, according to experts.
And if health systems with the least resources – who often care about the most complex medically complex patients – are unable to carry out the advantages of AI, they could fall even more behind the larger or easier suppliers.
“If you look at the types of health systems that actively deploy AI at the moment, those who can afford it are those who pursue it more aggressive,” said Brian Anderson, CEO of the Coalition for Health IA, an industrial group developing directives for responsible use of AI in health care. “Those who are in rural communities, for example, who do not have computer staff to deploy and configure different types of AI tools are unable to do so. This is an example of the digital fracture already reinforced in AI space.”
“ A ton of human effort ” ‘
The adoption of AI products in health systems may require specialized manifolds of human workforce and technology to implement in complete safety, creating significant obstacles for suppliers of short of money, according to experts.
“People tend to talk about it or conceptualize it as if you are lighting a light switch,” said Paige Nong, assistant professor at the University of Minnesota School of Public Health. “It’s actually not so simple. These tools and systems require a ton of human efforts. ”
Supporters of security nets are likely to operate on thin margins, given their stronger dependence in Medicaid – an urgent challenge because the insurance program faces federal funding reductions – and higher rewarded care requests.
For example, the net margin of community health centers, which provide primary care to poorly served populations, was only 1.6% in 2023, according to the KFF health policies research company. This increased from 4.5% in 2022, driven by inflation and the expiration of financing the pandemic era.
Many community health centers are also faced with labor concerns, with more than 70% reporting a primary care doctor, a nurse or a mental health professional last year, according to the Commonwealth Fund. Meanwhile, labor costs are a significant expenditure for many providers.
And the implementation of AI will take a lot of work to manage. For example, health systems will have to set up AI governance structures that can assess products for safety and efficiency as well as to maintain regulatory and legal compliance. In addition, suppliers should continue to monitor their AI tools, as hypotheses underlying the model, such as the characteristics of the patients, could change over time, potentially degrading its performance, according to experts.
“It is obvious when a scalpel rusts, you know that you must replace or clean it,” said Anderson de Chai. “With many of these AI tools, we don’t necessarily know it yet. So how health systems can afford to make the type of surveillance and management of these models over time is real concern. ”
Technological support
In addition, suppliers will need computer staff with technical expertise to manage the work necessary to adopt AI tools, a particular challenge for short and rural facilities that may have trouble attracting talent, according to experts.
For example, many security net suppliers do not have staff scientists, said Mark Sendak, population health and data science at the Duke Institute for Health Innovation. It probably does not make sense for some of these care establishments to use them either, since staff can order high wages without generating income from patient care, he added.
Financial challenges can also make it difficult to invest in the IT infrastructure necessary to adopt AI tools, said Jennifer Stoll, head of external affairs at Health Informal and Consultancy Ochin.
“The combination of the challenges of community health organizations is that many have no choice but to rely on obsolete and ineffective technological systems,” she said by email. “Not only are these systems outdated by which they have access, but some are not even able to integrate AI tools, without exacerbating the technological fracture.”
Meanwhile, AI could be at the bottom of the task list of a computer team. For example, in North Country, Wi-Fi in clinics does not always work well. And his inherited electronic health file will not be supported by his supplier in a few years, so the supplier will have to go to a new one.
“We even miss the basics,” said Cortes. “Even if you are not talking about AI, we are late.”
Miss AI
These constraints probably already have an impact on how low resources suppliers implement AI, according to experts.
For example, a study published at the start of this year in health cases revealed that 61% of American hospitals who used predictive AI models had assessed them for precision using their own data, and only 44% have locally evaluated their biases – an important process to help health systems to determine whether a tool will work well within their patient populations.
Hospitals that have developed their own predictive models, reported high operating margins and were part of a health system that is more likely to locally assess their AI products.
“You have resources, you have computer staff, you have scientists that can design models or can assess the models of a DSE supplier,” said the University of Minnesota, one of the study authors. “Resources were the critical component necessary to be able to carry out an assessment carried out and also to design tailor -made models.”
Without assistance, providers who do not have the funds and technical capacities to implement AI could miss the potential advantages of technology or adopt AI without necessary guarantees. This could expand the existing disparities between high resources providers and low resources and their patients, according to experts.
For example, hiring and maintaining staff is already a challenge for security nets, said Sendak. When a new graduate of the residence is looking for a job of primary care, does he prefer to take a stand in a health system that could offer a documentation assistant on AI – which could help to mitigate professional exhaustion – or a clinic unable to put one?
In addition, the limitation of the adoption of AI to the most appreciated suppliers could strengthen the biases included inadvertently in these tools, said Anderson de Chai. If the data used to train algorithms continues to be easily collected in urban communities and highly educated on the ribs, AI tools will miss the information collected from other groups.
“I think that our work as a company is to make sure that we make this as simple as possible, so that we have an AI that can serve the individual in rural apparatus or in an agricultural community in kansas as well as they serve people in San Francisco or to the inhabitants of Boston,” he said.
Help sought
However, there are methods that could help small suppliers with low resources to adopt AI products – including mentoring and support models that have been used for other emerging technologies, according to experts.
For example, the Hitech law, promulgated in 2009 to promote the use of DSEs, included the financing of regional popularization centers, which provided technical assistance in the field for small primary care practices, community health centers and critical access hospitals.
Likewise, the Health Resources and Services Administration Finance of Télésanté Resource Centers, a group of 12 regional national centers and two which offer education and resources to suppliers who seek to implement virtual care.
DSE sellers also have a role to play, said Nong. Almost 80% of hospitals in his study on health affairs have used predictive models obtained through their DSE developer, so this could be a hard-hitting point to help suppliers securely deploy models, she added.
More important health systems and university medical centers could also help their small counterparts short of resources. The AI health partnership, a coalition that includes health systems like Duke Health and Mayo Clinic, manages the practice network, which works with safety environments to adopt best AI practices with individual support.
North Country is one of the inaugural participants in the network. The program provides technical support to security net organizations, helping them to work through the supply, assessment and implementation of AI, said Sendak de Duke, which sits on the Haip leadership council.
However, the practice network is currently working with five suppliers of safety nets, while there are hundreds of qualified health centers across the country that may need support, he added.
“There is a big difference between the place where the whole conversation concerns AI in health care and where many people who offer health care are in terms of ability to adopt and use it safely,” said Cortes de North Country. “When they talk about all these enormous investments in AI, great, it’s exciting, but how are they going to bring us too?”