AIM Media House

Providence St. Joseph Hospital Lets AI Read Mammograms

Providence St. Joseph Hospital Lets AI Read Mammograms

Radiologists in Orange County are using FDA-cleared software to assist breast cancer screening, with humans still making the final call

At Providence St. Joseph Hospital in Orange County, radiologists are using AI to assist with routine breast cancer screening. The system analyzes mammograms alongside human readers, highlighting areas that may warrant closer attention. Patients can opt into the AI-assisted review for an additional fee, typically around $50.

The hospital has been using the software for roughly a year. According to clinicians there, the technology is designed to surface subtle findings that can be difficult to detect, particularly in dense breast tissue.

AI Moves Into the Mammography Reading Room

“They have good data showing it finds 20% more cancers and cancers two to three years earlier than without the AI program,” said Dr. Kenneth Meng, a radiologist at Providence St. Joseph Hospital, in an interview with ABC7.

Meng emphasized that the tool does not replace clinical judgment. “It does take the AI and the radiologist together to get the best, most accurate reading,” he said. In some cases, the system flags findings that turn out to be benign. In others, it draws attention to abnormalities that might otherwise be overlooked. “I’ve seen now dozens of cases where it’s made a difference,” Meng told the station.

The software used at Providence was developed by iCAD, whose ProFound AI product is cleared by the U.S. Food and Drug Administration for use with 2D and 3D mammography. The system functions as a concurrent reader, generating visual prompts and risk scores that radiologists review as part of their normal workflow. Providence has not published internal performance audits of its own program, and the hospital has not issued a formal press release detailing the deployment. The available record consists of clinician comments reported through local and trade media.

Even so, the Orange County rollout is part of a broader pattern across U.S. healthcare, where AI tools are beginning to move from research environments into everyday screening workflows.

Research Models to Routine Screening

Large academic health systems across the United States have begun integrating AI into mammography at scale, often after years of internal validation and regulatory review.

Mount Sinai Health System has publicly reported performing more than 100,000 AI-assisted mammograms as part of routine screening, positioning AI as an adjunct to radiologist interpretation rather than a standalone reader. Mayo Clinic has licensed mammography AI software for use across its system, while Mass General Brigham has established formal governance structures to evaluate and deploy AI tools in radiology, including breast imaging (Mass General Brigham).

Academic centers have also generated much of the evidence underpinning these deployments. Researchers at UCLA Health have published studies showing that AI-assisted screening can reduce interval cancers, those diagnosed between routine mammograms, by identifying high-risk cases earlier.

AI systems tend to increase cancer detection rates while maintaining, or in some cases reducing, false-positive recall rates. Several prospective studies and real-world audits report improvements in workflow efficiency and earlier identification of suspicious findings. What they do not yet show is a clear link to reduced breast cancer mortality. Long-term outcome data will take years to accumulate, and most health systems acknowledge that current evidence supports process improvements rather than definitive survival benefits.

Regulatory progress has helped accelerate adoption. The FDA has cleared multiple mammography AI products, including systems from iCAD, Lunit, ScreenPoint, and Therapixel, giving hospitals a pathway to deploy these tools in clinical practice. In some cases, research models developed by large technology companies have been commercialized through partnerships with established imaging vendors, allowing them to enter regulated healthcare settings.

The way these tools are offered to patients varies. Some hospitals incorporate AI into standard reads without separate billing. Others, like Providence St. Joseph Hospital, present it as an optional add-on paid directly by patients. Insurance coverage is inconsistent, raising questions about access and equity that health systems have yet to resolve publicly.

What unites these programs is a shared operating assumption: AI works best when embedded quietly into existing workflows, augmenting rather than displacing clinicians. Radiologists remain responsible for final interpretations. AI outputs are treated as decision-support signals, not diagnoses.

Providence’s experience fits squarely within that model. The hospital has not claimed that AI eliminates uncertainty or human error. Instead, clinicians describe it as another layer of review, one that can prompt earlier follow-up or additional imaging when warranted.

Key Takeaways

  • Providence St. Joseph Hospital uses FDA-cleared AI to enhance breast cancer screening accuracy.
  • AI reportedly detects 20% more cancers 2-3 years earlier, especially in dense breast tissue.
  • Radiologists maintain final diagnostic authority, utilizing AI as an assistive, not replacement, tool.
  • Patients can opt for AI-assisted mammogram reviews for an additional fee, typically around $50.