Healthcare AI’s Future in Underrepresented Communities Hinges on Data Privacy – Julia Komissarchik, Glendor
Lane F. Cooper, Editorial Director, BizTechReports
The intersection of artificial intelligence (AI) and healthcare holds immense promise, but achieving its full potential requires a careful balance of innovation, privacy, and regional inclusivity.
So says Julia Komissarchik, CEO of Glendor, in a recent BizTechReports vidcast in which she highlighted how her company is tackling some of the most pressing obstacles in the healthcare AI space: training AI models on sensitive medical data, addressing data silos, and ensuring diverse representation across regional healthcare datasets.
Julia Komissarchik, CEO of Glendor, View Full Vidcast Interview Here
“Generative AI has made incredible strides in fields like Natural Language Processing, but applying AI in healthcare is far more complex due to the sensitive nature of the data involved,” Komissarchik explained. “Medical records are not just publicly available bits and bytes—they’re deeply personal, and their value makes them a target for bad actors.”
The Current Landscape of AI in Healthcare
AI, particularly generative AI, has demonstrated its potential in technology, finance, and entertainment. Applications like ChatGPT, for example, were trained on vast amounts of publicly available internet data, enabling advanced conversational capabilities.
However, healthcare presents unique challenges that other industries don’t face. AI models must be trained on highly sensitive data, such as medical records, imaging scans, and lab results. Unlike publicly available internet data, this information is intensely private, often siloed within hospitals and healthcare systems. Furthermore, data in healthcare is not just fragmented but regionally biased, as much of the existing AI training data comes from large, well-funded institutions in urban areas.
This disparity creates a critical challenge. AI models trained on limited or non-representative data can lead to biased algorithms that fail to address the needs of underrepresented populations, such as rural communities or marginalized groups. “If the data doesn’t reflect the real-world diversity of healthcare needs, the AI models will be biased and less effective,” Komissarchik warned.
Adding to the challenge is that healthcare data is a prime target for cybercriminals. On the black market, medical records can fetch as much as $250 each—500 times the value of a stolen social security number. The immutability of medical data—unlike credit card information, which can change—makes it especially valuable.
Barriers to Data Sharing
While the benefits of AI in healthcare are undeniable, the barriers to data sharing remain significant. Privacy concerns, regulatory restrictions, and the fragmented nature of healthcare systems all contribute to the difficulty of creating robust datasets for AI training.
“Much of the data needed to train AI models is locked away in silos, inaccessible due to privacy concerns or logistical challenges,” Komissarchik explained. “Even when organizations are willing to share data, the data itself is often incomplete or non-representative, further compounding the problem.”
The situation is particularly dire in rural areas and smaller healthcare providers, which often lack the resources to participate in AI initiatives. These organizations hold invaluable data that could improve AI’s accuracy and applicability. Still, they frequently operate under tight financial constraints and are reluctant to engage in data-sharing initiatives due to privacy and security risks.
Komissarchik described the current state of data availability as heavily skewed. “When we talk about healthcare data for AI, much of it comes from a handful of large institutions like Stanford, Harvard or the Mayo Clinic,” she said. “This coastal and urban bias means the resulting AI models don’t reflect the realities of rural, suburban, or underrepresented communities. It’s not just a technical challenge—it’s an ethical one.”
The Importance of De-Identification
To overcome these barriers, Glendor has developed a solution that centers on de-identification—a process that removes personally identifiable information (PII) from medical records. De-identification ensures that sensitive data can be shared and used for AI training without compromising patient privacy.
“There’s often confusion about the difference between privacy and security,” Komissarchik noted. “Security is about protecting data in place and during transfer—ensuring it’s encrypted and inaccessible to unauthorized parties. Privacy, on the other hand, is about ensuring that the data shared doesn’t expose personal information, thus making the data accessible for research and AI Model training.”
De-identification is a complex process. For example, in medical imaging, patient information is often “burned in” to the image— as can be seen on X-Rays, and stored in metadata associated with the image. Other identifiers, such as jewelry, or even dental structures in X-rays, could inadvertently reveal a patient’s identity. “Each type of data has its own challenges,” Komissarchik explained. “What works for one dataset—like removing burned-in text from an image —might render another dataset useless, such as removing teeth from a dental x-ray.”
Hospitals and healthcare providers often rely on manual processes to de-identify data, which are time-consuming, error-prone, and costly. This reliance on human intervention is a significant bottleneck.
Automated De-Identification
After five years of development, Glendor has introduced a software-based de-identification solution that automates the process, eliminating the need for human intervention. The software operates entirely within a healthcare provider’s own infrastructure, whether on-premises or in a private cloud, ensuring that sensitive data never leaves the organization’s control.
“Our solution is not a service—it’s a product,” Komissarchik emphasized. “This distinction is important because it means healthcare providers retain full control over their data before and after it’s processed. There’s no risk of a third-party gaining access, and that peace of mind is crucial for providers and patients.”
This approach also addresses reputational concerns. Komissarchik cited cases where hospitals partnered with major tech companies for AI initiatives, only to face backlash from patients worried about their data being shared with third parties—even when the partnerships between hospitals and companies providing service adhered to the regulations.
Monetizing De-Identified Data
In addition to enabling safer data sharing, Glendor’s solution offers a potential financial lifeline for cash-strapped healthcare providers, particularly in rural or underserved areas. By de-identifying their data, these organizations can participate in the growing AI healthcare economy, providing valuable datasets to pharmaceutical companies, AI developers, and research institutions.
“De-identified data is not just a resource for better AI—it’s also a revenue stream,” Komissarchik said. “This creates a win-win scenario where healthcare providers can improve patient outcomes while generating much-needed funding for their operations.”
Bridging the Urban-Rural Divide
One of the most critical aspects of Glendor’s mission is addressing geographic bias in healthcare AI. Komissarchik stressed the importance of including data from rural hospitals, tribal clinics, and smaller healthcare providers to ensure AI models are representative and effective across all populations.
“These organizations often hold data vital for training unbiased AI models,” she said. “If we only train AI on data from large, urban institutions, we risk perpetuating inequities in healthcare access and outcomes.”
By enabling rural and underrepresented healthcare providers to securely share their data, Glendor is working to create a more inclusive AI ecosystem that benefits patients nationwide.
Simplifying Implementation
According to Komissarchik, Glendor PHI Sanitizer can be installed and operational within minutes, minimizing the learning curve for healthcare providers.
“We know that hospitals and clinics have enough on their plates,” she said. “Our goal is to make de-identification as straightforward as possible so they can focus on what matters most—delivering quality care.”