A Revolution in Skin and Code: How Balance Is Changing the Face of Global Healthcare

Artificial intelligence was supposed to make medicine fairer, faster, and smarter. Instead, it learned to see only some select skin tone. In hospitals from Boston to Nairobi, algorithms that detect disease, measure oxygen, or diagnose skin conditions often fail one simple test: recognising darker skin. But in a quiet lab in Abuja, a small team of engineers and ethicists is rewriting that code—and, perhaps, the ethics of global healthcare itself. Their project, Balance, is not just a technological breakthrough. It is a moral one—a bid to teach machines what humanity has too often forgotten: that every shade of skin deserves to be seen ThisDay writes…

In a modest laboratory in Abuja, a quiet revolution is unfolding, one that could redefine how the world’s medical systems see, diagnose, and treat the human body. The project, called Balance, isn’t just an innovation. It is a moral re-engineering of artificial intelligence itself.

Born under the auspices of HUGE Solutions, Balance is building the most comprehensive dermatological dataset focused on underrepresented skin tones, a dataset capable of teaching algorithms to recognize every shade of humanity. The ambition is vast, the stakes profound. In an era where AI diagnoses skin disease, measures oxygen levels, and identifies cancer, those who are unseen by data are unseen by care.

The Problem No One Wanted to See

For decades, medical AI systems have trained almost exclusively on datasets representing lighter skin tones from North America and Europe. The result has been devastating. Studies published in Science Advances and Nature Medicine have confirmed that diagnostic models often misidentify or overlook skin conditions on darker skin, sometimes by margins of 20 to 40 percent.

“It’s not that the technology failed,” says Kelvin Bawa, Co-Founder and CEO of Balance. “It’s that it was never taught to see everyone.” His words carry the weight of both technical truth and social reckoning. A 2024 Stanford HAI report revealed that fewer than five percent of dermatological images used in commercial AI systems depicted patients with richly pigmented skin.

In many parts of Africa, this invisibility isn’t academic. It’s life or death. Pulse oximeters, the devices that measure oxygen saturation, often overestimate oxygen levels in people with darker skin, leading to delayed treatment and poorer outcomes, CNN reported earlier this year. Balance emerged as a response not just to a gap in data, but to a global injustice hiding in plain sight.

The Birth of Balance

The story began when Kelvin Bawa, a Nigerian software engineer working across global AI systems, noticed a troubling pattern. “We called it ‘silent bias,’” he recalls. “The models performed beautifully until the patient wasn’t white.”

Bawa co-founded Balance alongside Maria Raiwe and Gloria Osemeke, both senior leaders at HUGE Solutions, the innovation lab that gave birth to Balance. Maria, VP of Product and Co-Founder, had already led the development of breakthrough platforms such as Clamp, a global remittance engine, and NewsOS, an AI-powered publishing framework now used by leading media organizations across Africa. “Balance was different,” she reflects. “It wasn’t just about building products that work. It was about building systems that care.”

Gloria, Co-Founder and Head of Data Ethics, helped define Balance’s ethical architecture, ensuring that every image, consent form, and partnership reflected transparency and respect. “Inclusion isn’t something you add later,” she says. “It has to be written into the foundation.”

Working closely with their colleague Duza Bawa, Co-Founder and Head of Business Development & Partnerships, the founding team expanded Balance’s reach across hospitals, research institutions, and technology firms, building the alliances that would sustain its ambitious data-gathering efforts and global collaborations.

Together, they built Balance’s Ethical Data Trust, a governance model that ensures every layer of the company’s data collection and use adheres to strict ethical standards, aligning with both Europe’s AI Act and Nigeria’s National Health Research Ethics Committee.

Health Equity by Design

What makes Balance stand apart isn’t only the scale of its dataset—though it now includes thousands of catalogued dermatological images from across Africa—but its philosophy.

“Bias isn’t a glitch,” Maria explains. “It’s a design flaw. And we’re redesigning the system from the ground up.”

Balance’s data powers collaborations with hospitals and universities in Nigeria, Kenya, and the United Kingdom, allowing AI developers to validate diagnostic models on diverse skin types for the first time. Each partnership is reciprocal, ensuring that local institutions share in both the knowledge and the benefits created.

The Team Behind the Vision

The Balance team is small, brilliant, and driven by a shared moral conviction.

At the helm is Kelvin Bawa, the visionary CEO whose background in software engineering and AI systems informs the company’s technical architecture and global partnerships. Maria Raiwe, VP of Product, leads the development and design of Balance’s products, ensuring they are scalable, intuitive, and deeply human-centred. Gloria Osemeke, Head of Data Ethics, defines the principles that make the entire ecosystem compliant, transparent, and trusted. Dr. Ikechukwu Henry Michael, Chief Clinical & Technology Officer, bridges clinical medicine and computational science, ensuring that every algorithm and dataset meets the highest standards of medical precision and patient safety.

Dr. Michael, a physician and technologist with a background in computational biology, brings a decade of experience in medical imaging, bioinformatics, and digital diagnostics. His work ensures that every product emerging from Balance is both clinically accurate and technologically sound, reflecting the project’s dual commitment to innovation and care.

Together, they form what Bawa calls “a coalition of conscience,” a team proving that technology can be both precise and principled.

Towards a Fairer Future in AI

Globally, the market for medical AI data is approaching $300 billion, yet most of it rests on foundations that exclude the majority of the world’s population. For Balance, that imbalance is not only unsustainable. It is unjust.

“Our dream,” says Bawa, “is that one day, every dataset starts in Africa, not ends there.”

Their ambition doesn’t stop with dermatology. By 2030, Balance plans to expand its ethical data infrastructure into cardiology, ophthalmology, and digital biometrics—domains where racially skewed data continues to shape outcomes.

A Moment of Reckoning

There is a quiet audacity in what Balance represents: the belief that Africa, long treated as a data source for the world, can now lead the next chapter of ethical innovation. As AI regulation tightens from Brussels to Washington, global technology companies are being forced to confront the inequities embedded in their systems.

Balance, born on a continent often marginalized by technology, offers something rare: a blueprint for fairness that is both practical and profound.

As Gloria says, “For the first time, we have a chance to build medical AI that isn’t just smart, but just.”

And perhaps, at last, the world is ready to see.

Related Articles