Latest Headlines
Can AI Make Lending Fairer? The Role of UX in Financial Inclusion
By Fatteh Hamid
Ademola Adepoju has spent years studying how small choices in words,, and information flow can decide whether people feel respected or excluded. From his days shaping content at Flutterwave to auditing Zest AI’s lending software during his graduate studies, Adepoju has seen firsthand that fairness in digital finance, especially lending, is not just about algorithms. It is about the human experience on the screen.
“At Flutterwave, I learned that trust often comes down to small but deliberate choices,” Adepoju recalled. “We placed key details early in the flow in plain language, giving people the chance to make decisions with full knowledge. We explained identity checks as protection steps rather than hurdles, which made them feel safer. Even when transactions failed, we replaced technical codes with clear explanations, next steps, and realistic timelines.”
Changes like these, he said, often show borrowers that the product is on their side, even when the answer is “no.” By writing for lower literacy and multilingual contexts, using short sentences and familiar terms, and labeling icons clearly, the projects he contributed to built bridges for underserved users often left out of financial systems.
His graduate work auditing Zest AI revealed similar lessons. The audit team examined how the company’s use of traditional and alternative data, feature engineering, and explainability tools shaped people’s ability to understand credit decisions. Adepoju found that Zest AI often relied on technical explanations that made sense to lenders but left applicants uncertain about why they were approved or denied. By contrast, he argued that translating those factors into plain, scenario-based language with clear next steps could help users see what influenced the outcome and how to improve their chances in the future.
“Risk scores alone do not mean much,” he said. “Adding context like when data was last updated or what broad categories were considered turns a bare number into useful information. A simple feedback link or appeal button can change a denial from feeling like a black box to a process that respects the borrower.”
Adepoju believes fairness in lending cannot stop at technical models. “A decision only becomes real when someone encounters it on a screen and has to make sense of it,” he explained. “If an interface hides fees, blurs the difference between prequalification and approval, or delivers a denial without a clear explanation or path forward, it feels arbitrary even if the algorithm is sound.”
For him, inclusive design is not optional. It is central to responsible AI. He warns that bias often creeps in through everyday design choices: unexplained jargon, cultural assumptions, or layouts that do not work for low-bandwidth connections or shared devices. “A loan app built for one country might assume a stable internet connection or a fixed address format that does not fit another,” he said. “Those assumptions lock out the very people fintech claims to serve.”
To balance regulatory requirements with human-centered communication, Adepoju recommends bringing compliance officers and UX writers together early. He suggests using shared glossaries, reusable components, and clear timing for disclosures so borrowers see critical information before committing. Offering choices of language or channel signals respect for different realities.
Adepoju also insists that fintech companies need cross-disciplinary teams. Engineers, designers, ethicists, and community advocates should share responsibility for outcomes. Reviews should include advocates who can flag what feels confusing or unfair, while testing should explore denials and disputes, not just smooth transactions. “Inclusive lending is not something a single team can deliver on its own,” he said. “When borrowers come away understanding what happened and feeling they were treated with clarity and respect, that is what builds trust.”
Looking ahead, Adepoju urges fintech startups to involve underserved communities directly, set measurable goals for readability and understanding, and keep explanations inside the interface rather than hidden in policy documents. He calls for clear paths to human review when automation fails, simple privacy controls, and ongoing accountability through tracking and improving user experiences.
“If startups build this way, AI-driven financial tools will not deepen exclusion,” Adepoju said. “They will bring more people into systems that for too long have kept them out.”
His message is simple but powerful: in the race to automate finance, fairness begins with the words on the screen and the respect they show to every borrower.
*Fatteh Hamid is a multimedia journalist who writes from Lagos, Nigeria.







