JAMB, WAEC and a Generation Left Waiting When Exams Glitch

Over the last eight months, two of Nigeria’s biggest examinations, the Joint Admissions and Matriculation Board’s Unified Tertiary Matriculation Examination and the West African Examination Council’s West African Senior School Certificate Examination, have stumbled. While server errors skewed UTME scores, WAEC papers stretched into the night by torchlight, leaving thousands of students in limbo. For some, the confusion proved fatal, casting a harsh light on the human cost when systems meant to shape futures break down. Sunday Ehigiator writes

On a day that should have promised possibilities, 19-year-old Timilehin Opesusi faced an outcome that shattered her world. Residing with her elder sister in Odogunyan, Ikorodu, Lagos State, she logged into her JAMB UTME result, anticipating a new chapter in her academic journey and her future. Instead, she was met with a score of 146 out of 400, a steep drop from the performance she recorded a year earlier.

In the hours that followed, overwhelmed by despair, Opesusi ingested rodenticide. She was brought to Kolak Hospital in critical condition, but tragically passed away soon after.

A cruel twist of fate unfolded some minutes later: an admission offer, provisionally accepted, landed in her Gmail inbox 30 minutes after her passing. The message pierced the silence left by her absence, amplifying the heartbreak of a dream deferred, perhaps irreparably.

Her father, Femi Opesusi, expressed both sorrow and anger. Although JAMB later admitted to a technical glitch affecting nearly 380,000 candidates and promised a re-sit, it was too late for Timilehin.

“I hold JAMB responsible,” her father said, lamenting that no one from the board or government reached out to his family in the wake of her death.

Timilehin’s story is far more than statistics or system failure; it is the human face of institutional collapse. It epitomises the devastating impact of opaque processes, miscommunication, and lack of support structures on the most vulnerable: the youth.

Her death punctuates the urgency of examining how systemic failures intersect with fragile mental health, institutional opacity, human careers, and the inadequate redress mechanisms.

Human stories: not statistics but futures

Statistics, percentages, and numbers make the headlines. But behind each data point sits a person. Ada (not real name), a candidate who sat in one of the affected JAMB centres. She spent a year preparing for UTME after a disappointing first school-leaving performance that had initially kept her home for two years.

She left her small town for a city testing centre, weathered transport delays, questions that seemed oddly formatted, and then logged into an online portal to check a result that left her reeling. For days, she thought she had failed; later, the board announced a resit for her centre. Months of study and two sleepless nights were effectively devalued.

Then there is Emeka, whose WAEC centre in one south-western state ran late and forced students to write by torchlight into the night. He remembered the strain of concentrating through aches and drowsiness, the communal worry as invigilators whispered about missing materials, and later, a result that did not reflect his confident performance in class. His parents had borrowed money for school fees, valedictory fee, transport, and examination fees. They cannot afford another year of uncertainty.

These are repeatable narratives across regions: students whose revision cycles are disrupted by institutional errors; families who take loans for exam fees and travel; schools whose reputations hinge on their pupils’ performance; and teachers who must explain inexplicable declines in pass rates.

The mental health toll is real. Anxiety, sleeplessness, depression, and panic attacks have been reported with rising frequency among students awaiting results or who must resit exams. The suddenness of a recall or a portal collapse can be traumatic: imagine thinking your life pivot point has passed, only to be told later that the data you relied on was wrong.

Clinical psychologists consulted by media outlets and schools describe the preceding months as a period of “anticipatory stress”; constant dread about score-checking, compounded by community gossip and shame. For some students, the glitch is not only an administrative failure, but also a public humiliation that lingers.

The mental health cascade

The immediate mental health consequences for affected students are obvious: panic and acute stress. But there are also long-term effects that may play out subtly.

Academic disengagement: a student who believes the system is unfair, that effort does not translate into reliable outcomes, may be less likely to invest in further schooling or to reattempt exercises. Over time, that disengagement manifests as lower enrollment, higher dropout rates, and an erosion of trust in public institutions that provide education.

Increased fear of testing; for students who experience trauma during high-stakes exams, future test situations become trigger points. Teachers and psychologists report that some students carry lasting anxiety into university interviews or postgraduate entrance exams.

Additionally, household economic stress can exacerbate mental strain. When families spend scarce resources on exam preparation and fees, and then face the prospect of repeating the investment due to institutional error, the financial pressure worsens mental health across the household. Adolescents who internalise parental stress can become depressed or withdraw.

Teacher morale and educator burnout: teachers who must explain inexplicable results to parents and students, and who often see the reputational harm reflected back onto their schools, experience demoraliSation. Burnout among teachers further weakens the quality of schooling and long-term educational outcomes.

These consequences do not resolve with a “reissue” of a result. They require mental health services, counselling, and deliberate support programmes for students and teachers; resources that are scarce in many parts of Nigeria’s education sector.

The timeline: a short, sharp shock

The year’s examination cycle began like any other, but soon went off-script. In the case of JAMB, the board acknowledged a “systemic error” after the 2025 UTME results provoked widespread complaints.

The Registrar, Professor Ishaq Oloyede, publicly admitted problems that affected hundreds of thousands of candidates and ordered partial reconducts/resits for affected centres. The scale of the disruption was staggering: JAMB stated that the glitch affected centres across several states and compromised the integrity of results for many candidates.

In the case of WAEC, its troubles arrived later in the results-release phase. After releasing WASSCE results that showed a dismal pass rate (and triggered nationwide outcry), the regional examination body temporarily suspended access to its result portal, said it had detected technical glitches in the post-processing that affected subject displays and grading for many candidates, and then directed candidates to recheck their results after corrections; a “recall” and “reissue” of results that only deepened public anxiety.

Between the JAMB admissions confusion and the WAEC portal meltdown, social media and family WhatsApp groups filled with bewilderment, anger, and fear.

Headlines asked whether the system had failed an entire cohort; op-eds demanded resignations; parents threatened legal action. Yet at the centre of the chaos were young people who had sat timed exams, sometimes under substandard conditions, and then watched their futures fall away.

The commonalities of failure

At first glance, JAMB and WAEC are distinctly different institutions: one is a national admissions and testing agency, while the other is a regional examination council with decades of experience across West Africa. Yet the 2025 debacle revealed several striking similarities.

Both institutions relied heavily on computerised systems for marking, uploading, and releasing results. Where those systems failed, the human consequences were immediate and widespread: incorrect grades, inaccessible portals, and widespread uncertainty.

JAMB’s registrar described the problem as systemic and technical, citing glitches that compromised grading integrity in many centres. WAEC’s team attributed the error to post-processing and subject serialisation procedures that went awry. Both are technical explanations that, for students, felt like excuses.

Additionally, reports emerged of students being required to write WAEC examinations late into the night in some locations, often with the aid of torches or candlelight due to power outages, delayed materials, or extended exam schedules that exceeded normal hours.

These harsh conditions were widely blamed for poor performance in some regions and for the eventual need to re-examine or reprocess results in others. That kind of logistical collapse is noteworthy: it compromises fairness, concentration, and the basic premise that a standardised test provides an even playing field.

Another commonality, reactive communication, not proactive transparency, both bodies moved to explain after the outrage had begun, rather than preventing the confusion in the first place; JAMB’s public apology, forceful, tearful, and unusual, came only after media pressure.

WAEC’s recall and update of results followed intense criticism and portal crashes. In both cases, the timing and tone of communications made the institutions appear defensive and improvised rather than open and accountable.

Inadequate redress and institutional immunity are another commonality. Perhaps the most infuriating shared feature for the public was not only that errors occurred, but that they seemed to produce few direct consequences for the institutions’ leadership. The boards acknowledged mistakes and attempted to rectify them with limited remedies (resits, reissued results), but in the broader sense, as widely articulated on social platforms and in op-eds, the perception was that the leaders escaped systemic accountability, while students bore the cost.

The outrage, memes and court threats

In the era of social media, outrage travels fast. Students took to Twitter, Facebook, and WhatsApp to share screenshots, personal stories, and angry commentary. Hashtags trended, TV shows featured panels, and lawyers promised class-action lawsuits. Newspapers dissected the boards’ statements across front pages; opinion pieces asserted that the mistakes were symptomatic of broader governance failures.

Some commentators drew uncomfortable parallels with previous national institutions that had experienced “server glitches,” suggesting a new lexicon of failure had entered public life. Yet beyond the performative fury lies a real question: what redress do candidates have? The procedural answers, resits, reissued results, or appeals do not fully compensate for lost time, psychological harm, additional costs, or the reputational damage that can come from an “affected” result.

Few families can afford lawyers; most feel stuck between institutional inertia and a bureaucratic process designed to minimise disruptions rather than to make amends.

Public calls for leadership change have been loud and persistent. Some commentators have asked whether the heads of WAEC and JAMB should be suspended pending an inquiry; others have suggested the establishment of independent oversight panels comprising technologists, educators, and civil society members.

But those options are politically fraught and slow-moving. For families who must now rework their budgets and life plans, speed and transparency, rather than slow bureaucratic reviews, feel like what matters most.

Beyond “technical glitches”

It is tempting to treat the crisis as a story about servers, code, and unlucky timing. But a technical glitch does not appear in a vacuum. Behind the code sits decisions about procurement, staffing, testing protocols, and contingency planning.

Underinvestment in infrastructure and capacity, both institutions manage enormous logistical tasks: thousands of centres, millions of candidates, secure paper handling, digital marking systems, and nationwide result dissemination.

When funding, training or systems testing are inadequate, the likelihood of catastrophic failure rises. The recent events expose that an institution’s public reputation, built across decades, can be eroded quickly when investment fails to match operational complexity.

Vendor and software management; many public agencies rely on third-party vendors for digital services. Contracts are often opaque, and oversight of outsourced systems is weak. If coding errors, poor integration, or inadequate stress-testing caused the result glitches, procurement and contract-monitoring failures will need to be part of any inquiry.

Logistics and human factors; holding students late into the night points to deeper logistical breakdowns, inadequate seat planning, failure to mobilise personnel, electricity insecurity, and poor contingency for transport delays. If invigilators are unpaid, under-trained, or ill-supported, they cannot be expected to manage extended examination sessions competently. Or were they paid for overtime?

Accountability culture is another huge deficit. The lack of swift, meaningful consequences risks signalling tolerance for failure. Institutional cultures that protect leadership or normalise cursory explanations will not incentivise rigorous systems improvement.

Educational standards and credibility at stake

The aggregate effect of repeated examination failures is not just reputational, it affects the migration of talent, the credibility of certificates, and the confidence of universities that rely on these instruments for selection.

University admissions distortions: when a significant subset of candidates is re-tested or when results are reissued after initial publication, universities face chaotic admissions windows. Admission officers must decide whether to wait for corrected results, what to do about already-offered slots, and how to handle appeals; all of which can slow the intake process and disadvantage students who are left in limbo.

Employment and credential questions: employers and foreign institutions use WAEC and UTME results as baseline credentials. When stories of “glitched grades” become public, external institutions may start adding verification steps, which can raise costs and delays for graduates seeking jobs or further study abroad. Over time, the social trust that certificates carry erodes, forcing additional bureaucratic checks that harm ordinary candidates.

A small cohort of “anyhow” graduates; opinion pieces in the past have warned about a rise in what critics call “anyhow doctors” and degree inflation; essentially, people who hold certificates but not the competencies those certificates are supposed to represent; systemic exam failures risk accelerating that phenomenon, not because students are less able, but because the measurement instruments are unreliable.

Effective reform outlook

Based on the mix of technical, operational, and human problems exposed in 2025, a credible reform agenda would have multiple strands. It must begin with an independent technical and forensic audit.

An audit team, including software engineers, psychometricians, auditors, and civil-society observers, should review both JAMB’s and WAEC’s systems and publish findings. The audit should identify the root causes (such as software bugs, human error, and procurement failure) and provide a remediation roadmap. This is about learning, not just assigning blame.

Next is the need for robust contingency protocols. Every major public testing agency must have publicly available contingency plans detailing what happens when power fails, materials are delayed, servers crash, or results need to be rescinded. These plans should be stress-tested ahead of exam seasons and simulated under public scrutiny.

Procurement transparency and vendor verification: Any third-party vendor responsible for exam-critical systems must be subject to stronger vetting, open contracts, and performance-based penalties. Public procurement in this space must be audited.

Next is an independent oversight body. The creation of a national examination oversight commission, with representation from educators, technologists, legal experts, parents’ associations, and students, could provide continuous monitoring of major public testing agencies.

Mental health and student support programmes are another important aspect. The ministry of education, schools and universities should roll out targeted counselling and mental health services to affected cohorts. Financial relief or fee waivers for resits should also be considered for disadvantaged students.

Legal and administrative accountability: A parliamentary or judicial review could ascertain whether institutional negligence occurred and propose statutory changes to prevent recurrence, including possible civil remedies for affected students. Also, a public communication overhaul is important. Institutions should adopt more proactive, accessible, and empathetic public communication strategies, including regular press briefings, timely FAQs, and an open-data portal that displays the status of centres and technological checks in real-time.

How others manage exam crises

No large-scale testing body is immune to failures. International examples offer lessons. When computerised transcripts were incorrectly issued in other jurisdictions, institutions ordered independent audits, temporarily suspended the affected services, and established ombudsman offices with the power to adjudicate disputes. In some countries, ministry-level interventions forced procurement reform and mandated open-source checks for exam algorithms. The lesson is that recovery requires both technical fixes and a visible chain of remediation steps that the public can follow. Otherwise, messaging is dismissed as “spin” and the crisis lingers.

Epilogue: students waiting in the wings

Schools will reopen. New cohorts will take their seats. Teachers will return to lesson plans. But for those who sat the 2025 cycles, the trauma of uncertainty will not vanish overnight.

For some, whose grades were corrected, relief will be profound. For others who must resit, toil will be redoubled. For yet another group, whose plans were contingent on a single set of scores, the ripple effects could last for years. The 2025 scandal should serve as a wake-up call. Not just to WAEC and JAMB, but to a nation that relies on examinations as key sorting mechanisms for talent and opportunity.

If those mechanisms are no longer trusted, the downstream damage to fairness, social mobility, and national capacity will be severe. The cost of inaction will be measured in lost potential: doctors who do not receive training, engineers who fail to emerge, and young people who lose faith in a system that once promised that effort and merit would be rewarded.

This crisis contains a choice. Nigeria can treat it as a transient scandal, a moment of outrage that will soon be forgotten, or as a turning point to build more resilient, transparent, and humane examination systems that centre on the student, not the institution.

The latter will require time, money, and political will. But if education is truly the foundation for the country’s future, the investment is not optional. It is an imperative. Timilehin’s death is a grim reminder that exam errors are never just “technical glitches”; they can shatter lives. By the time WAEC admitted its mistake and corrected her result, the loss was irreversible.

If Nigeria’s examination bodies continue to operate without accountability, more students risk being failed not by their performance, but by the very systems meant to measure their potential. Timilehin’s story must be the last of its kind.

Related Articles