The massive pile up unspent funds from the 15th Finance Commission exposes tall claims on disaster management efforts
By P. SESH KUMAR
New Delhi, February 15, 2026 — The 16th Finance Commission’s (FC) disaster-management (DM) award —about ₹2.84 lakh crore for 2026–31 —looks generous on paper, and yet the Commission’s own backward glance at the 15th FC period reads like a post-mortem of a system that could not digest what it was already served: unspent allocations, delayed guidelines, and “zero release” lines that scream institutional paralysis.
The Mainstream narrative by G. V. Venugopala Sarma flags many of the right symptoms; what’s still missing is a hard, operational diagnosis of why the pipeline keeps choking, and a set of enforceable fixes that change incentives, data discipline, delegation, and accountability — so disaster risk management becomes an everyday governance function, not a ceremonial budget headline.
The headline number is not the problem; the plumbing is. The 16th FC has recommended disaster-related allocations totalling ₹2,83,807 crore for 2026–31, including ₹2,04,401 crore for States and ₹79,406 crore at the national level. Can we say, in other words, the country is not “under-provided”; it is under-performing.
The Mainstream piece correctly calls out the missing “end point” story-funds released but not finally spent, and money mapped to grand windows (preparedness, mitigation, response, recovery) without the unglamorous disclosure of project lists, physical outputs, and completion status. That omission matters because disaster financing fails in two very Indian ways: money arrives late, and money — when it does arrive — arrives in a system that has not built the muscle to convert it into works, assets, and readiness.
The most damaging clue is buried in the Commission’s own implementation review: during the 15th FC period, national allocations existed for multiple purposes, but releases were a fraction of what was envisaged — ₹14,855 crore released for response/relief against an allocation of ₹27,385 crore; ₹819 crore released for recovery/reconstruction against ₹20,539 crore; and several “Nil” release items for specific mitigation priorities. This is not merely “COVID did it” or “guidelines were late.” Those are triggers.
The deeper disease is that India’s disaster-finance architecture has been designed like a rulebook, but operated like an exception-handling desk: cautious approvals, fragmented ownership, and an excessive fear of post-facto scrutiny that pushes administrators to not spend rather than to spend well.
What the Mainstream narrative underplays is the incentive mismatch between the four funding windows. Response-and-relief spending is comparatively easier to justify (and politically urgent). Mitigation and preparedness are where competence must precede crisis — and where files move slowest because benefits are invisible until disaster strikes. The World Bank review prepared for the 16th FC process captures this bluntly in practice: states often struggled to provide project-level breakdowns; data mismatches required repeated validation; and by 31 March 2025, only 12.09% of the total NDMF allocation cited there had been released-an extraordinary statistic for a fund meant to prevent losses rather than mourn them. If prevention money cannot exit the treasury with confidence and speed, “holistic DRM” becomes a slogan, not a system.
A second missing piece is the accountability architecture. The Mainstream note asks-fairly-why the 16th FC is not explicit about whether the 15th FC’s intended outcome framework and mid-term review happened. But the sharper question is: even if they happened, what would have compelled compliance? In our country, outcome frameworks die quietly when they are not welded to (a) release conditions, (b) public dashboards, and (c) audit trails that make non-performance reputationally costly. Absent that triad, reporting becomes ornamental: a PDF somewhere, a workshop somewhere else, and the field remains unchanged.
A third gap is the quiet confusion between “funds” and “functions.” The DM Act already requires ministries and departments to budget for activities in their disaster management plans; this obligation is explicitly stated in Section 49(1), which requires each ministry/department to provide funds in its annual budget for activities and programmes in its DM plan. (The Mainstream piece cites Section 37(2)(a); the budget provision requirement is actually in Section 49, while Section 37 is about plans.) The Commission could have used this as a lever to expose the real story: how much DRM spending occurs outside the NDRF/SDRF/NDMF ecosystem-through roads, housing, irrigation, power, urban development, forests, health-and whether those ministries are genuinely “mainstreaming” risk reduction or merely relabelling routine works after a calamity.
Then there is the politics of formulae. The Mainstream critique of the Disaster Risk Index methodology changes-such as dropping area and replacing BPL families with per-capita income-goes to a real governance risk: when allocation logic becomes less intuitively tied to vulnerability, legitimacy suffers, and states begin to treat the award as yet another distributive contest rather than a risk-management contract. On Odisha, for instance, the narrative points to a reduction in SDRF allocation despite Odisha’s well-known improvements in cyclone preparedness. Whether one agrees with the conclusion or not, the Commission owed a clearer explanation of such counter-intuitive outcomes because confidence in the formula shapes compliance with the system.
So what actually fixes this? Not another pious review. Not another committee whose report becomes a bookshelf artefact. The practical repair is to turn disaster financing into a performance-and-delivery pipeline with hard disclosure, time-bound delegation, and consequence management.
Should we not start with the simplest discipline we appear to routinely avoid: publish a live, standardised project-and-expenditure dashboard for each of the four windows at both national and state levels-allocation, release, expenditure, physical progress, geo-tagged assets, and completion certificates-updated quarterly and auditable. If project-level transparency feels “too much,” that is precisely the point: the discomfort is the reform. The World Bank review notes that project-level details were often unavailable and that data mismatches required repeat validation; you cannot fix that with speeches, only with enforced reporting standards and common data structures.
Next, can we separate what has been blurred: response funds are not mitigation funds, and pretending they are cousins creates lazy reporting. The Mainstream piece is right to flag that clubbing NDRF and NDMF for “meaningful analysis” is suspect; they exist for different purposes, and the accounting must prevent mitigation from being cannibalised by relief. The Commission should have mandated reporting that never merges these streams, and requires an explicit narrative when mitigation allocations remain unreleased while disasters keep recurring.
Then should we not fix the bottleneck that administrators won’t name: approval anxiety? Many mitigation and preparedness releases stall because officers fear later objections-CVC/CAG queries, vigilance heat, media noise-especially when guidelines are late or ambiguous.
The remedy is two-fold: issue stable, early guidelines at the start of the award period (not mid-way), and create a pre-audit style “design clearance” for mitigation projects above a threshold-so technical adequacy and eligibility are certified upfront, and honest execution is protected. This is not about weakening scrutiny; it is about shifting scrutiny to the design stage, where it prevents waste rather than punishing action.
Fourth, we should build absorption capacity where it actually breaks: districts and urban local bodies. Disaster spending collapses at the last mile because engineers, procurement staff, and project managers are thin, rotated frequently, and trapped in procedures designed for normal works, not time-sensitive risk reduction. The fix is a dedicated DRM program management unit at state level and a lighter, standardised procurement-and-contracting toolkit for risk-reduction works-still compliant, but engineered for speed, standard designs, and measurable outputs. If “urban flood mitigation” gets a fraction of allocation released, as earlier patterns suggest, the system is telling us it cannot execute complex, cross-agency urban works without an empowered delivery structure.
Fifth, treat “good performers” not as anecdotes but as curricula. The 16th FC notes the primary responsibility rests with states; fine. Then it should also have institutionalised peer learning: a formal mechanism where consistently high-performing states mentor comparable risk-profile states, with measurable adoption targets. This is not charity; it’s replication of administrative DNA.
Sixth, let us stop pretending DRM is only MHA/NDMA business. The DM Act’s budgeting obligation for ministries (Section 49) is a sleeping clause that should be weaponised for mainstreaming: every major infrastructure ministry should publish a “Disaster Risk Reduction Statement” alongside its budget-showing what it spent on risk prevention, what standards it adopted, and which assets were climate-and-disaster screened. Without that, the nation will keep pouring money into rebuilding the same vulnerable assets-roads that wash away, embankments that fail, drains that choke-while calling it “reconstruction.”
Finally, can we not do the review-and do it like a prosecution, not a seminar. The Mainstream author is sceptical about NDMA reviewing itself, and that scepticism is healthy. A comprehensive review of the DM Act’s implementation should be independent, time-bound, and evidence-heavy, drawing on audit findings, state performance, and project-level outcomes. Why can CAG not do it? We examine this later exclusively. The deliverable should not be another generic “capacity assessment.” It should be a public report that names failure modes-late guidelines, weak delegation, fractured coordination, unusable data systems-and pairs each with a mandated correction and a deadline.
Audited Often, Fixed Rarely: Disaster Money and the CAG Mirror
A question would arise. “Has the CAG done a fresh, Union-level, institution-focused performance audit of NDMA itself in the last ten years?”, the honest answer is: not in the way one would expect for an apex authority. The CAG’s last major Union performance audit that squarely examined NDMA’s role, design and coordination weaknesses is the 2013 report on civil disaster preparedness, which carried a dedicated chapter on NDMA and flagged poor planning, weak coordination, and fuzzy roles at the top.
But if our question is wider — “Has the CAG audited disaster-management outcomes and effectiveness of DM spending?” — then yes, often, but largely through State-focussed performance audits and State finance/compliance reports, where the same pattern keeps surfacing: plans exist on paper, institutions limp, data systems don’t breathe, funds are hard to track to a disaster or component, and outcome measurement is more aspiration than architecture.
Here’s the uncomfortable storyline. After the 2005 Disaster Management Act created NDMA as the apex policy-and-guidance body, the country promised itself a shift from relief-centric firefighting to prevention, mitigation, preparedness and resilient recovery. Yet when auditors went looking for that “shift”, they kept finding a machine that is ceremonially well-designed but operationally underpowered.
Even in a State audit like Jammu & Kashmir (report for the year ended March 2016), the CAG’s lens on preparedness reads like a checklist of foundational gaps: the State Disaster Management Authority (SDMA) was not fully constituted with full-time members; the State Advisory Committee (SAC) wasn’t in place; plans and guidelines for departmental disaster management plans weren’t laid down; divisional disaster management authorities hadn’t been established; and district plans were either absent, not reviewed, or non-functional in practice. This is not “one State’s problem”; it is a symptom of how disaster governance often behaves when the only time it is taken seriously is when a siren is already wailing.
Fast-forward to Karnataka, where the CAG’s Performance Audit on Disaster Management (covering 2017–23) shows how the “outcomes question” collapses when the basics are missing. The State published its disaster-management policy 12 years after constituting the SDMA; annual disaster management plans were being approved during/after September of the very year they were supposed to guide; hospitals and educational institutions largely lacked the required plans; the State Emergency Operation Centre was under-equipped; and the forecasting/dissemination system was flawed-defunct/faulty equipment, deficient data, weak contract management, missing models and sensors. Then comes the line that kills outcome evaluation at birth: funds released were not “calamity-specific/component-specific”, making monitoring of utilisation difficult; alongside that, audit noticed misappropriation and payments from SDRF for purposes not related to disasters. When money cannot be traced cleanly to a purpose and a disaster window, “effectiveness” becomes a speech word, not an auditable result.
So where does that leave NDMA and the Union picture in the last decade? In practice, NDMA has appeared more as a reference point inside other audits (plans, guidelines, institutional architecture) than as the primary subject of a recent, hard-hitting, Union performance audit solely on NDMA’s/MHA’s outcomes. The CAG has certainly audited disaster-related spending streams and disaster-linked preparedness in multiple places-flood-control schemes, state disaster management performance audits, SDRF/SDMF accounting and utilisation issues in state finance reports-but the public record does not show a recent, NDMA-centred, Union performance audit comparable in stature to the earlier one that directly critiqued NDMA’s planning and coordination failures.
The bigger, sharper takeaway from what the CAG has been saying is this: disaster management in India is too often treated as a fund rather than a system. One can raise allocations; one can create “windows”; one can announce modernisation; but if operational plans are late, if institutions are half-constituted, if data platforms aren’t updated, if equipment is dead and contracts aren’t managed, and if releases aren’t tagged to a calamity/component, then audit will keep finding the same thing-money either doesn’t move, or moves in ways that can’t be convincingly tied to reduced risk, faster response, or better recovery.
Resilience Demands a Contract
If we want disaster resilience rather than disaster paperwork, we must convert the 16th FC award into a contract: money in exchange for measurable readiness, visible mitigation assets, and audited outcomes. The country has already learned the cost of delay-in lives, livelihoods, and rebuilt infrastructure that collapses again. The next five years should be judged not by how much was “allocated,” but by how much risk was actually reduced before the next cyclone, flood, heatwave, landslide, or earthquake writes its own audit note this time in debris. And CAG should ‘post-haste’ plan and take up a comprehensive all India performance audit with special focus on the role of Union Ministry and NDMA.
(This is an opinion piece. Views expressed are author’s own.)
NFRA Enforcement Crisis: Why Audit Watchdog Is a Paper Tiger
Follow The Raisina Hills on WhatsApp, Instagram, YouTube, Facebook, and LinkedIn

