AI in 2026 Federal Grants

Who Gets to Participate?

Federal AI policy is about to reshape the grant landscape in 2026. This is not just for large agencies, research universities, and health systems. Community-based organizations may feel the effects whether they opt in or not.

OMB Memorandum M-25-21 is explicit. Federal agencies are no longer being asked to explore AI cautiously. They are being told to accelerate its use, remove barriers to innovation, stand up governance structures, and pilot AI tools that improve efficiency, customer experience, and public value. That directive is likely to change what program officers are encouraged to fund and how proposals that signal “innovation” will be judged.

For large federal grantees, this moment looks familiar. Departments of transportation, universities, public health systems, and significant research institutions already manage complex compliance regimes. They control large data environments. They can hire AI engineers, lawyers, and risk teams. When agencies talk about “high-impact AI,” meaning systems that affect access to benefits, civil rights, public health, or safety, these are the organizations most likely to be building and operating them with federal dollars.

But AI in federal grants is unlikely to stop with big institutions.

Community-based organizations will increasingly be pulled into AI-enabled systems through subawards, intermediaries, and partnerships. Anywhere AI touches eligibility screening, outreach, triage, case management, or performance measurement, CBOs are likely to be the ones interfacing with people on the ground.

HHS’s 2025 AI compliance plan makes this clear. It anticipates AI use across Medicaid, public health, and human services. It requires divisions to identify high-impact tools and apply minimum-risk practices, such as pre-deployment testing, impact assessments, independent reviews, and safe shutdowns when systems fail. As those requirements cascade through state agencies and prime grantees, CBOs may be expected to implement and explain AI-driven decisions they did not design and do not control.

That raises questions far bigger than basic compliance or readiness.

One risk is that AI quietly becomes a sorting mechanism in federal funding. Organizations with robust data stacks, technical staff, and AI-fluent language in their proposals look “ready.” 

Another risk is harm. AI pilots in high-impact areas such as benefits eligibility, fraud detection, or risk scoring can reinforce existing inequities. When community organizations are brought in late, they have limited ability to shape design choices, governance rules, or appeals processes. They absorb the operational burden while others retain decision-making power.

The real question for the sector is not whether AI will show up in federal grants. It already has. And we will continue to receive additional information about what that looks like in implementation.

If AI becomes a de facto requirement to be considered innovative, we risk a future in which only organizations with scale and infrastructure fully participate in the next generation of federal funding. Community based organizations are left as implementers rather than co-designers, responsible for outcomes without authority to do the work the way they think is most effective.

A healthier path looks different. It invests in shared capacity, such as the reusable AI models and data “commons” HHS has proposed. It treats governance as a participation issue, not just a technical one. And it assumes that responsible AI in public programs requires community voice at the table from the start.

The question heading into 2026 is not simply “Can we use AI in this grant?”

It is a systems question: “Who benefits, who is burdened, and who gets a meaningful say in how AI shows up in work that is meant to serve the public.”


Be the first to receive federal grants updates and insights with Federal Grants Accelerator Quick Tips.

Previous
Previous

Priority Trends Under this Administration

Next
Next

You’re Not Too Small. You’re Just Underestimated.