Trump’s AI EO vs the state patchwork
/ 15 min read
Table of Contents
The loudest debates in American AI policy still sound like philosophy seminars: safety versus speed, bias versus neutrality, privacy versus personalization. Meanwhile, the real constraint on national power looks more like a zoning hearing. You can’t “out-innovate” a three‑year interconnection queue.
That’s why President Trump’s Executive Order 14318, “Accelerating Federal Permitting of Data Center Infrastructure,” matters even if you never read a single line of it. It treats AI as what it has become in practice: an infrastructure program with a compute-shaped appetite for land, power, steel, and time. (GovInfo PDF)
The part people are already calling “overpowering state regulation” is not a magical clause that nullifies state AI laws. An executive order can’t just wave away state statutes. What it can do is shift the battlefield onto federal terrain—federal permits, federal land, federal environmental review rules, federal financing—where Washington has leverage and where the timeline can be compressed.
In other words: EO 14318 doesn’t preempt the state patchwork. It route‑around’s it. In a race, route‑around speed can be the difference between winning and writing autopsies.
The race you win with concrete, not code
If you’ve been following our series on the U.S.–China AI Cold War, you already know the punchline: the competition is not a model leaderboard, it’s a full stack—energy, chips, infrastructure, models, and applications. We argued that America leads at the frontier, but China leads in build velocity and the industrial scale of deployment. That framing is central to “The AI Cold War: Six Months and Closing”.
Jensen Huang put the physical reality bluntly at CSIS: in the AI stack, “energy” is not a background variable; it’s the foundation. (CSIS conversation with Jensen Huang) That framing is not a metaphor. A frontier lab is now inseparable from a small utility.
The International Energy Agency’s 2024 report on “Energy and AI” turns that intuition into a curve you can’t unsee. It estimates global data center electricity consumption at 415 TWh today and projects it more than doubling to around 945 TWh by 2030, with wide uncertainty later in the decade. (IEA PDF) That’s not “more servers.” That’s “a new industrial sector.”
Here’s the operator-grade insight that falls out when you combine the IEA’s forecast with EO 14318’s definitions. The order defines a “Data Center Project” as anything that needs more than 100 megawatts of new load dedicated to AI inference, training, simulation, or synthetic data generation. (GovInfo PDF) One hundred megawatts running all year is about 0.876 TWh. If global data center demand grows by roughly 530 TWh by 2030 (945 minus 415), that’s on the order of ~600 “100‑MW data centers” worth of new annual electricity demand. That is the scale of the buildout the order is aiming to accelerate.
The IEA offers a translation layer that makes the “100 MW” threshold feel less abstract: it notes that a 100‑MW data centre can consume as much electricity annually as 100,000 households, and that the largest facilities currently under construction could be on the order of 2 million households. (IEA PDF) Once you see the infrastructure as “city‑scale demand,” you stop treating permitting as paperwork and start treating it as national capacity planning.
And the risk is not theoretical. The IEA warns that if grid constraints aren’t addressed, around 20% of planned data center projects could be at risk of delays. (IEA PDF) In a geopolitical contest measured in quarters, “delay” is a synonym for “lose.”
This is where the state patchwork enters. The U.S. doesn’t have one set of bottlenecks; it has fifty-plus, plus thousands of localities with zoning power, plus a federal layer of environmental review, plus utilities and regional grid operators. Meanwhile, state AI laws are proliferating too. Colorado’s SB24‑205 is one of the clearest signals that states are willing to regulate “high-risk” AI interactions and disclosure duties on their own timeline. (Colorado legislature bill page) California’s SB1047 is an example of how quickly the frontier-model conversation can turn into a state-level safety regime proposal. (California bill text)
So the strategic question becomes: if America cannot harmonize the rules quickly, can it at least harmonize the build? EO 14318 is a yes‑answer to that question.
What EO 14318 actually does (and what it doesn’t)
EO 14318 is best read as a toolkit for converting AI infrastructure from “hard” projects into “fast” projects. It’s not a sweeping national AI law, and it does not formally preempt state regulation of model behavior, consumer protection, or civil rights. It does, however, expand the share of the pipeline that can be sped up through federal process and federal assets.
1) It defines the target: very large AI compute loads.
The order’s “Data Center Project” threshold (>100 MW of new load for AI) is deliberately high. It is designed to cover the kind of facilities that move national capability—training clusters, inference farms, and the energy infrastructure that feeds them—not the average enterprise server room. (GovInfo PDF)
2) It defines “Qualifying Projects” and then tries to finance and fast-track them.
A “Qualifying Project” can be a Data Center Project or a Covered Component Project (things like transmission lines, turbines, semiconductors, networking gear) if it clears thresholds like $500 million in committed capex or >100 MW incremental load, or if it’s designated for national security. (GovInfo PDF)
Then it instructs Commerce to launch an initiative for financial support that can include loans, loan guarantees, grants, tax incentives, and offtake agreements. This is less “deregulation” than “industrial policy with a stopwatch.” (GovInfo PDF)
3) It revokes an earlier infrastructure order and swaps the priorities.
The order explicitly revokes Executive Order 14141 (from January 2025), which framed domestic AI infrastructure around clean power, community support, and labor standards. (GovInfo PDF) That revocation is a policy signal: this administration is choosing speed and scale over a slower, more negotiated buildout.
If you want the broader context on the administration’s AI posture—procurement, copyright, and the industrial push—you can read our earlier breakdown of Trump’s AI orders and their strategic intent.
4) It tries to compress NEPA timelines through categorical exclusions and a narrower definition of “major Federal action.”
Within 10 days, agencies must identify existing categorical exclusions that could accelerate Qualifying Projects; CEQ must coordinate new ones. The order also asserts that certain forms of federal financial assistance—especially where an agency lacks “substantial project-specific control”—should not trigger NEPA as a “major Federal action,” and it sets a <50% federal funding presumption for that “no substantial control” stance. (GovInfo PDF)
This is the administrative equivalent of widening the on‑ramp. It doesn’t eliminate environmental review, but it tries to narrow how often a project hits the slow lane. The legal backdrop here is the modernized NEPA framework codified in 42 U.S.C. § 4336e. (Cornell Law)
Why does that matter for “state power,” even indirectly? Because the biggest projects often die by sequencing: a federal review that drags for months forces every other stakeholder to wait, renegotiate, and reopen questions. If NEPA review is narrower or more categorical, the developer can run local engagement, utility planning, and supply-chain procurement in parallel instead of serial. The states haven’t lost authority—but they’ve lost the time monopoly that made that authority decisive.
5) It puts Qualifying Projects on the FAST‑41 track—and makes the process visible.
The order empowers the Federal Permitting Improvement Steering Council (FPISC) to designate Qualifying Projects as “transparency projects” and then expedite their transition to FAST‑41 “covered projects,” including publishing them with schedules for expedited review. (GovInfo PDF)
If you’ve never heard of FAST‑41, that’s the point: it’s a bureaucratic lever for compressing cross‑agency permitting by turning an amorphous multi‑agency process into a named project with a timeline, a dashboard footprint, and escalation paths. The statutory foundation sits in 42 U.S.C. § 4370m‑1 and the surrounding FAST‑41 framework. (Cornell Law) The “overpowering” effect here is not ideological; it’s procedural. A project on a published federal schedule with named agency owners becomes harder to stall quietly.
And it matters that EO 14318 pushes projects from “transparency” status into “covered” status. A transparency project is visibility. A covered project is governance: a structured timetable and coordination obligations that can change agency behavior even when no one wants to be the villain slowing “AI leadership.” EO 14318 is trying to make that governance pathway the default for the biggest loads.
6) It directs EPA and the Army Corps to make permitting more programmatic—and more standardized.
EPA is told to help expedite permitting on federal and non-federal lands by developing or modifying regulations under the Clean Air Act, Clean Water Act, CERCLA, and TSCA. It also pushes EPA to identify Brownfield and Superfund sites for reuse and to issue guidance to expedite reviews. (GovInfo PDF)
Separately, the order instructs the Army Corps to review whether new activity-specific nationwide permits are needed to facilitate efficient permitting for Qualifying Projects. (GovInfo PDF)
7) It makes federal land part of the supply curve.
Interior and Energy are directed to offer authorizations for sites they identify, and Defense can lease land on military installations for qualifying infrastructure, subject to security and force protection. (GovInfo PDF)
That’s the quiet “state‑override” story: a developer who can build on federal land, under a federal permitting schedule, with federal agencies coordinating categorical exclusions, is simply less exposed to state and local chokepoints. Not immune—but less exposed.
The bigger move is that EO 14318 attempts to standardize the infrastructure pipeline in the same way the internet standardized network protocols. You don’t need every town to share the same zoning philosophy if the critical path shifts to a smaller set of federal review gates. The political story says “overpower the states.” The operational story is closer to “reduce the number of veto points.”
To make the map concrete, here’s the tightest way to think about the administration’s AI policy stack right now:
| Order | Lever | What it standardizes |
|---|---|---|
| EO 14318 | Permitting + land | Where and how fast compute can be built |
| EO 14319 | Procurement | What “acceptable” federal LLMs look like (GovInfo PDF) |
| EO 14320 | Statecraft | How America packages and exports the AI stack (GovInfo PDF) |
| EO 14179 | Strategy | What the federal posture is (dominance) (GovInfo PDF) |
That table also clarifies what EO 14318 does not do: it does not create a national civil liability regime for AI harms, it does not preempt state consumer protection laws, and it does not replace Congress.
What could break this bet
I’m broadly in favor of EO 14318’s direction. In an AI race, you want your bottleneck to be chip supply and engineering talent—not a permitting calendar that treats 2028 like it’s “soon.” But there are at least four ways this strategy can fail, and they matter precisely because the order leans so hard on speed.
1) The “preemption” story may be more political than legal.
Under the Supremacy Clause, federal law can preempt state law, but that power flows through statutes and constitutional authority—not a presidential preference memo. Courts take preemption seriously because it’s a federalism question, not an engineering tradeoff. (Cornell Law overview of preemption)
EO 14318 can change federal permitting posture, federal land availability, and federal agency coordination. But if the claim is “states can’t regulate AI anymore,” that’s not what’s written. The patchwork of state AI rules can still bite companies at the model and application layer, even as the federal government accelerates the compute layer.
2) Grid physics doesn’t care about executive orders.
The IEA’s warning about delay risk is fundamentally a grid story: interconnection, transmission, and local capacity are not paperwork problems; they’re steel-and-transformer problems. (IEA PDF) A 100‑MW data center is not a permitting abstraction. It’s a new small town on the load curve.
This is where the U.S.–China comparison becomes uncomfortable again. China’s advantage is not that it has “better forms.” It’s that it can build large infrastructure quickly, and it can align energy planning with industrial outcomes. If the U.S. accelerates permitting but still can’t deliver transformers, transmission, or generation at pace, EO 14318 becomes a headline without throughput.
There’s also a subtler risk: overbuilding the wrong kind of capacity. The IEA’s scenarios include a “High Efficiency” case in which data center electricity demand is materially lower later in the decade because hardware and model efficiency improves faster than expected. (IEA PDF) If you accelerate siting and financing without building flexible, grid-friendly operations (demand response, storage, curtailment plans), you can end up with stranded electrical upgrades and a backlash narrative: “we paved the desert for servers that didn’t show up.”
3) Speed increases litigation incentives.
When government narrows environmental review pathways, it doesn’t remove conflict; it often redistributes it into court. The order’s NEPA posture—especially the <50% funding presumption for “no substantial control”—is exactly the kind of thing that becomes a test case when a project is controversial. (GovInfo PDF)
Even if the administration wins those fights, time spent in litigation is still time—again, measured in quarters. The faster you try to go, the more you need your legal strategy to be industrialized too.
4) The state patchwork can still shape the product, even if Washington shapes the build.
Colorado SB24‑205 and California SB1047 are not about megawatts; they’re about duties, disclosure, and frontier-model risk. (Colorado legislature bill page) (California bill text)
That matters because the most important “AI advantage” is not training the best model in a lab; it’s deploying systems at scale in businesses, government, and defense. If the deployment layer becomes a fifty‑state compliance maze, compute abundance alone won’t convert into economic productivity or military capability.
This is the subtle tension in the administration’s approach. EO 14318 centralizes infrastructure acceleration at the federal level. But without a credible federal baseline for model and application governance, states will keep filling the vacuum—because the vacuum is where political demand flows.
That dynamic is why the administration’s broader AI posture matters here, not just the permitting order. EO 14179 explicitly directs the development of an “AI Action Plan” aimed at sustaining and enhancing American AI dominance. (GovInfo PDF) The White House later published its “America’s AI Action Plan,” which frames winning as a competitiveness and national-security imperative. (White House) If that plan doesn’t translate into a baseline governance regime that states can accept (or at least coexist with), the infrastructure acceleration and the regulatory fragmentation will keep pulling against each other.
The outlook: pro-speed, anti-chaos (operator checklist)
EO 14318 is not a perfect instrument, but it is a strategically coherent one. It takes the U.S.–China diagnosis seriously: America cannot win an infrastructure-shaped competition if its projects move at the speed of consensus. And it pulls on the few levers an executive order actually controls—federal permitting posture, federal land, federal agency coordination, and the use of existing statutory fast tracks like FAST‑41.
I’ll say the quiet part out loud: we should be biased toward speed right now. Not because safety is fake, but because falling behind is a safety risk too. In a world where the leading AI stack sets the default norms, losing the race is how you import someone else’s standards.
But “speed” only works if you pair it with two stabilizers: a coherent baseline that reduces state-by-state chaos, and a hardening strategy that assumes infrastructure becomes a target (cyber, physical, and supply chain). That’s the bet we outlined earlier when we argued hardware and energy crown AI’s kings.
At the national level, a serious “race strategy” looks like three moves executed in parallel.
First, centralize the buildout (what EO 14318 is attempting) and make grid upgrades a national priority rather than a local fight. Second, standardize the rules of deployment enough that companies don’t have to build fifty compliance stacks to ship one product. Third, push the stack outward—export packages, alliances, and standards—so American systems become the default in the places that don’t want to pick between Washington and Beijing.
EO 14318 helps with the first move; EO 14320 explicitly pushes on the third. (GovInfo PDF) The unresolved question is the second move: a credible governance baseline that keeps safety and civil legitimacy intact without letting process become the weapon that slows the build.
If you’re an operator—building, financing, siting, or selling into this wave—here’s the checklist I’d run.
Operator checklist
- Treat >100 MW as your policy threshold: if your project clears it, plan as if EO 14318’s fast-track mechanisms apply by default. (GovInfo PDF)
- Build your permitting narrative around being a “Qualifying Project”: capex commitments, national-security relevance, and covered-component dependencies are the language the order uses. (GovInfo PDF)
- Don’t assume “federal faster” means “no local politics”: community opposition still kills timelines; plan mitigation and benefits early, even if you’re on federal land.
- Treat grid and transformers as first-class risks: the IEA’s delay warning is a forecast of your critical path. (IEA PDF)
- Map state AI obligations separately from infrastructure work: Colorado SB24‑205 and California SB1047 illustrate that state model/application rules can move independently of federal infrastructure acceleration. (Colorado legislature bill page) (California bill text)
- Expect procurement to become de facto standardization: EO 14319 is a reminder that “federal buyer” is one of Washington’s strongest regulatory tools. (GovInfo PDF)
- Plan for export posture early if you’re selling infrastructure or models abroad: EO 14320 explicitly targets packaging the “American AI technology stack” for global deployment. (GovInfo PDF)
The meta-takeaway is simple: EO 14318 is an attempt to turn “AI race” rhetoric into permitting math. It won’t erase state regulation, and it won’t eliminate physics. But it does something rare in tech policy: it points at the real bottleneck, then tries to move it.
If we pair that velocity with a federal baseline that reduces state-by-state compliance chaos—and with serious infrastructure hardening—the U.S. has a plausible path to win the buildout phase of the AI Cold War. And in this cycle, winning the buildout is how you earn the right to argue about everything else.