Photo by Taylor Vick on Unsplash
Cisco's AI orders surge, then it cuts 4,000 jobs
/ 15 min read
Table of Contents
The day Cisco won the AI cycle and fired the engine room
Cisco picked Wednesday evening to teach Wall Street the new shape of the AI economy. In a single earnings release, the sixty-six-thousand-employee networking incumbent posted record revenue, raised every meaningful guide on the page, more than doubled its full-year AI order forecast, and announced that it would be cutting close to four thousand jobs over the next three quarters. The stock obliged with a roughly seventeen percent after-hours rally. The layoff memo, per multiple outlets, hit employee inboxes that same evening, with the formal restructuring beginning Thursday — today. CNBC’s primary report on the print framed the two announcements as a single document: AI demand is exploding, and the labor base that built Cisco’s pre-AI franchise is being repriced inside the same press release. There is no “but” in the structure. That is the structure.
The numbers force the comparison. Cisco reported Q3 FY2026 revenue of $15.84 billion against a $15.56 billion consensus, adjusted EPS of $1.06 versus $1.04 expected, and twelve percent year-over-year top-line growth, according to the official investor-relations release. The company also said it has booked $5.3 billion in AI-infrastructure and hyperscaler orders so far in fiscal 2026 and raised its full-year AI-order expectation to $9 billion, up from a prior $5 billion target. Fiscal-2026 AI infrastructure revenue guidance rose to $4 billion from $3 billion. Converge Digest’s breakdown of the segment numbers put the underlying network mix in sharp relief: total networking product orders grew more than fifty percent year over year, data-center switching orders climbed over forty percent, and campus networking orders rose more than twenty-five percent. The networking renaissance is no longer a slide-deck thesis. It is a backlog.
The layoff was the other half of the same sentence. Fox Business reported that Cisco is cutting fewer than four thousand jobs — under five percent of its workforce — in an AI-focused restructuring that the company will continue staffing into “strategic areas” even as it eliminates roles elsewhere. Pre-tax charges land at roughly $1 billion total, with $450 million hitting in Q4 FY26 and the balance falling into FY27, per Cisco’s own newsroom release on the quarter. CEO Chuck Robbins explained the geometry in plain language quoted by Fox: “The companies that will win in the AI era will be those with focus, urgency, and the discipline to continuously shift investment toward the areas where demand and long-term value creation are strongest.” Translation: the AI-era P&L is not an additive layer on top of the legacy business. It is a substitution.
The stakes for anyone tracking the AI macro are immediate and uncomfortable. The bull case on hyperscaler capex — that the $725 billion of 2026 AI spending guided by Amazon, Microsoft, Alphabet, and Meta will lift a long tail of suppliers — has been validated, decisively, by Cisco’s print. So has the bear case on labor. The same press release that justified the bull case justified four thousand pink slips. The Republic World report on the layoffs captured the dissonance in its headline: record growth, then layoffs to fund the AI pivot. The AI revenue is real, the AI capex is real, and the labor displacement is not a side-effect — it is the funding mechanism. Cisco is the cleanest single-name expression of that mechanism this earnings season, and the trade is now public.
Follow the wire — and the money flowing through it
The Cisco print matters because the company is not a chip vendor and not a model lab, which makes it a uniquely clean signal on the second derivative of AI spend. Nvidia tells you that hyperscalers want GPUs. Microsoft and Alphabet tell you that hyperscalers are buying GPUs. Cisco tells you that the GPUs are now being wired together at scale — which is the part of the cycle that has to happen for any of the prior spend to actually produce inference revenue. Network World’s coverage of Cisco’s 2026 agenda emphasized this point months before the print landed: the company spent calendar 2025 retooling its catalog around AI-ready infrastructure and connectivity, betting that hyperscalers would eventually need to spend on the fabric between the racks, not just the silicon inside them. Q3 FY2026 is the quarter in which that bet paid out.
The competitive geometry behind the bet runs through Nvidia. Cisco’s first major Nvidia-co-designed switch — the N9100 series, built on Nvidia Spectrum-X silicon — pairs a 2RU, sixty-four-port, 800Gb-Ethernet form factor with the Spectrum-4 ASIC, the same chip Nvidia is selling into its own white-box ecosystem. Cisco’s pre-existing Silicon One ASIC family slots into the same fabric, which means Cisco can sell either its own silicon or Nvidia’s, depending on which the customer wants. The Silicon One expansion announcement from earlier this year made this dual-track design explicit. Cisco is no longer fighting Nvidia for control of the AI data center networking layer. It is selling the integration. That is a structurally easier business to scale.
The $725 billion of 2026 hyperscaler capex is the demand-side pool Cisco is fishing in, and the pool has roughly doubled in twelve months. The four hyperscalers — Amazon, Microsoft, Alphabet, and Meta — have collectively guided to roughly $725 billion of capital expenditure this year, up from $462 billion in 2025, per 24/7 Wall St.’s tracker. Even if networking is only ten to fifteen percent of that pool — and analysts vary on the exact share — the addressable wallet for switches, optics, routers, and data-center connectivity is somewhere between $70 and $110 billion in 2026 alone. Cisco’s $9 billion AI-order guide implies that the company captures roughly a tenth of that pool at the new run-rate, which is large enough to move the consolidated P&L but small enough that competitors — Arista, Marvell, Broadcom, the white-box ecosystem — still have plenty of room to claim share. The pie is genuinely growing faster than any one supplier can absorb.
The labor math is the other side of the same ledger, and it is more brutal than the headline number implies. Q1 2026 alone produced 78,557 tech layoffs — a 2.6× jump from the 29,845 cuts in Q1 2025 and the largest first-quarter wave on record, per the AI Consulting Network’s analysis of the Q1 dataset. Layoff-tracking analysis suggests roughly forty-eight percent of those Q1 cuts were explicitly tied to AI-driven workflow automation or reduced labor demand from AI tooling. The Cisco action does not introduce that pattern; it confirms it inside a company whose customers are the very hyperscalers driving the substitution.
The strongest proprietary inference from stacking these data points is dollar-for-dollar. If Cisco is raising its AI-revenue guide by $1 billion (from $3 to $4 billion) while charging $1 billion in restructuring costs, then in a literal accounting sense the company is funding one year of expected AI revenue with one year of severance. That alignment is not a coincidence — it is the mechanic Robbins described in the call. The Detroit News writeup quoted the company saying it plans to “continue hiring in strategic areas” even as it eliminates roles, which means the headcount will not necessarily fall by four thousand on a net basis. It will be reshaped. The legacy networking, legacy software, and legacy services teams shrink; the AI-systems, security, and silicon teams grow. That is the AI-era P&L in microcosm: a corporate body executing a partial transplant on itself while still posting record revenue.
Where this Cisco bull case can break
The first hole in the thesis is that hyperscaler AI capex is not a contracted obligation — it is a guide, and guides can be cut. The 2026 $725 billion figure assumes that Amazon, Microsoft, Alphabet, and Meta sustain the spending pace they laid out in their Q1 earnings, but each of those companies has shown willingness to revise capex sharply when AI-revenue economics disappoint. The CNBC writeup on Cisco’s print noted that Cisco’s $9 billion order figure is full-fiscal-year orders, not signed contracts with cancellation penalties — and an order that is booked but not delivered can be deferred. If even one of the four hyperscalers signals a 2027 capex pause in the next two quarters, the entire networking-supplier rally that fueled the post-earnings move would compress quickly. The circular-financing geometry behind some of this capex — described in my May 11 piece on Nvidia’s $40 billion of AI equity bets — is one reason that compression risk is non-trivial.
The second hole is competitive. Cisco’s networking incumbency is real, but Arista Networks has been the consensus winner among hyperscaler-focused networking pure-plays for the past five years, and the Nvidia Spectrum-X partnership Cisco rode into this quarter is not exclusive. Nvidia has shipped reference designs, ASIC integrations, and white-box partnerships across the industry; an enterprise that bought Cisco’s N9100 today could buy a comparable Arista or white-box switch tomorrow with similar silicon underneath. The pricing power Cisco extracts from this cycle depends on enterprise customers — not hyperscalers — choosing the integrated Cisco platform over assembled alternatives. Hyperscalers are notoriously price-sensitive and have a long history of designing their own gear once a supplier reveals the recipe. The $9 billion order book is bullish, but the gross margin on that order book is the more telling number, and the company has not yet disclosed it cleanly.
The third hole is the layoff narrative itself. Cisco’s framing is that the four thousand cuts free up capital for AI-focused investment, but the alternative reading is that the cuts are conventional cost-discipline dressed in AI clothing. The job categories targeted are not yet public — the AOL excerpt of the internal memo and Cisco’s own corporate blog post on the change describe the cuts in strategic-fit language without naming specific functions. Historically, networking-vendor restructurings have hit middle-management layers, regional sales operations, and legacy hardware engineering, all of which can be cut without any AI thesis at all. If the cuts turn out to look like a 2019-style restructuring with an AI label pasted on, the market will eventually re-rate the multiple back down toward the pre-AI Cisco range. The narrative premium is conditional on the labor mix shift being real and substantive.
The fourth and largest hole is the labor-substitution premise itself. The bull case assumes AI tooling will replace enough white-collar engineering, sales, and operations work to permanently lower Cisco’s operating-expense base. The evidence for that level of substitution at a networking incumbent is thin. AI coding agents — the segment I covered in the May 9 piece comparing Claude Outcomes and OpenAI Codex Goals — are productive enough to compress some software-engineering workloads, but the savings show up as faster project velocity rather than headcount eliminated, in most enterprises that have measured it carefully. If Cisco discovers in twelve months that the four thousand cuts produced no durable labor-cost reduction — because the work simply moved to contractors, offshore vendors, or higher-cost AI-engineering hires — then the restructuring becomes a one-time accounting maneuver rather than a structural rerate.
The fifth hole is regulatory and political. The juxtaposition of record AI revenue and four thousand layoffs in a single press release is precisely the kind of optic that draws Congressional and state-level attention. TheNextWeb’s tally of 2026 tech layoffs crossing the hundred-thousand mark suggests that the political surface area for “AI is taking my job” rhetoric is widening quarter over quarter. The CAISI testing regime I covered in my May 7 piece on pre-deployment evaluations is a model-level intervention; the next legislative phase — already telegraphed by data-center moratorium bills introduced this spring — could touch hyperscaler capex itself. Cisco’s narrative depends on a regulatory environment that does not impose direct friction on the hyperscaler buildout. That assumption is plausible today and considerably less plausible in eighteen months.
The operator’s checklist for the next networking quarter
The most likely scenario over the next two quarters is that Cisco’s AI-order momentum continues, the layoffs proceed largely as guided, and the stock holds the post-earnings range while volatility migrates to Q4 EPS upside and FY27 guidance. The harder scenario to handicap is what happens to enterprise IT budgets, networking-vendor competitive intensity, and the broader labor-substitution narrative as the rest of the supplier ecosystem reports against the bar Cisco just set. The Anthropic-SpaceX compute deal I covered on May 8 and the Google–Anthropic $40 billion compute commitment I covered on April 26 both imply that AI-cluster wiring demand keeps compounding through at least mid-2027. The networking suppliers that capture share against that demand are the ones to watch; the ones that lose share will look like value traps long before the multiples reflect it.
Operators reading this should treat the Cisco print as a forcing function, not a victory lap. The cleanest takeaways:
- Treat 50% YoY networking order growth as the new baseline for AI-data-center suppliers. Any switch, optics, or routing vendor reporting sub-30% AI-segment growth in the next two quarters is losing share, not riding the cycle. Cisco’s data-center switching number — over 40% YoY — is the closer comparable for pure-play networking pure-plays like Arista, Marvell, and Broadcom’s networking segment.
- Watch the gross-margin disclosure on AI orders, not the order figure itself. Cisco’s $9 billion AI-order guide is meaningful only if hyperscaler-tier gross margins are at or above the corporate average. If management discloses AI-segment margins below 60% in a future call, the rerate thesis weakens substantially regardless of top-line growth.
- Map every $1 of AI revenue to the $1 of severance funding it. Cisco’s restructuring charge is roughly the same size as its raised AI-revenue guide. That is a one-time funding event. The same trick cannot be repeated in FY27, which means the FY27 AI-revenue acceleration has to fund itself out of organic operating leverage. Plan the model accordingly.
- Audit your own organization for the same arbitrage. If your company is also reporting record growth and layoffs in the same quarter, ask whether the labor cuts are funding a genuine technology pivot or simply re-labeling conventional cost discipline. The honest version of the AI-restructuring story is rare; the cosmetic version is not.
- Treat enterprise networking refresh as a leading indicator for AI-app readiness. Campus networking orders up 25% — the third of Cisco’s three growth bands — implies that enterprises are upgrading the underlying connectivity before deploying agentic workloads in production. That refresh cycle is a six-to-twelve-month lead on enterprise-AI revenue at downstream vendors. The networking supplier print is, in effect, a forward indicator on which enterprises are actually about to ship.
- Take seriously the political risk that a record-AI-orders-plus-record-layoffs print can attract. The political surface area for “tech is destroying jobs while making record profits” is wider in May 2026 than it was in January, and Cisco’s joint announcement makes it easy to weaponize. The hedge is not to deny the substitution — it is to publish the retraining-and-reskilling commitment with the same prominence as the order guide. Cisco’s corporate blog post on its restructuring did some of this, but the depth is not yet at the level a serious policy response will require.
- Reread the Q4 guide before the rest of the supplier complex reports. Cisco guided Q4 to $1.16–$1.18 adjusted EPS on $16.7–$16.9 billion revenue, well above the $1.07 / $15.82 billion consensus. The bar for Arista, Juniper, Marvell, and Broadcom is now higher than the day before yesterday. Any one of them missing against that bar in the next four weeks creates an asymmetric short setup; any one of them clearing it cleanly creates the opposite setup on the long side.
The deepest takeaway sits in the structure of the press release itself. Cisco told the market: we have the orders, we are letting people go, the two facts are the same fact. That candor is, on balance, useful. The AI cycle is not a victimless capex boom, and the suppliers most exposed to it are the suppliers most exposed to the labor substitution it makes possible. The next twelve months of earnings season will tell the rest of the corporate world how to talk about the same trade-off. Cisco picked the structure; the others will follow it.
In other news
-
Anduril raises $5B at a $61B valuation. The defense-tech company closed a Series H led by Thrive Capital and Andreessen Horowitz on May 13, cementing its status as one of the most valuable private defense-technology companies and a defining player in the AI-for-defense thesis. The round implies that demand for autonomous-system contractors is now valued on growth-software multiples, not industrial ones (TechStartups).
-
Fractile closes a $220M Series B for AI-inference silicon. The UK-based chip startup raised from Accel, Founders Fund, Conviction, Gigascale, Felicis, 8VC and others on May 13, betting that purpose-built inference hardware can undercut Nvidia’s economics on the deployment side of the AI stack. The round is the latest signal that “AI inference” is now a venture category separate from “AI training” — and one with much less Nvidia exposure (TechStartups).
-
Anthropic ships “dreaming” for Claude managed agents. Anthropic introduced a scheduled memory-curation system that lets agents reorganize their own memory between sessions, with Harvey reporting roughly 6× task-completion gains and Wisedocs cutting document-review time in half after deployment. The feature ships under Anthropic’s Code with Claude track and pushes the autonomous-agent thesis closer to durable production workloads (VentureBeat).
-
Sierra raises $950M as the enterprise-AI agent race accelerates. The Bret Taylor–led customer-experience AI company closed a $950 million round on May 4 at a valuation north of $10 billion, the latest mark on a series of enterprise-agent rounds that have collectively raised more capital this spring than the entire 2024 enterprise-SaaS vintage. The round is the cleanest pure-play wager that voice-and-chat agent infrastructure becomes the default front door for customer support (TechCrunch).
-
Fortune flags that AI-automation layoffs are underperforming on ROI. Fortune published research on May 11 finding that AI-automation-driven layoffs have largely failed to generate the productivity returns companies expected, with Gartner separately predicting that half of AI-driven customer-service cuts will be reversed by 2027 as enterprises rehire. The finding is a useful counterweight to the Cisco narrative: the substitution premise behind the AI-restructuring trade is not yet empirically settled (Fortune).