- Arista Networks delivered $2.0B in Q4 2025 revenue, up 25% YoY — AI networking is the growth driver
- Dell's AI server revenue hit $9.7B for full-year 2025, up 110% YoY, though at razor-thin margins
- Super Micro remains a wild card: revenue recovery but accounting concerns and margin pressure persist
- The networking layer (Arista, Broadcom) has better margins and moats than server assembly (Dell, Super Micro)
Section 1 — The AI Infrastructure Investment Thesis
When investors talk about "picks and shovels" plays on the AI gold rush, they typically mean Nvidia. But the GPU is only one component of an AI cluster. A modern 100,000-GPU cluster requires high-speed networking, power distribution systems, cooling infrastructure, storage arrays, and management software — creating a broad ecosystem of beneficiaries that receive significantly less attention than the chip itself.
The addressable market for AI-specific infrastructure (excluding GPUs) is estimated at $180 billion annually by 2026, up from $45 billion in 2023. This includes Ethernet and InfiniBand networking ($28B), AI-optimized servers ($95B), storage systems ($22B), and cooling/power ($35B). The critical insight is that as GPU clusters scale — from 8 GPUs per rack to 64 GPUs in DGX SuperPOD configurations — the networking and cooling complexity scales faster than the GPU count itself.
Arista Networks sits at the premium end of this ecosystem. The company specializes in cloud networking hardware and software, with its EOS (Extensible Operating System) giving network operators programmability that competitors like Cisco cannot easily match. Arista's customer concentration is notable — Microsoft and Meta together represented approximately 40% of revenue in 2025 — but this concentration reflects the strength of the relationships rather than fragility. Both customers have been Arista partners for over a decade.
Dell Technologies represents the opposite end of the margin spectrum. Dell assembles AI servers using Nvidia's GPUs, competing primarily on supply chain relationships, warranty, and enterprise service capabilities. The PowerEdge XE9680 — Dell's flagship AI server housing eight H100 or B200 GPUs — has a list price of approximately $200,000-$250,000. Dell's gross margin on these systems is roughly 8-12%, compared to Arista's 64% product gross margin. Dell is a volume play; Arista is a margin play.
Section 2 — Arista vs. Cisco: The Networking Layer Battle
The networking layer is where the most interesting competitive dynamics in AI infrastructure are playing out. As GPU clusters scale to tens of thousands of units, network fabric becomes a critical performance bottleneck. The choice between InfiniBand (dominated by Nvidia) and Ethernet (where Arista competes) is a strategic decision that hyperscalers are actively debating.
Arista's argument for Ethernet is compelling: the technology is more widely understood, supports a broader ecosystem, and avoids vendor lock-in to Nvidia's networking stack. Microsoft has standardized on Ethernet for its Azure AI clusters, which benefits Arista directly. Google uses a combination of proprietary Jupiter fabrics and Arista equipment. Meta has been an Arista customer for AI networking since 2024.
Cisco, the incumbent networking giant with $57 billion in annual revenue, has struggled to maintain relevance in AI networking. Its acquisition of Acacia Communications (optical interconnects) and investment in Silicon One custom ASICs represent genuine efforts to compete, but Arista's software-defined approach continues to win new AI infrastructure deals. Cisco's AI-related networking revenue grew 31% in its most recent quarter but from a much smaller base.
Broadcom deserves mention as another infrastructure beneficiary. The company's custom ASIC business — building AI accelerators for Google (TPU v5), Meta (MTIA), and others — generated approximately $12 billion in AI-related revenue in 2025. Broadcom's Jericho3-AI Ethernet switching chip is the foundation of many large-scale AI cluster networks. At 35x forward earnings, Broadcom offers growth comparable to Arista but with greater revenue diversity.
| Company | Fwd P/E | Gross Margin | AI Revenue Exposure |
|---|---|---|---|
| Arista Networks (ANET) | 32x | 64% | ~60% of revenue |
| Cisco Systems (CSCO) | 14x | 65% | ~15% of revenue |
| Broadcom (AVGO) | 35x | 70% | ~45% of revenue |
| Dell Technologies (DELL) | 11x | 22% blended | ~28% of revenue |
| Super Micro (SMCI) | 14x | 11% | ~85% of revenue |
Section 3 — Super Micro: High Revenue, High Risk
Super Micro Computer delayed filing its fiscal 2024 10-K due to an accounting review, resulting in a Nasdaq delisting notice in late 2024. While the company ultimately restated financials and retained its listing, the episode revealed governance weaknesses that warrant a significant risk premium. Investors should apply a 20-30% discount to SMCI valuations relative to clean-governance peers.
Super Micro Computer had the most dramatic story in AI infrastructure during 2025. Revenue grew 87% YoY to $22.3 billion, driven entirely by AI server demand. The company's ability to quickly integrate new Nvidia GPUs into shipping products — often weeks ahead of Dell — gave it a first-mover advantage that translated directly into order share. At peak, Super Micro held an estimated 10-12% of AI server market share.
The accounting saga that began in August 2024 — when short seller Hindenburg Research published allegations of accounting irregularities — remains a risk even after the company's financial restatement. The CFO who oversaw the problematic period has departed, and a new CFO with a stronger compliance background has been installed. However, institutional investors with governance mandates remain reluctant to hold significant SMCI positions, keeping the stock's P/E multiple suppressed relative to fundamental performance.
The more fundamental concern with Super Micro (and Dell) is commoditization. As more server manufacturers gain authorized Nvidia channel partner status, and as ODMs (original design manufacturers) in Taiwan compete aggressively, the AI server assembly business will become increasingly price-competitive. Super Micro's liquid cooling expertise — the company pioneered direct liquid cooling (DLC) technology for high-density GPU servers — represents a genuine differentiation that may help defend margins. But the 11% gross margin leaves little cushion for competitive pricing pressure.
Section 4 — Investment Framework
The investment framework for AI infrastructure stocks should be anchored to margin quality and moat sustainability. The hierarchy is clear: networking and software (high margins, strong moats) > custom silicon (high margins, customer concentration risk) > server assembly (low margins, commoditization risk).
Arista Networks at 32x forward earnings is expensive in absolute terms but reasonable given its 25%+ revenue growth rate and 64% gross margins. The stock is not cheap, but it is the highest-quality way to gain exposure to AI networking spend without betting on a single hyperscaler. The key risk is multiple compression if revenue growth decelerates to the 15-20% range that bears expect for 2027.
Dell at 11x forward earnings is a value trap risk we take seriously. The AI server revenue surge masks declining profitability in the traditional PC and enterprise hardware business. Infrastructure Solutions Group (ISG) operating margins of 8.5% are structurally constrained by the server assembly business model. Dell is appropriate for investors seeking low-multiple exposure to AI capex, but position sizing should reflect the margin quality.
For most investors, Arista is the cleanest AI infrastructure expression at a defensible valuation. Broadcom (covered in our semiconductor supply chain piece) offers broader exposure with similar margin quality.
Verdict
AI infrastructure stocks offer genuine earnings growth with the AI buildout as a structural tailwind through at least 2027. Within the group, Arista Networks is the highest-conviction hold at current prices — premium valuation is justified by superior margins and competitive positioning. Dell offers value but low quality; Super Micro offers volume but governance and margin risk. We recommend overweighting Arista and Broadcom, holding Dell at market weight, and treating Super Micro as a speculative position with appropriate position sizing.
Data as of March 2026. Not financial advice.
— iBuidl Research Team