Nexthop AI

Nexthop AI builds custom AI-optimized networking infrastructure for hyperscalers.
Series B $610M total Founded 2024 Santa Clara, California 214 employees
Nexthop AI builds custom networking infrastructure purpose-built for AI workloads in hyperscale cloud data centers. The company combines custom hardware, software, and optical integration using a JDM (Joint Development Manufacturer) model to deliver tightly integrated solutions for GPU clusters. Nexthop's Disaggregated Spine architecture achieves up to 30% reductions in energy consumption and infrastructure costs while supporting extreme throughput (50-100+ terabits) compared to traditional networking vendors like Cisco and Juniper.
Problem solved
Hyperscalers need networking infrastructure tightly integrated with GPUs that traditional vendors cannot provide with the required customization, performance, and energy efficiency.
Target customer
Hyperscale cloud providers (Google, Amazon, Meta-equivalent) deploying large-scale AI training and inference clusters requiring custom networking solutions.
Founders
A
Anshul Sadana
Founder & CEO
17 years at Arista Networks (Chief Customer Officer, COO) and 8 years at Cisco; M.S. Computer Science from University of Illinois, MBA from Wharton.
Funding history
Series A $110M March 2025 Led by Lightspeed Venture Partners · Kleiner Perkins, WestBridge Capital, Battery Ventures, Emergent Ventures
Series B $500M March 2026 Led by Lightspeed Venture Partners · Andreessen Horowitz, Altimeter, all Series A investors
Total raised: $610M
Pricing
Not publicly available. B2B JDM model with custom solutions priced per engagement with hyperscalers.
Notable customers
Not disclosed. Company works exclusively with largest cloud providers on customized solutions.
Integrations
SONiC (open-source network OS), FBOSS, Broadcom Tomahawk 6 and Thor 800G NIC, merchant silicon partnerships
Tech stack
Webpack WordPress (Blogs) Google Analytics (Analytics) PHP (Programming languages) Google Workspace (Email) Cloudflare (CDN) MySQL (Databases) Elementor (Page builders) Hello Elementor (WordPress themes) Contact Form 7 (WordPress plugins)
Website
Competitors
Arista Networks
Arista focuses on standardized networking products; Nexthop provides custom hardware, software, and optical integration tailored to AI workloads.
Broadcom
Broadcom supplies merchant silicon chips; Nexthop integrates these chips into custom switching systems with software and optics optimization.
Cisco
Traditional vendor with legacy architecture and execution challenges; Nexthop purpose-built for modern AI workloads with disaggregated design and JDM model.
Juniper Networks
Established networking vendor; Nexthop offers tight GPU-network integration and customization that incumbents struggle to deliver.
Why this matters: Nexthop represents a new category in infrastructure: custom networking specifically architected for AI at massive scale. Founded by Arista's former COO with a team of 100+ engineers from Google, Amazon, and networking leaders, the company has raised $610M in just 12 months ($4.2B valuation) because hyperscalers face a critical gap—traditional networking vendors cannot deliver the GPU-network integration and customization required for next-generation AI workloads.
Best for: Hyperscale cloud providers building proprietary AI infrastructure who need custom-integrated networking that compresses product development cycles by 6-12 months versus building in-house.
Use cases
GPU Cluster Interconnect Optimization
Nexthop designs custom switches and optical interconnects to maximize throughput between compute nodes in large AI training clusters. Their 50-100+ terabit solutions with 1.6 Tb/port speeds eliminate bottlenecks that generic networking hardware creates for distributed training workloads.
Energy-Efficient Data Center Scaling
The Disaggregated Spine architecture reduces infrastructure energy consumption by 30% compared to traditional hierarchical designs. Hyperscalers deploying thousands of GPUs can achieve substantial OpEx savings through custom air-cooled and liquid-cooled switching options.
Accelerated Product Development via JDM Model
Rather than building networking infrastructure entirely in-house (18-30 month cycles), hyperscalers partner with Nexthop's engineering team to co-develop custom solutions in 6-12 months, compressing time-to-market for new AI services.
Alternatives
In-house custom development Takes 18-30 months; Nexthop's JDM model cuts this to 6-12 months by leveraging deep networking expertise.
Arista + custom integration Uses standardized hardware requiring heavy integration work; Nexthop purpose-builds for AI from the ground up with co-optimized software and optics.
FAQ
What does Nexthop AI do? +
Nexthop AI builds custom networking infrastructure for hyperscale AI data centers. The company designs switching systems, integrates open-source network operating systems (SONiC, FBOSS), and provides optical/electrical interconnect components optimized for AI training and inference workloads. All solutions are custom-built via a JDM model in partnership with cloud providers.
How much does Nexthop AI cost? +
Pricing is not publicly available. Nexthop uses a B2B JDM (Joint Development Manufacturer) model where custom solutions are priced per engagement. Contact Nexthop directly for pricing based on your infrastructure requirements.
What are alternatives to Nexthop AI? +
Alternatives include building networking infrastructure in-house (slower, 18-30 month cycles), using Arista Networks standardized switches with custom integration, or leveraging Broadcom merchant silicon directly. Nexthop's advantage is combining custom hardware, software, and optics in a faster co-development model.
Who uses Nexthop AI? +
Nexthop serves the largest cloud providers (hyperscalers like Google, Amazon, Meta-equivalent companies) deploying large-scale AI clusters. Specific customer names are not disclosed, but the company works exclusively with tier-1 cloud infrastructure operators.
How does Nexthop AI compare to Arista? +
Arista is a traditional networking vendor selling standardized products; Nexthop is purpose-built for AI with custom hardware, software, and optical integration. Nexthop's JDM model accelerates development 6-12 months faster than in-house builds and delivers 30% energy savings versus traditional architectures. Arista focuses on broad enterprise networking; Nexthop focuses exclusively on hyperscaler AI infrastructure.
Tags
networking AI infrastructure data center custom hardware GPU interconnect hyperscalers disaggregated architecture