Lightmatter

Lightmatter replaces electrical connections with photonic technology for faster, more efficient AI computing.
Series C $850M total Founded 2017 Boston, Massachusetts 139 employees
Lightmatter develops photonic processors and interconnects that dramatically reduce energy consumption and latency in AI computing by replacing electrical connections with optical ones. The company's Envise chip combines traditional electronics with integrated photonic elements for AI acceleration, while its Passage interconnect technology enables data to move 100x faster between chips. They serve hyperscale cloud providers and AI data center operators, addressing the critical bottleneck of data movement energy consumption in modern AI workloads.
Problem solved
Memory access and data movement consume as much or more energy than computation itself in AI workloads, creating a critical bottleneck that limits overall efficiency regardless of computational unit improvements.
Target customer
Hyperscale cloud providers and AI data center operators (Amazon, Google, Meta scale), enterprise companies deploying large-scale AI infrastructure requiring extreme energy efficiency.
Founders
N
Nicholas Harris
CEO & Co-Founder
PhD in EECS from MIT, former R&D engineer at Micron Technologies focused on DRAM and NAND circuits, 87+ patents, Optica Fellow, MIT Technology Review Innovators Under 35 (2021).
D
Darius Bunandar
Co-Founder
Co-founder with background in photonic technology development.
T
Thomas Graham
Co-Founder
Former Google employee and COO, brings enterprise operational and scaling experience.
Funding history
Grant $100K May 9, 2017 Led by Prize/Grant
Series A $33M February 11, 2019 Led by Google Ventures
Series B Unknown May 6, 2021 Led by SIP Global Partners · Lockheed Martin
Series C $154M May 31, 2023 Led by SIP Global · Fidelity Management & Research Company, Viking Global Investors, Google Ventures, HPE Pathfinder
Series C-2 $155M December 19, 2023 Led by Google Ventures · Viking Global Investors
Series D $400M October 11, 2024 Led by T. Rowe Price
Total raised: $850M
Pricing
Not publicly available. Company operates B2B hardware sales model focused on large enterprise deals with cloud and AI data center operators.
Notable customers
Not disclosed. Company is demonstrating technology with major hyperscalers; has pitched to Amazon and Google.
Integrations
Laser systems for powering photonic chips, CMOS semiconductor manufacturing processes.
Tech stack
jQuery Migrate (JavaScript libraries) Alpine.js (JavaScript frameworks) jQuery (JavaScript libraries) Vimeo (Video players) three.js (JavaScript graphics) RSS Open Graph WordPress (Blogs) Google Analytics (Analytics) Twitter Emoji (Font scripts) Nginx (Reverse proxies) PHP (Programming languages) Ubuntu (Operating systems) Google Workspace (Email) MySQL (Databases) Salesforce (CRM) The SEO Framework (SEO) Amazon Web Services (PaaS) Mailgun (Email) WPForms (WordPress plugins)
Website
Competitors
Ayar Labs
Focuses exclusively on optical I/O interconnect solutions without computing capabilities; has secured partnerships with Intel, AMD, and NVIDIA but operates narrower product scope.
Celestial AI
Develops optical interconnect technology with narrower focus than Lightmatter's full-stack approach combining computing and interconnects.
Luminous Computing
Develops photonic AI accelerator chips similar to Envise but operates at earlier stage with less mature technology and smaller funding base ($105M Series A).
Why this matters: Lightmatter represents a fundamental shift in AI infrastructure, moving beyond incremental GPU improvements to rearchitect computing around photonics. With $850M raised and a $4.4B valuation, the company is backed by major institutional investors and deployed by hyperscalers facing existential power and efficiency constraints at AI scale.
Best for: Hyperscale cloud providers and AI infrastructure operators deploying massive AI training and inference workloads where energy efficiency and interconnect bandwidth are critical competitive advantages.
Use cases
Data center AI cluster interconnects
Replacing electrical connections between AI compute nodes with optical interconnects via Passage technology. Reduces data movement latency and energy consumption by up to 100x, enabling more efficient scaling of large language models and AI training clusters across multiple GPUs/TPUs.
High-throughput neural network inference
Using Envise photonic processors to accelerate matrix multiplications in deep learning models. Combines photonic elements for optical computation with traditional CMOS for memory, reducing power consumption while maintaining inference throughput for image recognition and NLP applications.
Energy-efficient AI compute scaling
Replacing CPU/GPU clusters with photonic computing infrastructure to reduce total power consumption of AI data centers. Critical for hyperscalers facing power budget constraints and cooling limitations as AI workloads grow exponentially.
Alternatives
Traditional GPU/CPU clusters Electrical-only architecture with higher data movement overhead; choose when latency and efficiency are less critical than vendor ecosystem and software maturity.
Ayar Labs optical I/O Pure interconnect play without computing capabilities; choose if you want to retain existing compute hardware while optimizing chip-to-chip connections only.
FAQ
What does Lightmatter do? +
Lightmatter develops photonic chips and interconnect systems that dramatically improve energy efficiency and speed in AI computing. The Envise processor accelerates AI workloads using integrated photonics, while Passage interconnects replace electrical connections with optical ones to enable 100x faster data movement between chips in AI data centers.
How much does Lightmatter cost? +
Pricing is not publicly available. Lightmatter operates a B2B enterprise sales model focused on large deals with hyperscale cloud and AI data center operators. Contact directly for pricing and custom deployment options.
What are alternatives to Lightmatter? +
Ayar Labs focuses on optical I/O interconnects only; Luminous Computing and Celestial AI offer narrower photonic solutions. Traditional GPU/CPU clusters remain the dominant alternative for enterprises without extreme efficiency requirements.
Who uses Lightmatter? +
Lightmatter is deployed by hyperscale cloud and AI data center operators including major tech companies. Specific named customers are not publicly disclosed, though the company has demonstrated technology with Amazon, Google, and other hyperscalers.
How does Lightmatter compare to Ayar Labs? +
Lightmatter offers a full-stack solution combining photonic computing (Envise) and photonic interconnects (Passage), while Ayar Labs focuses exclusively on optical I/O connections without computing capabilities. Lightmatter has higher valuation ($4.4B vs $1B) and more funding ($850M vs $155M), though Ayar Labs has secured strategic partnerships with major chip manufacturers.
Tags
photonic computing optical interconnect AI acceleration data center efficiency energy efficiency semiconductor hardware neural networks