Startups Directory

6 funded startups. Filter by industry or funding round. Updated weekly.

Showing 6 of 6
Company Round Amount Date Industry Location
d-Matrix
d-Matrix develops specialized AI inference accelerator hardware for data centers, featuring the Corsair inference accelerator card, JetStream I/O networking accelerator, and SquadRack rack-scale system. The company uses digital in-memory compute architecture to deliver 3x better cost-performance and 10x faster token generation compared to traditional GPU-based inference solutions. d-Matrix targets the massive AI inference market—where most compute costs occur after model training—by optimizing for real-time latency, energy efficiency, and cost at scale.
Series C $275M 2025-11-12 Artificial Intelligence (AI) United States
Lightmatter
Lightmatter develops photonic processors and interconnects that dramatically reduce energy consumption and latency in AI computing by replacing electrical connections with optical ones. The company's Envise chip combines traditional electronics with integrated photonic elements for AI acceleration, while its Passage interconnect technology enables data to move 100x faster between chips. They serve hyperscale cloud providers and AI data center operators, addressing the critical bottleneck of data movement energy consumption in modern AI workloads.
Series C $155M 2023-12-19 Artificial Intelligence (AI) United States
FuriosaAI
FuriosaAI designs and manufactures high-performance, energy-efficient AI accelerators (NPUs) for data center deployments of large language models, multimodal applications, and computer vision workloads. Their flagship RNGD chip delivers 2.25x better inference performance per watt compared to traditional GPUs while consuming only 180W of power. The company serves enterprise customers and hyperscalers seeking cost-effective alternatives to Nvidia GPUs for AI inference at scale.
Series C $125M 2025-07-31 Artificial Intelligence (AI) South Korea
Hailo
Hailo develops patented silicon AI accelerators that enable efficient deep learning inference on edge devices with minimal power consumption (<3W), eliminating cloud dependency while protecting privacy. The Hailo-8 and Hailo-10H chips integrate all required memory on the processor die, delivering 26+ TOPS of performance while maintaining exceptional power efficiency. Serving automotive, security, retail, industrial, and medical sectors, Hailo has secured over 300 customers including HP, Dell, BMW, Sony, and Bosch.
Series C $120M 2024-04-02 Artificial Intelligence (AI) Israel
Enfabrica Corporation
Enfabrica develops specialized networking chips and systems that solve critical data movement bottlenecks for AI and machine learning workloads at hyperscale. Their Accelerated Compute Fabric (ACF) SuperNIC chip delivers multi-terabit Ethernet connectivity and memory disaggregation, enabling GPUs, CPUs, and memory to communicate 4x faster than competing solutions. Built by former Broadcom and Google infrastructure leaders, the company bridges PCIe/CXL memory semantics with RDMA networking to reduce AI processing costs and improve efficiency.
Series C $115M 2024-11-19 Artificial Intelligence (AI) United States
Recogni
Recogni develops AI inference processors that deliver 10x higher compute density, 10x lower power consumption, and 13x lower cost per query compared to GPU-based solutions. The company applies proprietary logarithmic number system (LNS) technology to enable efficient inference at scale for generative AI and autonomous vehicle applications. Recogni targets cloud data centers and automotive OEMs seeking to reduce the power, cooling, and infrastructure costs of AI deployment.
Series C $102M 2024-02-20 Artificial Intelligence (AI) United States