Get the latest tech news

Inference.ai matches AI workloads with cloud GPU compute


Inference.ai, a startup building a platform that matches AI workloads with available GPU compute, has raised new venture capital.

Inference uses algorithms to match companies’ workloads with GPU resources, Yue says — aiming to take the guesswork out of choosing and acquiring infrastructure. The company claims that — thanks to its algorithmic matching tech and deals with data center operators — it can offer dramatically cheaper GPU compute with better availability than major public cloud providers. The startup recently closed a $4 million round from Cherubic Ventures, Maple VC and Fusion Fund, which Yue says is being put toward build out Inference’s deployment infrastructure.

Get the Android app

Or read this on TechCrunch

Read more on:

Photo of GPU

GPU

Photo of AI workloads

AI workloads

Photo of Inference.ai

Inference.ai

Related news:

News photo

Entry-level GPU RAID card enables mind-bending storage speeds — 80 GB/s of throughput from eight SSDs with SupremeRAID SR-1001

News photo

Apple Vision Pro M2 Chip Said to Have 10 GPU, 8 CPU Cores

News photo

Cooler Master unveils dual-fan GPU cooler prototype