Summary
For years, the AI race has been narrated as a story about GPUs, model breakthroughs, and who can buy more silicon faster. That story is getting obsolete. The limiting factor is turning into electricity, and the unglamorous machinery that moves it from the grid to the GPU without wasting it as heat.
Peak XV’s backing of Indian startup C2i, which has raised $15 million to test a grid to GPU approach to cutting power losses, is a signal that the industry is starting to treat energy efficiency not as a sustainability add on, but as a throughput problem. In a world where data centers are hitting power ceilings, shaving losses can translate into real compute, and real revenue.
Power is the new compute
AI operators can order more accelerators, but they cannot conjure extra megawatts from a constrained local grid. In many markets, the wait is not for hardware deliveries, it is for interconnect approvals, substation upgrades, and long negotiations with utilities that have their own politics and backlog. When the ceiling is power, efficiency becomes the closest thing to a cheat code.
This is why C2i’s premise matters. If you can reduce the losses between the grid intake and the GPUs actually doing work, you effectively unlock capacity inside the same power envelope. That is not an abstract green win. It is a way to run more inference requests, train longer, and keep utilization high without building a new facility or gambling on a fragile supply chain.
A quiet shift in venture logic
Venture capital has loved the visible layers of AI, models, apps, and chips, because they map neatly onto growth narratives. Power infrastructure is slower, messier, and tied to regulation and hardware constraints. Yet the market is being forced to admit that the real moat might be electrical engineering, not another wrapper product.
Peak XV’s bet also hints at a geographic inversion. India is often cast as a demand story for AI, but the next wave may include exportable infrastructure innovations, born from an environment where efficiency is not a virtue signal, it is survival. When your grid is stressed and your margins are tight, waste stops being tolerable.
The uncomfortable implication
There is a reason the industry prefers to talk about clever models rather than power losses. The former flatters our sense of progress, the latter reminds us that AI is still bound to physics, permits, and copper. If the constraint is energy, then the winners are not only the labs with the best researchers, but the operators who can extract more compute from the same watt.
C2i’s approach may succeed or it may run into the brutal realities of integration, reliability, and conservative data center procurement. But the direction is clear. The AI boom is becoming an infrastructure story again, and the most important breakthroughs might look less like magic and more like fewer wasted electrons disappearing into heat.




















