Niv-AI Raises $12M to Fix GPU Power Waste in AI Data Centers
3 min read
A new startup is stepping into one of artificial intelligence’s biggest hidden problems: power inefficiency. As AI systems grow more demanding, the electricity needed to run them is becoming just as critical as the chips themselves—and current infrastructure is struggling to keep up.
Niv-AI, a Tel Aviv-based startup, has officially emerged from stealth mode with $12 million in seed funding. Its mission is simple but ambitious: help data centers use GPU power more efficiently and stop wasting energy that’s already being paid for.
The issue isn’t small. During a recent keynote at Nvidia’s GTC conference, CEO Jensen Huang pointed out how much energy is being lost inside modern AI facilities. According to him, unused power directly translates into lost revenue—a problem that’s only getting worse as AI workloads scale.
Founded in 2024 by CEO Tomer Timor and CTO Edward Kizis, Niv-AI is backed by investors including Glilot Capital, Grove Ventures, Arc VC, Encoded VC, Leap Forward, and Aurora Capital Partners. The company has not disclosed its valuation.
So what exactly is going wrong inside data centers?
When thousands of GPUs work together to train or run AI models, they don’t consume power at a steady rate. Instead, they generate rapid spikes in demand—sometimes within milliseconds—as they switch between tasks and communicate with each other. These sudden surges make it difficult for data centers to draw consistent power from the grid.
To deal with this, operators often rely on temporary energy storage systems or limit GPU usage altogether. Both approaches come at a cost, reducing the overall return on expensive hardware investments.
Niv-AI believes the solution starts with visibility. The company is currently deploying rack-level sensors that track GPU power usage in real time, down to the millisecond. By studying how different AI workloads consume energy, the startup aims to build smarter ways to balance and optimize power use.
Over time, the team plans to develop an AI-powered system that can predict and coordinate power demand across entire data centers. Think of it as a “copilot” for engineers—helping them run more GPUs without overloading the grid.
The company expects to roll out its first operational systems in several U.S. data centers within the next six to eight months.
The timing could work in its favor. Large cloud providers are facing increasing challenges when building new data centers, from land constraints to supply chain issues. That makes solutions that improve efficiency within existing facilities especially appealing.
Niv-AI’s founders see their technology as a missing layer between data centers and the power grid. It’s not just about boosting performance—it’s also about making energy consumption more predictable and responsible.
As Timor explained, the challenge goes both ways: helping data centers get more out of their GPUs while also ensuring they don’t overwhelm the grid at critical moments.
If Niv-AI succeeds, it could unlock significant unused capacity in today’s AI infrastructure—without requiring more power, just smarter use of it.
Also read : Apple’s March Launch Blitz: iPhone 17e, MacBook Neo & More
