INFERENCE.NET (KUZCO) - technology, facts, figures.
Inference.net doesn't create hype - it creates a product. The team operates without flashy announcements, but the numbers speak for themselves. The question is: Will you notice this opportunity in time? In this article, based on facts and figures, I will briefly explain the essence of the project and show its true value.
First things first—subscribe to my channel at https://t.me/topimzauspex. That's where I publish all early news about promising projects and will soon drop a guide on "How to connect to Inference.net."
I'll start by saying that I've been involved in the Inference.net project (formerly Kuzco) since early summer 2024, back when it was still called "Kuzco," although its development and launch began in March 2024.
It was 2024, and the AI narrative was rapidly becoming prominent among all other trends. Looking at the pace of neural networks, AI solutions, robotics, and the like, it's hard to keep up with the speed of innovation.
The core technical challenge is that absolutely all neural networks and AI require processing user "requests," which creates a significant problem overall: the demand is immense, but processing resources are limited. Simply put, there's not enough for everyone.
And this is exactly where Inference.net steps in with its solution to the problem of limited computing resources.
Inference.net is a platform that provides developers with fast and easy-to-use APIs for integrating advanced artificial intelligence models, such as DeepSeek R1 and Llama 3.3, into their applications.
Advantages of Inference.net compared to competitors:
- High Performance: The platform offers low latency and high throughput thanks to its GPU-optimized infrastructure. - Cost Efficiency: Inference.net's service pricing is up to 90% lower than other providers (I'll explain why below). - Ease of Integration: APIs are compatible with OpenAI's SDK, enabling quick and easy integration into existing projects with minimal code adjustments.
So, the project's core idea is to utilize unused computing resources from data centers—basically, resources that would otherwise sit idle without being rented out—to handle AI-generated requests. This approach efficiently leverages available capacity, allowing the platform to offer clients competitive pricing for request processing.
Let's pause here! Think about it: how much computing power worldwide is sitting idle, just waiting for any task to come along? The answer: a lot, a tremendous amount.
INFERENCE.NET has successfully completed Epoch1 and is currently running Epoch2. To give you an inside scoop—Epoch3, the final stage, is coming soon.
The essence of each test Epoch is to load as many tasks as possible onto GPU devices and assess the overall system stability under the stress of processing various AI models.
Key network metrics are recorded daily. Below is a screenshot of the latest 24-hour data:
Here’s what these numbers mean:
- 5.2K GPU devices are connected to the Inference.net network, contributing their computing power.
- 4.4K RPM (Requests Per Minute)—the network processes 4,400 AI model requests every minute!
- 9.5B $KZO points were distributed among all participants within the last 24 hours as rewards for contributing their resources.
At the beginning of 2025, I witnessed the network being loaded up to 25K RPM, but after that, it crashed. As a result, the team decided to gradually increase the network’s throughput—starting from 1–2K RPM and scaling up to 4–5K RPM, where generations have been running stably for a couple of weeks without failures.
So, what’s the real gem for us, ordinary users?
The real value lies in the fact that right now, we can participate in the testnet and earn KZO points, which will later be converted into the project's token.
On the screenshot, as an example of point mining, you can see that:
- 14.1K requests were processed by my workers in 24 hours.
- 20.9M $KZO points were mined by my workers in total over the past day.
- 1B $KZO points have been accumulated by me throughout Epoch2.
There’s plenty of confirmation that KZO points will be converted into tokens, especially from the project’s CEO, Sam Hogan. The Token Generation Event (TGE) and listings have been delayed multiple times, but given the goal of building a high-quality product, that seems reasonable. According to the CEO, TGE is now planned for “closer to summer.”
Now that we understand points = tokens, let’s move on to what everyone loves—investments.
To start, the project has officially raised only $500K in funding. Here’s a reference from Cryptorank:
🔗 Cryptorank Kuzco Funding Rounds
And this is where things start to get really interesting. Let’s dig deeper into the investment side.
Check out the TwitterScore of CEO Sam Hogan:
🔗 TwitterScore Sam Hogan
Looking at TwitterScore, one thing becomes clear: Sam Hogan has a massive administrative network. I won’t go into detail about the specific funds and their influence, but just the connection between a16z – Multicoin – Coinbase – Paradigm speaks volumes.
Next, head over to the "About" section on their website:
🔗 Inference.net About Page
And there, you’ll find a very interesting icon…
And there it is—among the investors, we see the same big names that follow the CEO:
- Multicoin Capital
- a16z CSX (Andreessen Horowitz)
- Topology
- Founders, Inc.
- Chaotic Capital
- Frictionless Capital
This confirms that Inference.net has backing from some of the most influential venture capital firms in the space.
BUT the investment amounts are still undisclosed.
Here’s a little alpha straight from a private Discord thread shared by the CEO:
Literal translation:
"We have already announced our funding several times to our Gold workers. We raised ~$11.5M from Multicoin and a16z CSX. This is no longer a secret at this point. We just haven’t made an official announcement.
And to be honest? Announcing a funding round just for the sake of it is pretty lame.
Epoch 2 is going to be a lot of fun 🙂 Many exciting things are coming your way."
Once again—$11,500,000 USD in investments from Multicoin and a16z! 🚀🔥
Yes, it’s all exciting, but as we know, having VC backing ≠ guaranteed profitability.
The only thing I’ll add—besides a few words in a private Discord thread and some logos on the website, there’s no official funding announcement. A small detail, but one I really like.
Why? Because 95% of projects start hyping themselves up before they even secure funding. Here, it’s the opposite.
It’ll be interesting to see how this plays out—the narrative and the problem they’re solving both look extremely bullish. 🚀
Conclusion & Key Takeaways
✅ AI is a trending narrative
✅ A product that solves a real problem—idle computing power utilization
✅ A top-tier team of builders—focused on execution, not hype
✅ Strong administrative leverage—led by CEO Sam Hogan
✅ Backed by top-tier VCs—Multicoin, a16z CSX, and others
Thus, Inference.net looks highly promising, especially considering the issue of idle GPU resources.
What sets it apart is the strategic approach of the team and CEO, who deliberately avoid unnecessary hype and flashy investment announcements, focusing instead on building a product that is already delivering stable results for the second epoch in a row. Meanwhile, we are farming points now, which will later convert into tokens.
The presence of major funds like Multicoin, a16z, Coinbase, and Paradigm, along with over $11.5M in investments, signals strong market confidence in the project's future.
All of this combined makes Inference.net one of the most interesting and strongest AI projects of the past six months. Given the trend, the team, and the investors, the likelihood of success looks very high. I’m ALL-IN!
Now, it's just a matter of closely following its development. But one thing is already clear—overlooking this project would be a big mistake.
Thanks for sticking with me until the end!
And last but not least—if you want to be among the first, subscribe to my channel https://t.me/topimzauspex. That’s where I share insider info on promising projects and the latest updates! 🚀
Best regards, Marson_Kotovi4 🚀
Official Inference.net Sources:
🌐 Website: https://inference.net/
🐦 Twitter (X): https://x.com/kuzco_xyz
💬 Discord: https://discord.gg/ujMtg4J7