Nvidia, a supplier and investor in Lambda, has now become its largest customer. Media reports indicate the company has reached a $1.5 billion agreement with Lambda to lease servers powered by its own GPUs. Analysts say this deal continues Nvidia's strategy of supporting "small cloud service providers," similar to its previous deal with CoreWeave, aimed at strengthening its market competitiveness against traditional cloud giants like Amazon and Google.
Media reports indicate that Lambda, a small AI cloud service provider preparing for an IPO, has recently received strong support from its most important supplier, Nvidia.
Sources familiar with the matter told the media that this summer, Nvidia agreed to lease 10,000 GPU servers powered by Nvidia's own AI chips from Lambda for a four-year period valued at $1.3 billion.
Nvidia also reached another $200 million deal with Lambda to lease 8,000 servers equipped with Nvidia chips, with the exact timing yet to be determined. These contracts make Nvidia Lambda's largest customer to date and lay the groundwork for the company's upcoming initial public offering (IPO).
Media reports indicate that this is the latest example of Nvidia using "circular" financial arrangements to promote its chips in the cloud market and help smaller cloud providers compete with traditional giants like Amazon and Google. This also demonstrates how the capital market in the AI field operates in an "internal cycle": Nvidia acts as a supplier, investor, and customer, supporting numerous "small cloud companies," also known as "neoclouds."
Business model similar to CoreWeave
Lambda's business model involves leasing data center space, deploying servers equipped with Nvidia GPUs, and then leasing these servers to customers. It is unclear what the specific cost of leasing GPU servers from Nvidia is, or how Nvidia will financially record this transaction, as Nvidia is both the buyer and seller and a Lambda shareholder. Another source familiar with the matter told the media that Nvidia's own researchers will also use the GPU servers it leases from Lambda.
In addition to Nvidia, Lambda's other major customers include Amazon and Microsoft, which together contributed half of the company's nearly $114 million in cloud revenue in the second quarter. Notably, Amazon and Microsoft primarily use Lambda's GPU servers for internal purposes, not for customer service on their AWS or Azure platforms.
Lambda expects its cloud revenue to exceed $1 billion in 2026 and $20 billion in 2030, with the goal of securing contracts from major AI developers such as OpenAI, Google, Anthropic, and xAI.
Lambda also projects that its computing capacity will reach nearly 3 gigawatts (GW) by 2030, equivalent to nearly half of the total computing capacity of some of the largest cloud providers today. This compares to just 47 MW in the second quarter of this year. How the company will achieve this growth remains unclear, but a public offering could help it expand its operations through debt financing.
Lambda's business model and high customer concentration are similar to CoreWeave's. The latter is a larger GPU cloud service provider that recently went public and has also received significant backing from Nvidia.
Previously, Lambda primarily signed small, short-term GPU leases. This deal with Nvidia is its largest ever and is likely to boost its market momentum ahead of its IPO in the first half of next year.
Supporting small vendors is a consistent Nvidia strategy.
Nvidia has consistently supported companies willing to use its chips and purchase a wider range of hardware products than traditional cloud giants. For example, Nvidia clashed with Microsoft over GPU server rack design, while Lambda executives internally debated whether to adopt Nvidia's new optical networking technology.
Nvidia also facilitated the rapid rise of CoreWeave. CoreWeave, a company transitioning from crypto mining, signed a nearly identical agreement with Nvidia early in its transformation. This agreement helped CoreWeave secure debt financing and expand its cloud business, gaining market share from traditional cloud vendors.
Nvidia's support for small cloud service providers is intended to protect its core business over the long term. While Nvidia's largest customers remain Microsoft, Amazon, and Google, these tech giants are also developing their own AI chips in an effort to reduce their reliance on Nvidia.
Customer Concentration Risk
Although Amazon and Microsoft purchase far more GPUs for their own data centers than they lease from third parties like Lambda, both companies say their GPU servers are running at near-full capacity around the clock and that data center expansion is lagging behind demand.
Microsoft's contract with Lambda is significantly smaller than its lease agreement with CoreWeave, but Lambda executives stated that the company is in discussions with other potential customers for larger partnerships.
However, Lambda's ability to secure such deals remains questionable, and company executives acknowledged that, like other cloud service providers, Lambda faces challenges with power supply and data center space constraints.
CoreWeave also historically relied on leased data centers, but recently acquired a major power and data center company for $9 billion and plans to build its own sites to reduce costs.
NVIDIA Partnership
Media reports indicate that Lambda executives stated that Nvidia's $1.3 billion GPU leasing agreement, codenamed "Project Comet," will support its emerging cloud computing business, DGX Cloud. Through this platform, Nvidia leases GPUs from cloud providers, which are then sublet to AI developers; Nvidia's own researchers also use the platform.
Analysts suggest that Nvidia values Lambda for multiple reasons, one of which is that Lambda is attracting more customers to Nvidia GPUs. For example, Lambda recently signed a one-year partnership with image generation startup Midjourney to help the company migrate code originally running on Google's AI chips to Nvidia's next-generation Blackwell GPUs.
Lambda executives stated that converting Google AI chip users to Nvidia GPU users has earned the company higher recognition within Nvidia.
Google's TPU (Tensor Processing Unit) chips have become increasingly competitive in the AI field in recent years. Google has also contacted GPU-focused cloud service providers such as CoreWeave, hoping that they will deploy Google chips, and one company has already agreed to cooperate.