Summary:
- Nvidia’s DGX Cloud Lepton platform is currently in early access and has already gained support from major companies to make GPUs available for customers.
- Developers can access GPU compute capacity in specific regions for on-demand and long-term computing to support AI operational requirements.
- The platform utilizes Nvidia’s AI software stack to accelerate and simplify the development and deployment of AI applications.
Article:
Nvidia Launches DGX Cloud Lepton Platform for AI Computing
Nvidia has officially launched its DGX Cloud Lepton platform, which is currently in early access but has already garnered support from major companies like CoreWeave, Crusoe, Firmus, and more to make tens of thousands of GPUs available for customers. This move aims to provide developers with access to GPU compute capacity in specific regions for both on-demand and long-term computing, catering to strategic and sovereign AI operational requirements.
The platform is expected to attract leading cloud service providers and GPU marketplaces to participate in the marketplace, expanding its reach and capabilities. Nvidia’s DGX Cloud Lepton platform utilizes the Nvidia AI software stack, including NIM and NeMo microservices, Nvidia Blueprints, and Nvidia Cloud Functions. These tools are designed to accelerate and simplify the development and deployment of AI applications, making it easier for developers to leverage powerful AI capabilities.
In conclusion, Nvidia’s DGX Cloud Lepton platform is set to revolutionize AI computing by providing developers with easy access to GPU compute capacity and a suite of AI software tools. With the support of major companies and a focus on simplifying AI application development, this platform is poised to make a significant impact in the AI industry.