NVIDIA NIM Revolutionizes Model Deployment, Now Available to Transform World’s Millions of Developers Into Generative AI Developers

  • 150+ Partners Across Every Layer of AI Ecosystem Embedding NIM Inference Microservices to Speed Enterprise AI Application Deployments From Weeks to Minutes
  • NVIDIA Developer Program Members Gain Free Access to NIM for Research, Development and Testing

TAIPEI, Taiwan, June 02, 2024 (GLOBE NEWSWIRE) -- COMPUTEX -- NVIDIA today announced that the world’s 28 million developers can now download NVIDIA NIM™ — inference microservices that provide models as optimized containers — to deploy on clouds, data centers or workstations, giving them the ability to easily build generative AI applications for copilots, chatbots and more, in minutes rather than weeks.

These new generative AI applications are becoming increasingly complex and often utilize multiple models with different capabilities for generating text, images, video, speech and more. NVIDIA NIM dramatically increases developer productivity by providing a simple, standardized way to add generative AI to their applications.

NIM also enables enterprises to maximize their infrastructure investments. For example, running Meta Llama 3-8B in a NIM produces up to 3x more generative AI tokens on accelerated infrastructure than without NIM. This lets enterprises boost efficiency and use the same amount of compute infrastructure to generate more responses.

Nearly 200 technology partners — including Cadence, Cloudera, Cohesity, DataStax, NetApp, Scale AI and Synopsys — are integrating NIM into their platforms to speed generative AI deployments for domain-specific applications, such as copilots, code assistants and digital human avatars. Hugging Face is now offering NIM — starting with Meta Llama 3.

“Every enterprise is looking to add generative AI to its operations, but not every enterprise has a dedicated team of AI researchers,” said Jensen Huang, founder and CEO of NVIDIA. “Integrated into platforms everywhere, accessible to developers everywhere, running everywhere — NVIDIA NIM is helping the technology industry put generative AI in reach for every organization.”

Enterprises can deploy AI applications in production with NIM through the NVIDIA AI Enterprise software platform. Starting next month, members of the NVIDIA Developer Program can access NIM for free for research, development and testing on their preferred infrastructure.

40+ NIM Microservices Power Gen AI Models Across Modalities
NIM containers are pre-built to speed model deployment for GPU-accelerated inference and can include NVIDIA CUDA® software, NVIDIA Triton Inference Server™ and NVIDIA TensorRT™-LLM software.

Over 40 NVIDIA and community models are available to experience as NIM endpoints on ai.nvidia.com, including Databricks DBRX, Google’s open model Gemma, Meta Llama 3, Microsoft Phi-3, Mistral Large, Mixtral 8x22B and Snowflake Arctic.

Developers can now access NVIDIA NIM microservices for Meta Llama 3 models from the Hugging Face AI platform. This lets developers easily access and run the Llama 3 NIM in just a few clicks using Hugging Face Inference Endpoints, powered by NVIDIA GPUs on their preferred cloud.

Enterprises can use NIM to run applications for generating text, images and video, speech and digital humans. With NVIDIA BioNeMo™ NIM microservices for digital biology, researchers can build novel protein structures to accelerate drug discovery.

Dozens of healthcare companies are deploying NIM to power generative AI inference across a range of applications, including surgical planning, digital assistants, drug discovery and clinical trial optimization.

With new NVIDIA ACE NIM microservices, developers can easily build and operate interactive, lifelike digital humans in applications for customer service, telehealth, education, gaming and entertainment.

Hundreds of AI Ecosyst em Partners Embedding NIM
Platform providers including Canonical, Red Hat, Nutanix and VMware (acquired by Broadcom) are supporting NIM on open-source KServe or enterprise solutions. AI application companies Hippocratic AI, Glean, Kinetica and Redis are also deploying NIM to power generative AI inference.

Leading AI tools and MLOps partners — including Amazon SageMaker, Microsoft Azure AI, Dataiku, DataRobot, deepset, Domino Data Lab, LangChain, Llama Index, Replicate, Run.ai, Saturn Cloud, Securiti AI and Weights & Biases — have also embedded NIM into their platforms to enable developers to build and deploy domain-specific generative AI applications with optimized inference.

Global system integrators and service delivery partners Accenture, Deloitte, Infosys, Latentview, Quantiphi, SoftServe, TCS and Wipro have created NIM competencies to help the world’s enterprises quickly develop and deploy production AI strategies.

Enterprises can run NIM-enabled applications virtually anywhere, including on NVIDIA-Certified Systems™ from global infrastructure manufacturers Cisco, Dell Technologies, Hewlett-Packard Enterprise, Lenovo and Supermicro, as well as server manufacturers ASRock Rack, ASUS, GIGABYTE, Ingrasys, Inventec, Pegatron, QCT, Wistron and Wiwynn. NIM microservices have also been integrated into Amazon Web Services, Google Cloud, Azure and Oracle Cloud Infrastructure.

Titans of Industry Amp Up Gener ative AI With NIM
Industry leaders Foxconn, Pegatron, Amdocs, Lowe’s, ServiceNow and Siemens are among the businesses using NIM for generative AI applications in manufacturing, healthcare, financial services, retail, customer service and more:

  • Foxconn — the world’s largest electronics manufacturer — is using NIM in the development of domain-specific LLMs embedded into a variety of internal systems and processes in its AI factories for smart manufacturing, smart cities and smart electric vehicles.
  • Pegatron — a Taiwanese electronics manufacturing company — is leveraging NIM for Project TaME, a Taiwan Mixtral of Experts model designed to advance the development of local LLMs for industries.
  • Amdocs — a leading global provider of software and services to communications and media companies — is using NIM to run a customer billing LLM that significantly lowers the cost of tokens, improves accuracy by up to 30% and reduces latency by 80%, driving near real-time responses.
  • Lowe’s — a FORTUNE ® 50 home improvement company — is using generative AI for a variety of use cases. For example, the retailer is leveraging NVIDIA NIM inference microservices to elevate experiences for associates and customers.
  • ServiceNow — the AI platform for business transformation — announced earlier this year that it was one of the first platform providers to access NIM to enable fast, scalable and more cost-effective LLM development and deployment for its customers. NIM microservices are integrated within the Now AI multimodal model and are available to customers that have ServiceNow’s generative AI experience, Now Assist, installed.
  • Siemens — a global technology company focused on industry, infrastructure, transport and healthcare — is integrating its operational technology with NIM microservices for shop floor AI workloads. It is also building an on-premises version of its Industrial Copilot for Machine Operators using NIM.

1 | 2  Next Page »
Featured Video
Editorial
Jobs
Mechanical Engineer 3 for Lam Research at Fremont, California
Senior Principal Mechanical Engineer for General Dynamics Mission Systems at Canonsburg, Pennsylvania
Mechanical Manufacturing Engineering Manager for Google at Sunnyvale, California
Mechanical Test Engineer, Platforms Infrastructure for Google at Mountain View, California
Mechanical Engineer 2 for Lam Research at Fremont, California
Equipment Engineer, Raxium for Google at Fremont, California
Upcoming Events
Celebrate Manufacturing Excellence at Anaheim Convention Center Anaheim CA - Feb 4 - 6, 2025
3DEXPERIENCE World 2025 at George R. Brown Convention Center Houston TX - Feb 23 - 26, 2025
TIMTOS 2025 at Nangang Exhibition Center Hall 1 & 2 (TaiNEX 1 & 2) TWTC Hall Taipei Taiwan - Mar 3 - 8, 2025
Additive Manufacturing Forum 2025 at Estrel Convention Cente Berlin Germany - Mar 17 - 18, 2025



© 2024 Internet Business Systems, Inc.
670 Aberdeen Way, Milpitas, CA 95035
+1 (408) 882-6554 — Contact Us, or visit our other sites:
AECCafe - Architectural Design and Engineering EDACafe - Electronic Design Automation GISCafe - Geographical Information Services TechJobsCafe - Technical Jobs and Resumes ShareCG - Share Computer Graphic (CG) Animation, 3D Art and 3D Models
  Privacy PolicyAdvertise