News

GeForce RTX With TensorRT-LLM brings generative AI to your PC

×

GeForce RTX With TensorRT-LLM brings generative AI to your PC

Share this article
GeForce RTX With TensorRT-LLM brings generative AI to your PC

The landscape of generative AI has seen significant advancements, with NVIDIA playing a pivotal role in driving this innovation. The introduction of GeForce RTX and NVIDIA RTX GPUs will bring generative AI to over 100 million Windows PCs and workstations, marking a significant trend in personal computing. These GPUs have been instrumental in accelerating AI, with the introduction of TensorRT-LLM for Windows making generative AI on PC up to 4x faster.

GeForce RTX With TensorRT-LLM

Generative AI, which is at the heart of new pipelines of AI and other software, can automatically analyze data and generate a vast array of content. Large Language Models (LLMs) like Llama 2 and Code Llama are central to these developments. The use of TensorRT-LLM accelerates LLM inference, allowing LLMs to operate up to 4x faster on RTX-powered Windows PCs. This acceleration is particularly beneficial when integrating LLM capabilities with other technologies, such as in retrieval-augmented generation (RAG), where an LLM is paired with a vector library or vector database.

NVIDIA has released tools to help developers accelerate their LLMs. These include scripts that optimize custom models with TensorRT-LLM, TensorRT-optimized open-source models, and a developer reference project. These tools have made it easier for developers to implement and optimize these models, thereby contributing to the growth and development of the field.

Generative AI performance enhancements

One of the significant applications of TensorRT acceleration is Stable Diffusion in the popular Web UI by Automatic1111 distribution. This application speeds up the generative AI diffusion model by up to 2x over the previous fastest implementation. Stable Diffusion is a type of diffusion model used in image generation, and the acceleration provided by TensorRT has made it more efficient and faster.

See also  'Thrill of the Fight 2' Brings Multiplayer Boxing Action to Quest Today, Now in Early Access

In addition to accelerating LLMs, NVIDIA has also introduced the RTX Video Super Resolution (VSR) version 1.5 to improve video quality. RTX VSR is a breakthrough in AI pixel processing that improves the quality of streamed video content by reducing or eliminating artifacts caused by video compression. The updated version further improves visual quality, de-artifacts content played in its native resolution, and adds support for RTX GPUs based on the NVIDIA Turing architecture.

Other articles we have written that you may find of interest on the subject of generative AI :

NVIDIA’s software, tools, libraries, and SDKs have helped bring over 400 AI-enabled apps and games to consumers. The company has made TensorRT-optimized open-source models and the RAG demo available on platforms such as ngc.nvidia.com and GitHub. These resources are aimed at helping developers further explore and utilize the capabilities of generative AI and AI acceleration tools.

Looking ahead, NVIDIA plans to make TensorRT-LLM available for download from the NVIDIA Developer website. This move is expected to further democratize access to these advanced tools and promote the development and application of generative AI.

NVIDIA’s advancements in generative AI and AI acceleration tools, particularly through the introduction of GeForce RTX, NVIDIA RTX GPUs, and TensorRT-LLM, have significantly impacted the field. These advancements have not only accelerated AI but also improved everyday PC experiences for all users. As NVIDIA continues to innovate and release new tools, the potential applications and benefits of generative AI and AI acceleration are set to expand even further.

Source & Image : NVIDIA

See also  'World of Warcraft' Mod Brings PC VR Support to the World of Azeroth

Filed Under: Technology News, Top News





Latest aboutworldnews Deals

Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, aboutworldnews may earn an affiliate commission. Learn about our Disclosure Policy.

Leave a Reply

Your email address will not be published. Required fields are marked *