Nvidia’s most affordable next-gen AI supercomputer, the Jetson Orin Nano, is here: What it is and how it works

Nvidia has launched its most affordable generative AI supercomputer that integrates a 6-core ARM CPU, providing up to 67 INT8 tops.

listen to the story

Advertisement
Nvidia’s most affordable next-gen AI supercomputer, the Jetson Orin Nano, is here: What it is and how it works
Nvidia’s General AI supercomputer

Nvidia, a global leader in computing and artificial intelligence (AI), has unveiled its most affordable generative AI supercomputer, the Jetson Orion Nano. This compact yet powerful device aims to democratize AI development by offering cutting-edge capabilities at an affordable price. The Nvidia Jetson Orin Nano Super Developer Kit is compact enough to fit in the palm of your hand and offers advanced generative AI capabilities. It also provides performance for a wide range of users, including commercial AI developers, hobbyists, and students. Priced at $249 (approximately Rs 21,000), it is a step towards providing General AI to everyone. Here, we’ll go into detail about the Jetson Orin Nano, how it works, and its potential impact.

Advertisement

What is Jetson Orin Nano?

The Jetson Orin Nano is part of the Jetson family of AI computing platforms from Nvidia. Known for its compact form factor and strong computational capabilities, Jetson devices are widely used in robotics, edge AI, autonomous systems, and embedded applications. However, the Orin Nano is designed for those looking for high-performance AI capabilities at an entry-level price.

Powered by Nvidia’s Ampere architecture, the Jetson Orin Nano integrates a 6-core ARM CPU, delivering 67 INT8 trillion operations per second (TOPS). This level of performance allows the device to handle complex AI workloads including generative AI models, computer vision, natural language processing (NLP) and more.

How does this work?

The Jetson Orin Nano is an AI supercomputer that combines computational power, software tools, and pre-trained models. Software updates to boost 1.7X Generative AI performance will also be available for systems on the Jetson Orin NX and Orin Nano series of modules. The company claimed that this supercomputer offers a 50 percent increase in memory bandwidth up to 102GB/s compared to its predecessor.

At its core, the Orin Nano takes advantage of Nvidia’s Ampere architecture, which is renowned for its ability to accelerate AI and deep learning tasks. The integrated GPU excels in parallel processing, enabling it to train and deploy AI models efficiently. The 6-core ARM CPU handles diverse computational tasks, while the GPU provides the necessary horsepower for demanding AI workloads.

The latest software update for the Jetson Orin Nano Super will enhance generative AI performance for existing Jetson Orin Nano Developer Kit users. Designed for individuals wishing to explore generic AI, robotics, or computer vision, the Jetson Orin Nano Super serves as an ideal platform, as the company claims. As AI evolves from task-specific models to foundational models, it provides an accessible gateway to turning innovative ideas into reality.

LEAVE A REPLY

Please enter your comment!
Please enter your name here