The major computing cycles: From IBM to NVIDIA, CUDA and the generative AI age

Chris Zeoli
Author
No items found.
In this exploration of computing's evolutionary timeline, you’ll learn the key milestones that have shaped the AI revolution.

At the GPU Technology Conference, NVIDIA CEO Jensen Huang delivered a riveting keynote in which he described the major computing cycles of the past, present and future. It’s highly worth a listen!

Computing has undergone remarkable transformations over the decades, with significant milestones that have shaped the tech landscape. This evolutionary timeline — depicted in these images from NVIDIA — tracks a fascinating journey from the era of IBM's System/360 in 1964 to the sophisticated AI models of 2022, such as OpenAI's ChatGPT.  

We discuss each era below.

Content source: Jensen Huang’s keynote at NVIDIA GTC, March 2024

Mainframe to personal computing: The seeds of change

The IBM System/360 represented the first major cycle of computing. This was an era where computers filled entire rooms and only a privileged few could access them. This centralization of computing power was the norm until the late 20th century.  

By 1995, with the introduction of Windows 95 running on Intel's Pentium processors, computing became personal. PCs democratized access to computing and empowered a generation of creators, developers and businesses.

The acceleration era: GPUs and parallel processing

The rise of graphics processing units (GPUs) marked a pivotal shift in computing capabilities. NVIDIA's CUDA, introduced in 2006, was a groundbreaking innovation that allowed for parallel processing, which made it possible to handle complex calculations more efficiently.  

CUDA became a landmark library for AI developers AI. This technology paved the way for accelerated computing, which bolstered advancements in 3D graphics, scientific computation and eventually, machine learning. These developments led to the current era of AI infrastructure, with powerful computing resources housed in data centers to support the growing demands of AI applications.

AI and deep learning: A new frontier

In 2012, AlexNet's victory in the ImageNet competition, dubbed "First Contact," heralded the deep learning revolution. GPUs became the engine of AI, which drove rapid progress in neural network performance. This led to the concept of AI doubling in capability every six months — a stark contrast to the more gradual pace of earlier computing advancements.

Generative AI: The birth of creative machines

Fast-forward to 2017. The introduction of transformer models like GPT and BERT ushered in a new phase in generative AI. These models showcased an AI's ability to learn from vast amounts of data and generate human-like text, opening up possibilities for applications in language translation, content creation and even code generation. NVIDIA highlights OpenAI’s ChatGPT, the first AI application to reach 100 million users.

RAG—Retrieval-Augmented Generation—has been a significant development in generative AI. This hybrid approach combines the best of generation and retrieval systems, leveraging the vast knowledge embedded in large databases while utilizing the creative prowess of generative models. RAG systems first retrieve information relevant to a query from a dataset and then use this context to generate a coherent and informed response.  

This technique enhances the generative model's outputs with precision and factual accuracy, and overcomes one of the traditional limitations of pure generative systems that can sometimes fabricate details or miss nuances contained in actual data.

The relevance of RAG systems to the broader concept of generative AI cannot be overstated. In an era where the distinction between real and AI-generated content is increasingly blurred, the capacity for AI to draw from real-world information and generate verifiable outputs is invaluable. For instance, in industries such as journalism, research and legal analysis, where factual accuracy is paramount, RAG could represent a leap forward.

Vector databases, with Pinecone leading the charge, have emerged as a crucial component of AI infrastructure. These databases are designed to efficiently store and manage high-dimensional vector data, also known as embeddings. Vector databases are optimized for performing similarity search, which enables rapid retrieval of data points based on their spatial proximity to each other in the high-dimensional space.

This is crucial when dealing with the high volume and dimensionality of data produced by generative AI. Faster, more precise retrieval can significantly enhance the generation process. The growing adoption of vector databases signifies the evolving needs of AI systems. This trend points to a future where the infrastructure supporting AI must be as dynamic and advanced as the algorithms themselves in order to keep pace with the rapid developments in the field.

2022 and beyond: The era of large language models  

In 2022, OpenAI's ChatGPT took the world by storm and signified a maturation in AI capabilities. The sophistication of fine-tuning, alignment and prompt engineering allowed these models to not just understand and generate text but to do so in a way that could be directed and controlled with nuanced instructions. Multi-modal capabilities where AI can interpret and generate sound, images and language, are now becoming the norm.

The industrialization of AI

As we look at the broader implications, it's clear that we're entering what can be described as a new industrial revolution powered by AI.  

The image above suggests a staggering potential market size of $100 trillion, with AI at the helm of future enterprises, acting as co-pilots in businesses, fueling an AI factory model for software development and revolutionizing enterprise IT and data centers.

The road forward into new AI tech advancements

The evolution of computing, from the era of mainframes to the age of generative AI, has been a remarkable journey. Each cycle has built upon the innovations of the previous one, leading us to a present where AI is poised to transform industries on an unprecedented scale.  

As we stand at the precipice of this new industrial revolution, it's clear that the infrastructure supporting AI will be critical. To stay informed about the latest developments in data and infrastructure software, be sure to sign up for my Substack newsletter.

Wing Logo
Thanks for signing up!
Form error, try again.