A look back at the pre-GPU era

For most people, the history of graphics cards begins with Nvidia and AMD. The reality is very different.
Long before either of these two companies began to dominate the market, and long before GPUs powered modern games, we witnessed the beginnings of computer graphics. To help understand how we got to modern GPUs, we’ll have to go back to a time before the term “graphics card” existed.
HTG Wrapped 2025: 24 days of technology
24 days of our favorite hardware, gadgets and technologies
What happened before graphics cards existed?
For most people, infographics didn’t exist yet.
For us to now be able to discuss all the new GPUs coming out in 2026, someone had to invent computer graphics in the first place – and that’s not the same thing as a GPU, not even close.
Early computers did not display pixel-addressable images, windows, or even graphics. Most results were text-based. Electromechanical teleprinters like the 1963 Model 33 Teletype, which adopted the new ASCII standard, were essentially glorified typewriters, spitting out results onto paper line by line.
They were painfully slow, loud and very literal. There were no visuals to speak of other than the text you ordered the computer to print.
Next come video terminals, called “dumb terminals”. They were basically keyboards connected to screens, but they weren’t computers per se: they were all part of a network connected to a host computer, and the terminal displayed whatever the host sent back. The screen was divided into a fixed grid (usually 80 columns wide) and each small box could display one character from a predefined set. This allowed people to get creative and create very basic ASCII artwork, but all of this was accomplished using characters that were pre-programmed and assigned to specific keys.
The 1960s saw the advent of computer graphics in one form or another.
Although most people were still stuck with text, some researchers were already experimenting with interactive graphics. In 1963, Ivan Sutherland created Sketchpad, a system that allowed users to draw and manipulate line art directly on a screen. It’s quite similar to today’s touchscreens, isn’t it? Sketchpad used a light pen to achieve this.
Interactive infographics were also coming to life in the systems of large companies. IBM began marketing graphics terminals like the 2250 in 1965.
The introduction of the PC accelerated the evolution of graphics
But that still had little to do with graphics cards.
The late 1970s saw the emergence of personal computers. Not as we know them today, but in general. Before that, computers were massive machines that took up entire rooms and were used by businesses; the advent of the PC made them widely available. This brings us to the scope of infographics.
The beginnings of computer graphics gave us low-resolution, often monochrome screens. PCs had limited memory, which meant that programmers had to be clever to achieve anything remotely engaging in terms of graphical output.
Machines like the Radio Shack TRS-80 introduced bitmap-style graphics, but at extremely low resolutions (128 x 48).
Before graphics cards and accelerators, the processor and memory both played important roles in display outputs. Since these early PCs had kilobytes, not megabytes, of RAM, storing full-screen images was expensive and impractical. Graphics were minimal and aggressively reused as needed.
It was a time before any standard image formats, before JPGs, BMPs and PNGs. Software had to store images as raw bitmaps or custom data structures, and compress them like there was no tomorrow.
IBM’s early display standards shaped PC graphics
Apple was also up there with the Macintosh.
IBM was a major player in the early days of personal computing. In 1981, IBM introduced the “PC”, followed by the PC-XT in 1983. The original IBM PC, however, had no graphics power. Most workloads were still text-based, so text clarity and software reliability were the two main concerns.
The IBM PC could be configured with the Monochrome Display Adapter (MDA), which was sharp for its time, and produced high-contrast text without any bitmap graphics. It was used for word processing, databases and spreadsheets.
The Color Graphics Adapter (CGA), also introduced with the IBM PC, eventually added basic color graphics, but of course the color palette was tiny and everything was very low resolution. But hey, at least we had some graphics.
Apple took a different approach with the first Macintoshes, or Macs, as we know them today. Macs treated the screen as a bitmap instead of a pre-programmed character grid, giving the user much more freedom. Apple innovated industries like graphic design and publishing with these choices, and it still remains a benchmark for similar workloads today.
Although IBM wasn’t very generous with its GUIs in those early days, it did something much more important: it helped standardize PC display modes across the IBM-compatible ecosystem. As developers and manufacturers optimized their products to be compatible with IBM as an industry standard, the door was wide open for infographics to finally flourish.
The rise of 2D graphics was a pivotal moment in computing
However, we are still far from today’s GPUs.
In the late 1980s, bitmap graphics modes became common on IBM-compatible PCs, and the advent of Windows led to more software being optimized for a pixel-addressable display. Text modes didn’t disappear overnight, but basic graphics became the norm. And in 1987, IBM’s PS/2 introduced VGA, which became the standard for PCs, and although these ports are officially too old today, at the time they were revolutionary.
VGA eventually allowed personal computers to display semi-realistic images, games, and movies. It also greatly expanded the practical resolution and color options of PCs, introducing the popular 256 color mode.
Although most of the videos were still tiny, heavily compressed, and overall unimpressive by today’s standards, they were still revolutionary for their time; Additionally, VGA eventually became a benchmark that developers could target, which accelerated the graphics revolution. SVGA arrived in the early 1990s and PCs could finally run higher resolutions, up to 1024×768.
However, graphics were still heavily CPU-driven, and dedicated “graphics cards”, as we call them today, were not yet common, although display adapters like CGA did exist. But in the late 1980s and early 1990s, 2D acceleration became more widespread.
Early 2D accelerations were implemented as dedicated hardware on add-on graphics cards (again, they weren’t called that at the time). In the 1990s, they increasingly became 2D engines integrated into consumer VGA/SVGA cards.
These 2D accelerators have done a lot of work to make a PC more responsive. More importantly, they made it clear that computer graphics was a workload that deserved its own dedicated hardware, which ultimately led us to the GPUs we know today.




