

But Nvidia’s NV20, released in 2001 under the GeForce 3 brand, marked a key turning point, and one that serendipitously opened up new vistas in scientific computing and artificial intelligence. By the late 1990s, graphics chips for the games market had considerably improved from their humble beginnings when displaying say, a handful of colors at a resolution of 300 by 200 pixels was considered impressive. This demand for faster and better graphics had existed since the advent of mainstream video games in the 1970s. Gamers, on the other hand, were interested only in technologies that could generate graphics in real time. Pixar had created Toy Story’s graphics by slowly rendering each frame individually and then stitching it all together.

But gamers drove the technology in a very specific direction. By 1995, movies like Pixar’s Toy Story, the first full-length digitally animated movie, had demonstrated the potential of high-quality computer animation.

Of course, gamers weren’t the only people pushing graphics technology forward. They all have video-game players to thank for the emergence of these workhorse processors: It was gamers who stoked the original demand for chips that could do the massive amounts of parallel number crunching required to produce rich graphics quickly enough to keep up with fast-paced action. Many researchers have co-opted powerful graphics processing units, or GPUs, to run climate models and other scientific programs, while tech and financial giants use large banks of these processors to train machine-learning algorithms.
