You may notice that I didn’t mention graphics: one of the cornerstones of most PC vs. console debates. It’s also a topic that’s become increasingly popular as AMD and NVIDIA release their new GPUs which some will certainly argue already make the next-generation of gaming look slightly outdated. Maybe that’s true to a degree, but as time goes on, it’s becoming increasingly clear that the future of video game graphics is more often shaped by consoles rather than PCs. Let me explain. There’s no denying that AMD’s RX 6800 XT and NVIDIA’s RTX 3080 will offer increased performance over the GPUs featured in the PS5 and Xbox Series X. The PS5’s AMD GPU is roughly equivalent in power to an RTX 2080 Super and the Xbox Series X’s AMD GPU is about as powerful as an RTX 2080 TI. Given that those are custom GPUs, that’s a little like comparing apples to oranges (or, in our case, AMD cards to NVIDIA cards) but the point here is that the new AMD and NVIDIA cards will be about a generation ahead of what’s inside the Xbox Series X and PS5. The biggest problem with that comparison, though, is that it looks at things from a purely technical point of view. It doesn’t account for variations in individual components, required applications, overclocking, or a hundred other things that separate a custom-built PC from a gaming console. Most importantly, it doesn’t account for the fact that when most studios develop a game, they typically don’t develop it to test the gamers with the most powerful hardware. Just look at the biggest PC exclusives from the last several years. It’s a list that includes names like StarCraft II, World Of Warcraft, Total War: Shogun 2, and DOTA 2. Those are great games, but they’re not exactly technically demanding titles that require a top-of-the-line PC to properly run. At a time when the lines that separate PC and console gaming become thinner, you don’t see a lot of games like Crysis which were specifically designed to test the capabilities of the PC platform. In fact, some of the most technically demanding PC games in recent years (such as Red Dead Redemption 2 and Metro Exodus) are multi-platform titles with PC ports that were carefully designed to offer high-end graphics options that the developers knew most gamers wouldn’t be able to run. Other popular benchmark games like Grand Theft Auto 5 typically require you to install several graphics mods before they can really be used to test the capabilities of a high-end gaming PC. Games like those will certainly continue to exist into the next generation, but the majority of titles that launch simultaneously for PC and consoles will need to account for the limits of consoles more than the capabilities of top-of-the-line gaming PCs. We’re already seeing this happen now with Cyberpunk 2077 which developer CD Projekt Red recently delayed again partially due to the extra time it takes to optimize their game for the PS4 and Xbox One while also accounting for the increased power of the PS5 and Xbox Series X. In that sense, the PC will always be the place where the future of gaming is tested, perfected, and, at times, utilized in such a way that leaves console players dreaming of what they’re missing out on. As we saw in the previous generation with the rise of SSDs and other technology, it will also always be the place that people who want fully-optimized gameplay experiences will go for the smoothest gameplay possible. Yet, at a time when few Triple-A studios are developing demanding PC exclusives with minimum requirements that closely echo the power of a next-gen console, it’s more important than ever to remember that what you’re buying into when you think about buying a high-end GPU is the joy of running modern games as smoothly as possible as well as the pleasure of helping test the future of gaming for whenever that future becomes the new standard.