AMD vs. NVIDIA. The eternal hot topic giving keyboard warriors an excuse to spend long nights and a significant part of their lives fighting about the technical minutiae of both manufacturers' product output, leaving many hurt feelings and sometimes even dead bodies in its wake. In today's article, and we will look at the whole issue from a broader and significantly less emotionally charged perspective, giving you a chance to form a better picture and then devote your precious time to something more meaningful and hopefully without any casualties. Strap in and join us as we compare AMD and NVIDIA graphics cards in terms of technology and performance.
AMD or NVIDIA?
It all starts and ends with how and what you use your gaming PC for. The resolution, power draw, detail requirements, genres of games played and even the specific games. Do you want orgasmic visuals or raw power? Do you want to enjoy rich in-game worlds or rank high and win gaming tournaments? And do you plan to stream?
Let's begin by explaining the morbid context of our first paragraph. Once upon a time, two friends somewhere in the faraway East were arguing over which company was better, AMD or NVIDIA? Alexander probably couldn't beat Evgeny's killer arguments, so he decided to end the debate once and for all. This is a very hot and sensitive topic, though we are not quite sure why. And if you are wondering what's the point of this article is if you don't feel all that strongly about it all, read on.
For those of you who just want to learn the basics to buy the right graphics card, we shall explain what matters and how and what factors should you pay attention to when making your choice. And for those of you who live for the graphics card wars, we shall supply you with some good ammunition so you can argue verbally to your heart's content.
In today's article, we'll take a look at the current differences between the two manufacturers and the key factors going forward. Although their approach to image processing seems the same at first glance, in recent years and with the latest generation of graphics cards, the ways in which we use our graphics cards for gaming are starting to diverge significantly.
While AMD is sticking with rasterization (traditional image processing), NVIDIA is increasingly starting to use ray tracing in combination with upscaling, using its own AI (artificial intelligence) to do this. And it's the question of ray tracing that has been adding particularly flammable fuel to the fire lately.
Graphics cards of course can do a lot more than just process pretty game pictures. For those of you who stream, the quality of the encoded output and other streaming features like green screen and chroma keying are crucial. Last but not least, the vast majority of creatives also use their graphics card when exporting the final edited video in software such as Adobe Premiere, DaVinci Resolve or Sony Vegas.
The point of this chapter is not to compile a partial list of what a graphics card can do, but to emphasise the fact that NVIDIA and AMD do all of this differently. They have different efficiencies, performance characteristics, and all for a different purchase price. And in the end, what one can do, the other may not be able to do at all. But more on that in the next chapter.
The answer to the question of how an image is actually created in a graphics card is quite crucial for today's article. In this chapter, we'll talk about rasterization, ray tracing, and upscaling, thanks to which we can already experience spectacularly realistic in-game lighting.
But we barely have the power for ray tracing, you might argue. Doesn't matter! We can already look forward to path tracing and NVIDIA PTX graphics cards, not to mention the fountain of internet memes with PTX ON and PTX OFF. Interesting times ahead.
As we've already mentioned, the most well-known and still the most used method of image processing is rasterization. Rasterization works on the principle of triangulation. To give you a more concrete example, think of Lara Croft from the Tomb Raider game series. If you've been gaming for a while, you may remember when Lara was still just a bunch of blocks vaguely resembling a female shape. In contrast, the last few installments— Shadow of the Tomb Raider in particular—have made her look like an actual person.
Unlike in the first game, Lara now resembles a real woman. The trick is the aforementioned triangulation. Triangulation works on the principle of composing an object out of many triangles or polygons. From these triangles, in very simplified terms, a pixel is created that can change colour. The resulting image, in this case Lara, is composed of these pixels. So if you were to look at the initial stage of the image, well before you see the final result, you would see clusters, called polygon networks, across the whole screen. This grid and parts of it are then applied to other layers of the final image, which we unthinkingly perceive as reflections, flares, shadows, ambient shading and other lighting work as we play. This enormously demanding work is designed to render the game scene as faithfully and as close to reality as possible.
So why was Lara a bunch of blocks 25 years ago and now she looks like a real woman? The answer is the rasterization power of graphics cards. This has increased considerably for both AMD and NVIDIA and both manufacturers rasterize a little differently and, more importantly, with different performance results. And this is where our peace and harmony is starting to unravel. With increasing resolution pure rasterization performance is no longer enough, times are changing and ray tracing is finding its place in the gaming sphere.
We already know how a bare object arrives on our screen, and we also know that it's only part of the job we need our graphics card to perform. That leaves a huge number of tasks and a lot of hard work that someone has to do to get the desired results. The aforementioned shadows, reflections and all the lighting magic are simply mimicked during rasterization to make the whole scene look real. This doesn't just cost the graphics card computing power, but mainly the time and work of the game developers who have to manually simulate all these aspects.
How does ray tracing fit into this? Simple. With ray tracing, a huge amount of developer work is rendered unnecessary. Ray tracing is simply a simulation of real photon behaviour. Light, shadows, and other aspects of the in-game scene then behave the way light really does from the perspective of the human eye. However, end users, in other words gamers, often have no idea what this means for them in real terms. In short, the game just looks more real to you overall. The light behaves naturally the way we're used to and our brains are not distracted by the fake reflections we have never seen in the real world.
Our brains notice the irregularities, which is why most people don't appreciate ray tracing at first glance. Things just look "normal". Yet its incredible computational complexity is actually an Achilles' heel of ray tracing. It was only with the latest NVIDIA series that graphics cards started to have relatively decent performance to allow games to benefit from real-time ray tracing. In addition, the latest NVIDIAs are equipped with special computing units that exist exclusively for ray tracing. AMD Radeon are also equipped with these units, but in terms of performance they are more comparable to NVIDIA's RTX 2000 series.
The difference in raw rasterization performance between AMD and NVIDIA starts here and continues in the performance and ray tracing capability itself. However, for mainstream users the real difference lies in how it impacts the developers. For an example, let's imagine a game studio has