The G200 mattered to some degree for a long time, because most x86 servers up until a few years ago would ship a G200 implementation or at least something pretending to be a G200 card as part of their BMC for network KVM.
A lot of GPUs in this list are basically just previous GPU but faster or more RAM. I kind of thought it was going to focus on interesting new architecture innovations.
I think pairing RX 5700 XT with Control as the "defining game" is an interesting choice, considering the facts 1. AMD cards were incapable of RT at the time and 2. Control was basically the first game with a good, comprehensive RT implementation that had a massive positive impact on the graphics.
I remember the main noticeable difference being ray traced reflections. However that was mostly on immovable objects in extremely simple scenes (office building). Old techniques could've gotten 90% there using cubemaps, screen space reflections, and/or rasterized overlays for dynamic objects like player characters. Or maybe just completely rasterize them, since the scenes are so simple and everything is flat surfaces with right angles anyways. Might've looked better even because you don't get issues with shaders written for a rasterized world on objects that are reflected.
Games that heavily advertise raytracing typically don't use traditional techniques properly at all, making it seem like a bigger graphical jump than it really is. You're not comparing to a real baseline.
Overall that was pretty much the poorest way to advertise the new tech. It's much more impressive in situations where traditional techniques struggle (such as reflections in situations with no right angles or irregular surfaces).
The other elephant in the room is the consoles, and even if they're capable of RT they also have to consider the performance capabilities versus visual payoff. As I see it the PC versions of games like Control from studios like Remedy are trailblazers, it's an early implementation (geforce 20 released in 2018, Control was 2019) as the ultra option to shakedown their implementation and start iteration early so future games will benefit, however the baseline is non-RT.
We had the Riva TNT2 in our family computer, so that was fun to see that again, I think it was paired with an AMD K6-2 chip.
One day one of my friends from school wanted to optimize airflow in our computer, and re-did the cabling, but he managed to block the CPU-fan from spinning. I am not sure how, but we didn't realise it for a couple of months.
When I got my own PC, it had an AMD Barton chip, and it allowed me to play Half-Life 2.
Absolute nostalgia fever. About a month ago, I dug up an old desktop in the corner, took the drives out and gave away the machine. It felt like putting a racehorse to pasture: i7-4790k, 1080 Ti. It was my dream machine when I got it. Dual-boot (as we did back in the old days when Proton wasn't here) to Ubuntu, then Elementary, then Arch. By the time I gave it away it wasn't worth the power cost.
And that brought to mind my older dream machine, an 8800 GT from generations past, before which we made do with a Via Unichrome that worked sufficiently enough on the OpenChrome driver that I could edit open software (Freespace only needed a few constants changed) so it would render (though some of the image was smeared and so on I could play!).
I've been running the worst gaming set up I can get away with, which atm is a 3080 10gb, using random DDR3 ram, a budget WD 512gb ssd, and an i5 of the same socket as the i7-4790k that doesn't even support hyperthreading and can't do more than 4 tasks in parallel.
It's absolutely laughable at this point, but I'm unironically looking for a deal on that cpu lmao, it would be a huge upgrade.
The title of site should probably have "for gaming" at the end as it doesn't consider GPUs for compute such as the A100 or the GTX 580 3GB that AlexNet was trained on.
I have fond memories of lending a Voodoo 2 from a friend when I was moving from a 486 to a K6 based system component by component. At that time I was still using my old ISA VGA card, which meant 2D performance was horrible, and I couldn't really watch videos on that thing - but thanks to the Voodoo I could play Unreal Tournament without problems.
The 8800 GT is easily the most impactful GPU in my mind. The combination of that video card with valve's Orange Box was insane value proposition at the time.
I'd put the 5700xt at #2 for being the longest lived GPU I've owned by a very wide margin. It's still in use today.
This brings so many memories. I remember how badly I wanted an GeForce 6800 Sadly, I was never able to justify spending this much money on a GPU. Still holds true, even today.
If I can at least tell myself that our technological achievements come with efficiency gains instead of just apeing power throughput, I can rest a little better
Ah I was just trying to remember the model names last week and this website pops up like magic, weird how the internet works sometimes. The 560 Ti was a dream for teenage me and most of my friends back then, but I must say my Radeon HD 4870 game powered most of my favourite Team Fortress 2 years.
I see it as similar to virtual reality, it was born and grew up with gaming demands and influences, but other disciplines may be more attractive for a mature product
I don't think there's strong evidence of this being an ad. I was surprised to see the Intel Arc A770, a GPU I've never heard of, included on this list. I think it's just that Nvidia has been the dominant force in consumer-level GPUs for a while now.
> I don't think there's strong evidence of this being an ad.
There is strong evidence. Click on the link above. It was posted by a viral marketing company. They even feature the GPU story on their website: https://sheets.works/data-viz
> I was surprised to see the Intel Arc A770, a GPU I've never heard of, included on this list.
Yes, because otherwise the ad would be too obvious.
I think it's a terrible UI - requires 3 different things to see the GPUS: scrolling vertically down to see the Era buttons which then scrolls up and hides the Era buttons even if you have enough vertical screen space, clicking on the Era buttons, clicking < > buttons to see the GPUs of an Era.
I can't remember last time I've seen such a confused design.
At the same time I'd add the S3 ViRGE and the Matrox G200. Both mattered a lot at the time, but not long term.
Released before the Voodoo 1 with glquake and gl support for Tomb Raider.
I remember the main noticeable difference being ray traced reflections. However that was mostly on immovable objects in extremely simple scenes (office building). Old techniques could've gotten 90% there using cubemaps, screen space reflections, and/or rasterized overlays for dynamic objects like player characters. Or maybe just completely rasterize them, since the scenes are so simple and everything is flat surfaces with right angles anyways. Might've looked better even because you don't get issues with shaders written for a rasterized world on objects that are reflected.
Games that heavily advertise raytracing typically don't use traditional techniques properly at all, making it seem like a bigger graphical jump than it really is. You're not comparing to a real baseline.
Overall that was pretty much the poorest way to advertise the new tech. It's much more impressive in situations where traditional techniques struggle (such as reflections in situations with no right angles or irregular surfaces).
One day one of my friends from school wanted to optimize airflow in our computer, and re-did the cabling, but he managed to block the CPU-fan from spinning. I am not sure how, but we didn't realise it for a couple of months.
When I got my own PC, it had an AMD Barton chip, and it allowed me to play Half-Life 2.
And that brought to mind my older dream machine, an 8800 GT from generations past, before which we made do with a Via Unichrome that worked sufficiently enough on the OpenChrome driver that I could edit open software (Freespace only needed a few constants changed) so it would render (though some of the image was smeared and so on I could play!).
I've been running the worst gaming set up I can get away with, which atm is a 3080 10gb, using random DDR3 ram, a budget WD 512gb ssd, and an i5 of the same socket as the i7-4790k that doesn't even support hyperthreading and can't do more than 4 tasks in parallel.
It's absolutely laughable at this point, but I'm unironically looking for a deal on that cpu lmao, it would be a huge upgrade.
https://en.wikipedia.org/wiki/TMS34010
I'd put the 5700xt at #2 for being the longest lived GPU I've owned by a very wide margin. It's still in use today.
also, the gpu did not exist until 1999
looks like this was created for engagement
If I can at least tell myself that our technological achievements come with efficiency gains instead of just apeing power throughput, I can rest a little better
Combined with the color scheme of this site, this might be a cleverly disguised Nvidia ad.
Edit: Clicking through to their main page [1]: yeah, that's definitely an Nvidia ad.
1: https://sheets.works/data-viz/hire
There is strong evidence. Click on the link above. It was posted by a viral marketing company. They even feature the GPU story on their website: https://sheets.works/data-viz
> I was surprised to see the Intel Arc A770, a GPU I've never heard of, included on this list.
Yes, because otherwise the ad would be too obvious.
I can't remember last time I've seen such a confused design.