Nvidia's RTX 2060 Was Never Fast Enough for Ray Tracing

The 20 series was an exciting time. Pricing was still reasonable, Ray Tracing was the future and everyone was still high off of how awesome the 1080Ti was. Granted, even the high end cards had a rough time with RT, but it was still fun.

Last time I was that excited over computing was when the G92/8800GT came out. Now we have the 4090, which is no doubt a tech wonder, but with prices this high it's hard to get excited about any of it. Scalpers really ruined the pricing structure for all of us.

It'll be interesting to see how we look back on current gen tech in 5-6 years.
 
"Nvidia were attempting two things: one, to sell gamers on RTX features like ray tracing and DLSS instead of raw performance improvements, and two, to increase GPU prices overall and elevate the mid-range in response to gamers buying more expensive graphics cards."

You think? Let us know when they stop doing this.
 
I'm impressed it does as well as it does. What surprises me is people cry about RT so much. The first gen of hardware T&L cards were completely useless within a year, the first DX11 cards were worthless once DX11 became popular, ece.
 
I'm impressed it does as well as it does. What surprises me is people cry about RT so much. The first gen of hardware T&L cards were completely useless within a year, the first DX11 cards were worthless once DX11 became popular, ece.

I'm surprised anyone still has the courage to defend this crap.
Drawing analogies between past and present is a futile exercise that overlooks the unique nuances of each issue, reducing them to an apples-to-oranges comparison. The only impact of RT has been to inflate costs, waste energy, and spawn a generation of broken games marketed as AAA titles merely for incorporating irrelevant RT tech.

 
I'm surprised anyone still has the courage to defend this crap.
Drawing analogies between past and present is a futile exercise that overlooks the unique nuances of each issue, reducing them to an apples-to-oranges comparison. The only impact of RT has been to inflate costs, waste energy, and spawn a generation of broken games marketed as AAA titles merely for incorporating irrelevant RT tech.
No courage needed friend, jsut stating the obvious.

If you dont like RT, turn it off and enjoy your games, along with being able to squeeze out even more life out of GPUs.
 
The point was never RT, but DLSS for this card.

DLSS is great for longevity and 600+ games has DLSS now. I know people using this crap card still because of DLSS.
 
I had a 2080Ti, and I feel that even there RT was only sufficient to try things out (at 1440p). In my view, to really enjoy RT at 1440p, you need at least 3090-level RT performance (asides from enough VRAM). And even that bar has probably moved recently. Even though, with FG on, my 4070Ti Super can decently enough handle CP77, even with path tracing. And in games like CP77, how heavy the RT burden may be, RT really makes the game world a truly different place. Global illumination alone is a game changer, in my view.
 
2060 did what it needed to do: paved the way raytracing hardware to eventually reach the majority of home gaming setups where before there was none at all. It wasn't intended to be that good, it was just intended to exist, knowing that future models and generations would improve the tech once the foundation was in place.

 
My first modern desktop PC I went with an RTX 2070, this was maybe 6-9 months before the Super refresh which was a tad annoying. It was advertised as a great 1440p or maybe even 4k card, and without RT, at the time it probably could eek out 60fps at 4K in some titles, but it was definitely a 1440p card. With RT, you definitely could not do max settings in most games at native resolution, and even then it would still probably drop below 60. I remember trying Quake 2 RTX and even that would drop below 60. It was still a solid card for the price, I think it had MSRP'd at $500 which is still a bargain for a xx70 card compared to today's prices. RT was just a bit of a novelty that you'd use for screenshots, not really real time gameplay. DLSS was and still is a game changer, it got me a solid 5 years out of it, and probably would have gone further if either the card or my motherboard at the time didn't start having issues with PCIe lane width.
 
I don't have the ability to repurpose the die space wasted on dedicated RT structures into something more useful, let alone fix this generation of broken games. That's the obvious.
You dont need to, you can just play the games with raster, in raster along the current crop of GPUs are total overkill for most games unless you are at 4k. Freaking out over tensor cores ain't worth it. Go play some good games and enjoy your GPU! Or dont, keep complaining, much like the people who used to whine about how we couldnt have six core consumer GPUs for nearly a decade when the rest of us were buying quad core CPUs and playing great games having a great time. The march of technology will never stop, even if you dont like RT its not going away.
 
I did say at the time that the 2060 was unfit for RayTracing. Didn't stop Techspot (and a lot of other tech 'journos') from praising its RT prowess in every subsequent Graphics card review.
Pity it took six years to issue a Mea Culpa.
 
The 20 series was an exciting time. Pricing was still reasonable, Ray Tracing was the future and everyone was still high off of how awesome the 1080Ti was. Granted, even the high end cards had a rough time with RT, but it was still fun.

Last time I was that excited over computing was when the G92/8800GT came out. Now we have the 4090, which is no doubt a tech wonder, but with prices this high it's hard to get excited about any of it. Scalpers really ruined the pricing structure for all of us.

It'll be interesting to see how we look back on current gen tech in 5-6 years.

Turing had zero improvement in price/performance vs Pascal at any level of the stack. For those that had a 1080Ti (about a $700 card), the ONLY upgrade was the 2080Ti at ~$1300. Turing was a bust.
 
The 20 series was an exciting time. Pricing was still reasonable, Ray Tracing was the future and everyone was still high off of how awesome the 1080Ti was. Granted, even the high end cards had a rough time with RT, but it was still fun.
Exciting? Reasonable prices? Fun?

Are we talking about the same launch here because I remember it very differently. 20 series launch was horrible. They were such bad cards. Expensive. Barely any improvement in raster perf over 1080 Ti but massively hiking the prices. VRAM amount either decreased at the same price (1080 Ti vs 2080) or at best stayed the same (1080 Ti vs 2080 Ti). Not to mention that DLSS1 was a vaseline filter. Complete joke and RT took 3 months after launch to arrive in a multiplayer game of all places. The last genre where you want to tank your framerate. Denoising was bad and slow. It was a total mess.
I'm impressed it does as well as it does. What surprises me is people cry about RT so much. The first gen of hardware T&L cards were completely useless within a year, the first DX11 cards were worthless once DX11 became popular, ece.
T&L and DX11 were improvements. Meant to make games run better. Better visuals were a bonus on top of that. RT is just a marketing gimmick meant to increase prices.
2060 did what it needed to do: paved the way raytracing hardware to eventually reach the majority of home gaming setups where before there was none at all. It wasn't intended to be that good, it was just intended to exist, knowing that future models and generations would improve the tech once the foundation was in place.
But did it? Can 4060 players really enable RT in a meaningful way and enjoy a smooth experience? They cant. Everyone is always quick to dunk on AMD for always being one gen behind in RT per and not improving enough, but at the same time ignoring the fact that Nvidia has done nothing to really bring good RT to 60 class cards.
current crop of GPUs are total overkill for most games unless you are at 4k.
Tell that to UE5 players who have a hard time running these games even at lower resolutions without RT at locked 60 and proper framepacing.
More raster performance is always beneficial. People who say that current GPU's are fast enough for raster have no idea what they're talking about.
 
I did say at the time that the 2060 was unfit for RayTracing. Didn't stop Techspot (and a lot of other tech 'journos') from praising its RT prowess in every subsequent Graphics card review.
Pity it took six years to issue a Mea Culpa.

You're absolutely wrong. TechSpot kept saying that the RTX 20 GPUs (and even mid-range 30 series and 40 series) were too slow to use ray tracing.

It's DLSS that went from crap to pretty good.
 
Turing had zero improvement in price/performance vs Pascal at any level of the stack. For those that had a 1080Ti (about a $700 card), the ONLY upgrade was the 2080Ti at ~$1300. Turing was a bust.
To be fair, upgrading every generation has always been a disappointing waste of cash. turing was not unique in this regard.
Exciting? Reasonable prices? Fun?

Are we talking about the same launch here because I remember it very differently. 20 series launch was horrible. They were such bad cards. Expensive. Barely any improvement in raster perf over 1080 Ti but massively hiking the prices. VRAM amount either decreased at the same price (1080 Ti vs 2080) or at best stayed the same (1080 Ti vs 2080 Ti). Not to mention that DLSS1 was a vaseline filter. Complete joke and RT took 3 months after launch to arrive in a multiplayer game of all places. The last genre where you want to tank your framerate. Denoising was bad and slow. It was a total mess.

T&L and DX11 were improvements. Meant to make games run better. Better visuals were a bonus on top of that. RT is just a marketing gimmick meant to increase prices.

But did it? Can 4060 players really enable RT in a meaningful way and enjoy a smooth experience? They cant. Everyone is always quick to dunk on AMD for always being one gen behind in RT per and not improving enough, but at the same time ignoring the fact that Nvidia has done nothing to really bring good RT to 60 class cards.

Tell that to UE5 players who have a hard time running these games even at lower resolutions without RT at locked 60 and proper framepacing.
More raster performance is always beneficial. People who say that current GPU's are fast enough for raster have no idea what they're talking about.
DX11 and hardware T&L were focused on major improvements to tesselationa nd deformation, and rendering lights and shadows, respectively. their primary purpose was not to make things run smoother. Nothing in their promotions mentioned "our main goal is to get more frames then *previous tech*".

UE5 being an unoptimized mess is not the fault of GPUs. No moreso than Crysis being an absolute bear to run meaning that the 8800 ultra was a slow card.

the 4060 is a low end dGPU. low end cards can rarely make full use of their tech. Were you crying about early low end DX11 cards like the GTS 450 not being able to run DX11 rendering effectively? Probably not, because everyone wasnt so cynical back then.
 
Only one game run at 1080p and the rest 720p, nice
So the 2060 is a 720p card.(with RT)

"Enabling RT reflections reduces performance but still maintains a frame rate above 60 FPS. Granted, this is at 1080p"
No, 720p. Stop lying in your reviews, enabling DLSS lowers the resolution.
If you are going to use this tech, then at least say in the article that the resolution has been lowered to 720p and then upscaled.
At least let the people know that Nvidia want you to lower your res to PS3 days so you can play games at 60(maybe) with a power hog tech called RT.
 
The good thing about RT is that I can run games in 1440p high/ultra quality by disabling it with my good old rx6800.

As long as this technology takes up so many resources, the devs have to take into account those who are going to enable it, which leaves more margin for the others.
 
Now we have the 4090, which is no doubt a tech wonder, but with prices this high it's hard to get excited about any of it. Scalpers really ruined the pricing structure for all of us.

It'll be interesting to see how we look back on current gen tech in 5-6 years.

You make that change sound as if it were just scalpers (and maybe the pandemic as we were told by jolly uncle Jensen) and totally not intentionally planned (again, as the aforementioned character pretty much admitted) in what was actually a really grubby fashion.

In 5-6 years... well, much could change. I can see it getting worse before... well it might get better, maybe. Fairly likely though that those of us that haven't been drained out of contention, downgraded or fled to consoles could all end up Nvidia users on the graphics side. That's whether we want to or not by that point. Otoh the job market might be running to capacity with more ppl taking second, third jobs to pay those kind of prices for 4K cards they'll have less time to use...
 
To be fair the thing was as fast as the GTX1080. You would kill these days to have a midrange x060 part as fast as the previous generation x080 wouldn't you?

GTX1080 was a $599 part when new. RTX2060 was $349.

Imagine the RTX4060 was as fast as the RTX3080 for a $250 cost reduction.

Not that long ago that performance jump was normal. The RTX2060 doesn't seem too bad at all in hindsight.
 
To be fair the thing was as fast as the GTX1080. You would kill these days to have a midrange x060 part as fast as the previous generation x080 wouldn't you?

GTX1080 was a $599 part when new. RTX2060 was $349.

Imagine the RTX4060 was as fast as the RTX3080 for a $250 cost reduction.

Not that long ago that performance jump was normal. The RTX2060 doesn't seem too bad at all in hindsight.
Lol and how much are XX80 series now?
 
Exciting? Reasonable prices? Fun?

Are we talking about the same launch here because I remember it very differently. 20 series launch was horrible. They were such bad cards. Expensive. Barely any improvement in raster perf over 1080 Ti but massively hiking the prices. VRAM amount either decreased at the same price (1080 Ti vs 2080) or at best stayed the same (1080 Ti vs 2080 Ti). Not to mention that DLSS1 was a vaseline filter. Complete joke and RT took 3 months after launch to arrive in a multiplayer game of all places. The last genre where you want to tank your framerate. Denoising was bad and slow. It was a total mess.

T&L and DX11 were improvements. Meant to make games run better. Better visuals were a bonus on top of that. RT is just a marketing gimmick meant to increase prices.

But did it? Can 4060 players really enable RT in a meaningful way and enjoy a smooth experience? They cant. Everyone is always quick to dunk on AMD for always being one gen behind in RT per and not improving enough, but at the same time ignoring the fact that Nvidia has done nothing to really bring good RT to 60 class cards.

Tell that to UE5 players who have a hard time running these games even at lower resolutions without RT at locked 60 and proper framepacing.
More raster performance is always beneficial. People who say that current GPU's are fast enough for raster have no idea what they're talking about.
Do you hate fun?
 
Nvidia's RTX 2060 Was Never Fast Enough for Ray Tracing
And yet, as the charts show above, it does a good job of it. What was the point here?

This article seems like spin-doctoring to me.
 
Back