Ray Tracing & DLSS with the GeForce RTX 3080

DLSS 2.0 seems nice, I'd like to use my 1440p monitor at 144 fps. Is it any better then Trixx Boost with image Sharpening, probably since it doesn't bug out. At the very least a nice option for us plebs who bought a high res monitor and a mid or low v end card.

Pretty sure most people just want 120 plus fps not rt. I don't hate RT, but I do have a loving infliction with fps.

P.S. where is captain cranky my grammar needs em
 
Last edited:
What I'd like to see is more DLSS support for Ultra-Wide (21:9) and Super-Ultra-Wide (32:9) resolutions.
 
Rubbish update on RT performance. It needed to be a minimum of 100% better than Turing and the RT cores are barely any better at all and the card relies on the brute force of the 3080 and DLSS2 to show decent increases. Each SM core is roughly only 2/3rds as good as the Turing SM cores so it needs 2x as many for an overall 33% uplift.
 
IMO ray tracing is still trush. Barely any visual difference and that's when you're looking for it and makes the games unplayable at 4K. DLSS is a good feature though, AMD needs something like this being the "budget" option most of the time.
 
The only thing I've noticed is my 3080 runs warmer than my 2080Ti did by about 10 degrees C.

It definitely gets higher framerate.
 
Is it just me or does the image with the RTX turn off look better than the one with RTX turned on? Sometimes improvements are an improvement...
 
It's oxymoronic to continue naming the Nvidia cards as "RTX", while it keeps dragging their own performance down.

It may be a feature but not something that is practical. At least for now and a few more generations to come.

What we need is a smooth, continuous 60fps or higher MINIMUM framerate, at any given resolution.
 
Rubbish update on RT performance. It needed to be a minimum of 100% better than Turing and the RT cores are barely any better at all and the card relies on the brute force of the 3080 and DLSS2 to show decent increases. Each SM core is roughly only 2/3rds as good as the Turing SM cores so it needs 2x as many for an overall 33% uplift.
What are you on about? The 3080 is nearly double the performance of the 2080. Look at pretty much the only game that's completely RT core bound, Quake, it goes from an average of 33fps to 65fps...
 
The 2nd person and 1/2 the room is too dark to see in the RTX Ultra picture!

Is this an "improvement" worth throwing (big) money at??
 
Still the only game we see clearly RTX ON is much better than OFF is control. I'm not counting q2rtx, fortnite, minecraft as games with very primitive or obsolete graphics. So a "bonus feature" - yes, but the size of the bonus is ... tiny. To upgrade from 1080 ti the overall perf. uplift is now OK, about 1.7-2X at 1440p/4K, but still we get VRAM size downgrade. To upgrade 2080 ti - not enough value, you get only +10-15% FPS if your 2080 ti is OC. 3080 is capable of 0 OC right now, some even have to remove very small factory OC to get it stable.(boost over 2 GHz is problematic)
 
Why is Techspot using a 3950X? It’s slower than an i5 10600K at games and everybody knows the PCIe4 doesn’t give the cards a boost. They even admit it’s causing limitations at 1440p - the resolution that most people are playing at if they don’t have a 1080p monitor.

I guess they must be big fans of AMD.

 
Why is Techspot using a 3950X? It’s slower than an i5 10600K at games and everybody knows the PCIe4 doesn’t give the cards a boost. They even admit it’s causing limitations at 1440p - the resolution that most people are playing at if they don’t have a 1080p monitor.

I guess they must be big fans of AMD.
They do it to make you seethe on every AMD related thread.
 
Yet more evidence that I'm not at all concerned with raytracing and DLSS. Outside of control there are 0 games where the difference is noticeable. Even nvidia's own screenshots of the likes of fortnight show the difference: There's a tree reflection in the window now, and your FPS is halved. Yay?

All that RT technology and it looks like garbage. I remember the lighting effects of PS2 games being more impressive then RT, and the GPUs without RT cores would be much cheaper to produce and smaller, meaning better yields and fewer shortages.

Would much prefer a greater focus on rasterization performance, maybe better hardware acceleration of things like lighting effects so you GAIN performance instead of losing it for an extra light or some junk.
 
Rubbish update on RT performance. It needed to be a minimum of 100% better than Turing and the RT cores are barely any better at all and the card relies on the brute force of the 3080 and DLSS2 to show decent increases. Each SM core is roughly only 2/3rds as good as the Turing SM cores so it needs 2x as many for an overall 33% uplift.
If you look at Nvidia expose on RayTracing, where they show timers of frame rendering then you will see that game spend only 10% of frame time in RT core - the rest is pure shader processing. So even if you increase RT core performance by a factor of 2x you will only get 5% increase in performance. RayTracing is incredibly heavy on pure shader performance which is why performance scales with shader compute power and not with RT cores.
 
Why is Techspot using a 3950X? It’s slower than an i5 10600K at games and everybody knows the PCIe4 doesn’t give the cards a boost. They even admit it’s causing limitations at 1440p - the resolution that most people are playing at if they don’t have a 1080p monitor.

I guess they must be big fans of AMD.

Because they asked their readers, and ~65K of them (some 85%) voted in favor of using an AMD CPU. Simple, isn't it?

(and yes, an overclocked 10th gen Intel would be faster, especially at 1080p, while 1440p and beyond suffers little from the bottleneck of AMD CPUs, especially on ultra settings. Think of it this way: if you have a high-clocked 10th gen Intel CPU, you will have even more FPS than showed here in the tests)
 
Even nvidia's own screenshots of the likes of fortnight show the difference: There's a tree reflection in the window now, and your FPS is halved. Yay?

I hope this makes you appreciate reflections in real world. Every time you see a reflection in the mirror or window ask yourself: How much faster would our real world be with reflections off !!
 
The only thing I've noticed is my 3080 runs warmer than my 2080Ti did by about 10 degrees C.

It definitely gets higher framerate.

ive had a totally different experience with my 3080. I previously had a evga 2080ti xc ultra and now have a founders edition 3080 and the founders edition 3080 runs significantly cooler and overclocks a lot higher aswell, its the first card I have ever got 2100mhz stable from, 2115mhz in fact before benchmarks will crash and my highest temps are 69c, the 2080ti I had with a heavy overclock (but only 2070mhz max) would easily go into the mid 70's under the same conditions.

what I will say is my setup more than likely compliments the founders new design as im using a lian li 011 case with 9 fans, 3 of which are directly underneath the gpu and ofcourse the 3080 now ejects a large majority of its hot air out of the back of the case unlike my 2080ti which ejected into the case from the side like most third party cards. so yeh its probably a best case scenario for my 3080 but im super impressed with it so far, I read so much about how the 3080 does not overclock but its just not the case for me and temps stay very good.
 
Back