Nvidia GeForce RTX 4080 Review: Fast, Expensive & 4K Capable All the Way

dont know how long I've played control, its fun, which is all I care about, I can toggle on invincibility and just mess around with game mechanics, be a psychic demigod.

on the other hand I wouldnt touch warzone cause running around being gunned down by tryhards...isn't fun, but even if I did I'd still turn everything on cause thats the reason I burn cash on this pc of mine, the graphics.
What monitor do u have ?
 
dont know how long I've played control, its fun, which is all I care about, I can toggle on invincibility and just mess around with game mechanics, be a psychic demigod.

on the other hand I wouldnt touch warzone cause running around being gunned down by tryhards...isn't fun, but even if I did I'd still turn everything on cause thats the reason I burn cash on this pc of mine, the graphics.

Yeah, I know a lot of people who do not like online games...
 
no monitor for me, I use a 65" 4k tv
I'm kinda the same, but my 4K TV is only 55". Hell, I got it brand-new at Costco 5-6 years ago for UNDER $500 when the rest of them were over $1000. I think that it's because it's only 60Hz and it's not a "smart" TV. Truth be told, I didn't want a "smart" TV because I'd heard stories of Samsung and LG "smart" TVs spying on their owners and sending information back to Korea. Since I was hooking my main PC up to it anyway, I saw no advantage to getting a "smart" TV anyway. My TV is just a 55" 60Hz panel with a TV Tuner that I never use. :laughing:
 
I'm kinda the same, but my 4K TV is only 55". Hell, I got it brand-new at Costco 5-6 years ago for UNDER $500 when the rest of them were over $1000. I think that it's because it's only 60Hz and it's not a "smart" TV. Truth be told, I didn't want a "smart" TV because I'd heard stories of Samsung and LG "smart" TVs spying on their owners and sending information back to Korea. Since I was hooking my main PC up to it anyway, I saw no advantage to getting a "smart" TV anyway. My TV is just a 55" 60Hz panel with a TV Tuner that I never use. :laughing:
years ago when my ps3 died a guy at work said I should build a pc instead of buying yet another ps3, so I did that, now years, and a few builds later here I am.

sitting on the couch playing ff14 with an xbone controller + chatpad cant be beat, and the pc handles all of the "smart" features I'd want.
 
Do you know how that and FSR works?
They are "cheating" by taking a lower resolution image, upscaling then displaying it at what you think is native resolution.
Why? simple, look at your own answer, to increase FPS and then use that metric (as RT is being abused now) to boast some weird superiority beyond what perhaps the owner can do with it (FPS displayed by the monitor).
I know exactly how it works. Like I said elsewhere in this thread, I only offered some (clearly unwanted) insight into why I purchased the card. The fact remains: I personally want 165 frames on games I play. I could not do that with a 3080. If you take issue with that, that's perfectly fine. Don't buy the card.
 
years ago when my ps3 died a guy at work said I should build a pc instead of buying yet another ps3, so I did that, now years, and a few builds later here I am.

sitting on the couch playing ff14 with an xbone controller + chatpad cant be beat, and the pc handles all of the "smart" features I'd want.
Yup, no "smart" TV is anywhere close to as smart as an actual PC.
 
I know exactly how it works. Like I said elsewhere in this thread, I only offered some (clearly unwanted) insight into why I purchased the card. The fact remains: I personally want 165 frames on games I play. I could not do that with a 3080. If you take issue with that, that's perfectly fine. Don't buy the card.
And there you have the reason why dlss 3 fake frames generated numbers is a thing to “desire”. 🤷🏼‍♂️
 
I know exactly how it works. Like I said elsewhere in this thread, I only offered some (clearly unwanted) insight into why I purchased the card. The fact remains: I personally want 165 frames on games I play. I could not do that with a 3080. If you take issue with that, that's perfectly fine. Don't buy the card.
I'm still curious as to which games you play because I'm still trying to imagine what 165fps looks like. If it's a big PvP game, then I understand completely. You don't have to say (although I don't know why it'd be a secret), I was just asking.

I play single-player FPS/RPG games like Skyrim, The Witcher 2&3), Assassin's Creed: Odyssey and Far Cry (3, 4, 5 and 6). I don't know how much 165fps would help in those games but in e-sports titles, it would absolutely be beneficial.
 
I'm still curious as to which games you play because I'm still trying to imagine what 165fps looks like. If it's a big PvP game, then I understand completely. You don't have to say (although I don't know why it'd be a secret), I was just asking.

I play single-player FPS/RPG games like Skyrim, The Witcher 2&3), Assassin's Creed: Odyssey and Far Cry (3, 4, 5 and 6). I don't know how much 165fps would help in those games but in e-sports titles, it would absolutely be beneficial.
Not a secret. I went to dreamhack this weekend, and didn't really check much. I primarily play Modern Warfare when looking for frames on a single monitor. I otherwise play MS Flight Simulator and F1 on a triple monitor setup. I rotate RPGs in between that.
 
Not a secret. I went to dreamhack this weekend, and didn't really check much. I primarily play Modern Warfare when looking for frames on a single monitor. I otherwise play MS Flight Simulator and F1 on a triple monitor setup. I rotate RPGs in between that.
Well, if it's what you want and you can appreciate it, then I can't fault you for going after it. You're allowed to want what you want. (y) (Y)
 
Well, if it's what you want and you can appreciate it, then I can't fault you for going after it. You're allowed to want what you want. (y) (Y)
That's really all I've been trying to say. My intention was just to discuss the card and why someone might be interested in paying that price for it. Also, like I mentioned early on in the thread, I will (obviously) sell my 3080 and recoup several hundred on the "upgrade" price for the 4080. That's what I consider it. It's an upgrade for which I'll end up paying about $800. Same thing I've done since I first purchased a 2GB 1050ti. If I was starting a build, like I've said, I would absolutely not straight up pay $1,350 for a 4080.
 
I think you misunderstand. I'm saying that I would prefer to NOT use DLSS.
In that, we agree but your answer was not clear and instead said that you just want 165 fps period.

Now personally, I can't see a difference above 60 fps, but I can see microstutters and jaggies, not sure if those are fixed by having a monitor and GPU that pushes 165 fps, being fake or real.

All that I see is that you are dead set on staying with Nvidia, since you already said you will justify buying a 4080 regardless of what both the 7900 XT and 7900xtx has to offer...unless I also read your other comments wrong.
 
In that, we agree but your answer was not clear and instead said that you just want 165 fps period.

Now personally, I can't see a difference above 60 fps, but I can see microstutters and jaggies, not sure if those are fixed by having a monitor and GPU that pushes 165 fps, being fake or real.

All that I see is that you are dead set on staying with Nvidia, since you already said you will justify buying a 4080 regardless of what both the 7900 XT and 7900xtx has to offer...unless I also read your other comments wrong.
Yea, I probably could have been more clear. I have four monitors capable of 165, and I just like to shoot for that. It's a game within a game to me. I enjoy it. I enjoy tinkering with settings for hours on end to push games (and software) as far as I can. That includes overclocking. Overclocking RAM, CPU, GPU, creating various profiles and custom curves, you name it. DLSS, love it or hate it, allows that to happen in games that are particularly tough on a card. But I would certainly prefer not to use it because it does come with its own set of implications.

There is most definitely a perceptible difference (to me anyway) in most games between 60 and, say, 120 and 165. 60 and 75? A little.

Admittedly, yes, I am probably dead set on Nvidia. But that's essentially all I've ever owned. I've been very close a couple of times at picking up an AMD card, but just haven't.
 
That's really all I've been trying to say. My intention was just to discuss the card and why someone might be interested in paying that price for it. Also, like I mentioned early on in the thread, I will (obviously) sell my 3080 and recoup several hundred on the "upgrade" price for the 4080. That's what I consider it. It's an upgrade for which I'll end up paying about $800. Same thing I've done since I first purchased a 2GB 1050ti. If I was starting a build, like I've said, I would absolutely not straight up pay $1,350 for a 4080.
Well that's fair. If you can do it, go for it. Although I would say that if you're satisfied with the RT performance of the 3080, you might consider the 7900 XTX instead. However, if you want the best in RT, there's no question that the RTX 4080 will be better than the 7900 XTX.

However, with the costs being what they are, you might consider just biting the bullet and getting the 4090 because the performance delta between it and the 4080 is like the Grand Canyon.
 
Rtx 4060 400euros maybe
Maybe, but I still wouldn't touch nVidia right now even if I liked them. When so many people just buy their products without a second thought, it's no wonder that their prices are out in space. People need to buy something else for their next two upgrades to give nVidia the message. Otherwise, all the moaning and whining that they do about nVidia pricing is useless. It's like listening to a child who never ties their shoes complain about tripping over their shoelaces. It's just dumb.
 
Maybe, but I still wouldn't touch nVidia right now even if I liked them. When so many people just buy their products without a second thought, it's no wonder that their prices are out in space. People need to buy something else for their next two upgrades to give nVidia the message. Otherwise, all the moaning and whining that they do about nVidia pricing is useless. It's like listening to a child who never ties their shoes complain about tripping over their shoelaces. It's just dumb.
I see your point but AMD has to release cards that challenge nvidia on every feature and right now they dont do that, they may be cheaper and match or outdo them on rasterization but when other features are brought in AMD's cracks start to show, and when someone is about to spend $1000 or more on just a gpu that stuff starts to be considered, even if theyre not going to use them.

I work for a company that builds systems, and as of now clients dont even look at AMD graphics, maybe the new ones will start to change that, we'll see, we dont even lock up the amd cards, no one wants to even steal em.

but were talking systems that cost thousands, and when cash like that is in play people want all the bells and whistles price be damned and nvidia thrives there right now.
 
I see your point but AMD has to release cards that challenge nvidia on every feature and right now they dont do that, they may be cheaper and match or outdo them on rasterization but when other features are brought in AMD's cracks start to show, and when someone is about to spend $1000 or more on just a gpu that stuff starts to be considered, even if theyre not going to use them.

I work for a company that builds systems, and as of now clients dont even look at AMD graphics, maybe the new ones will start to change that, we'll see, we dont even lock up the amd cards, no one wants to even steal em.

but were talking systems that cost thousands, and when cash like that is in play people want all the bells and whistles price be damned and nvidia thrives there right now.
Well sure, and in that case I completely agree that people will pay whatever. BTW, where do you work? I'll happily steal all of the non-locked-up AMD cards from you! :laughing:
 
Right now 6700XT reached $382 where I live. It's 50% better than the 2060 I have now and would be a nice upgrade for me. But I prefer to wait for new cards next spring or summer.
The only thing stopping me now it's the 230W label. Maybe 7000 series will have lower power draw for same performance.

From YT I see some guys undervolting this card under 150W without dropping any performance.

 
Right now 6700XT reached $382 where I live. It's 50% better than the 2060 I have now and would be a nice upgrade for me. But I prefer to wait for new cards next spring or summer.
The only thing stopping me now it's the 230W label. Maybe 7000 series will have lower power draw for same performance.

From YT I see some guys undervolting this card under 150W without dropping any performance.

When upgrading, the most important question is "Do I NEED to upgrade?". If you're gaming along just fine with the RTX 2060, there is literally no point in upgrading. Being able to wait until next summer just tells me that you don't need to upgrade yet and so shouldn't do so.

If I wasn't able to get my hands on an unbranded OEM reference RX 6800 XT, I'd still be using my RX 5700 XT THICC-III because I was gaming along just fine with it. I didn't upgrade because I needed to, I just upgraded because I wanted the first reference Radeon card in decades to not come with a blower. When they all sold out, I was actually disappointed because I wanted to get one and then when I found one on StockX that was literally $500 cheaper than everywhere else, I pulled the trigger.

I also think it's beautiful and not many cards make me think that. To me, video cards are generally "meh"-looking (and I don't care because they go INSIDE the case) but that new reference just looked great. I bought it almost as a work of art more than a video card. It's actually the only time that I've bought a card for reasons other than actually needing it for a certain game (like when I bought two HD 7970s for ARMA III). When I think about it though, I did also want it because I needed 11+GB of VRAM for the HD textures in Far Cry 6 and the 5700 XT only had 8GB. I had already beat the game but was curious. In retrospect, it wasn't worth it. :laughing:

The RTX 2060 is still a very capable card for 1080p gaming so if that's what you do, I'd wager that you'd still be fine over a year from now. Remember that the most common discrete gaming card in the world according to Steam is the old GTX 1060. Your card is FAR more potent than that.
 
Admittedly, yes, I am probably dead set on Nvidia. But that's essentially all I've ever owned. I've been very close a couple of times at picking up an AMD card, but just haven't.
Well, I can tell you from experience that the brand of your video card doesn't make that much difference when it comes to your experience. I've had ATi/AMD and nVidia but I've also had card brands that you've never heard of like Matrox, CirrusLogic and Oak Technologies. They all do the same thing in the way that all PC parts of a type do the same thing.

The difference between a Radeon and a GeForce card is like the difference between a Ryzen and a Core CPU. They're not exactly alike but you're really splitting hairs when trying to tell them apart. I saw a Daniel Owen video that was comparing DLSS 2.0 to FSR 2.0 in the game God of War and the only difference that he could find was how the mist of Kratos' breath looked as it passed over his beard. I ask you, who the hell is going to notice THAT when playing a game? When they have to look at things like that for a difference, it means that there is no difference. The two videos side by side were indistinguishable to me (and I'm sure everyone else too) even though I had his video cranked up to 2160p on my 4K display:
Here's a little thought experiment...
1.) Ask yourself "What CPU and GPU does my phone have?"
2.) Then ask yourself "What CPU and GPU did my previous phones have?"

You probably have no idea (I know that I don't). It could be a Qualcomm CPU and Adreno GPU or it could be a Cortex CPU with an Imagination GPU or vise-versa. The chipset could be the Snapdragon platform or it may be by Exynos. PC parts, just like SmartPhone parts are just computer parts that all do the same things. If you didn't know what you had when you bought it, you wouldn't know from using it.

Now I'm not saying to buy one or the other, that's up to you. I'm just giving you a pearl of wisdom from someone who has been building and gaming on PCs since before nVidia even existed. Daniel Owen wanted an RTX 3080 but gave up trying to get one and "took a chance" on an RX 6800 XT. His hand was forced by the fact that the RTX 3080 was $500 more at the time. Here's his thoughts on it:
Here's his separate full review of the RX 6800 XT:
Again, this is just to explain that while the differences between the two manufacturers are hyped up by the "Tech Press", in reality, the differences are actually insignificant. Don't ever be afraid of a red box because it will work just fine. Now, for your (extremely) high-end desires, sure, only nVidia will satisfy them at the moment but "at the moment" is key. In another generation or two, Radeons will have RT performance that matches or exceeds the RTX 4090. At that point, don't be afraid of Radeon because you will get more for your money.
 
The RTX 2060 is still a very capable card for 1080p gaming so if that's what you do, I'd wager that you'd still be fine over a year from now. Remember that the most common discrete gaming card in the world according to Steam is the old GTX 1060. Your card is FAR more potent than that.
RTX 2060 yes, not the Gigabyte RTX2060 I have...........the cooler it's so bad with only 2*6 mm heatpipes designed for 1650 or 1660 and reused on 2060. From what I know one 6mm heatpipe can handle 40-50W depending on number of fins and radiant surface.
To put that cooler to a 175W+ card it's crazy.

https://www.gigabyte.com/Graphics-Card/GV-N2060OC-6GD-rev-20

The card with original cooling was constant 83C, witch also is the vBIOS limit.
Removed first the plastic backplate and instant drop 5C.
Changed pads and paste and dropped another 6C.
Handmade custom fan shroud and 2*92x25mm fans dropped temp to 70 with Furmark.
Set limit to 73% power ~125W and max 60C and fans just up to 1300rpm.
With original cooling 2800rpm and 83C.
I also had a GTX1060 6Gb 3 fans in wife's rig now, dead silent and no more than 55C.

But yeah I can still live with it for a while since I only play PUBG lately.
 
RTX 2060 yes, not the Gigabyte RTX2060 I have...........the cooler it's so bad with only 2*6 mm heatpipes designed for 1650 or 1660 and reused on 2060. From what I know one 6mm heatpipe can handle 40-50W depending on number of fins and radiant surface.
To put that cooler to a 175W+ card it's crazy.

https://www.gigabyte.com/Graphics-Card/GV-N2060OC-6GD-rev-20

The card with original cooling was constant 83C, witch also is the vBIOS limit.
Removed first the plastic backplate and instant drop 5C.
Changed pads and paste and dropped another 6C.
Handmade custom fan shroud and 2*92x25mm fans dropped temp to 70 with Furmark.
Set limit to 73% power ~125W and max 60C and fans just up to 1300rpm.
With original cooling 2800rpm and 83C.
I also had a GTX1060 6Gb 3 fans in wife's rig now, dead silent and no more than 55C.

But yeah I can still live with it for a while since I only play PUBG lately.
Jeez... I had no idea it was that bad (that version anyway). I'm sorry to hear that. That really sucks.
 
Back