Nvidia GeForce RTX 3090 Ti Review: Fast and Dumb

I think you're still missing the point. Doesn't matter how much money you make per month, wasting isn't smart and the 3090 Ti is dumb for all the reasons already outlined.

The first-class airline ticket is yet another really bad example that's completely unrelated. You might not be able to justify the ticket cost, but you are getting a significantly better flight in business or first class opposed to economy. Let me try and provide an example that makes sense and applies to what we're taking about here.

If first class was just an economy ticket which cost 200% more but all you got was 2% more leg room, that would be dumb.
With respect it’s you that’s missing the point. It absolutely does matter how much money you earn. Is the 3090 Ti not the fastest card available? If you’re rich enough youl pay it. It doesn’t matter if it’s only a few percent faster. It’s the best.

Apparently you are blissfully unaware that there are people who can waste $2000 without batting an eyelid. For those people Having the best is worth more to them than $2000. Even if it’s just 2%. We see this all the time. To write these people off as simply stupid is in itself a stupid assumption.
 
Last edited:
I didn’t say that. You’re putting words in my mouth.

Look I get it, you have an irrational jealousy of people that have more money than others and you want to lash out and call them dumb. But here’s a fact, a rich person who earns $50k a month is spending less of their money on a 3090ti than a normal person who earns say $2000 a month that buys a Nintendo switch. Are they dumb? They could buy an older second hand console for half the money of a Nintendo switch? It still plays games!

Presumably you walk past first class on a plane and think to yourself “must only be stupid people in this compartment”?
I didn't put words into your mouth, I followed your logic and used it the way I did, those are not your words.
I didn't call anyone dumb, a smart person can make a dumb decision sometimes. That irrational jealousy of wealthy people...what's that gibberish about? It has nothing to do with what I said.
With respect it’s you that’s missing the point. It absolutely does matter how much money you earn. Is the 3090 Ti not the fastest card available? If you’re rich enough youl pay it. It doesn’t matter if it’s only a few percent faster. It’s the best.

Apparently you are blissfully unaware that there are people who can waste $2000 without batting an eyelid. For those people Having the best is worth more to them than $2000. Even if it’s just 2%. We see this all the time. To write these people off as simply stupid is in itself a stupid assumption.
Dude, we can all understand what we're saying except you and this isn't the first time. Even in this comment, you said there are people who can >>waste<< $2000, you even agree with us without even knowing. Jeff Bezos can burn money to light a fire and won't even know but that doesn't make it a smart decision.
 
Does it include a "Mr. Fusion" micro reactor to power it?

In all seriousness, these new card announcements are getting ridiculous on power demand. I Don't have an issue with 600w if it's used for things like a Cuda Card but a GPU? *****s need to get back to the Performance per Watt thinking that AMD used for the Ryzen designs.

One of the biggest issues I'm already facing is normal daytime temps are already hitting 30+c/90+f where I live (desert climate) and my only cooling comes from an Evap cooler and that means I'm lucky to see temps in the office below 25c/80f during the day.
 
It's official graphics cards are in stock and some are at MSRP my microcenter has the 3080ti xc3 ultra at $1329 and in stock imagine paying $700 more for 10% performance gains for 6 months of bragging rights 😳.
That said the liquid cooled 3080ti and 3090 can give you the same performance as the air cooled 3090ti but with less power requirements and lower cost.

Update don't blink
12gig 3080 is going for $1049 at Amazon

GIGABYTE GeForce RTX 3080 Gaming OC 12G Graphics Card, 3X WINDFORCE Fans, 12GB 384-bit GDDR6X, GV-N3080GAMING OC-12GD Video Card
 
Last edited:
With respect it’s you that’s missing the point. It absolutely does matter how much money you earn. Is the 3090 Ti not the fastest card available? If you’re rich enough youl pay it. It doesn’t matter if it’s only a few percent faster. It’s the best.

Apparently you are blissfully unaware that there are people who can waste $2000 without batting an eyelid. For those people Having the best is worth more to them than $2000. Even if it’s just 2%. We see this all the time. To write these people off as simply stupid is in itself a stupid assumption.

No this is still wrong. Not to sound like a complete wanker but I can comfortably buy multiple 3090 Ti's per week and not bat an eyelid, doesn't mean I'd ever buy one because I'm not a dope with my money.

"For those people Having the best is worth more to them than $2000"

You're right, some people are extremely shallow and try to find meaning in owning the best. Doesn't make those products any less silly and pointless though.
 
It doesn't do it justice, testing such a product without 8K, where the margin should be much higher.

 
It can't game at 8K, which should be self-evident based on the 4K numbers.
It is not evident, because performance doesn't scale linear, going from 4K to 8K. I've seen plenty of benchmarks and some games tested in 8K for RTX3090. Surely the Ti model deserves to be re-tested for those. And since there are no 8K screens capable of more than 60Hz today, there should be a lot of games for which 60fps is achievable @8K.

According to some initial tests...

The Ti does offer about 10% in 4K, and about 25% in 8K. That's quite a difference.
 
Last edited:
It is not evident, because performance doesn't scale linear, going from 4K to 8K. I've seen plenty of benchmarks and some games tested in 8K for RTX3090. Surely the Ti model deserves to be re-tested for those. And since there are no 8K screens capable of more than 60Hz today, there should be a lot of games for which 60fps is achievable @8K.
It at least halves the frame rate when going from 8 million to 33 million pixels, but of course it's more than that. The only time it's playable is either a game that plays at 200+ fps at 4K like Doom, or you're using DLSS so it's upscaling from something like 1440p.

Most games play at 30 fps with 1 lows below 20 fps, not playable, especially when spending $2000+.

I'm also not entirely sure why you believe the margin between the 3090 and 3090 Ti would be different at 8K opposed to 4K.

"The Ti does offer about 10% in 4K, and about 25% in 8K. That's quite a difference."

Absolute garbage, you can't honestly believe a GPU with 2.5% more cores that are clocked 10% higher with 8% more bandwidth could be up to 25% faster under any condition :S
 
Last edited:
I'm also not entirely sure why you believe the margin between the 3090 and 3090 Ti would be different at 8K opposed to 4K. Of course I'd love to see your evidence for this claim.
I've included (above) the initial video for that evidence. We should see a lot more of them in the coming days.
 
I've included (above) the initial video for that evidence. We should see a lot more of them in the coming days.
As I said "Absolute garbage, you can't honestly believe a GPU with 2.5% more cores that are clocked 10% higher with 8% more bandwidth could be up to 25% faster under any condition :S"

You're also grossly misrepresenting the data. Looking at the average frame rate which is shown in those comparisons the margins are as follows.

Watch Dogs Legion 8K, 3090 Ti is 11% faster
Halo Infinite 8K, 3090 Ti is 12% faster
Red Dead Redemption 2 8K, 3090 Ti is 16% faster
Cyberpunk 2077 8K, 3090 Ti is 9% faster

That video isn't providing a 3 run average either. So the 16% margin shown in Red Dead Redemption is likely more like 11-12% after a 3 run average.
 
Last edited:
With respect it’s you that’s missing the point. It absolutely does matter how much money you earn. Is the 3090 Ti not the fastest card available? If you’re rich enough youl pay it. It doesn’t matter if it’s only a few percent faster. It’s the best.

Apparently you are blissfully unaware that there are people who can waste $2000 without batting an eyelid. For those people Having the best is worth more to them than $2000. Even if it’s just 2%. We see this all the time. To write these people off as simply stupid is in itself a stupid assumption.

I waste 2k without batting an eye. It's a dumb purchase. I make dumb purchases all the time. Yes I can afford it but it's still dumb.
 
Well an Nvidia Intel partnership would be perfect as both have released brain dead products this week. Hands up those that are stupid enough to want to get a 3090 Ti and 12900KS thermal heat pump combo for bragging rights.

Nvidia are smoking crack to release this waste of resources. As for 8K gaming, you can't make up that level of BS and keep a straight face.
 
It doesn't do it justice, testing such a product without 8K, where the margin should be much higher.

As Steve said 8K monitors are not thick on the ground.

If testers did all the edge cases - then the time taken would be incredible .
If someone wants to game in 8K - they will - no matter reviews . But no one would sensibly game in 8K over a wide range of AAA titles at high/Ultra settings .

8K is for detail work and production - you need to be real close - and just be looking at a fraction of the screen .
You know those people who master films/videos , colourists etc , lighting directors etc - In my useless opinion after a certain resolution the monitor quality and mastering in a video game is everything - resolution - made be good in need to see details quickly - but again is 8K overkill .
I would rather watch 720p well mastered or 128kbps MP3 likewise well masters - than 8K 182/96 audio that is a hot discordant mess .
I don't do content creation - Steve does - Content creators can spend a lot of time - especially if lots of video shots , different sound sources, various lighting etc - otherwise resulting video will be a jarring mess .
Yeah Nvidia , AMD will push numbers - when really we have our GPUs to convey impact , emotion etc

There is also an elephant n the room - 8K - what game is 8K - what game has 8K assets , how big would Call of Duty be then 2Tb ?

What game do you own that you want to play in 8K ? - if you have an 8K TV it will look the same as 4K from the couch
Your eyes and ears DO NOT see reality - Cartoon makers know this and use lots of tricks that make us see it as more real - ie blurring other planes of focus
 
We need more efficiency not a little bit more horse power and a lot less efficiency. Surely no one wants a card that makes their leccy meter spin like a flipping rocket going off.
 
Yeah pointless cashgrab but it will sell anyway. People needs to accept that some people have tons of money and can buy whatever they want. They can buy 3090 Ti now, upgrade to 4090 in 6 months and throw the 3090 Ti in the trash.

For people chasing high fps using 4K/144Hz in AAA games, having the best GPU at all times makes sense. Next gen GPUs should finally be able to bring cards that can deliver 100+ fps in 4K/UHD. 4090 is expected be ~75% faster than 3090/3090Ti.
 
All those calling this pathetic monstrosity "not dumb" would be singing a different tune (as in "dumber than dumb!!") if it were produced by AMD and not by Ngridia.....
We‘ll find out when the 6950XT is released, depending on how it compares with the 6900XT and other alternatives.
 
700W only for a GPU and if you go for Intel 12, you would need a 1200w PSU, tons of heat and probably needs a much bigger case in exchange for ~10% uplift in performance? Definitely not for me or normal users/gamers. Imo, it's just another luxury product to show off and that's it.
 
We‘ll find out when the 6950XT is released, depending on how it compares with the 6900XT and other alternatives.

Just look at the 6900XT LC edition which already has the memory at 18 Gbps

The chip is the same, because 6900XT already uses the full die, they can up the clocks and watt usage slightly tho

I expect 6950XT to perform around 2-5% better than 6900XT.
5% in 4K/UHD and 2% in 1440p..

This will probably put 6950XT around 3080 Ti performance at 4K/UHD.
 
Last edited:
Just look at the 6900XT LC edition which already has the memory at 18 Gbps

The chip is the same, because 6900XT already uses the full die, they can up the clocks and watt usage slightly tho

I expect 6950XT to perform around 2-5% better than 6900XT.
5% in 4K/UHD and 2% in 1440p...
It really depends on the extra performance vs price and power. If it‘s the same price or maybe $50 more, no problem. If AMD wants a lot more for a slight perf bump, it‘s stupid.
 
It really depends on the extra performance vs price and power. If it‘s the same price or maybe $50 more, no problem. If AMD wants a lot more for a slight perf bump, it‘s stupid.
They should put it at 999 dollars tops, and make 6900XT go EoL because 3080 Ti is still a better card for most people (DLSS/DLDSR + Overall better performance regardless of game (Nvidia performs much better in many early access and lesser known titles for example) + Better Ray Tracing etc.).

We will see more games coming like Metro Exodus EE, that *needs* a RT capable GPU. It is only a matter of time before more and more games will use RT elements.

Maybe they should even price 6950XT at 799 or 849 because 4000 series is coming in less than 6 months and AMD probably won't have anything else this year.
 
Back