Then you lost your money.I'll put money on the MSRP being 1499.00
Performance per watt is the best GPU metric and this is where RTX 4000 series outshines all over cards.
All 5080 needs to do, is deliver 4090 performance or close, for less.
AMD is not competing in this segment.
Expect 1200 dollars on release.
Eventually, maybe price will drop to 1000 dollars.
AMDs best SKU in Radeon 8000 series will probably barely compete with RTX 5060.
Rumours claim top RDNA4 SKU will get 7900GRE/4070 performance with improved RT performance (compared to RDNA3), so lets hope price is going to be sub 500 dollars.
There is a tiny chance of 5080 being 999 dollars, considering Nvidia uses TSMC 5nm again. However I don't see why Nvidia should settle, as it will sell easily at 1199 dollars.
4090 performance for 500 dollars less? Many people will take that deal.
Then people should consider the TCO, and include power bill, then most savings will vanish over time anyway.For you.
Most people want the most performance for money spent.
Funny, seeing as so many articles on this site are based on rumours.Rumors are nice for some people to parrot and feel better about themselves
Then people should consider the TCO, and include power bill, then most savings will vanish over time anyway.
Everyone in the industry knows performance per watt is the sole best metric to compare GPUs.
If you save 100 bucks but use 200 bucks more on power over the next few years, the joke is on you.
Good efficiency means you can drive chips harder. Which means more performance.Industry and consumer use are very different things. If consumer GPUs were designed and marketed for efficiency, they wouldn't be run at over 1.0 V where they are well out of their efficiency zone. But all of them are aside from those 70W PCIe slot-powered GPUs that have no choice.
Gamers want max FPS and very few care or are even aware of power usage/efficiency.
If you are worried about $200 in electricity over 5 years, you shouldn't be buying $1000+ GPUs, period.Then people should consider the TCO, and include power bill, then most savings will vanish over time anyway.
Everyone in the industry knows performance per watt is the sole best metric to compare GPUs.
If you save 100 bucks but use 200 bucks more on power over the next few years, the joke is on you.
I am not, but the cheapskates buying a GPU are worried about spending 100 or 200 dollars more.If you are worried about $200 in electricity over 5 years, you shouldn't be buying $1000+ GPUs, period.
Good efficiency means you can drive chips harder. Which means more performance.
Having superior performance per watt is ALWAYS BETTER!
If 4000 series did not have good efficiency, clockspeeds would be massively lower, and then performance would be lower as well.
4000 series runs 1 GHz higher than 3000 series due to BETTER EFFICIENCY!
WTF are you trying to argue here? That simply because a GPU is in the top 50 list that automatically means it sold well? There are many AMD models in the top 50 too, does that mean AMD is selling well in your world?Are you blind? 4000 is listed all the time in the top 50 GPU list. Fun fact: 7900XTX did not even make top 50.
I'm not comparing the 4000 series today to the 3000 series today. I'm comparing the 4000 series 2 years after its launch (today) to the 3000 series 2 years after its launch (late 2022), and the 4000 series still loses.The only reason 3000 is higher, is because its a generation older and sold for cheaper + second hand market.
It literally isn't. That is the whole point.4000 is better than 3000 series by far.
No, I'm a 2000 series owner (2060 Super) who is one of the people that Nvidia is incapable of enticing to upgrade, because they have only released garbage in the $300~$400 segment for the past 4 years. The 3060 wasn't enough of an upgrade to warrant it, and the 4060 and 4060 Ti are both awful 8 GB products that I'm not touching with a 10-foot pole.To me, you sound like a silly 3000 series owner not willing to accept that 4000 series is better.
WTF are you trying to argue here? That simply because a GPU is in the top 50 list that automatically means it sold well? There are many AMD models in the top 50 too, does that mean AMD is selling well in your world?
Do you see those numbers to the right side of the GPU models? Do you know what those numbers means? Because it seems you don't. Those numbers are the percentages of people using those GPUs. For the 4000 series, those numbers are as small as the numbers from the 2000 series (another disappointing series) and 3000 series (availability tarnished by the pandemic/crypto), and a fraction of the numbers the 1000 series (that was actually successful) achieved.
I don't know how to make this simpler for you.
I'm not comparing the 4000 series today to the 3000 series today. I'm comparing the 4000 series 2 years after its launch (today) to the 3000 series 2 years after its launch (late 2022), and the 4000 series still loses.
It literally isn't. That is the whole point.
The 4060 was the same price as the 3060, only 15% faster, and has less VRAM.
The 4060 Ti was the same price as the the 3060 Ti and offered no performance improvement whatsoever, and in a few case even a performance reduction due to the smaller memory bus.
The 4070 was 20% faster than the 3070 but also 20% more expensive ($500 vs $600), and matched the 3080 in performance while costing just $100 less, which is better than nothing but still very disappointing after 2 years.
For perspective, the 3070 matches the 2080 Ti for less than half the price ($500 vs $1200). The GTX 1060 matched the GTX 980 for half the price ($250 vs $500) while also having 50% more VRAM. Those are the kinds of gen-on-gen upgrades that people expect, not the wet fart that was the 4000 series.
No, I'm a 2000 series owner (2060 Super) who is one of the people that Nvidia is incapable of enticing to upgrade, because they have only released garbage in the $300~$400 segment for the past 4 years. The 3060 wasn't enough of an upgrade to warrant it, and the 4060 and 4060 Ti are both awful 8 GB products that I'm not touching with a 10-foot pole.
Buddy, this is incredible, I'm baffled. I don't understand how I have explained to you multiple times already that I'm not comparing the 4000 series to the 3000 series today, but that I'm comparing both 2 years away from their initial launch (4000 series today vs 3000 series in october 2022), and you have consistently failed to comprehend this idea every single time.3000 series are built on a trash node, cheap and soon to be 5 years old, obviously marketshare will be bigger, when most pc gamers are cheapskates and buy second hand or wait for sales. Nothing new here.
LMAOMaybe you should start making some money so you are not locked in the 300-400 dollar bracket.
No, it literally doesn't.4060 runs in circles around 2060 and you say 4060 is slow. Oh the irony.
Google "techspot 4060 review" and your mind will be blown. Spoiler: in the 15 games tested, the 4060 averaged 91 FPS while the 2060 Super averaged 72 FPS, making the 4060 only 26% faster. I'm not paying $300 in 2024 for a 8 GB card that is only 26% faster than the 6 year old one that I already own, that is a pathetic improvement.2000 series, and 2060 series especially, is so slow and dated it is not listed in most reviews anymore, so how do you know?
Yeah it does. 4060 literally stomps on 2060 in most new games with ease.Buddy, this is incredible, I'm baffled. I don't understand how I have explained to you multiple times already that I'm not comparing the 4000 series to the 3000 series today, but that I'm comparing both 2 years away from their initial launch (4000 series today vs 3000 series in october 2022), and you have consistently failed to comprehend this idea every single time.
LMAO
So the solution to Nvidia releasing awful value products is to just be a good consoomer and give them even more money? I'd rather save my money for other more fulfilling aspects of my life.
No, it literally doesn't.
Even if it did, I'm not buying an 8 GB GPU in 2024.
Google "techspot 4060 review" and your mind will be blown. Spoiler: in the 15 games tested, the 4060 averaged 91 FPS while the 2060 Super averaged 72 FPS, making the 4060 only 26% faster. I'm not paying $300 in 2024 for a 8 GB card that is only 26% faster than the 6 year old one that I already own, that is a pathetic improvement.
Yeah it does. 4060 literally stomps on 2060 in most new games with ease.
So in 4 years, you still make the same dime and can't afford anything else than low tier GPUs. Impressive.
You should look into AMD cards or step a tier up to 70 series.
![]()
MSI GeForce RTX 4060 Gaming X Review
MSI's GeForce RTX 4060 Gaming X achieves impressive noise levels that are whisper quiet and temperatures are low, too. Unlike other vendors, MSI achieves that with a compact dual-slot card, which ensures it will fit into all cases out there, and PSU requirements are minimal, too.www.techpowerup.com
4060 performs like a RTX 2080 pretty much.
2060 is far behind and on RTX 3050 level.
Why all the yelling? Relax.And you think most PC gamers care for 4K/UHD with RT enabled, why? 99% of PC gamers use 1440p or lower and barely anyone are enabling RT. If they do, they enable upscaling as well in most cases.
Most people praising high amounts of VRAM are AMD users, and AMD users can't use RT to begin with.
See the irony here?
NOT A SINGLE GAME today needs more than 12GB at 4K/UHD maxed out, unless you enable RAY TRACING or PATH TRACING on top, AT NATIVE 4K that is. The only GPU capable of this, is 4090. Most RASTER ONLY games barely use 8GB at 4K ULTRA.
Link me a SINGLE GAME that uses RASTER ONLY that struggle on a 12GB GPU in native 4K. I am waiting. By struggle I want to see LOW MINIMUM FPS not high VRAM USAGE because many game engines today just allocate most of the VRAM, this does not mean its needed. And this is why 4090 can use 16-20GB in some games, that runs fine on 4070 12GB anyway. SIMPLE ALLOCATION.
99% of PC gamers use 1440p or lower.
99% of PC gamers are not enabling RT at native res, if at all.
That is reality.
Why all the yelling? Relax.
If you play at 1440p you don't need a 5080.
Right? Brick wall is what I would say...
It's like talking to a door.
No, you don't. A 4070 is more than enough for 1440p, 4070Super maybe. A 5080 would be overkill, assuming it performs at levels that will be where they are expected to be.If you render games at higher res using DLDSR, yes you do.
Right? Brick wall is what I would say...
No, you don't. A 4070 is more than enough for 1440p, 4070Super maybe. A 5080 would be overkill, assuming it performs at levels that will be where they are expected to be.
If I had a dime for every time a brand new account showed up here making a string of nonsensical comments in the last month or so, I'd have two dimes. Which isn't a lot, but it's funny that it happened twice.Right? Brick wall is what I would say...
No... your post just compared 1 high end card to another for each generation... for an article of this "depth", I'd expect average performance increase of each generation based on techspot's benchmarks over the years...Check my post, it is all there...