4060 Ti Founders Edition performs 4% less than 3070 at 1440p. Techpowerup.Mwahahaha what a joke, I wasn't expecting this from a 4050Tie card to almost beat the last gen 3070.
With half the memory bus, half the PCIe lines and 70% less power, impressive.
Now only if the price was below $250 ($300 adjusted) maybe it will make some sense.
Nvidias x60 series always sell like hotcakes. This is nothing new. AMD barely has any GPUs present in Steam Top 20 GPUs.Only because most PC gamers are still using very old cards. The GTX1060 is still the most popular card on the steam survey. What gamers want is a reasonably priced replacement for much older cards that CAN allow them to play at high settings. I replaced my 1060 with a AMD 6700XT and can run Forza Horizon 5 at 1440p with maxed out/Ultra settings. I couldn't even play AC: Odyssey at 1080p at 30fps without turning settings down with a 1060! The AMD 7600 only having 8gb and a higher power usage (due to node size) is disappointing.
Did you expect to play AC Odyssey maxed out at 1440p on a 7 year old mid-end GPU tho?Only because most PC gamers are still using very old cards. The GTX1060 is still the most popular card on the steam survey. What gamers want is a reasonably priced replacement for much older cards that CAN allow them to play at high settings. I replaced my 1060 with a AMD 6700XT and can run Forza Horizon 5 at 1440p with maxed out/Ultra settings. I couldn't even play AC: Odyssey at 1080p at 30fps without turning settings down with a 1060! The AMD 7600 only having 8gb and a higher power usage (due to node size) is disappointing.
I wanted to say it uses 70% power that the 3070 uses, but I got it wrong4060 Ti Founders Edition performs 4% less than 3070 at 1440p. Techpowerup.
It uses 25-30% less power, not 70%.
It beats 3060 Ti by 9%.
At 1080p tho, 4060 Ti performs 1% less than 3070 and 11% better than 3060 Ti.
Fixed.
16GB version will change nothing and will be pointless. Might as well go 4070 to get 32-35% higher performance.
Did you expect 4060 Ti to beat 4070?yeah, I was hoping we'd be surprised by the 4060ti GPU performance when it wasn't limited by VRAM but they couldn't let it compete with the 4070. I guess that 32MB of cache they put on it doesn't make up for the limited bus width on the memory bus.
Considering relative performance to last gen cards, the 4060ti really seems like 50 series card.
Its not a GPU for me either, haha, in terms of performance per watt tho it looks decent, especially for 1080p and 2560x1080 gamers, it does 1440p decently as wellI wanted to say it uses 70% power that the 3070 uses, but I got it wrong.
It's still an impressive result from a castrated GPU.
Polls like that just don't represent the average PC gamer tho, because these tend to use Nvidia GPUs and more people on hardware forums are using AMD GPUs than the average Steam gamer.Steam HW survey
I have more confidence in ad-hoc polls like
https://www.techpowerup.com/forums/threads/which-generation-is-your-graphics-card.308104/
I am sorry don't know what you are on about, my AMD GPU has been an amazing experience, no matter what game I play, no issues with software or anything, no lack of features here as well(RT included), can we stop promoting NVIDIA or insulting peoples hardware choices(AMD).Polls like that just don't represent the average PC gamer tho, because these tend to use Nvidia GPUs and more people on hardware forums are using AMD GPUs than the average Steam gamer.
Ask any PC gamer that is not into tech/hardware which GPU he has. 9 out of 10 times, it's Nvidia and most often, they had AMD before with a very mixed experience or not ever owned an AMD GPU. This is my experience.
I have used many AMD GPUs earlier, I think they stagnated tho - too expensive, lack of features etc.
It's fine to focus on raster perf only and lack some features when price reflects this.
In some games, AMD GPU is fine, in others, not so fine. With Nvidia, it's just overall a good experience with alot of superior features. Nvidia is the big player. AMD is the small player. The smaller player should always represent much better value, if they has intentions to actually sell anything. However AMDs GPU output, eats away at their CPU and APU output. I think this is the reason why AMD don't really have alot of focus on desktop GPUs.
To be fair, this started with AMD's RDNA2. The emphasis on cache over bus basically means the mid low end cards are going to be badly gimped when attempting to run any games above the target resolution. It is probably cheaper to throw more cache at the problem than to make a chip more complex with a wider bus.Seems Nvidia is really trying to push what should have been a xx50 class card up a tier. I mean, 128-bit bus? PCIe x8? Targeted at 1080p? Barely any performance improvement over its predecessor without DLSS3 plus the same amount of VRAM unless you pay $100 more? Laughable. Even the 4080, if you ask me, should have been a 4070 Ti at best. Then use a further cut-down AD102 die for the 4080.
Define "stutter less" and what it means for you, because that's definitely not true. Unless you are talking about some specific title, an exception.Sure there is. The consoles load & resume much more quickly, stutter less often, [Sony] has better exclusives, and there's less hassles re: game & driver bugs & configuration.
It annoys me that PCs fall behind that way but I stick with mine for keyboard & mouse support, and better modding & multi-tasking.
Ehh? Not insulting anyone, just speaking facts. AMD can't match Nvidia features, the end. If you are only playing popular titles using pure raster and don't care about awesome features that can revamp older games, then sure, AMD is fine, for the most part.I am sorry don't know what you are on about, my AMD GPU has been an amazing experience, no matter what game I play, no issues with software or anything, no lack of features here as well(RT included), can we stop promoting NVIDIA or insulting peoples hardware choices(AMD).
You mean features like hardly ever used in all games RT and fake frame DLSS3, that skipped last gen Nvidia cards so you have to buy the latest series card just to have those gimmicks or features as you call them. I just play games to enjoy them don't buy into "Features". I have both AMD and Nvidia cards, your "Facts" are your opinion, this is just mine.Ehh? Not insulting anyone, just speaking facts. AMD can't match Nvidia features, the end.
I mean that out-of-the-box, the console titles I play tend to immediately work and play smoothly, albeit at perhaps lower absolute performance.Define "stutter less" and what it means for you, because that's definitely not true. Unless you are talking about some specific title, an exception.
The simple answer as I see it: Corporate Hubris and being drunk on Nvidia fanbois who used to shell out as much money as they could, and far more money than they should have, just for bragging rights - IMO.It's really unfathomable what Nvidia is doing here. Why are they pricing these cards so high?
"immediately work and play smoothly"I mean that out-of-the-box, the console titles I play tend to immediately work and play smoothly, albeit at perhaps lower absolute performance.
On PC, I agree that after day 1 and week 1 patches, and after the driver updates, and after you've done some tweaking of the settings, you'll probably find some settings that work well. But I'm counting those periods and that higher fussiness in general against the PC experience, because it's a hassle I don't have as much on the consoles. That's why my overall impression is that my PC glitches more often than my console.
I'll admit its not a fair fight in that I'm demanding much more of my PC, and usually eventually getting it. But it doesn't change my impression that averaging it all together, I experience more stutters, glitches, crashes, "must download patch / update now" BS on the PC.
Data read/writes from/to DRAM is by far the slowest aspect of any part of rendering -- this is why GPUs have so many cores, so that they can continue to work on new instruction threads while previous ones are stalled waiting for data. Adding any additional latency to the memory system is something the likes of AMD, Intel, and Nvidia actively go out their way to avoid doing. Adding any kind of a RAM slot into a graphics card would add some latency, albeit not a huge amount.I wonder if GPU developers ever thought of creating a kind of VRAM slot so we could upgrade it afterwards just like we do with RAM.