AMD may price the Radeon RX 9070 series to undercut Nvidia's mid-range GPUs

That's just not true. They solved it a few months post release.
I do not have a card to tell otherwise, just some reviews. Now that you say they fixed the issues, I looked for something newer. A review from 2024 (on techpowerup) shows the 7900 GRE consuming 63W on video playback while the 4080 consumes 20W. It also consumes 100W while playing a game locket at 60fps compared to the 4080 that consumes 70W. While not stellar it is not as bad as it was on release, but I believe it is still an area that needs addressing.
 
Last edited:
I suspect that says more about the ***** who wants to buy your card than Nvidia.
Not really. 4090’s are impossible to acquire atm. and they perform better than the 5080 - and comes with 8gb more video memory than the 5080 as well. 5080 prices keeps increasing - with the OC models almost up at the 1600 dollar mark - the 4090 isn’t badly priced at 1800 when it’s the «next best thing» to a card that has passed 3000 dollars now.
It’s a hard pill to swallow to get 20% more performance for double the price
 
AMD's stated intent for this generation is to gain (regain) market share. It is not going to achieve that without an aggressive priced-to-sell strategy. Everyone will be disappointed if it does not.
 
AMD's stated intent for this generation is to gain (regain) market share. It is not going to achieve that without an aggressive priced-to-sell strategy. Everyone will be disappointed if it does not.
I agree. Look at Intel, everybody praised the lower price and the only people who really believe that if it's cheap, it's not good, are those who'd buy Ngreedya anyway. The prices since Ngreedya is king of the hill are absolutely non sense.

AMD needs to do what they did with Ryzen in 2017 if I remember well.
 
That’s because AMD has usually been operating at higher clock speeds than Nvidia. According to the leaked specs - that doesn’t seem to have changed with the 9070
No, it was because the last highend GPUs from AMD used MCMs.
 
No, it was because the last highend GPUs from AMD used MCMs.
Not really? I mean - their cpu's have been MCM chips and has alot lower powerdraw than their intel monolithic counterparts.
AMD has always been a sucker for clocking their cards much higher than Nvidia - which draws more power. Just look at what happens when people clock nvidia cards - just 500mhz on the memory draws 40-50w more power.
Well, 5 series aside of course - there Nvidia has totally gone ballistic on the powerdraw..but historically AMD has been 3-400mhz (or more) on the memory clockspeed - which compensates partially for their lack of raw gpu power.

Not saying it's a terrible thing or anything, they probably have more thermal headroom as their actual gpu chip doesn't get as hot..But it does require more power.

I'm super curious as to where the 9070xt will place itself towards the 5070ti - Nvidia needs some competition in the midrange - and not just for raster, something that will just....deflate some balls in the boardroom of Nvidia :p
 
Back