AMD admits it doesn't have an RTX 4090 competitor

Status
Not open for further replies.
If the RX 7900 series is able to deliver solid performance that's better than the RTX 4080, then AMD would already have a more popular product than Nvidia. Its not as good to run RT titles, but it's not a show stopper either. RTX 4090 is faster, but pointless and priced beyond most gamers' budget.
 
7900XTX and 4080 16GB not launched, tested or reviewed yet. From specs and charts if the 7900XTX is even equal to 4080, then the price would make the difference for who is willing to give them that money. Waiting for reviews on both even knowing I will not buy either.
 
Buying a high end graphics card serves several purposes!
1. It tells lazy game devs that they can keep making poorly optimized games (when Moore's law dies soon, this won't end well).
2. It gets you 300+fps on your 144hz monitor
3. It gives nvidia money.
4. Increasing your energy bill and turning your PC into a small heater (time to buy more expensive cooling! Buy up!)
5. Bragging rights so you can tell everyone how you wasted $1000+ on a GPU nobody needs
6. RTX as a bragging right (even though nobody really needs it or uses it much)
100% worth it

Just get a 4060 ti or 4070 when it's all released.
 
At most we'll see 2.8GHz, but realistically ~2.7GHz is what we'll get from the 7900 AIB cards (and eve this might be too high). The mid range cards are the ones that will get closer to 3GHz.

Really...? That sounds like an opinion and not based on anything you've heard, or discussed...

We know for a fact, that RDNA3 can hit 3GHz... I already hinted as to why AMD releases 7900 series with a normal sized card at $899 & $999... because AIB will take it to a new level.
 
It's true....AMD does not have a single new GPU that requires a dedicated power plant to run.
Such losers....

Buying a high end graphics card serves several purposes!
1. It tells lazy game devs that they can keep making poorly optimized games (when Moore's law dies soon, this won't end well).
2. It gets you 300+fps on your 144hz monitor
3. It gives nvidia money.
4. Increasing your energy bill and turning your PC into a small heater (time to buy more expensive cooling! Buy up!)
5. Bragging rights so you can tell everyone how you wasted $1000+ on a GPU nobody needs
6. RTX as a bragging right (even though nobody really needs it or uses it much)
100% worth it

Just get a 4060 ti or 4070 when it's all released.
Right on the money imo.
Devs are being handed the opportunity to do less because of the extra power.
Buyers are being handed to opportunity to spend more on elec.
Sounds like a great deal...
 
Last edited:
The country is not nearly as bad as you make it sound.... Lose your job almost all states now cover you with medicaid and it pays 100% for almost everything.

Have low income and or can't afford a needed service?

Every single hospital network I've been to has a charity care program to help you pay for or even cover 100% your costs for services.

I had a 70,000k surgery when I was 19 and my health insurance only covered 55k the hospital called afterwards and asked how I would be the difference and I told them I was jobless (as I was for about 6 months to recover} and they called me back 20 min later and said it was all taken care of.

Many people find the few and far between exceptions that slip through the cracks and have a terrible time but for most in this country there are plenty of ways to get taken care of.

Heck if you're dying and go to hospital they MUST service you thru cannot deny you even with nothing for coverage.

I spent a large portion of my life poor in the US and it was still a better life than most would experience in a lot of the rest of the world.
Well, several people I know of tell otherwise.

Health care is / are not only the emergencies. They tell me that more insurances don't pay everything; if you need a medical treatment but it's not an emergency, most times the insurance doesn't cover; a kid had an uncommon disease and the treatment costs around $1.000.000 in the US, the insurance didn't cover it. In my european country another kid had the same disease, the hospital made a requirement by the health department and within a couple of weeks it was approved and the medicament was bought from the US (it is a drug that it is produced in the US) and that department said it costed with transportation around $1.5M (with no costs for the patient). Also everyone that has no job is fully covered and they earn jobless money.

Is that similar in the US? (Honest question)

About the 4090 & 7900xtx:

Nvidia just wants to have the best product and earn the "best" badge. With it it will sell like hotdogs because most gamers that are ok spending more than $1000 care little for how much it costs: they just pay what it takes. And Nvidia knows it.

So above $1000: Nvidia nails it. If the person X spends that amount of money to play games, it cares little how much it heats up our users energy. FPS is what it matters.

500 - $1000: AMD may have the best product but marketing and between gamers, if the 4090 is the best card, then a lot of people will go Team green for the midrange even if the product is not so good for the price. So Nvidia will nail a lot of sales if not most.

< $500 : no-one (from Nvidia and AMD) cares really as they get much less money per card and most gamers will go above that line.
 
That doesn't mean they don't have anything faster than a 7900XTX all it means is this card is a competitor for the 4080 16GB. That is a mighty big assumption based on what was said.

Well, given that the NAVI31 chip is maxxed out, I'd say that's exactly the case. Its not like the ADA which still has room to grow into a RTX 4090Ti.
 
Really...? That sounds like an opinion and not based on anything you've heard, or discussed...

We know for a fact, that RDNA3 can hit 3GHz... I already hinted as to why AMD releases 7900 series with a normal sized card at $899 & $999... because AIB will take it to a new level.
Not really.There are rumours that say the preliminary OC results (with very early drivers) are around 3% and it's why the launch was delayed. They are using the extra time for better driver optimisations which will hopefully give AIBs more headroom.


We also have this rumour floating around:

We'll just have to wait for Steve to give us the real facts :)
 
If the halo gpu/product was the only reason for nvidia rabid cult members insane base numbers, AMD would release one.

The problem is, AMD doesnt bribe sites, writers and Tubers like nvidia does because even though they are a corporation, they seems to held themselves to a higher moral code, as you might want to believe (at least I do).

Those influencers have done so much damage to the hobby that now the mindless ones only demand sky high fps, yet if you look at Steam hardware survey, they can only afford 3060’s.

Every damned video review is the same, which is with a giant RTX gpu in frame, regardless of the video subject and when doing reviews AMD is only included or mentioned to prop nvidia even more.

With this article, AMD hasnt released a gpu in the last 3 gens that ever competed with the halo nvidia gpu, yet the comparison always comes up.

Better yet, those amd gpus had ended providing and in some cases, beaten nvidia halo gpu in rasterization which is still the primary tech used in over 99% of existing games.

The issue is the gimmick called RT, which is now shoved down our throats 24/7 by the same influencers by command of their nvidia marketing team overlords.

In my opinion, RT is a useless gimmick until at least 5 more years and maybe then, we might have some real benefits for it, but at an insane hardware cost

AMD may not bribe sites that do reviews because it's pretty hard to twist facts when it comes to performance cards and AMD just never has those.

AMD absolutely is all about paying for gorilla marketing on social media websites like Reddit to grow sycophants that will sing AMD's virtues no matter what.

There's no way around it, the 4090 is insane and is very efficient on a per frame basis and actually very affordable at a dollar to frame basis as well, but most of us simply don't want to spend more than $800 on a card.

I am one of those people and yeah I did just buy a 6800XT. AMD is doing some things right, but it took Nvidia to mess around with trying to scam people with that 4080 10gb model for me to be willing to go back to bad driver land of AMD and do without RT, which looks absolutely amazing in a lot of games when you see it in person.

Just don't be fooled or pretend like AMD does not have a huge marketing budget where they pay bots or shills to talk up AMD and **** on Nvidia no matter what. You could possibly be one yourself as you have to be psychotic to think AMD isn't absolutely shady with marketing or they will not gouge on price when they have better chips like the previous Ryzen generation. AMD will stoop just as low when given the opportunity.
 
Not really.There are rumours that say the preliminary OC results (with very early drivers) are around 3% and it's why the launch was delayed. They are using the extra time for better driver optimisations which will hopefully give AIBs more headroom.


We also have this rumour floating around:

We'll just have to wait for Steve to give us the real facts :)

That misinformation was already debunked, because post press release it was "slipped" live on camera that AIB are free to exploit what was engineered to be a 3GHz die...

AMD's own stock cards are 2.4GHz... and AIBs get to play with another 500+ Hz. AMD can bask in the glory (while never Officially claim they can beat, or compete with rtx4090)... wile AIBs match or beat stock 4090 in several games.

RX 7900 XTX starting point of $999 (stock)... leaves a massive field for AIBs to chase the $1,599 RTX4090...!


The entire RTX 4080 is dead in the water @ $1,299. When the $899 RX 7900 beats it in 90% of the games.
 
Last edited:
I think that this was a smart decision by AMD. It's like they deliberately kept the reference models as vanilla as possible which is actually a great thing.

Good for Users:
The lower power draw means that users don't need to incur the cost of upgrading their PSU because they only need 2 standard connectors which pretty much all PSUs made in the last 20 years already have. The lower power draw also means less heat so users don't have to incur the cost (and work) of upgrading their cases because the card doesn't need some colossal exotic cooler (unlike some other cards). This means that a maximum number of gamers can use the reference model as a quick and easy drop-in upgrade. That's a very attractive quality to most users.

Good for AIBs:
Keeping the reference card as vanilla as possible also means that for users who want more performance, the AIB partners can make models with higher power and performance levels. The AIB models will cost more but in this case, they'll actually be worth it which is a great thing for the AIBs. As was demonstrated by EVGA, a video card AIB cannot be profitable if the GPU manufacturer is competing with them directly. In this case, the AIB partners won't be limited to selling reference models in the beginning. The reference model will be there but the AIB partners can release their upscale custom models from day one. This makes AIB partners very happy and it might tempt EVGA to consider making Radeon cards.

The "lower" (than nVidia) MSRPs are more of a signal to me that AMD is going back to the normal pricing of video cards. When I say normal, I mean that the prices on video cards had been more or less stable from 1987-2018 with only $300 of MSRP "price creep" over those 31 years for the most expensive cards in the world (I skipped over the 90s because there were so many brands at the time battling it out):
1987: ATi EGA Wonder (first card with 16 colours): $400
2009: nVidia GTX 295 (dual-GPU monster): $500
2016: nVidia GTX 1080 (Pascal domination): $600
2017: nVidia GTX 1080 Ti (Prince of Pascal): $700
2018: nVidia RTX 2080 (Enter RTX): $700

Inflation was counterbalanced by the fact that tech drastically loses value over time. There is no question that not only did GeForce and Radeon survive during this period, they thrived. A thriving company is not strapped for cash which means that their profit levels were more than acceptable as they kept on advancing the technology at near breakneck speed. They were both doing just fine.

An MSRP price creep of $300 over 31 years is an average of 81¢ per week. It has been only 4 years since the top gaming card in the world was $700 and yet here we are with a card that's $900 more expensive than the top card FOUR years ago. That's a weekly MSRP price creep of $4.33 per week which is 24x greater than it had been between 1987 and 2018.

The only possible cause for any of this would be nVidia's greed and the fools who are willing to pay these insane prices. If people had self-control, they'd have looked at the price of a card like that and said "Are they out of their damn minds? I'm not going to pay that much for a f*cking video card!". The problem is that the most recent mining craze had us all fleeced. I don't believe that gamers were the target of scalpers (although I don't think that they cared either), I believe that miners were. We gamers were just caught in the crossfire.

The fact that many people actually bought the RTX 4090 makes me question the sanity of people today because their inability to control themselves hurts them and everyone else as well. Sure, during the mining craze, several of us had no choice but to pay through the nose if we wanted a new card but to allow yourself to get bent over and sodomized when cards are cheap and plentiful makes stupidity look like genius.
 
In Russia electricity is cheap and abundant. No restrictions. The heaters here use hot water a plant send. We don't use electricity or gas for it. Winters are hard, of course.
The US, being a first-world country, has very little clue about using hot-water heat and its increased efficiency over blowing hot air over everything and expecting miracles in energy efficiency. That's not to say that radiant flooring and boilers do not exist, just that they are considered "too expensive" and the majority of residential heating would require substantial amounts of money to retrofit to something that is much more efficient. Besides, the US, being a First-world country could care less about energy efficiency and the cost-savings over time since it has all kinds of money to pi$$ away on "hot air".
 
You could possibly be one yourself as you have to be psychotic
Sadly for you, you are mistaken for the reasons why I prefer and defend AMD, which I have posted many times before.
hey will not gouge on price when they have better chips like the previous Ryzen generation.
Last I remember, they went up because of TSMC, which they dont have the same control as Intel does over their own fab. Which gives them the opportunity to sell cheaper than AMD. Also remember, Intel is selling CPU's based on old tech, so its cheap for them and they have already show us what they do when on top.

AMD will stoop just as low when given the opportunity.
Funny enough, they kind of have that opportunity, specially with consoles, yet everyone that works with them keeps coming back happily, yet everyone that has dared working with nvidia ended with a giant knife in their backs...
 
That doesn't mean they don't have anything faster than a 7900XTX all it means is this card is a competitor for the 4080 16GB. That is a mighty big assumption based on what was said.
It's funny you say that because Jim from AdoredTV released a new video in which he talked about that very thing. It's quite interesting and as usual, he's 100% on-point:
I would even take it a step further and say that even the 4080 and the 7900 XTX are also unnecessary for non-professional use: I'm fine with keeping them but consumers shouldn't consider them unless they want to do something like develop games or leverage them for compute task part time (As time savings are worth the diminishing returns of price vs performance)
I couldn't agree more. The RX 6900 XT was only 9% faster than the RX 6800 XT while also being 54% more expensive. I truly believe that the only reason it sold well was that at the time, people were just happy to get their hands on anything because stock was non-existent for pretty much all cards.
But for most people? You can max out a reasonable high refresh rate (1080p at 200hz or 1440p at 144hz) with the 4070 and 7800 class cards in the near future. Specially with all the new features like DLSS 2 and 3 and FSR 2 and 3
Yep, that's why ATi always referred to their level-8 cards (HD 4870, RX 380, RX 6800 XT) as "Enthusiast-Tier". The level-8s are the highest sensible tier while the level-9s are just insane (HD 5970, HD 6990, RX 6900 XT).

The AMD FX-series CPUs were similar in that way as the FX-8350 was a decent CPU that was really nicely-priced ($170 CAD) while the FX-9590 was just insanely-priced ($1000USD) but only 13.5% faster than the FX-8350 in Cinebench R15. People don't seem to realise that halo products are meant more as a demo of a company's capabilities than a viable product for consumers. These pie-in-the-sky products are relatively rare and their price-to-performance metric is in the toilet. As you say, if you're a professional and will use that product to make more money, then it eventually pays for itself and price becomes irrelevant (to a degree). That's why miners were willing to buy whatever was out there, including halo cards. I can guarantee you that miners wouldn't want the RTX 4090 though because the efficiency is terrible and it wouldn't be profitable.
Yes you can squeeze out more performance but you shouldn't have to increase the power limits: the true gains should be that now you can get 3090/6900 levels of performance with the 4070/7800 class of cards for less power and a better launch price. Anything else is almost always forced through with unreasonable power and cooling requirements which push the entire stack forward and the price premium becomes apparent not only on those power and cooling requirements but every single other component around the PC that would be needed to take advantage of the super high end GPU: super high end CPUs and cooling for those, super high end motherboards, super fast ram all become necessary so now we're talking 100% or worst price increases for maybe 30 to 40% performance at the best of cases and basically undiscernible levels of fidelity and refresh rates: adjust settings down a bit to get most of the way there without paying literally double once you put together the full system.
Well yeah, just like the FX-9590, a lot of these products are just high-bin silicon that have had the snot overclocked out of them.
It's really an unsustainable market and these companies rarely admit the market is maintained by midrange offerings instead, otherwise they'd be out of business.
I couldn't agree more. ATi's bread-and-butter has always been levels 6, 7 and 8 for gamers while levels 3, 4 and 5 are their bread-and-butter for OEMs. People who go off on how they MUST have the halo products are either people to whom money is only paper or clueless noobs who teeter on the brink of bankruptcy for a feeling of self-worth.
I agree but also disagree with your statement. I’m very interested in the high end tier because I don’t want to have to upgrade my gpu for 6 years or so.
The thing is, even an AMD level-8 card would give you those six years. There's no way that spending 54% extra to get only 9% extra performance would make much difference in the longevity of a card. You're always better off just getting what will do what you want it to and upgrade every three years or so. This is because it's not just about the performance, it's also about efficiency and the adoption of new standards. The other nice thing is that it ends up being LESS expensive than if you blew all that money on a halo product.

You know, people say that AM5 motherboards are freakishly expensive, and I don't disagree because they are. However, when one considers that you could buy an RX 6800 XT plus an AM5 motherboard (and 16GB of DDR5) for the price of an RX 6900 XT alone, you really have to question the sanity of the person who chooses the RX 6900 XT.

Some people seem to have different gauges for cost and I just don't get it. They seem to think that it's perfectly acceptable to pay $350 for the performance of an ancient HD 5850 but to pay the same amount for an ASRock X670E PG Lightning with 16GB of DDR5-5600 is too much. It just makes me realise just how little these people actually know about tech or about money because to me, $350 is $350 no matter what I spend it on when it comes to tech. I'd rather spread the money out for a balanced system rather than blow it all on a video card and be left with a CPU budget so low that the card just gets bottlenecked and the whole system sucks as a result.

I love this video from Tech Deals for demonstrating this very thing to people because they really do need to know these things and no salesman working on commission would ever tell them:
When you look at new AAA games like cyberpunk the 4090 can just keep more than 60fps in ultra at 4k and that’s with dlss quality. If devs keep pushing ray tracing more and more it will just be even more demanding. When devs start using UE5 it will destroy current gpus on ultra settings. All this said most people don’t play on 4k with ultra, but they do exist like me.
I agree with your statement, but remember that you're as rare as a hen's tooth and halo products are theoretically not supposed to be any more common than you are. What you have to question with the RTX 4090 is "Since I have to use DLSS anyway, is it worth getting the most expensive card? Would it be better for me to get a cheaper card and wait to spend the big money on the next generation that perhaps doesn't need DLSS for Cyberpunk 2077 at 4K with ultra settings? I would chose the latter because if I have to use DLSS anyway, what's the difference or the point?
If 2nd fastest means my pc doesn't literally melt I'm ok with that
Hear, hear! (y) (Y)
In my opinion, RT is a useless gimmick until at least 5 more years and maybe then, we might have some real benefits for it, but at an insane hardware cost
Even so, I don't think that it will ever have the impact that tessellation did:
Tessellation was a REAL game-changing graphics technology because it affected everything, not just the things that gamers rarely, if ever, look at like shadows and reflections.
I agree.

The days of $500 top end gpus was long time ago. That is what I paid for a Ati Radeon 9700 Pro on launch.
The cost of the top video card has gone up by $900 over the past 4 years while in the 30 years before that, it only went up by $300. So while, yeah, the days of the $500 top video card are behind us, we should still be in the days of the $700 top video card, not the days of the $1,600 top video card. Using historical data, the natural MSRP price creep over time should be about 18.5¢ per week.

That people believe that the prices being asked are even remotely justified is just insane. The market was healthy and sustainable between 1987 and 2018 because both Radeon and GeForce grew and prospered. What nVidia is doing now is not only short-sighted, it's suicidal. The cost increase on cards in the past 4 years has moved 24x faster than it had in the previous 30 years. No market can survive that kind of drastic change so it will either stop or the market will collapse because not enough people will buy cards to keep it going.
 
I would even take it a step further and say that even the 4080 and the 7900 XTX are also unnecessary for non-professional use: I'm fine with keeping them but consumers shouldn't consider them unless they want to do something like develop games...

I wouldn't recommend using those cards for game development. When you develop a game to run normally on the best card that exists, it runs like crap on any other hardware. Devs should use medium, or even a low-end cards to develop games. Because if they use excellent hardware, even if they are good, they develop a bias where anything below their hardware is irrelevant. They only make it run well on their hardware. And everyone else gets a slow crappy game.
 
Status
Not open for further replies.
Back