Radeon RX 7900 XT vs. GeForce RTX 4070 Ti: 50 Game Benchmark

Status
Not open for further replies.
While the MCDs are tiny, there’s no usage (at present) for dies with any defects — for the 7900 cards, it’s the full die, at the required clock speeds, or nothing. So the yield is unlikely to be quite that high.

The same is also to be true of the GCD, although not to the same extent — the 7900 XTX uses a full Navi 31 and the 7900 XT only has 5 WGPs disabled (rest of the die is normal). One wafer is unlikely to yield 90% for both SKUs, as any sporting cache defects or more than 5 WGPs with problems can’t be used.
I agree that a monolithic die like that wouldn't likely have really high yields. That also puzzles me though because with lower yields come more lower-grade GPUs but we haven't heard even a peep about lower-tier Radeons like the 7800, 7700, 7600 or 7500. I would've expected that there would be a lot of those ready to go from the resulting glut of imperfect Navi 31 dice.

I wonder what the problem is.
As already mentioned, a chip of that size would not reach 90% yield, I think it is somewhere between 70-80%.
Yeah, one would think... but as I said to Neeyik, lower yields should mean lots of lower-tier GPUs and we haven't seen anything. Who knows, maybe TSMC has managed to boost yields through fabrication process optimisation. After all, I would imagine that increasing yields through improvements in fabrication has to be one of TSMC's goals in their R&D departments. I don't know if this has happened for sure, I'm just postulating based on what we've seen thus far.
 
AMDs control panel looks more modern, but features inside are not really beating Nvidias.

What is AMDs answer to DSR and especially DLDSR? VSR is not it
I've never felt the need for either of those. I just play games as they are and love it. Little frills like that don't impress me much. It's the reason that RT doesn't impress me much.
I think AMD lacks too many features in generel, most of their features is a ripoff from Nvidia really - Gsync > Freesync .. RT perf (AMD is 2-4 years behind) .. DLSS > FSR etc.
Well yeah, I mean, what can you expect when so much of the market is throwing money at nVidia, giving them an R&D budget that is several times what ATi has to work with, parity? The only way to stabilise the market is to bring them to parity in GPU sales. Throwing money at nVidia and expecting ATi to magically come out with something is like expecting a miracle. The problem is that miracles aren't real. I can however say one thing that ATi did that proved to be amazing and that was Mantle. Several features of DX12 came from Mantle and Vulkan essentially IS Mantle. That had a big positive impact on gaming in general, not just for owners of green cards.
And performance in lesser popular titles and especailly early access games just seems far superior on Nvidia
It's like I said earlier, there's "good" and then there's "good enough". For the vast majority of gamers, Radeon cards are easily "good enough". If I'm not getting over 100FPS in a new title but I'm still getting over 60FPS, I call that "good enough" because I'm not exactly suffering and I know that in a month or two, the newest Radeon drivers will give me a big performance uplift (that I probably won't even care about). I'll definitely agree that GeForce has more frills than Radeon but I honestly don't care about those and never have. If the card is fast and stable, that's all I truly care about. I don't feel like paying hundreds of dollars for extra frills that I won't even use.

If you use them and like them, then yeah, get a GeForce card. If you're more interested in the game itself than the hardware that runs it, then there's no reason to pay those extra hundreds of dollars. To me though, nothing is more important than speed and VRAM when I'm buying a card. I've never encountered the "unstable drivers" problem that so many people keep bleating about and I've gamed on only Radeon cards since 2008. The only thing that I've ever encountered in Radeon drivers that was the least bit annoying was sometimes getting overscan with my R9 Furies. All I had to do was toggle GPU scaling and everything was fine again. How many people do you think said "Oh, Radeon drivers are BAD!" and returned their cards because they had no idea how to deal with a little niggle like that? I'm willing to bet that it's a crap-tonne. :laughing:
 
$900 12GB KEKW

Too many suckers out there. "But but it's not mean for 4K" "But but it performs pike a 90"
 
It was a sale. Don't know if they had any stock before I saw the page. Either way, the article is off on the pricing. There are 4070Ti cards selling at MSRP, not many, but they do exist.
I only see 1 on newegg which is set as a backorder, amazon has none and microcenter has a only a few in stock at MSRP which can only be picked up in certain stores (pick-up only). unless you live in a very specific location, you won't find one at MSRP without some luck.

existing is not enough, they need to be readily available. otherwise most ppl won't be able to get one and it effectively is not selling at MSRP.
 
At launch, with actual stock prices, the 4070ti was the card to buy. Wasn't even a contest. Even if all you care about is raster performance, the 4070ti had the better price to performance ratio. Then you add things like DLSS + FG + RT+ better power draw - and no issues with idle multimonitor and video playback skyrocketing the power consumption and you have to be silly to go for the 7900xt.

The last 2 weeks prices have dropped for the 7900x so it starts becoming an option..
 
That also puzzles me though because with lower yields come more lower-grade GPUs but we haven't heard even a peep about lower-tier Radeons like the 7800, 7700, 7600 or 7500. I would've expected that there would be a lot of those ready to go from the resulting glut of imperfect Navi 31 dice.
Here's the timeline of SKUs for Navi 21:

October 2020
Radeon RX 6800 -- 60CU, 2.11 GHz, 16 GB
Radeon RX 6800 XT -- 72 CU, 2.25 GHz, 16 GB
Radeon RX 6900 XT -- 80 CU, 2.25 GHz, 16 GB

June 2021
Radeon PRO W6800 -- 60 CU, 2.32 GHz, 32 GB

August 2021
Radeon Pro W6800X -- 60 CU, 2.09 GHz, 32 GB
Radeon Pro W6900X -- 80 CU, 2.17 GHz, 32 GB
Radeon Pro W6800X Duo -- 2x 60 CU, 1.98 GHz, 2x 32 GB

November 2021
Radeon PRO V620 -- 72 CU, 2.2 GHz, 32 GB

May 2022
Radeon RX 6950 XT -- 80 CU, 2.31 GHz, 16 GB

The Navi 21 launch was in the WFH-must-buy-PC-stuff boom, so AMD naturally pushed as many dies as they could into the general consumer sector, rather than the low-volume professional market (hence the 8 month gap before the chips were in Pro SKUs). Different times now -- big decrease in desktop PC and component sales, operational costs still rising, etc. AMD is probably spreading the new product launches out to see how things change in the general.

I don't think there's anything wrong, just a case that market conditions are forcing AMD's hand somewhat.
 
At launch, with actual stock prices, the 4070ti was the card to buy. Wasn't even a contest. Even if all you care about is raster performance, the 4070ti had the better price to performance ratio. Then you add things like DLSS + FG + RT+ better power draw - and no issues with idle multimonitor and video playback skyrocketing the power consumption and you have to be silly to go for the 7900xt.

The last 2 weeks prices have dropped for the 7900x so it starts becoming an option..
Personally I think AMD should always target 10% more perf for 10% less money.

IE.
7900XT 5-10% faster than 4070 Ti but 5-10% cheaper.
7900XTX 5-10% faster than 4080 but 5-10% cheaper.

AMD will NOT sell many chips if they ask similar prices. Nvidia is the big player on PC desktop and AMD needs to be super competitive in terms of pricing.
 
7900XT 5-10% faster than 4070 Ti but 5-10% cheaper.
7900XTX 5-10% faster than 4080 but 5-10% cheaper.
5% isn't going to sway anyone who's on the fence about choosing AMD over Nvidia:

-- 5% reduction of $799 and $1199 is $759 and $1139
-- 5% increase in 60, 144, and 240 fps is 63, 151, and 252 fps

10% looks better but it's still nothing special:

-- 10% reduction of $719 and $1199 is $759 and $1079
-- 10% increase in 60, 144, and 240 fps is 66, 158, and 264 fps

Sales of Nvidia's 40 series have demonstrated that the market is willing to pay high costs for products it believes are worth it or superior somehow. AMD doesn't have a problem with price or performance -- it's about image. Years of selling products at lower prices than the competition just cements a reputation for being 'second best', which is why Ryzen prices are now fully 'Intel level'.

AMD needs a Radeon lineup to outperform every comparable Intel and Nvidia model, by a considerable margin, and forget about the price.
 
5% isn't going to sway anyone who's on the fence about choosing AMD over Nvidia:

-- 5% reduction of $799 and $1199 is $759 and $1139
-- 5% increase in 60, 144, and 240 fps is 63, 151, and 252 fps

10% looks better but it's still nothing special:

-- 10% reduction of $719 and $1199 is $759 and $1079
-- 10% increase in 60, 144, and 240 fps is 66, 158, and 264 fps

Sales of Nvidia's 40 series have demonstrated that the market is willing to pay high costs for products it believes are worth it or superior somehow. AMD doesn't have a problem with price or performance -- it's about image. Years of selling products at lower prices than the competition just cements a reputation for being 'second best', which is why Ryzen prices are now fully 'Intel level'.

AMD needs a Radeon lineup to outperform every comparable Intel and Nvidia model, by a considerable margin, and forget about the price.
7900XT started out at 899 dollars which is way too much. Even 7900XTX was better value. The higher end card should never be better value. Failed pricing.

AMD won't and can't beat 4090. They don't think there is buyers for a 1500-2000 dollar AMD GPU and I doubt that as well.

MCM was not the savior AMD thought it would be for GPU market.
Intel is actually gaining marketshare in the CPU segment again. This just shows that AMD should not be more expensive because Ryzen is not overall better. Ryzen still has more software issues, especially in enterprise segment with docks, multimonitor use and different requirements + slow booting times.

Intel and Nvidia still gets most of the focus from developers and this is why AMD is more wonky, most of the time.

I had a 5700XT on launch, it took AMD around 9 months to fix black screen issues and overheating VRM, it's experiences like this, that makes people go back to Nvidia
 
5% isn't going to sway anyone who's on the fence about choosing AMD over Nvidia:

-- 5% reduction of $799 and $1199 is $759 and $1139
-- 5% increase in 60, 144, and 240 fps is 63, 151, and 252 fps

10% looks better but it's still nothing special:

-- 10% reduction of $719 and $1199 is $759 and $1079
-- 10% increase in 60, 144, and 240 fps is 66, 158, and 264 fps

Sales of Nvidia's 40 series have demonstrated that the market is willing to pay high costs for products it believes are worth it or superior somehow. AMD doesn't have a problem with price or performance -- it's about image. Years of selling products at lower prices than the competition just cements a reputation for being 'second best', which is why Ryzen prices are now fully 'Intel level'.

AMD needs a Radeon lineup to outperform every comparable Intel and Nvidia model, by a considerable margin, and forget about the price.
I disagree, $100 less, boosted by taxes is a significant difference... Twice what you think it is
 
I disagree, $100 less, boosted by taxes is a significant difference
$100 less is a 13% reduction against the 4070 Ti, though, before any tax or other rebate. KofeViR's suggested reduction was 5 to 10%.

7900XT started out at 899 dollars which is way too much. Even 7900XTX was better value. The higher end card should never be better value. Failed pricing.
In Steve's testing, the 7900 XT was 13% and 17% slower than the XTX, on average, at 1440p and 4K respectively. Unfortunately, it was definitely priced too close to the XTX, being just 10% cheaper, and even now it's still too expensive.

MCM was not the savior AMD thought it would be for GPU market.
How cost-effective the use of MCDs remains to be seen, given that it's just in two products. Besides, AMD never said it was any kind of savior for the market -- it said that the use of chiplets brought cost benefits. For whom was never made explicitly clear.
 
Like what? For example Asus have two motherboards that are pretty much carbon copies Except one support Intel and another AMD:


Differences: Intel supports Thunderbolt, AMD support dual M.2 PCIe 5.0 and dual PCIe x8. That's pretty much it.

Prices: AMD $484 https://pcpartpicker.com/product/bG...am5-motherboard-rog-strix-x670e-e-gaming-wifi
Intel $499.99 https://pcpartpicker.com/product/2Y...1700-motherboard-rog-strix-z790-e-gaming-wifi

If AMD really charges more for chipsets, then why AMD option is cheaper despite offering more M.2 PCIe 5.0, those are very expensive.
Sure, an LGA 1700 board can cost as much as an AM5 board but they don't necessarily have to. There are a lot of budget-friendly LGA 1700 boards that cost a good deal less than the least-expensive AM5 boards. At the bottom-end for example, an LGA 1700 motherboard costs about half of the least-expensive AM5 board. The least-expensive AM5 motherboard is the Gigabyte B650M D3SH for $150 while the least-expensive LGA 1700 motherboard is the ASRock H610M-HDV for $80.

Not having to buy new RAM and only having to pay a little over half is perhaps the best counter-punch to AMD's long-lived platforms because not only does it mean that the AM5 platform won't save you money, but you'll have to pay it all up-front. Also, that Intel board takes DDR4 which is one less cost to incur, especially when you consider that DDR4 is still perfectly fine for most people's purposes.

Don't get me wrong, I'm not saying that we're comparing apples to apples here because I believe that the AM5 platform is clearly superior to LGA 1700 overall. What I'm saying is that the aspects of AM5 that make it superior to LGA 1700 are generally aspects that most people won't understand or give a rat's posterior about. Most people couldn't tell you the difference between PCI-Express versions (or USB versions for that matter). To them, a PC is either fast enough or it isn't fast enough (in whatever program they use on it). I agree with you that PCI-e5 can't be cheap to implement but the thing is, just like ray-tracing, right now it's only an expensive frill as there are few, if any, devices supporting it.

I personally don't see the problem with only using DDR5 because before LGA 1700, the only time that I'd ever seen a motherboard that could use two DDR standards was on my old LGA 775 ASRock 4CoreDual-VSTA. Now THAT motherboard was innovative as hell because it was made for easy upgrading. Not only did it support both DDR and DDR2, it also supported both AGP and PCI-Express! The innovation came from VIA because the northbridge was the VIA PT880 Ultra and the southbridge was the VIA VT8237A.
4CoreDual-VSTA(M1).png

It's impressive as hell to be sure, but it's also unique as I've never seen anything else like it. The only reason that this could be done was that Intel's memory controllers have historically been part of the motherboard's northbridge while AMD's memory controllers have been on the CPU die itself. I think that the prices of the AM5 boards were exacerbated by the high prices of DDR5, something that AMD had no control over. Having said that, Intel's decision to support DDR4 on the LGA 1700 platform was a very deft and effective out-flanking of AMD.
For R5-7600X3D, right now $700 CPU (7950X3D) is sold out everywhere, so there is no way AMD could produce enough hexa core CPUs with 3D cache. Perhaps in future but not now.
You're making assumptions again and I've told you before that I don't deal in assumptions, only provable facts.
Assumption #1 - Nobody has any real idea of how many R9 X3D CPUs AMD has produced. I believe it far more likely that they sold out because AMD didn't make many to begin with, not because they're in high demand. These were never going to be mass-market products and so the fact that they're sold out proves nothing.

Assumption #2 - I already torpedoed this assumption in a previous thread so I don't know why you're trying to use it on me again. There is literally zero evidence that there's a shortage of 3D cache and I honestly have no idea where you got this impression. The last time you tried this on me, I simply asked you to provide evidence because I couldn't find any (If I had, I would have agreed with you). You failed to provide me with that evidence then and I still can't find any now so why are you bringing it up again?

I would love to see evidence that backs up your second assumption because that would make me less pissed-off at AMD because at least there would be a reason. If you still can't provide me with evidence of this, then please stop saying it to me.
 
The only thing I'm sure of is that the 4070's margins are much higher than the 7900XT, less Vram, single smaller chip etc...
I honestly can't believe that anyone could be so gullible as to think that GPU prices are what they are because of "hyperinflation". The cost of these parts has literally doubled over the last five years while nothing else has and the prices of nVidia cards have risen far more sharply than the Radeons. Hyperinflation my arse!

At least you see through the BS.
 
Sure, an LGA 1700 board can cost as much as an AM5 board but they don't necessarily have to. There are a lot of budget-friendly LGA 1700 boards that cost a good deal less than the least-expensive AM5 boards. At the bottom-end for example, an LGA 1700 motherboard costs about half of the least-expensive AM5 board. The least-expensive AM5 motherboard is the Gigabyte B650M D3SH for $150 while the least-expensive LGA 1700 motherboard is the ASRock H610M-HDV for $80.
That is because AMD board is at least somewhat useful and future proof. If you look carefully, that low end Intel trash has 0 (that means, zero) M.2 slots. And yes, that costs money.
Not having to buy new RAM and only having to pay a little over half is perhaps the best counter-punch to AMD's long-lived platforms because not only does it mean that the AM5 platform won't save you money, but you'll have to pay it all up-front. Also, that Intel board takes DDR4 which is one less cost to incur, especially when you consider that DDR4 is still perfectly fine for most people's purposes.

Don't get me wrong, I'm not saying that we're comparing apples to apples here because I believe that the AM5 platform is clearly superior to LGA 1700 overall. What I'm saying is that the aspects of AM5 that make it superior to LGA 1700 are generally aspects that most people won't understand or give a rat's posterior about. Most people couldn't tell you the difference between PCI-Express versions (or USB versions for that matter). To them, a PC is either fast enough or it isn't fast enough (in whatever program they use on it). I agree with you that PCI-e5 can't be cheap to implement but the thing is, just like ray-tracing, right now it's only an expensive frill as there are few, if any, devices supporting it.
Older DDR versions are never worth buying. DDR5 will quite soon make DDR4 obsolete and DDR4 spare parts (motherboards etc) will be hard to find. Happened every time on DDR era. While older DDR is "fine", it's just not worth buying since it goes to trash on next upgrade or late fix.

At this moment AMD is premium brand, Intel sells cheap trash since that's all they can do. And premium parts always cost more than cheap ones. At least you get good value for money when buying AMD and premium brand just cannot easily accept trashy motherboards, because customers will think poor features are AMDs fault.
I personally don't see the problem with only using DDR5 because before LGA 1700, the only time that I'd ever seen a motherboard that could use two DDR standards was on my old LGA 775 ASRock 4CoreDual-VSTA. Now THAT motherboard was innovative as hell because it was made for easy upgrading. Not only did it support both DDR and DDR2, it also supported both AGP and PCI-Express! The innovation came from VIA because the northbridge was the VIA PT880 Ultra and the southbridge was the VIA VT8237A.
4CoreDual-VSTA(M1).png

It's impressive as hell to be sure, but it's also unique as I've never seen anything else like it. The only reason that this could be done was that Intel's memory controllers have historically been part of the motherboard's northbridge while AMD's memory controllers have been on the CPU die itself. I think that the prices of the AM5 boards were exacerbated by the high prices of DDR5, something that AMD had no control over. Having said that, Intel's decision to support DDR4 on the LGA 1700 platform was a very deft and effective out-flanking of AMD.
I remember that kind of motherboards. There were some others that supported SDRAM (SDR) and DDR simultaneously.

AMD said they try to offer long term platform on AM5. Pairing DDR4 memory with long term platform rarely makes sense. Also with AMD it's easier for customer since AM5 indicates board supports only DDR5 memory. With Intel it's huge mess as some boards support DDR4 and some DDR5. Intel did that only because they are now low end brand and cannot compete with AMD premium features.
You're making assumptions again and I've told you before that I don't deal in assumptions, only provable facts.
Assumption #1 - Nobody has any real idea of how many R9 X3D CPUs AMD has produced. I believe it far more likely that they sold out because AMD didn't make many to begin with, not because they're in high demand. These were never going to be mass-market products and so the fact that they're sold out proves nothing.
Not only assumption. You could check actual 7950X3D and 7900X3D sales data at Mindfactory. 7950X3D sold (out) on hours more than any Intel CPU sold on 7 days (except 13700K). So there was decent stock. Sold out that much at that price so quickly proves high demand.
Assumption #2 - I already torpedoed this assumption in a previous thread so I don't know why you're trying to use it on me again. There is literally zero evidence that there's a shortage of 3D cache and I honestly have no idea where you got this impression. The last time you tried this on me, I simply asked you to provide evidence because I couldn't find any (If I had, I would have agreed with you). You failed to provide me with that evidence then and I still can't find any now so why are you bringing it up again?

I would love to see evidence that backs up your second assumption because that would make me less pissed-off at AMD because at least there would be a reason. If you still can't provide me with evidence of this, then please stop saying it to me.
Fact 1: V-Cache chips use custom 7nm process, not same as used on 5800X3D source https://www.techspot.com/news/97818-amd-explains-how-new-3d-v-cache-improves.html

That means, AMD couldn't produce V-Cache for Zen4 as long as 5800X3D has been on sale.

Fact 2: Only major difference between any Ryzen non-3d cache model and 3D cache model is extra V-Cache chip. There is no shortage of Ryzen IO chips. There is no shortage of Zen4 chiplets. V-Cache chips? Like usually, Epyc server chips (Genoa-X) will use V-Cache chips on Every chiplet. That means, single Epyc CPU may consume 12 V-Cache chips. AMD of course sets higher priority for Epyc CPUs and those are still unreleased.

That is both blessing and curse. Since server chips and desktop chips use mostly same parts, both come from same product lines that makes then cheap and easy to produce. But it also means there won't be dedicated product lines for desktop chips.

Fact 3: Never before top end desktop CPU product has been short supply unless there have been some sort of production issues. Because it makes zero sense to limit availability for product that has best profit margin. It also makes little sense to delay releasing top product with best product margin unless there are supply issues.

Adding those up, it's very clear that V-Cache chips are on short supply even without AMD telling it directly (they won't of course).
 
7900XT started out at 899 dollars which is way too much. Even 7900XTX was better value. The higher end card should never be better value. Failed pricing.

AMD won't and can't beat 4090. They don't think there is buyers for a 1500-2000 dollar AMD GPU and I doubt that as well.

MCM was not the savior AMD thought it would be for GPU market.
Intel is actually gaining marketshare in the CPU segment again. This just shows that AMD should not be more expensive because Ryzen is not overall better. Ryzen still has more software issues, especially in enterprise segment with docks, multimonitor use and different requirements + slow booting times.

Intel and Nvidia still gets most of the focus from developers and this is why AMD is more wonky, most of the time.

I had a 5700XT on launch, it took AMD around 9 months to fix black screen issues and overheating VRM, it's experiences like this, that makes people go back to Nvidia

But...
AMD wins the price/performance and doesn't have to overmarket their business hardware to gamers... it doesn't matter how much marketing NVidia tries to do, the 4070ti can't will never beat the 4080... but the XT can and does..! Seems you are afraid to admit this... and are influenced by marketing.

Additionally, the 4090 isn't gaming architecture... it's just Pro hardware being sold as "gamer" so Jensen Huang doesn't have egg on his face... so he sells an ego-cake $4k dGPU for $2,100 without pro software, so NVidia isn't embarrassed by RDNA. In reality, the $849 XT comes within 25 frames of the $2k 4090 an embarrassing amount of times, in reviews across 20+ review sites.... and even beats NVidia's flagship in a few.

After seeing the XT over and over and over punch above it's weight.... Gamers are like "whats a 4070ti...?"



Lastly, GAME DEVELOPERS don't care 1 bit about DLSS, or NVidia anything.... the Industry Standard is RDNA. (XSX/XSXs/Playstation5/Steamdeck/etc...).


IMO, many are living in the past and can not accept today's reality. I have a slew of EVGA gpus laying around, even they knew when to move on.

 
Interface is a clunky mess designed for generation something shiny. BLUF is always the right answer for interfaces; which is what the Nvidia control panel is; concise; everything on one screen in a row/list. Also AMD features are much more primitive; say something like ultrawide or DSR on Nvidia just smokes AMD's equivalent (or non-existent options). On a 3440x1440 display with Nvidia I can easily switch from 1920x3440 to 2560x1440 to 34401440 to 3768x1577 to 4213x1764 to 4587x1920 to 4865x2036 to 5160x2160 to 5958x2494 to 6880x2880; on the AMD side you are SOL with only 1920x1080 and 3440x1440 being available options; try to enter anything above 3440x1440 as a custom resolution with AMD and it will be invalid on my 3440x1440 monitors.
there is objectively nothing "concise" about the ui of nvidia's applications. absolutely nothing.

someone like me can navigate it, nobody, and I repeat, nobody who isn't tech savvy can use it. it's a broken mess that needs an urgent modern rework.
 
It's because the US doesn't put taxes in their prices
From what I've seen, prices overall are better / lower in the US. Prima facia, the only visible tax we get hit with, is the sales tax in your state of residence. In my case, (PA), that's 6% However, here in "Killadelphia", the city tacks on another 2%, which I assume is punishment for either being sufficiently brave of heart, or stupid enough, to live here. Most of the suppliers ignore that, which makes me wonder if one day, there'll be police helicopters hovering over my house trying to collect said 2%..Still, I have to wonder if the lower prices plus state sales tax are offsetting enough to still be below European actual selling prices, before the addition of the VAT tax.

Our next door neighbor state, Delaware, charges no sales tax at all. State income taxes though, are horrific. But, their government seems to have convinced residents that the higher income tax, is offset by the droves of out of state customers coming to Delaware to beat the sales tax.The Delaware state line is only about 15 miles, or 2 gallons of gas away and back from Philly. I used to use around $100, 00, as the break even point, when deciding whether or not it was worth driving down there to shop tax free.

IDK what the prices are like over there in "Transylvania", but be thankful you don't live in Australia. A simple 12 string acoustic guitar string set which cost $8.00 in the US, topped out at $24.00 down under. (Those prices were prior to the-pandemic, and the nasty bout of inflation we're dealing with now).

Taxation, in general is, at best, "a shell game", or ,"robbing Peter to pay Paul"..(That said, the customer always seems to be named, "Peter").

The VAT tax, (with my limited understanding of it), nails the end customer at every stage of transition, from manufacturer to one's front door. I wonder, if that allows corporations to escape some of what should be their fair share of the tax burden, by passing it on the the consumer. Although, to be fair, with our tax system, a reseller pays no tax on their merchandise in inventory, but at the time of sale, the consumers get hit with the full burden.

Again, as I understand it, Apple places their corporate headquarters in Ireland, due directly to its low corporate tax rates.

In the end, the customers are going to get screwed one way or another, so it's probably best to just grab your ankles, and enjoy the ride. :poop::eek:
 
Last edited:
But...
AMD wins the price/performance and doesn't have to overmarket their business hardware to gamers... it doesn't matter how much marketing NVidia tries to do, the 4070ti can't will never beat the 4080... but the XT can and does..! Seems you are afraid to admit this... and are influenced by marketing.

Additionally, the 4090 isn't gaming architecture... it's just Pro hardware being sold as "gamer" so Jensen Huang doesn't have egg on his face... so he sells an ego-cake $4k dGPU for $2,100 without pro software, so NVidia isn't embarrassed by RDNA. In reality, the $849 XT comes within 25 frames of the $2k 4090 an embarrassing amount of times, in reviews across 20+ review sites.... and even beats NVidia's flagship in a few.

After seeing the XT over and over and over punch above it's weight.... Gamers are like "whats a 4070ti...?"



Lastly, GAME DEVELOPERS don't care 1 bit about DLSS, or NVidia anything.... the Industry Standard is RDNA. (XSX/XSXs/Playstation5/Steamdeck/etc...).


IMO, many are living in the past and can not accept today's reality. I have a slew of EVGA gpus laying around, even they knew when to move on.

IMO, both the 7900XT/X will live along life Just like the previous generation 6800/6900 series. Can also throw the 6700XT in there as well.

And the upcoming 4070 12GB/3080 12GB, 4060TI 8GB/3070 8GB - more gimped VRAM - is not a good product. The 7800/XT - essentially a 6900XT and 6800XT - will have 16GB VRAM. We'll see AMD prices them.
 
there is objectively nothing "concise" about the ui of nvidia's applications. absolutely nothing.

someone like me can navigate it, nobody, and I repeat, nobody who isn't tech savvy can use it. it's a broken mess that needs an urgent modern rework.
Objectively...there is. It's all on one frame vs flipping all over the place. It's the less is more situation; AMD's previous drivers were actually decently organized compared to their current mess. BLUF; just because something's shiny doesn't make it better; we are talking about form over function here.

But that's the least of AMD's problems; the bigger issue is just how bad their drivers are. Great cards with terrible driver optimization cannot reach their potential. My most recent AMD irritations are being unable to uncap FPS in STALKER SoC while I can do it with no issue using an nvidia card; and Project Cars crashing instantly upon attempting to load into the game engine...meanwhile even my 3080Ti has no problems. Not to mention the amount of frametime issues AMD has vs Nvidia in multiple games.

And these issues are not single system dependent; I'm seeing the same thing on multiple systems. At this point I'm kinda hoping AMD driver optimization doesn't end up mirroring 3rd party drivers for 3dfx cards in Windows XP...at that point...choose which games you can play.
 
Last edited:
Objectively...there is. It's all on one frame vs flipping all over the place. It's the less is more situation; AMD's previous drivers were actually decently organized compared to their current mess. BLUF; just because something's shiny doesn't make it better; we are talking about form over function here.

But that's the least of AMD's problems; the bigger issue is just how bad their drivers are. Great cards with terrible driver optimization cannot reach their potential. My most recent AMD irritations are being unable to uncap FPS in STALKER SoC while I can do it with no issue using an nvidia card; and Project Cars crashing instantly upon attempting to load into the game engine...meanwhile even my 3080Ti has no problems. Not to mention the amount of frametime issues AMD has vs Nvidia in multiple games.

And these issues are not single system dependent; I'm seeing the same thing on multiple systems. At this point I'm kinda hoping AMD driver optimization doesn't end up mirroring 3rd party drivers for 3dfx cards in Windows XP...at that point...choose which games you can play.
sorry dude, but the whole world disagrees with you on this one. every UI designer and nvidia user is crying right now at what you said.
 
Objectively...there is. It's all on one frame vs flipping all over the place. It's the less is more situation; AMD's previous drivers were actually decently organized compared to their current mess. BLUF; just because something's shiny doesn't make it better; we are talking about form over function here.

But that's the least of AMD's problems; the bigger issue is just how bad their drivers are. Great cards with terrible driver optimization cannot reach their potential. My most recent AMD irritations are being unable to uncap FPS in STALKER SoC while I can do it with no issue using an nvidia card; and Project Cars crashing instantly upon attempting to load into the game engine...meanwhile even my 3080Ti has no problems. Not to mention the amount of frametime issues AMD has vs Nvidia in multiple games.

And these issues are not single system dependent; I'm seeing the same thing on multiple systems. At this point I'm kinda hoping AMD driver optimization doesn't end up mirroring 3rd party drivers for 3dfx cards in Windows XP...at that point...choose which games you can play.

I am not even sure what you are talking about.... it sounds as if you are talking about AMD & nVidia drivers from 5 years ago... when AMD was using GCN, not RDNA.

If you run multiple systems you would KNOW for a fact, that AMD's drivers are better.


Every game has a potential to have an issue with a card, but that is what updates are for. U can't take one case, without also applying it to both makers & plenty of issues on nVidia's side. AMD's drivers get better over time.... while there is nothing to tweak with cuda.
 
I am not even sure what you are talking about.... it sounds as if you are talking about AMD & nVidia drivers from 5 years ago... when AMD was using GCN, not RDNA.

If you run multiple systems you would KNOW for a fact, that AMD's drivers are better.


Every game has a potential to have an issue with a card, but that is what updates are for. U can't take one case, without also applying it to both makers & plenty of issues on nVidia's side. AMD's drivers get better over time.... while there is nothing to tweak with cuda.
I mean exactly what I'm talking about; drivers now. And what I'm talking about; is out of a run 60 games; 90% of them in the last 10 years AMD driver has infinitely more issues than the Nvidia driver; if anything AMD drivers have gotten worse. And performance isn't the only issue; things like Anisotropic filtering often don't function with the AMD drivers when you're dealing with older games like Operation Flashpoint; easy test is the cobblestones in Nogova or the paved streets in Oblivion; they are muddy past a few feet with the current AMD driver and crisp and defined in those same games with the Nvidia driver. Nvidia driver I have had 0 situations with not being able to launch a game; with AMD the same cannot be said. I have multiple systems; from a PIII 1Ghz to i7 10700KF (though my main is i9 10920X since I do most of my work on it); I do a lot of testing on this stuff and it is quite clear than AMD has serious quality issues on the driver side.

i9 10920X/i7 10700KF
 
Last edited:
I mean exactly what I'm talking about; drivers now. And what I'm talking about; is out of a run 60 games; 90% of them in the last 10 years AMD driver has infinitely more issues than the Nvidia driver; if anything AMD drivers have gotten worse. And performance isn't the only issue; things like Anisotropic filtering often don't function with the AMD drivers when you're dealing with older games like Operation Flashpoint; easy test is the cobblestones in Nogova or the paved streets in Oblivion; they are muddy past a few feet with the current AMD driver and crisp and defined in those same games with the Nvidia driver. Nvidia driver I have had 0 situations with not being able to launch a game; with AMD the same cannot be said. I have multiple systems; from a PIII 1Ghz to i7 10700KF (though my main is i9 10920X since I do most of my work on it); I do a lot of testing on this stuff and it is quite clear than AMD has serious quality issues on the driver side.


10920X
You do realize that AMD's RDNA driver stack is multi-threaded and modern, while AMD's is single-threaded and old news. You do not seem to understand that AMD re-wrote from the ground up, their entire driver stack for RDNA.... and SINCE THEN, NVidia drivers have not compared.

As for a case by case situation.... both AMD and NVidia have issues with certain things, that get fixed with an update (Nothing is systemic.)



Secondly, NOBODY cares about a 15+ year old game... when it's 3rd successor is out ARMA 3 and our clan has been plying for 9 years. Old games have a LOT of problems with lots of new hardware, because they were written for DX9... a single case is a far stretch for a valid argument.

Again, nobody is using such old hardware... it's not even logical or even a reason to complain, when a $45 Raspberry Pi is more powerful and uses 10x less energy than a Pentium. Nothing you are saying is a logical argument, nor does flexing a X-core when talking about gaming..?

As for your pic... where is the mess..?
 
Last edited:
You do realize that AMD's RDNA driver stack is multi-threaded and modern, while AMD's is single-threaded and old news. You do not seem to understand that AMD re-wrote from the ground up, their entire driver stack for RDNA.... and SINCE THEN, NVidia drivers have not compared.

As for a case by case situation.... both AMD and NVidia have issues with certain things, that get fixed with an update (Nothing is systemic.)



Secondly, NOBODY cares about a 15+ year old game... when it's 3rd successor is out ARMA 3 and our clan has been plying for 9 years. Old games have a LOT of problems with lots of new hardware, because they were written for DX9... a single case is a far stretch for a valid argument.

Again, nobody is using such old hardware... it's not even logical or even a reason to complain, when a $45 Raspberry Pi is more powerful and uses 10x less energy than a Pentium. Nothing you are saying is a logical argument, nor does flexing a X-core when talking about gaming..?

As for your pic... where is the mess..?
Well it does matter when most older games run with no issue with the nvidia drivers but either have issues or straight up won't run with the AMD driver; that is a quantifiable demerit for AMD. Why have a driver that prevents playing games or cannot execute basic functions like forcing AF or disabling vsync ? Also the stuttering is far more of a thing with the AMD driver in certain games than it is for Nvidia. If you can't tell the difference between the organization of the ncp with everthing in a single screen vs the mess from team red...then I can't help you there.

End of the day; obviously people DO care about older games or there wouldn't be things like GoG. Why would someone buy a rasberry pi when they can be 99% confident the older game will work on their pc with nvidia driver? But issue is not just older games; frametimes are a serious issue for amd vs nvidia; again though because of subpar effort on red side drivers, not due to the hardware potential.

For the amount of money these companies charge; I expect a matching level of effort on the driver side, and here AMD has been a disappointment sadly.

And also...ArmA 3 doesn't run much better than Operation Flashpoint. And definitely not as grand a campaign as Resistance.
 
Status
Not open for further replies.
Back