8GB VRAM vs. 16GB VRAM: RTX 3070 vs. Radeon 6800

"ray tracing is better on Nvidia"... when it work ahahahah...

srly, how many of you are angry in the comments but will still buy NVs... stop your BS and act...
 
That's the kind of crap the fan boys get into, we're not reviewing the company, we're reviewing the product. That said we certainly push back against anti-consumer behavior.

FROM MY REVIEW

What would you like me to re-evaluate? Again I mentioned the VRAM several times in my RTX 4070 review. I'm not sure what more you guys want from me? I've been on Nvidia's case about VRAM capacities for years now, and received a huge amount of flack for doing so, but I'm continuing to dig into it as the evidence mounts.

It would still be interesting to hear your thoughts on the ‚90/100‘ score Techspot gave the card.

Oh, and the 3070 got a 95/100 score (6800 got the same score to be fair), the same card that‘s now struggling to play some new games giving gamers the choice between lowered graphics fidelity / resolution or crashes, mussing textures and stuttering.

The thing is that the constant message customers get is to buy nVidia products if they want better graphics / more eye candy (RTX) and stability, yet this is exactly where the 3070 is beginning to fail in some new titles, giving the games a bad rep.

Thank you for this review, it‘s well done and balanced, but it would have been very useful to do it *before* the 4070 review, allowing you to point out what will potentially happen to the 4070 going forward.

This is not meant for you, but an observation on the comments - there are many ‚ah, the card will be fine, just play at lowered resolutions, who cares really‘ comments.

Seriously, on a card customers mist likely paid considerably more than $500 for ? I feel customers should expect more.
 
It would still be interesting to hear your thoughts on the ‚90/100‘ score Techspot gave the card.

Oh, and the 3070 got a 95/100 score (6800 got the same score to be fair), the same card that‘s now struggling to play some new games giving gamers the choice between lowered graphics fidelity / resolution or crashes, mussing textures and stuttering.

The thing is that the constant message customers get is to buy nVidia products if they want better graphics / more eye candy (RTX) and stability, yet this is exactly where the 3070 is beginning to fail in some new titles, giving the games a bad rep.

Thank you for this review, it‘s well done and balanced, but it would have been very useful to do it *before* the 4070 review, allowing you to point out what will potentially happen to the 4070 going forward.

This is not meant for you, but an observation on the comments - there are many ‚ah, the card will be fine, just play at lowered resolutions, who cares really‘ comments.

Seriously, on a card customers mist likely paid considerably more than $500 for ? I feel customers should expect more.

3070 have zero issues at 1440p gaming outside of broken console ports like TLOU, and running the latest and most demanding games at Ultra + RT settings makes no sense for a 3 year old mid-end GPU anyway, performance is crap even on 6800XT as well.

The games in this test seems almost cherrypicked to make 3070 look bad.

First of all, 6800 was never a 3070 competitor. It was supposed to be a cutdown 6800XT and undercut 3080 in price/perf, which it did.

Second, TLOU was and is broken on Nvidia cards, a rushed console port, one of the worst console ports in years according to most reviews. Hell even developer confirmed it.

Once again, 3070 still beats 6700XT in 99% of games at 1440p using high settings with about 10-15%

I have 3080 and it's doing very well at 1440p still.

Also, results here look funny, since A Plague Tale Requiem runs very well on most cards (except here) - It does not use alot of VRAM.


3070 very easily beats 6700XT in 4K using Ultra preset, however framerate is too low for most people, on both. 6700XT is unplayable tho, 3070 at least delivers 35 fps average, but neither cards were meant for 4K gaming. 3070 would probably deliver ~60 fps using DLSS Quality in 4K maxed.

TPU shows that max memory usage was 6.2-6.3GB on 4K maxed out settings.

And last, 90% of PC gamers tend to play lesser demanding titles (eSport titles etc), instead of the most demanding AAA games. Look at Steam top 100. It's funny that some people act like a 3070 is trash when it will play 99.9% of games on the market to perfection in 1440p, including alot of early access games that tends to run MUCH BETTER on Nvidia hardware. This is a place where AMD usually lacks ALOT of performance, probably because the developer knows that 85% of PC gamers are using Nvidia (Steam HW Survey)
 
Last edited:
It would still be interesting to hear your thoughts on the ‚90/100‘ score Techspot gave the card.

Oh, and the 3070 got a 95/100 score (6800 got the same score to be fair), the same card that‘s now struggling to play some new games giving gamers the choice between lowered graphics fidelity / resolution or crashes, mussing textures and stuttering.

The thing is that the constant message customers get is to buy nVidia products if they want better graphics / more eye candy (RTX) and stability, yet this is exactly where the 3070 is beginning to fail in some new titles, giving the games a bad rep.

Thank you for this review, it‘s well done and balanced, but it would have been very useful to do it *before* the 4070 review, allowing you to point out what will potentially happen to the 4070 going forward.

This is not meant for you, but an observation on the comments - there are many ‚ah, the card will be fine, just play at lowered resolutions, who cares really‘ comments.

Seriously, on a card customers mist likely paid considerably more than $500 for ? I feel customers should expect more.
Personally I don't like review scores, so I wouldn't do them, I don't have an input here, it's a TechSpot thing, they assign a score that they think is fair based on the review. If I had to I would have given the RTX 4070 maybe an 80/100 since it is the best value current and previous generation GPU priced above $500.
 
It would still be interesting to hear your thoughts on the ‚90/100‘ score Techspot gave the card.

Oh, and the 3070 got a 95/100 score (6800 got the same score to be fair), the same card that‘s now struggling to play some new games giving gamers the choice between lowered graphics fidelity / resolution or crashes, mussing textures and stuttering.

The thing is that the constant message customers get is to buy nVidia products if they want better graphics / more eye candy (RTX) and stability, yet this is exactly where the 3070 is beginning to fail in some new titles, giving the games a bad rep.

Thank you for this review, it‘s well done and balanced, but it would have been very useful to do it *before* the 4070 review, allowing you to point out what will potentially happen to the 4070 going forward.

This is not meant for you, but an observation on the comments - there are many ‚ah, the card will be fine, just play at lowered resolutions, who cares really‘ comments.

Seriously, on a card customers mist likely paid considerably more than $500 for ? I feel customers should expect more.
I don't know if you're including my comments in that bucket, but if so I'd like to elaborate that the point I'm trying to make is more subtle than that.

It's a little subjective but I believe games have an "artistic intent" target - essentially, what the developers themselves were looking at using the highest quality textures during most of the dev/test cycle.

Where I become disappointed is when what I see is visibly less pleasing than the best that they were seeing. In essence, where I'm missing out on the meat of the experience.

Where I'm not so disappointed is if what's pushing it over the resource limit is a bunch of post processing that to a certain extent may be gimmicky anyway, that wasn't really a central part of the artistic intent or within the design target. It seems clear the range of options on a full-service PC port can add a bunch of new workload that maybe no one who was ever on the central design team -- probably console based -- ever even looked at once. I'm not going to cry over missing that.

At least I think that's in part what Neeyik was trying to explain to me and what rings true on reflection. Otherwise I'd still be wondering why a developer would green light the budget for lots of extra detail in the art budget that they knew would be undisplayable on most systems anyway.
 
Personally I don't like review scores, so I wouldn't do them, I don't have an input here, it's a TechSpot thing, they assign a score that they think is fair based on the review. If I had to I would have given the RTX 4070 maybe an 80/100 since it is the best value current and previous generation GPU priced above $500.
Thanks for your reply - that sounds very reasonable.
 
3070 have zero issues at 1440p gaming outside of broken console ports like TLOU, and running the latest and most demanding games at Ultra + RT settings makes no sense for a 3 year old mid-end GPU anyway, performance is crap even on 6800XT as well.

The games in this test seems almost cherrypicked to make 3070 look bad.

First of all, 6800 was never a 3070 competitor. It was supposed to be a cutdown 6800XT and undercut 3080 in price/perf, which it did.

Second, TLOU was and is broken on Nvidia cards, a rushed console port, one of the worst console ports in years according to most reviews. Hell even developer confirmed it.

Once again, 3070 still beats 6700XT in 99% of games at 1440p using high settings with about 10-15%

I have 3080 and it's doing very well at 1440p still.

Also, results here look funny, since A Plague Tale Requiem runs very well on most cards (except here) - It does not use alot of VRAM.


3070 very easily beats 6700XT in 4K using Ultra preset, however framerate is too low for most people, on both. 6700XT is unplayable tho, 3070 at least delivers 35 fps average, but neither cards were meant for 4K gaming. 3070 would probably deliver ~60 fps using DLSS Quality in 4K maxed.

TPU shows that max memory usage was 6.2-6.3GB on 4K maxed out settings.

And last, 90% of PC gamers tend to play lesser demanding titles (eSport titles etc), instead of the most demanding AAA games. Look at Steam top 100. It's funny that some people act like a 3070 is trash when it will play 99.9% of games on the market to perfection in 1440p, including alot of early access games that tends to run MUCH BETTER on Nvidia hardware. This is a place where AMD usually lacks ALOT of performance, probably because the developer knows that 85% of PC gamers are using Nvidia (Steam HW Survey)
The 6700 XT never was a 3070 competitor, just like the 6500 XT isn‘t a 3050 competitor. We all remember msrp vs actual retail price for Ampere and RDNA2.

I‘d wager we‘ll see the same with the 7800 and 7700 XT vs their Ampere counterparts whenever they are released.
 
The 6700 XT never was a 3070 competitor, just like the 6500 XT isn‘t a 3050 competitor. We all remember msrp vs actual retail price for Ampere and RDNA2.

I‘d wager we‘ll see the same with the 7800 and 7700 XT vs their Ampere counterparts whenever they are released.
6700XT launched at 479 dollars.

3070 launched at 499 dollars and I know several people that got 3070 the day after release for that price exactly.

GPU prices did not went south before later. I picked up 3080 for 699 easily on launch day back in 2020.

3060 Ti and 6700XT performs on par. 3070 beats both by 12-15%
 
Also, results here look funny, since A Plague Tale Requiem runs very well on most cards (except here) - It does not use alot of VRAM.

Ray Tracing was added to the game, we are testing with RT enabled :S
 
Ray Tracing was added to the game, we are testing with RT enabled :S
Ah sure, but both cards have unplayable performance with RT enabled anyway :joy:

RT is a joke in most games, even on my 3080. Maybe I am just weird, but Nvidias RT focus is actually the reason why I look in AMDs direction now. I don't want to loose raster perf, to get improved RT perf, when I don't want to use it.

RT have yet to amaze me. The only single game that impressed me slightly, was Cyberpunk 2077 with RT reflections but fps hit was so high that it was pointless to enable when I actually played the game instead of just looking at the graphics while standing still. Even with DLSS On Quality/Balanced I felt fps was too low.

I like features like DLSS, DLAA, DLDSR and other RTX only features, but Ray Tracing, I could live without. Maybe it will change in 2-3 generations, but right now, nah... I have even seen 4090 struggle hard in alot of games with RT on max - What is the point - Outside of pretty screenshots? Even Portal RTX can choke out 4090 with 20-25 fps performance (7900XTX delivers 5-6 fps)

I think 30-40 fps is unplayable. 60-80 fps can work in some games, I tend to run my games at 100 fps or more tho. I feel the smoothness increases my immersion more than Ray Tracing and other effects, that decreases framerate like crazy. Eye Candy is great as long as framerate is still decent...

This is why I am saying that most people that own a 3 year old mid-end GPU, probably won't be using Ultra settings + RT to play the newest and most demanding titles in 2023... Many people would love to have a 3070 and most on steam probably uses alot worse.

Also, I don't get why 3070 is being compared to a 6800 when the direct competitor was 6700XT on launch, or actually 6700XT only delivered 3060 Ti perf overall in most games, yet 3060 Ti was 399 and 6700XT was 479 - with 3070 at 499

Why didnt you use 3070 or 3070 Ti vs 6700 or 6750XT instead. Then GPU power would be closer and it would still have been 8 vs 12 GB.
 
Last edited:
6700XT launched at 479 dollars.

3070 launched at 499 dollars and I know several people that got 3070 the day after release for that price exactly.

GPU prices did not went south before later. I picked up 3080 for 699 easily on launch day back in 2020.

3060 Ti and 6700XT performs on par. 3070 beats both by 12-15%
If you go by imaginary prices, sure. A few lucky buyers got those at launch, the large majority didn‘t. So I very much doubt the ‚easily‘ part.

The 6800 was admittedly an odd one since it wasn‘t widely available if at all. It used the same die as the 6800 XT and 6900 XT and as yields were good, these are what the dies were used for.

The important point is that the 6800 can be used just fine with all modern games without making sacrifices, the 3070 no longer can be on all. And it‘s probably not going to get better.

This is something that can be expected for lower end cards, but not upper mid range models.
 
If you go by imaginary prices, sure. A few lucky buyers got those at launch, the large majority didn‘t. So I very much doubt the ‚easily‘ part.

The 6800 was admittedly an odd one since it wasn‘t widely available if at all. It used the same die as the 6800 XT and 6900 XT and as yields were good, these are what the dies were used for.

The important point is that the 6800 can be used just fine with all modern games without making sacrifices, the 3070 no longer can be on all. And it‘s probably not going to get better.

This is something that can be expected for lower end cards, but not upper mid range models.
No it can't, because 6800 is too slow to max out all new games at 1440p regardless of it having more VRAM, the GPU itself is weak and dated.

It was very easy to buy 3070 on release, as I said, several friends picked one up for MSRP pricing. GPU mining craze was not started for real at this time. I picked up several 3080s on release, and kept the best. Paid MSRP for all.

3070 beat 6700XT on release and still does - https://www.techpowerup.com/review/asus-geforce-rtx-4070-tuf/36.html

This is minimum fps numbers.

3070 is close to 6800, even at 4K and with a 80 dollars lower price and 8GB VRAM. However, AMD mostly put out 6800XT's, thats true. 6800 still had a MSRP of 579 dollars and 6700XT had 479 dollars, 3070 was 499

6800 was never a 3070 competitor, 6700XT was. Same price, as in within 20 dollars apart, yet 3070 was 10-15% faster in most games (and overall) + better features (DLSS, DLAA, DLDSR, Shadowplay, NvEnc etc.)

Lets compare 6800 with 3080 if prices don't matter anyway?

Check my link again, my 3080 is 1 fps after 6900XT in averge minimum fps aka 1% lows, even at 4K and you think VRAM matters alot on 3 year old dated GPUs that can't even max out the new and demanding games anyway? Neither 6800, 6800XT, 3080 or 6900XT is considered high-end today.

In zero games, I have noticed VRAM issues on my 3080 at 1440p, not even in games that I used DLDSR in and downsampled 4K. Zero stutter in Metro Exodus Enhanced Edition with forced RT.
 
Last edited:
Ah sure, but both cards have unplayable performance with RT enabled anyway :joy:
As shown in the article the RX 6800 played at 52 fps at 1080p which was very playable. The RTX 3070 was unusable. You can argue about how much you think RT is worth it, I don't really care, it's just an opinion.
 
As shown in the article the RX 6800 played at 52 fps at 1080p which was very playable. The RTX 3070 was unusable. You can argue about how much you think RT is worth it, I don't really care, it's just an opinion.
It does not change the fact that 6800 was not a 3070 competitor, 6700XT was, pretty much same price tag

3060 Ti was 399 and performed on par with 6700XT, for 80 dollars less

So instead of compairing 3060 Ti or 3070 with 6700XT, you went with a completely different bracket GPU, the 6800 series which is a Nvidia x80 counter

Makes no sense, especially not because 6800 was hard to find, because AMD rather wanted to sell 6800XT because of good yields = Not many 6800s to find and alot of them was not near its 579 MSRP

AMD 6700 series = Nvidia 3060-70 series (3060 Ti, not 3060, which is way slower)
AMD 6800 series = Nvidia 3080 series
AMD 6900 series = Nvidia 3090 series

Yeah, difference bracket GPUs. 3070 was 15% cheaper than 6800 and much easier to find + Sold way better.

I guess you did not have a 6700XT lying around, but at least you should mention it.
 
Ray Tracing was added to the game, we are testing with RT enabled :S
But the non rt version looks way better than games that require 3 times the amount of vram. It doesn't help that the majority (actually, it's all of them) of the game that hog vram while looking average to mediocre are amd sponsored.
 
Glad I got in well ahead of this last desktop upgrade.

I went from a 6700K/1070 combo at, first 2560x1080 (cos I knew while it was THE 1440p card at launch, that wouldn't last) then when that got heavy on the GPU, down to 1080p by 2020. Sure, I could handle many AAA games up to that point fine with 8Gb VRAM at 1080p... but not all of them all the time.
New PC in 2021 was the 3440x1440 I initially wanted but was priced well out of in 2016... with a 6800XT to run it. Several reasons I went for the 6800XT, among which that at the time incredible VRAM cap being more reassuring than the 3080's 10Gb... and nm that for much of that year the 3080 cost 50-100% more, RT and DLSS be damned. Fwiw max usage seen with the 6800XT so far has been 14Gb, no mods, with around 12Gb being the average... across a good few recent (up to 3 years old) and current AAA's.

I'll add RAM usage for the same has been up to the mid 20's, also before modding etc, in games that generally recommend 16Gb... So it's not just GPU's that are getting hit. Also fwiw I have a 5600X/3070 and 6800H/3070ti (laptop) here. The former, my gf's PC, also runs 3440x1440 at which it's pretty punchy (20-30 less fps, same settings etc, than the 6800XT's 80-100) but her use case is mainly art and retro/emu gaming, so that's fine. The latter, a full powered laptop version, runs 1440p with tweaked settings and/or capped fps for most AAA's due to my wanting to keep thermals/fan noise down. Still uses at least 7Gb though but also fine for now, for a laptop.

I'd only willingly buy into 8Gb now for 1080p 60 fps ultra... and even that'll come with caveats soon enough, if not already in some cases. For 1440p with some consistency and longevity (assuming my average 3-5 year upgrade cycle) 12Gb is the line. Sticking with 3440x1440, which I love, only a 7900XTX or 4080 would be reassuring and have longevity to be worth the jump. Of those the 7900XTX is as good, averages £200 less for 4Gb more and, again, personally RT and upscaling are nothing to fuss over.
 
Thanks for these type of articles !
What I like so much about these 'long reads', is that they are not about the latest fad or hot release of the day ... no, they are (re)visiting hardware, scenarios and statements from a short/long while ago, providing us with new and additional insights.
Some results/tests will be more relevant or more academic related to reader's different usecases, and independent of what brand(s) might be preferred, as a hardware enthousiast, I think this is all very interesting info !
 
Guys like Avro are very disingenuous with their analysis because they are fanboys, just look at his comment history, he argues with anyone who says anything slightly negative about an AMD product but is more than happy to **** on Nvidia/Intel.
What analysis? All I did was post what YOU wrote. I didn't analyze ANYTHING.

  1. Now, I can understand why you think that I'm a fanboy Steve but the truth is this:

1) I do hate Intel and nVidia for their actions in the past and the present. I've never tried to hide that fact because, no matter what, I always try to be 100% honest. I do have a good reputation for that and it was well-earned. I won't say anything that I can't back-up with data. When I say something, it's not because of what I want to be true, it's always because of something that I see to be true.

If you look at my posts that are about you specifically, you'll find that they're overwhelmingly positive but I call a spade a spade and when I see something wrong, I say so. You and I have been around PC tech for decades and with you being far more involved than me, I really have a hard time believing that you didn't see the 8GB as a serious red flag because we both remember when AGP cards had no more than 256MB on them. I don't think that making an 8GB card today is necessarily bad in concept, but if the GPU on a card is potent enough that the 8GB will be a hindrance, that makes a card an objectively bad product if not sold for dirt-cheap (and they were the opposite of dirt-cheap).

I decried nVidia's use of 8GB because it was put on expensive cards with some objectively potent nVidia GPUs. I never said anything bad about the 8GB on the RTX 3050 or the RX 6600/50/XT because more than 8GB would be a waste on those cards and made them more expensive. Those cards should have no more than 8GB because the GPUs aren't potent enough to make 8GB a limiting factor. I also panned the RX 6500 XT and trolled it by always following it with (I still don't understand why AMD gave that card the "XT" suffix.), does that sound like something a fanboy would do?

2) I'm not a fanboy of AMD because I don't "love" AMD, in fact I don't really even "like" AMD. All that AMD is to me is a method of being involved in PC tech without having to use Intel or nVidia parts where I can avoid it. Fortunately, for my uses, I am always able to avoid it. If VIA released a new x86 CPU or a new S3 GPU, I would be totally interested in them. I'm not a fanboy, I'm a hater and my hate has been well-earned by Intel and nVidia. I am not delusional however and I don't see AMD as a saviour. I don't want AMD to win, I want AMD to achieve parity, nothing more. Does that sound like something a fanboy would say?

3) If AMD managed to reach parity in the markets with Intel and nVidia, I would stop using exclusively their parts because the main reason that I use them would be achieved. I want parity in the markets because that's what would benefit all of us consumers. Having said that, buying nVidia or Intel won't help anyone in that quest one bit. To me, Intel and nVidia have wounded the PC market and until they have serious competition, that wound won't heal. I don't think that AMD makes better products that Intel or nVidia but I do think that spending money on AMD parts usually (not always) is more beneficial to the consumer doing the spending because they get more for their dollar. For people who are rich and therefore don't care about that, sure, it makes no difference to them what they spend on. I'm focused on the average Joe and the average Joe doesn't have a lot of disposable income like we did 20 years ago. The CPU market is lopsided and the GPU market is horribly lopsided.

Do you remember back in the 90s, when the video card market actually had some semblance of parity and new cards were getting released almost monthly by companies like ATi, nVidia, 3dfx, Matrox, S3, Diamond and Orchid? THAT is what I want to see again and the only way to get it is for the duopoly that we're stuck with to be no more than 60% of the market on either side. Just imagine what it would be like if that were to occur. Then imagine what it would be like if AMD decided to just pack it up because some executive decided that the consumer market was a waste of time at that point. It would make the prices we see now seem like bargains in comparison to that reality.

The truth is that I would love to see Intel gain traction in the GPU market (really, I totally would!) but only at the expense of nVidia because replacing Radeon with Arc won't improve anything. If Intel is to advance, it has to be at nVidia's expense because they can afford to lose said market share while AMD can't. I want to see a 3-way free-for-all between GeForce, Radeon and Arc. Wouldn't that be just incredible, Steve? Isn't that something worth wanting, something worth having?

Hell, I honestly wish that VIA would re-enter the CPU market but I know that they can't afford the R&D to compete with AMD and Intel. If they DID enter the market, I would happily abandon AMD and only buy VIA CPUs because I want the maximum number of players to succeed. It helps all of us. Does that sound like a fanboy to you?

I can prove to you that I'm not a fanboy because I was absolutely LIVID at AMD for what I considered to be two extremely cynical moves. The creation of the R9 X3D CPUs and the refusal to create an R5 X3D CPU. I'm sure that some of you remember my (seemingly) unhinged rant about that, a rant that was completely on your side:
"This review vindicates everything that I've been saying about AMD's decision to produce 3D versions of R9 APUs instead of R5 APUs. It is literally the stupidest decision that I've ever seen AMD make and it's going to hurt them. I get no joy from this because their choice to make these 3D R9 APUs instead of a 3D R5 APU doesn't only hurt them, it hurts gamers and I am a gamer.
https://www.techspot.com/review/263...:~:text=This review vindicates,plate of crow.
Prosumers won't pay more for an APU that is beaten by the R9-7950X in productivity, even if they want to also game with it because the R9-7950X already matches the i9-12900K in gaming which makes it an already great gaming APU for far less money than the R9-7950X3D.
https://www.techspot.com/review/263...:~:text=This review vindicates,plate of crow.
OTOH, gamers won't buy it because there's no point in paying more for an APU that games worse than one costing significantly less (as the simulated R7-7800X3D showed us). This is especially true when you're paying more money for a bunch of extra cores that will just sit idle and eat power for no reason which is EXACTLY what the R9-7950X3D will do.
https://www.techspot.com/review/263...:~:text=This review vindicates,plate of crow.
The R9-7900X3D is in an even worse position because it has no hope of out-performing both the R9-7950X3D or the R7-7800X3D because it has fewer cores with 3D cache. It will perform no better in games than the APU that should have been, the R5-7600X3D, but, again, its high price will make it one of the worst processors ever launched by AMD.
https://www.techspot.com/review/263...:~:text=This review vindicates,plate of crow.
I said from the beginning that AMD was utterly insane to create X3D versions of the R9 APUs instead of the R5. When Steve tests the R9-7900X3D, he'll be able to simulate what the R5-7600X3D would have been, the APU that AMD should have made. I said that 3D versions of the R9 APUs would be DOA, and sure enough, here we are.
https://www.techspot.com/review/263...:~:text=This review vindicates,plate of crow.
The R5-7600X3D would've been an APU with no chance of failure. Instead, AMD decided to produce TWO APUs that have no chance of success. Even worse, these two APUs cost them a lot more of their money and resources (like TSMC allocation) than the R5-7600X3D would have, making the consequences of this assured failure all that much worse. I said that 3D versions of the R9 APUs would be DOA, and sure enough, here we are. Steve will be able to simulate an R5-7600X3D when he gets his hands on an R9-3900X3D and we'll see what could have been, the APU that would have made AMD the undisputed kings of gaming.
https://www.techspot.com/review/263...:~:text=This review vindicates,plate of crow.
Instead, here we are, EXACTLY where I knew that we'd be. To everyone who gave me flak for saying this, enjoy your plate of crow."

I ask again Steve, does that sound like a fanboy to you, or an idealist who gets really pissed off at cynical actions?

Don't get the idea that I don't like or admire you Steve, because I always have and I'm pretty sure that I always will. Your dismissal of the RTX 3060 8GB was pure genius as was your dismissal of the RX 6500 XT (I still don't know what the "XT" is for on that card). Your 100% fair and objective comparisons of the RX 5700 XT with the RTX 2060, 2060 Super and 2070 were as objective as could be. Your review of the HD 4870 is what convinced me to buy one as my first Radeon card. Seeing an article from you that was so non-objective that it seemed like from the beginning that you had an axe to grind was just shocking to me because I knew that you're so much better than that.

Not everything I say about Radeon is positive either. I was all over them because they took the RX 7800 XT and re-named it the RX 7900 XT and jacked the price. They tried to obfuscate it with the XFX suffix on the RX 7900 XTX and I said that while I believe that nVidia deserved the crap they received for trying to pass off the RTX 4070 Ti as the RTX 4080 12GB, I didn't think it was fair that AMD got a pass for their cynical nomenclature. They should have been shamed in the exact same way but I understood why that didn't really happen. No matter what crap AMD pulls, Intel and/or nVidia will never fail to do something far more egregious which makes everyone forget about AMD's (relatively) minor shenanigans and transgressions.

I'm fully aware that AMD is not anyone's friend, but for whatever reason, it cannot be denied that they have been, by far, the least evil of the three evils. I am ethically-driven to use their products but it's not because I'm a fan of them, it's because I just hate them the least. I'm also driven to get the most for my dollars while still getting a performance level that's good, or at least, good enough. It cannot be denied that, historically, AMD has been the one to provide that the most.

Hell, I got my FX-8350 for only $170CAD and it lasted me for five years. I know that, based on your pretty cool (and funny) dartboard that you hate FX for some reason (probably because AMD over-hyped it) but it served me flawlessly between 2012 and 2017. Was it good? Maybe not, but it was "good enough" and it was far cheaper than Sandy Bridge or its descendants. I was also on AMD's side in the "How many cores in an FX-8350?" question. It wasn't because I liked AMD and as much as I DID want to stick it to Intel, being dishonest would only make Intel look good and me look bad.

Steve, I'm sure that you remember this. Before the advent of the Intel 80486DX, CPUs were just ALUs and if you wanted an FPU, you had to buy a "Math Co-Processor" which was based on Intel's x87 FPU architecture. That means the CPU core was already defined many years previous as an ALU. It also means that, objectively, AMD didn't lie and so I had to take their side. If they HAD lied, I would've been pissing all over them like I did with the R9 X3D CPUs. I had no emotional investment in it whatsoever, it was simply what I saw as being correct, nothing more. I was defending the established definition of the CPU core as being an ALU, I wasn't defending AMD.

Tell me Steve, does that sound like a fanboy to you?
 
What would you like me to re-evaluate? Again I mentioned the VRAM several times in my RTX 4070 review. I'm not sure what more you guys want from me? I've been on Nvidia's case about VRAM capacities for years now, and received a huge amount of flack for doing so, but I'm continuing to dig into it as the evidence mounts.
The last words of the review : "Bottom line, the GeForce RTX 4070 is a good value product packing a strong feature set and excellent performance. We're happy that the RTX 4070 has turned out to be a product that we can recommend, and we expect this GPU to sell very well if they can meet the promised $600 MSRP."

What I think you should re-evaluate is that there's nothing recommendable in a 600$ midrange card but mostly, re-evaluate the tone with which you speak about Nvidia products. A friendly tone is as important in our conversations in the comment section as a firm and sharp tone is necessary with the powerful who enjoys building a monopoly in an unethical way. I wouldn't take the time to give you that advice, if I didn't care about your work.
 
Last edited:
No surprises here really. I bought the 6800 because of the VRAM and it's generally faster than the comparable NVidia products. Great article!
 
What was his point? That he doesn't understand that if a product is 11% faster but costs 16% more it's not particularly good value? The reality is he's attacking one of the few content pieces that clearly highlighted the 8GB VRAM buffer as being an issue and then years later that same media outlet is the only one to investigate the issue. Talk about friendly fire.

No one in their right mind would argue that the RTX 3070 hasn't been the vastly superior choice for ray tracing over the past few years. But we are starting to see a shift now, as I accurately predicted 2 years ago.

The TechSpot article is also a summary of my opinion (edited by Julio) from the original video, and I feel the video was a bit more critical of the RTX 3070. In the video I basically said I'd buy the RX 6800 over the RTX 3070...

"I was quite impressed with the RX 6800 in my day one review and felt it would be my go to option for $600 US or less, and despite the few hiccups seen in this testing I’m mostly still leaning that way. That said there’s a lot to talk about and depending on your preferences one might be better than the other."

Guys like Avro are very disingenuous with their analysis because they are fanboys, just look at his comment history, he argues with anyone who says anything slightly negative about an AMD product but is more than happy to **** on Nvidia/Intel.

RDNA2 had its fair share of issues when first released, there were a number of serious driver issues.

Anyway I'm very pleased with how my conclusion has aged and I stand by everything I said in the video:
I agree with you on the most part, and I didn't know that about him. But, Why did I get an 24-hour ban from this site for making that post? And then get forced making a new password, then spend 2 hours proving that I'm not a bot etc, And I know you can check that I had to go through all this.
But anyway my autism makes me come across much stronger in tone than I'm really trying to sound, and am a big fan of what you do, that is why I have been supporting you on Patreon for nearly 2 years now.
 
I don't understand much of the comments here, it shows how many people are crazy fans....

8 GB or less are still the majority but it clearly shows that on most newer titles people will start to have limitations at 4K and higher details. It is also obvious that mostly on ports, the stock holders want cash ready and fast and don't allow developers to optimize the game, so it is probable that you get a mediocre product.

Some presented some faulty points saying Steve was biased but... where is the biased point? The 3070 was only a little slower versus the 6800 but that inferior speed was less than the price difference. Besides AMD still doesn't have a so good media engine neither a so good professional app support. FSR and RT performance on AMD improved a lot but took too long and Nvidia is still the best on that. Those are big pluses and can't be taken out of the equation.

I would just be very careful buying a card in 2022/23 and I rather buy an cheaper old gen 16 GB card or wait a little longer.
 
I agree with you on the most part, and I didn't know that about him. But, Why did I get an 24-hour ban from this site for making that post? And then get forced making a new password, then spend 2 hours proving that I'm not a bot etc, And I know you can check that I had to go through all this.
But anyway my autism makes me come across much stronger in tone than I'm really trying to sound, and am a big fan of what you do, that is why I have been supporting you on Patreon for nearly 2 years now.
You didn't know that about me because it isn't true. I'm not a fanboy because I'm not a "fan" of AMD, I just despise Intel and nVidia so I buy AMD by default because there's nothing else out there, not because I "love" AMD.

I may have taken the piss out of Steve but for years I also sang his praises. I noticed that the change in tone of his reviews coincided with nVidia trying to blacklist Hardware Unboxed. Despite the backlash that the tech press had towards nVidia, it would appear that they still successfully got their message across.

It's yet another reason why I hate nVidia.
 
Back