Despite our fondness for the GeForce RTX 4060 Ti, we can admit that just 8GB of VRAM is not enough for a $400 GPU in mid-2023. So what does the extra VRAM bring to the table?
https://www.techspot.com/review/2714-nvidia-rtx-4060-ti-16gb/
Despite our fondness for the GeForce RTX 4060 Ti, we can admit that just 8GB of VRAM is not enough for a $400 GPU in mid-2023. So what does the extra VRAM bring to the table?
https://www.techspot.com/review/2714-nvidia-rtx-4060-ti-16gb/
How long do you expect game developers to kneecap themselves for old hardware? You can run games on 8GB, just use low settings at 720p, like the xbox series S.Thank you for the review!
What I do not get is this:
- on one hand you recommend users to mostly use HIGH settings because ultra brings little to nothing to quality
- on the other hand, you go out of your way to show that 8GB is not enough for ULTRA settings
So maybe this is related to futureproofing, but so long as the market is full of 8GB cards, then game developers need to make the games work on 8GB in order to address a large enough market.
I expect developers to optimize games for 8GB as long as the vast majority of the target market consists of 8GB video cards. I do not really know the market composition, but from what I can see from Steam surveys (which I highly doubt, but I have no alternative source), it seems the >8GB market share is below 20%. Also, keep in mind that from these 20%, 11.5% are 12GB, of which a considerable amount are 3060 12GB, which are by no means suitable to run with ULTRA settings. So, while I expect the game developers to bake in textures that make the games use those large buffers, they can't only address those 15% of the market.How long do you expect game developers to kneecap themselves for old hardware? You can run games on 8GB, just use low settings at 720p, like the xbox series S.
At some point you have to move on. The onus is on the hardware makers, 8GB was on $250 GPUs 7 years ago. There is no reason the ow end should have at least 10-12GB today.
Nvidia card you should ignore gets 65/100.
AMD card that offers best value for certain price gets 20/100.
""
Nobody was saying that, 0/10 terrible strawman argument. What they DID say was: "An extra 8GB will fix the issues where the 4060ti is running out of VRAM and as a result performance is terrible". And in these situations, the 16GB DID fix these issues. Funny how that works.I'm shocked by these benchmarks! Shocked I tell you! All the ***** fan boy experts were saying adding an extra 8GB of ram would turn the RTX4060ti into a RTX4090. How could they be so wrong....again?
News flash: console sales make up the majority of video game sales. 60-70% usually, sometimes higher. THEY are the benchmark, not people running 5 year old budget cards. Developers will target consoles for baseline performance, PC users with slow hardware can play at low settings and resolution if they cant even match a series S in performance.I expect developers to optimize games for 8GB as long as the vast majority of the target market consists of 8GB video cards. I do not really know the market composition, but from what I can see from Steam surveys (which I highly doubt, but I have no alternative source), it seems the >8GB market share is below 20%. Also, keep in mind that from these 20%, 11.5% are 12GB, of which a considerable amount are 3060 12GB, which are by no means suitable to run with ULTRA settings. So, while I expect the game developers to bake in textures that make the games use those large buffers, they can't only address those 15% of the market.
don't try to walk about on your claims once you got pantsed, have to courage to admit you were wrong (again) and live and learnNobody was saying that, 0/10 terrible strawman argument. What they DID say was: "An extra 8GB will fix the issues where the 4060ti is running out of VRAM and as a result performance is terrible". And in these situations, the 16GB DID fix these issues. Funny how that works.
Fact. The 6800xt should be my next upgrade in the near future.Looking at the charts, one can only be amazed how incredibly powerful the RX 6800 is and it's still competing VERY favorably with top of the line GPUs from both companies!!
"Ultra" settings for things like geometry, shadows, shader effects, post-processing and so on are usually inefficient in terms of how much visual improvement they bring compared to how much performance they cost.What I do not get is this:
- on one hand you recommend users to mostly use HIGH settings because ultra brings little to nothing to quality
- on the other hand, you go out of your way to show that 8GB is not enough for ULTRA settings
They do. If your card doesn't have enough VRAM, you can turn your texture settings down. The issue here is that consoles have 16 GB of shared memory, with 10+ GB being able to be used as VRAM, and 8 GB cards will simply not be able to match the visual quality of the consoles (as far as textures go). It's pathetic that Nvidia is launching $300, $400 GPUs that can't match the quality settings consoles use.So maybe this is related to futureproofing, but so long as the market is full of 8GB cards, then game developers need to make the games work on 8GB in order to address a large enough market.
Your understanding is wrong then. Texture quality settings are not bandwidth intensive, and texture filtering settings have a very modest bandwidth requirement (we got past the point where anisotropic filtering is "free" performance-wise on PC over 15 years ago). If a card has 16 GB of VRAM, it can benefit from it to use high-res textures, regardless of how narrow its bus is.Also there is this affirmation in the article:
"We're not sure where it originates or what the exact theory is, but the belief seems to be that there simply isn't enough memory bandwidth to effectively use a 16GB memory buffer."
As far as I understand graphics computing the two are not dependent but both are limiting factors. While you can have 16GB memory for large textures, you also need a fast bus to bring them to the GPU.
Yes, because shaders (especially pixel shaders) now have to work with even more data and even larger framebuffers. It's not because of textures.The larger the resolution, the larger the need for memory bandwidth.
The 3060 is perfectly capable of handling ultra settings for textures, because 1) texture settings don't affect framerate, and 2) it has enough VRAM to do it.Also, keep in mind that from these 20%, 11.5% are 12GB, of which a considerable amount are 3060 12GB, which are by no means suitable to run with ULTRA settings.
DRAM prices right now are the lowest they have been in ages. Recent estimates are that 16 Gigabit GDDR6 modules are at around $6 to $7 each, meaning 8 GB worth of modules costs less than $30 to AIB partners.As for the buffer size the problem might be that scaling the RAM cells does not work that well on newer processes as it did in the past. So, while RAM speed has increased in recent times, density did not increase that much, so it does not cover the cost increase of new processes. In order to have larger buffers we would need more memory dies are and thus a higher price.
Even Techspot considered it to have best value, here you go;The 6500 XT was unforgivably awful. It deserved the score it got.
“Best value” my rear end. You had to have a current-gen platform to even get the most out of it (largely defeating the purpose of its existence), and it was severely stripped down in functionality due to it being a repurposed laptop GPU. It being the “best value” was largely thanks to the GPU shortage, somehow managing to stay on shelves despite said shortage (I wonder why?). In reality, even at $200 MSRP, it was twice as expensive as it should ever have been. It was also no faster than it’s predecessor!
You and Techspot have same problem: fail to see bigger picture. No matter how "bad", "outdated" or "expensive" 6500XT was, it wasn't suitable for cryptomining that meant it was widely available. And no matter how expensive it was compared to predecessor, it still offered good value at time it was released. Basically every GPU release since 2019 sucks using that logic because price is horrible.For those of you looking to spend as little as possible the cheapest new graphics card is the unfortunate Radeon RX 6500 XT. Essentially, it's a bad product with numerous issues that costs too much for what it is. However, there's no alternative for under $450, so if you desperately need a new graphics card and the used market is out of the question, this is it.
Even Techspot considered it to have best value, here you go;
![]()
The Best GPUs 2022: New & Used Graphics Cards
It's been two years since we published a best GPUs guide, and that's because for two years now the market has been in complete shambles. But things...www.techspot.com
You and Techspot have same problem: fail to see bigger picture. No matter how "bad", "outdated" or "expensive" 6500XT was, it wasn't suitable for cryptomining that meant it was widely available. And no matter how expensive it was compared to predecessor, it still offered good value at time it was released. Basically every GPU release since 2019 sucks using that logic because price is horrible.
It's funny that AMD got hate when they release something that you can actually buy. And Nvidia got praise when the release something "better" (3050) that was nowhere to be found.
Those who say 6500XT was bad release should understand that many people had just two choices: 1. Buy 6500XT 2. Don't buy anything. Taking option 1 out seemed to be better for some reason![]()
You do not seem to remember how bad situation was. Any, repeat, ANY card that you can buy new and is able for 1080p gaming is good card. When options are limited, even crap becomes good quality.Availability is no indicator of quality. And if you continue past the cited paragraph of the article you linked, USED RX 570, RX 5500XT, and 1650S cards were strongly recommended over the poor 6500 XT. It was just an awful product. Completely unnecessarily neutered past its utter lack of mining performance (celebrating this aforementioned lacking also means celebrating the existence of a $200+ 4GB card released in this decade), the real Achilles Heel was the paltry FOUR pcie lanes it had to work with, which basically killed performance unless you had a newer platform. Like I said, for $100, maybe even $150, fine. But $200? It would have to have at least eight lanes, hardware decode, and 3 display output to even remotely justify what was a card which provided no generational performance boost over its predecessor! Ironically, AMD beat nVidia in setting this terrible precedent by a full generation in this instance. But I’m sure this isn’t the only unfortunate time such a phenomenon has occurred (except for any GPU launches which were clearly just refreshes).
To be fair, there is still some oversupply from older products and AMD/Nvidia must accept big losses if they release cards that are better buys than unsld older gen cards. Nvidia even had to pay for 5nm capacity in advance. Basically Nvidia is currently selling parts that were likely manufactured around year ago. AMD on other hand is still selling out 6700/6800 cards, launched over 2.5 years ago.Ultimately, as AMD/nVidia have shown with their latest generation of GPUs, they have absolutely no issue taking a page out of Intel’s 4-core-hell-book during the years of Bulldozer. Ironically, it now seems that Intel may be the only answer to this same phenomenon occurring in the GPU market.
What is this consooomer mindset? It's available, thus it has value?Even Techspot considered it to have best value, here you go;
![]()
The Best GPUs 2022: New & Used Graphics Cards
It's been two years since we published a best GPUs guide, and that's because for two years now the market has been in complete shambles. But things...www.techspot.com
You and Techspot have same problem: fail to see bigger picture. No matter how "bad", "outdated" or "expensive" 6500XT was, it wasn't suitable for cryptomining that meant it was widely available. And no matter how expensive it was compared to predecessor, it still offered good value at time it was released. Basically every GPU release since 2019 sucks using that logic because price is horrible.
It's funny that AMD got hate when they release something that you can actually buy. And Nvidia got praise when the release something "better" (3050) that was nowhere to be found.
Those who say 6500XT was bad release should understand that many people had just two choices: 1. Buy 6500XT 2. Don't buy anything. Taking option 1 out seemed to be better for some reason![]()
No, crap is still crap, you are convincing yourself that it doesnt smell so you can justify consooming product instead of just....not doing that.You do not seem to remember how bad situation was. Any, repeat, ANY card that you can buy new and is able for 1080p gaming is good card. When options are limited, even crap becomes good quality.
You do not seem to remember how bad situation was. Any, repeat, ANY card that you can buy new and is able for 1080p gaming is good card. When options are limited, even crap becomes good quality.