Intel's Arc GPUs eye up the sub-$200 market, but so does... a refreshed RTX 2060?

Molematt

Posts: 36   +2
Why it matters: By now, it's no secret that Intel's upcoming Arc Alchemist GPUs won't match AMD or Nvidia's best cards, and will likely be entirely outclassed by next-generation products coming next year. However, it's also becoming clear that they'll be almost uncontested in the low-end market... but according to another leak, Nvidia is readying a refreshed RTX 2060 to fill that gap before Alchemist launches.

An alleged leaked slide posted on Baidu once again reiterates that Arc Alchemist intends to leave the top "Enthusiast+" class of graphics cards — the RTX 3080 and RX 6800 XT, as well as their higher-end halo-product cousins — well enough alone. Instead, the upper portion of the Alchemist lineup, still referred to here by its DG2 moniker, covers the upper-midrange market segment, while the lower end "SOC2"-derived GPUs target the clear gap in recent releases below the $300 price point.

It's rather surprising that the Alchemist product line is set to be derived from only two families of GPU silicon. By contrast, AMD's Navi lineup has three silicon variants, while Nvidia Ampere is based on four, with the latter leading to the awkward performance gap between the GA104-derived RTX 3060 Ti and the GA106-derived RTX 3060.

Competition seems to be on the way, too, as according to insiders cited by VideoCardz, Nvidia is readying the RTX 2060 for another outing. The Turing-based GPU will see its vRAM capacity bumped to 12 GB, but other than that it is reportedly the same GPU that launched back in January 2019, rather than a higher-powered variant like the Super series.

Although it would lack the glamour of being the latest-and-greatest generation of graphics cards, a re-released RTX 2060 could help fill out the low-end GPU market if priced appropriately, as in laptops it's been shown to outpace the RTX 3050 Ti. However, GDDR6 prices have exploded with the current GPU and console generation, which may pour cold water on the prospects of a GPU with 12 GB of it going for cheap.

According to the report, the card could reach market by the tail end of 2021 or January 2022, which would be just in time to pre-empt Alchemist GPUs moving into that same low-end market. It's perhaps a little optimistic that Intel could still consider the $150-199 class to be "Mainstream+" — at least, not since the GTX 1060 3 GB and RX 570 way back when — but more competition in that price range is always welcome.

Permalink to story.

 
Hey after the 3050/ti flopped on the laptop market (Well, on reviews and perception there's still plenty of laptops with it, more than any other laptop gpu even) it wouldn't be such a bad idea if they released the 2060 as an X050 product for the desktop.

Sadly this generation has been so bad that whoever (Intel, Nvidia or AMD) can release a sub 300 card that can be kept in stock will overwhelmingly win that market, which happens to be by far the most popular one anyway.

AMD can't do it because they just don't have the TSMC allocation to support it plus all their other products and intel can't compete in either the feature set or the driver optimization at least not out of the gate so it leaves only the damned Nvidia, yet they're selling enough of the more expensive cards that they won't release anything sub 300 usd until *someone* puts out a competing product.
 
Why on earth do they need to give RTX 2060 12GB Vram if they want to make it the best low end solution???

RTX 3060 can't take advantage of it's 12GB good enough, so this is a really stupid move for a re-released RTX 2060, which is not even a 2060 Super mind you...

I'm amazed no one is bringing up this point at all. (the article just mildly grazes the issue, without calling it for what it is, a stupid decision).
 
The RTX 2060 was already re launched wasn’t it? You can buy them easily in the UK for about £350.

Why on earth do they need to give RTX 2060 12GB Vram if they want to make it the best low end solution???

RTX 3060 can't take advantage of it's 12GB good enough, so this is a really stupid move for a re-released RTX 2060, which is not even a 2060 Super mind you...

I'm amazed no one is bringing up this point at all. (the article just mildly grazes the issue, without calling it for what it is, a stupid decision).
Look up how memory is used on GPUs. In most cases the speed of the memory is a bigger performance barrier than how much you actually have. But the amount you have and how it is connected affects the speed. So it’s probably that the 12GB config gives the 2060 more bandwidth than the 6GB config. And explains why the 3060ti is faster and will play more future titles than a 3060 despite having 4GB less of total memory.

It’s very difficult to know how much memory games need as they will use up more than they need. Seeing a card using 8GB for example does not mean it needs that.
 
Why on earth do they need to give RTX 2060 12GB Vram if they want to make it the best low end solution???

RTX 3060 can't take advantage of it's 12GB good enough, so this is a really stupid move for a re-released RTX 2060, which is not even a 2060 Super mind you...

I'm amazed no one is bringing up this point at all. (the article just mildly grazes the issue, without calling it for what it is, a stupid decision).

Good point: only reason I can come up with is the mining performance. Sure there's *some* small list of productivity accelerated apps that can probably also take advantage of the vram but let's get real: I don't think this card will have the *wink wink* "hash limiter" enabled at all and will just be something else Nvidia can push for Eth mining while they can.
 
Well, a RTX 2060 sells for abt 550 EUR new right now in Greece and abt 450 EUR used at Ebay.De.

1080 is about the same money used.

So, say bb to your $200 GPU.
 
And with you sub-$200 you mean sub-$2000, don't you? It's rather pointless to talk about MSRPs when in fact actual market prices have nothing to do with them.
Yes, and tbh if Techspot keeps getting suckered by low ‚msrp‘ they will lose all respect I have for them.
 
Why on earth do they need to give RTX 2060 12GB Vram if they want to make it the best low end solution???

RTX 3060 can't take advantage of it's 12GB good enough, so this is a really stupid move for a re-released RTX 2060, which is not even a 2060 Super mind you...

I'm amazed no one is bringing up this point at all. (the article just mildly grazes the issue, without calling it for what it is, a stupid decision).
Simple. Because 6GB is no longer good enough and this is the only other configuration.

Plus marketing.
 
Why on earth do they need to give RTX 2060 12GB Vram if they want to make it the best low end solution???

RTX 3060 can't take advantage of it's 12GB good enough, so this is a really stupid move for a re-released RTX 2060, which is not even a 2060 Super mind you...

I'm amazed no one is bringing up this point at all. (the article just mildly grazes the issue, without calling it for what it is, a stupid decision).

Having just upgraded to a 3080 from a 2060 (6GB model), sure, it's doubtful that it'll take full advantage of the 12GB VRAM, but it is limited with the base 6GB.
Being that its a base 2060, the only option for adding more ram is to simply double it, else they'd have to use one of the GPU's of the other 2060 variants to match their memory bus/VRAM specs.
The base 2060 uses a different GPU than the 2060 Super.
 
The RTX 2060 was already re launched wasn’t it? You can buy them easily in the UK for about £350.


Look up how memory is used on GPUs. In most cases the speed of the memory is a bigger performance barrier than how much you actually have. But the amount you have and how it is connected affects the speed. So it’s probably that the 12GB config gives the 2060 more bandwidth than the 6GB config. And explains why the 3060ti is faster and will play more future titles than a 3060 despite having 4GB less of total memory.

It’s very difficult to know how much memory games need as they will use up more than they need. Seeing a card using 8GB for example does not mean it needs that.

It'll be the same memory bandwidth as the base model 2060.

Also, Allocated VRAM is not Dedicated VRAM, which is the amount actually being used at any given moment.
The game engine will allocate the most it can to itself, but that doesn't mean all of it is being used.
 
Last edited:
Shame they'll probably be sticking with the same 180W power limit on that 2060 refresh, that's really going to be what holds this GPU's back from full potential.

My old one was always hitting power limit while in use.
 
RTX 2060 (non-Super) with 6GB Vram or 12GB Vram is a 1080p GPU!

Make no mistake about that, it cannot push high enough resolutions to actually use all those 12GB Vram and at 1080p even today you don't need more than 6GB Vram.

Mining could be the/one reason (as Dimitriid said) and BS PR "look we have 12GB" for fools to fall into the trap, but other than that, like I said above it's a stupid decision and it will only make it more expensive than it needs to be, which defeats the so called intended purpose of being the cheap option...

As per usual, nvidia logic. Meh. 😑 But who cares as long as Jensen is sitting on a mountain of $$$. Right, Lether Jacket Man?
 
2060 is just a 1060 with rt, but has **** rt and way more tdp. so its a pointless gpu

All the results and graphs show that the RTX 2060 has similar performance to a GTX 1080. It's not a pointless GPU it's just pointless for RT, the tensor cores make it a viable option for gamers who want to game at 1440p by using DLSS without having to buy a more expensive card.
 
Any GPU that actually shows up under $200 and is anywhere RTX2060 performance, wins the market, cause believe It or not, there are places in the developed world where nothing is available under $250 than GT1030. Or RX550-4GB at best.
PS5 is available like 25-30% over msrp, so If You have to, You can buy It, but PC gaming is like M.I.A.
I suppose It's bad in a short term, but may be good In long term as both gaming accesories are becoming dirt cheap and PC game devs will have to start optimise Their games to have any sales.
 
Well, the GTX 1050 ti is selling for $$280.00 ATM, so tere's a long way to go and a lot of scalpers to suppress before you'll ever see a 2060 at under a deuce. (Which BTW, is where it should be)

I have a 4 GB 1050 ti, and it cost me $140.00, after rebate. I'm certainly not gullible enough, stupid enough, or desperate enough, to pay double for another.
 
RTX 2060 - is for running 1080p or simpler 1440p or 1080p upscaled to 1440p with DLSS as stated - so does not need 12Gbs
 
As per usual, nvidia logic. Meh. 😑 But who cares as long as Jensen is sitting on a mountain of $$$. Right, Lether Jacket Man?
Mining doesn't need this amount of memory, and if that was the purpose they would just put these GPUs on CMP cards and sell them for more.

This is clearly just for giving OEMs something to market, like the 3060 12GB. And you know what, in this market it'll sell great anyway.

Still beats 6600XT in RT, too.
 
Why on earth do they need to give RTX 2060 12GB Vram if they want to make it the best low end solution???

RTX 3060 can't take advantage of it's 12GB good enough, so this is a really stupid move for a re-released RTX 2060, which is not even a 2060 Super mind you...

I'm amazed no one is bringing up this point at all. (the article just mildly grazes the issue, without calling it for what it is, a stupid decision).
I play Control at 1080p and it is consuming 11.5 GB.
 
This is completely impossible to ever happen, when you can only find 1660 GDDR5 cards for $500

If you pack newer memory, its going to never exist!
 
Last edited:
Back