Intel's Arc GPUs eye up the sub-$200 market, but so does... a refreshed RTX 2060?

I play Control at 1080p and it is consuming 11.5 GB.

Again, allocated VRAM (what the game engine sets aside to use) is not Dedicated VRAM (what's actually in use)
Because Control in 4K maxed out on a 3090 didn't even use that much dedicated VRAM.

You can configure AB to display both, or check it in Task Manager even and see what's actually being used.

I'll even post a pic of actual VRAM usage vs Allocated in Control at 4K to verify this.
DC506943050FC9C59BE0C114D853380D4B5279E2

Allocated VRAM (whats set aside by the engine to use) 7514MB
Dedicated VRAM (what's actually being used) 5910MB
This is everything high settings, RT all enabled.
 
Last edited:
Since a refresher (or just lesson on this) appears to be needed, spend 7 minutes and learn how to monitor both Allocated and Dedicated VRAM:
 
I'm content with my 3060 for now, but I definitely need a new cpu. Running a 6700k at stock and I get 97-100% gpu usage in most scenarios, but 4 cores won't scrape by much longer.
 
Whatever. I just want to buy a card that will play what I want at a price I want. Politics be darned
I always buy a card that will play the games I want to play at the price I want, but I mine when not gaming and have not paid for a card in 4yrs. it pays itself off in 8months, all expenses costed in, although I do not game for more than 4hours a day
 
Good point: only reason I can come up with is the mining performance. Sure there's *some* small list of productivity accelerated apps that can probably also take advantage of the vram but let's get real: I don't think this card will have the *wink wink* "hash limiter" enabled at all and will just be something else Nvidia can push for Eth mining while they can.
Mining performance is not linked directly to VRAM. The DAG size for ETH is around 4gb atm, so a 4gb VRAM card will mine just as fast as a 12gb. But as DAG grows in size with mining, you`d probably need 6-8gb in the coming years, 12gb is overkill.
 
And with you sub-$200 you mean sub-$2000, don't you? It's rather pointless to talk about MSRPs when in fact actual market prices have nothing to do with them.
By sub -2000 do you mean sub 1500 I mean sub 1200 I mean sub 1400? Should we compare in dollars, yuan, or euros?

Comparing them via real world price is meaningless because they change CONSTANTLY. Using MSRP makes sense in terms of "what catagory should this GPU be competing with", and the real world prices are being covered by Techspot fairly reguarly.
 
Why on earth do they need to give RTX 2060 12GB Vram if they want to make it the best low end solution???

RTX 3060 can't take advantage of it's 12GB good enough, so this is a really stupid move for a re-released RTX 2060, which is not even a 2060 Super mind you...

I'm amazed no one is bringing up this point at all. (the article just mildly grazes the issue, without calling it for what it is, a stupid decision).

Because 8gb is a bare minimum .. and with new games using larger textures a 8gb 2060 is not going to get it done. If you want anything more than bare-medium settings you need more space on card to store textures.
 
RTX 2060 (non-Super) with 6GB Vram or 12GB Vram is a 1080p GPU!

Make no mistake about that, it cannot push high enough resolutions to actually use all those 12GB Vram and at 1080p even today you don't need more than 6GB Vram.

Mining could be the/one reason (as Dimitriid said) and BS PR "look we have 12GB" for fools to fall into the trap, but other than that, like I said above it's a stupid decision and it will only make it more expensive than it needs to be, which defeats the so called intended purpose of being the cheap option...

As per usual, nvidia logic. Meh. 😑 But who cares as long as Jensen is sitting on a mountain of $$$. Right, Lether Jacket Man?
I don't know what you're smoking but my 1060 (2018 refresh) handles 1440 with ease. Yes it's only got 6GB but now includes DX12 support in the GPU instead of the drivers. I actually wasn't aware that it supported 1440 until I tested it and yes, it handles it fine at high settings, just need to cut back on Shadows/Reflections to maintain 60fps as the game is frame limited due to my monitor (Asus ProArt 27 inch) but you know what, my games are all playable so I've got little to complain about.
 
Because 8gb is a bare minimum .. and with new games using larger textures a 8gb 2060 is not going to get it done. If you want anything more than bare-medium settings you need more space on card to store textures.
And yet, resize bar is supposed to have solved that issue but unless you're using the latest CPU with the Latest GPU, you don't get it even though it's been part of the PCIe spec for over a decade.
 
I don't know what you're smoking but my 1060 (2018 refresh) handles 1440 with ease. Yes it's only got 6GB but now includes DX12 support in the GPU instead of the drivers. I actually wasn't aware that it supported 1440 until I tested it and yes, it handles it fine at high settings, just need to cut back on Shadows/Reflections to maintain 60fps as the game is frame limited due to my monitor (Asus ProArt 27 inch) but you know what, my games are all playable so I've got little to complain about.
I'm not smoking anything, I have a GTX 1060 and a GTX 1080 (and an RX 6700 XT, but that's not the point).

The point is GTX 1060 is practically dead in 2021 for 1080p, let alone 1440p.

So what are you smoking or what 5 year old games are you only paying that you're so in love with it, still?

My GTX 1080 started to have issue in modern games too, quite a few of them and no playing on Medium settings is not keeping up, that's like paying a 2021 game with 2015 graphics almost.

I got my GTX 1080 last year at the start of the mining craze (for a great deal) since the prices were already going nuts for new GPUs and availability was **** and I though going from 1060 to 1080 will be enough of a jump to hold me more than 1 year at 1080p. And while I liked the performance jump, still it was not enough to play all the games I wanted at, at least High settings 1080p 60-75fps. For some the GTX 1080 was enough, for others not really.

So you tell me now what I'm smoking when I actually use all these GPUs? I don't even want to get started how big of a jump is 6700 XT over GTX 1080, let alone GTX 1060.

RTX 2060 (non-Super) has the exact same performance as a GTX 1080, unless it can use DLSS. But pure raster performance is identical. So I know what I'm talking about.

Here educate yourself, GTX 1080 = RTX 2060 > https://www.techpowerup.com/gpu-specs/geforce-gtx-1080.c2839

GTX 1060 is much lower, almost in the trash.
 
If I turn off two of the pistons on my car it will only run on 4 like smaller cars... Why would I cripple its power just to draw less fuel?

Well put.

Considering there are CPU's that pull nearly 170W, I'm still trying to figure out how they think that's excessive for a GPU.

Mine pulls 380W under load, with the option to flash it to a 450W extreme OC BIOS.
My brothers Gigabyte 3090 hits 460W under load, on stock BIOS.

Edit: wasn't Fermi a power hog as well?
 
Kosmoz you are a talking nothing but pure undiluted bs. get a better system. your gpu is just a part of it. dead? first, look at those steam charts, 1060 rules, . 2 im using a 1060 w a 5600x and it runs -anything- UE4, unity, cry engine at 1080p 60fps with ease and if I want to go " ultra" all I have to do is enable FSR ultra q and it crushes games like a boss. thanks amd! ( and losses scaling app) AND IT DOES THIS AT 95W.
 
I'm not smoking anything, I have a GTX 1060 and a GTX 1080 (and an RX 6700 XT, but that's not the point).

The point is GTX 1060 is practically dead in 2021 for 1080p, let alone 1440p.

So what are you smoking or what 5 year old games are you only paying that you're so in love with it, still?

My GTX 1080 started to have issue in modern games too, quite a few of them and no playing on Medium settings is not keeping up, that's like paying a 2021 game with 2015 graphics almost.

I got my GTX 1080 last year at the start of the mining craze (for a great deal) since the prices were already going nuts for new GPUs and availability was **** and I though going from 1060 to 1080 will be enough of a jump to hold me more than 1 year at 1080p. And while I liked the performance jump, still it was not enough to play all the games I wanted at, at least High settings 1080p 60-75fps. For some the GTX 1080 was enough, for others not really.

So you tell me now what I'm smoking when I actually use all these GPUs? I don't even want to get started how big of a jump is 6700 XT over GTX 1080, let alone GTX 1060.

RTX 2060 (non-Super) has the exact same performance as a GTX 1080, unless it can use DLSS. But pure raster performance is identical. So I know what I'm talking about.

Here educate yourself, GTX 1080 = RTX 2060 > https://www.techpowerup.com/gpu-specs/geforce-gtx-1080.c2839

GTX 1060 is much lower, almost in the trash.
From a Gaming Standpoint you are correct but from My Business Standpoint, you're absolutely wrong as just like every other business I have to justify the cost of any upgrades and the 1060 fills those business needs quite well. Most of the various benchmarks do not impress me and I don't make decisions in regards to hardware upgrades based on them unless it makes sense from a business standpoint. Things like improving my render times in Blender or Photoshop do make a difference but Gaming is secondary.

One of the biggest issues I've faced since I bought my first 1060 (3GB) model was cost. The only reason the 3GB 1060 got replaced was death. It died on me so I had to buy a new card and the 6GB was a bit cheaper then the 3GB at the time and meant I didn't have to futz with drivers to have it working.

The stimulus checks offered me a great oppurtunity from a business standpoint since the money wasn't coming out of my pocket. That means I could upgrade to an R5 3600XT, new 1440p Asus 27 inch ProArt (limited to 60FPS) and add my Wacom One Graphics Tablet plus a DataColor X11 Calibration Spyder to improve my business efficiency. Right now I'm regretting spending the money on the Radeon RX 5600 XT I purchased but I'm not going to sell it yet as it provides a slightly better gaming experience even though I'm using the 1060 still. Main issue is the card is a triple slot that over laps my second x16 slot (used for an LSI HBA controller card connected to a number of 10k/15k drives for scratch disks. Damn near as fast as SSD's are and well supported by my various Graphics tools.
 
Kosmoz you are a talking nothing but pure undiluted bs. get a better system. your gpu is just a part of it. dead? first, look at those steam charts, 1060 rules, . 2 im using a 1060 w a 5600x and it runs -anything- UE4, unity, cry engine at 1080p 60fps with ease and if I want to go " ultra" all I have to do is enable FSR ultra q and it crushes games like a boss. thanks amd! ( and losses scaling app) AND IT DOES THIS AT 95W.
How's the world view thru those rose tinted glasses?

I have a Ryzen 3600 OC to 4.4Ghz and DDR 3600 CL 16 and SSDs, so I don't think the difference is that big between my system and your 5600x, considering you have a GTX 1060. Your CPU is not even utilized with that GPU, so my 3600 OC is not an issue here at all. You are GPU bound in every game at every resolution at every setting...

I've seen how games run on my system with GTX 1060 and GTX 1080 and RX 6700 XT at every combination of settings and the GTX 1060 is a dead GPU in AAA gaming in 2021, unless you play on Medium settings (and even that doesn't fix the issue in all games).

My own experience + hundreds of fps test on the net are proof to what I just said.

I don't have any more time to waste proving you are wrong, so go ahead and believe what you want. I don't care. Keep those rose tinted glasses on if it works for you, who am I to convince you different. As long as you are happy, it's all good. Just don't impose your rose tinted reality vision as real world and the reality of facts.
 
Back