Intel rumored to unveil its discrete GPU at CES 2019

midian182

Posts: 9,740   +121
Staff member

We know that Intel is working on a new discrete GPU, but what we don’t know is when it will arrive. According to Anthony Garreffa at TweakTown, however, the company’s big unveiling could take place at CES 2019.

Garreffa’s industry sources say Intel has reached the end of one of the “big steps” related to the manufacturing of the GPU and is now preparing for launch. If it is unveiled at next year’s Consumer Electronics Show, then the card could release a few months later. This launch date would be a lot sooner than expected, so it’s best to take the report with a pinch of salt.

Back in November, Raja Koduri, the head of the Radeon Technologies Group at AMD, left the company to become Intel’s GPU chief architect, where he heads its newly formed Core and Visual Computing Group. A couple of weeks ago, AMD’s former director of global product marketing, Chris Hook, also joined Intel as its “discrete graphics guy.” A role that will see him lead the company’s GPU marketing push.

Intel has said it will focus on "high-end discrete graphics solutions for a broad range of computing segments" for the PC market, which sounds like it could include gaming.

Last February saw details of Intel’s first prototype discrete GPU revealed, though this was just a proof-of-concept and not a future product.

Intel’s GPU is reportedly codenamed Arctic Sound. According to Ashraf Eassa of TheMotleyFool, it was originally targeted for video streaming apps in data centers, but there will be a gaming variant arriving at some point.

We still have no idea just how powerful Intel’s GPUs might be, but the idea that it could go up against Nvidia and AMD in the gaming sector is an interesting one. While a reveal next January is possible, expect to learn more about the cards before then.

Permalink to story.

 
I predict it's gonna be 2 times faster than the competition, on paper, during the announcement, and slower than the competition when the product is launched, followed by a huge write-off and the stock plunging down.

So many times Intel has tried jumping on the wagon, always landing on its a$$, and in tears. Let's watch another remake of it.
 
Last edited:
You mean "discrete", not "discreet"

Right now this is pure speculation on everyone's side. Raja will not have had a chance to create a GPU design from scratch (that takes more than a few months) and considering the GPU was aimed at datacenter, I'm pretty sure this will *never* be a decent gaming card. It doesn't make sense. A microarchitecture which is aimed at datacenter has a completely different set of features than something for gaming... :-/
 
Intel will enter with a low to mid-range cards. This is easy for them and will cause AMD & NVidia to step up these ranges. What I see is the current MID for them will become the low end putting pressure on Intel. The question is what will happen above. It seems that the top needs either a huge leap or maybe just tweaking at a real cost. Gamers are the core on the top tier, the real money is Upper Middle and Middle.
 
"computing segments" - It does't sound like "gaming" to me. This might be just for compute workloads (aka servers), something that Intel has tried to do before.
AMD and Nvidia have most of the big technology patents and I doubt Intel can create something without entering in an billion dollar agreement with one of the two (mainly AMD)
 
Competition is always welcome, even if it doesn't live up to all it's promises it will definitely put the competitors on notice there's another alternative for the buyers and as such, the consumer will be the ultimate winner .....
 
I remember the flop that was larabee. But Intel without question has gotten a lot better at making GPUs, looking at there integrated x4500 HD from 2009 and compare that to whats built into the 15 watt 7th and 8th gen chips and you gotta give them credit. If they are simply taking that architecture and adding a lot more slices to make a mid range or high end card it could be interesting, it's proven it's efficient they designed it to be scale-able at the architectural level so they can offer different integrated tiers. But this is sounding more like a compute chip and not a gaming focused or everyday usage gpu.
 
This might reflect some things happening behind the scenes that haven't come to light yet. Intel is licensing AMD tech for higher performance integrated graphics now. After the agreement is reached, Intel hires a couple of AMD people without AMD complaining. It's possible that AMD and Intel have a non-publicised agreement wherein they are now sharing development costs of specific technologies, with both companies benefiting from the development.
 
So many times Intel has tried jumping on the wagon, always landing on its a$$, and in tears. Let's watch another remake of it.

Unless we limit to discrete, Intel already have the lions share of the GPU market. Most of the pieces are already in place - a brand, an architecture, a manufacturing process and a driver developement team. Intel are well positioned to compete for gamer share, they just need to scale it up.

From a consumer perspective, competition should be welcomed and encouraged.
 
Last edited:
You mean "discrete", not "discreet"

Right now this is pure speculation on everyone's side. Raja will not have had a chance to create a GPU design from scratch (that takes more than a few months) and considering the GPU was aimed at datacenter, I'm pretty sure this will *never* be a decent gaming card. It doesn't make sense. A microarchitecture which is aimed at datacenter has a completely different set of features than something for gaming... :-/

This is what I was thinking. If true, Raja likely didn't have anything to do with the architecture. Not that it would be entirely surprising, Raja Kudori has IMO always been a better driver guy. This is still about a year earlier than I expected them to announce a card.
 
I predict it's gonna be 2 times faster than the competition, on paper, during the announcement, and slower than the competition when the product is launched, followed by a huge write-off and the stock plunging down.

So many times Intel has tried jumping on the wagon, always landing on its a$$, and in tears. Let's watch another remake of it.

They might make it but only because that sell out rat RAJA spend plenty of time understanding AMD gpu's then when he collected what he can he jump on the INTEL train. INTEL may do well with the hardware but not the software side. Making GPU's and the constant drivers with new games and if you're off by performance on one generation you will sink very fast in the GPU market.
 
Unless we limit to discrete, Intel already have the lions share of the GPU market. Most of the pieces are already in place - a brand, an architecture, a manufacturing process and a driver developement team. Intel are well positioned to compete for gamer share, they just need to scale it up.

From a consumer perspective, competition should be welcomed and encouraged.

If only you had some insight to realize it (having a third graphics competitor in Intel) is only a pipe dream, you wouldn't have bothered writing the first paragraph; but I can't comment anything on it. Best I can say, in the case of Intel: history does indicate a trend and is about to be repeated. This won't be a case of "only time will tell"; it is a case of "I'll tell you [retroactively] in time". Too much confidential information, that writing a simple sentence with a fact would be like walking through a minefield.
 
I wouldn't scoff Intel so hard. As of yet we do not know what market segments they're aiming at or what kind of GPU they'll bring to the table:

All we know is this: When it comes to iGPUs embedded in CPUs, Intel is absolutely huge in the market. I won't dare to guess how many laptops, barebone systems, htpc's and plain office/desktop pcs are using Intel Grapichs right now. Even tablets with atom processors.

I think its downright amazing what Intel can push from an iGPU in a 4,5 watt CPU...

Hell, even my own old Intel NUC with a i3-3217u are giving me 1080p 60fps video without breaking a sweat. 5-6% on CPU cores and 6-8% on GPU cores...

Intel Graphic drivers, while maybe not the most complex, are stable, relatively small compared to geforce drivers, and do their job in the background very well.

We have a leftover HP laptop with a i5-4200u CPU. In 1366x768 it runs Heroes of Might and Magic V with 60+ frames in medium detail, and the fan is remaining relatively quiet. So for games found on FB, Bigfish and alike, office use, and watching video, not much can beat Intels iGPUs in terms of power efficiency and "bang for the buck value" as you have not payed a single cent for a discrete graphics unit.

My speculation, and that's what it is, pure speculation, is that Intel will not target High End Gamers. Nvidia and AMD have that market in hand. But lets say they are able to produce a graphics card able to match a 1050 Ti, but at a lower retail price. Using their exisiting driver lineup, I think that would be an immeidiate success.

Yes I too have a nvidia graphics card, but god how I hate their complicated driver. Why the hell must I install driver+physx+vulkan+experience and god knows what, taking up about a gigabyte on my harddrive, if Intels preinstalled standard driver can give me respectable performance at a great price, and in the low to mid range market, thats where the majority of pc owners across the world are places.

Dedicated gamers are outnumbered by ordinary everyday users, the quiet masses, so I suspect that that I where Intel wants to place themselves. Maximize cashflow, create revenue.

Thats my guess anyway...
 
Back