Intel is reportedly planning to unveil its dedicated GPU at a conference in December

Polycount

TS Evangelist
Staff member

The GPU, codenamed "Arctic Sound," will be unveiled by Intel at an unknown conference in December, according to DigiTimes. Though the conference will also serve as an avenue for Intel to discuss its other ambitions, Arctic Sound will undoubtedly take center stage.

Not much is known about the GPU at the moment, but it will reportedly act as the company's latest concrete attempt to break into the high-end PC gaming market (outside of CPUs), while also potentially allowing it to obtain a stronger foothold in the artificial intelligence industry.

As exciting as the possibility of finally feasting our eyes on Intel's upcoming GPU sounds, there are a couple things we should note.

First, even if DigiTimes' reporting is accurate, we don't know whether or not Intel's conference will be public or private. Second, none of this information has been officially confirmed by Intel yet so it may be wise to take the report with a grain of salt for now.

Either way, with December just around the corner, we won't have to wait long to find out the truth. If Intel doesn't announce its GPU this year, don't fret too much - there's always CES 2019.

Permalink to story.

 

QuantumPhysics

TS Evangelist
Intel makes the best-selling, most powerful line of CPU in the world. I'm hoping they can give us a 3-card solution that outperforms the 2060 (low end), 2080 (high end) and 2080Ti (Enthusiast) designed to perfectly optimize with the Core i7 and Core i9.

Of course, it wouldn't be a bad thing for them to target the low end gamer so that every computer rolling out of Walmart with an i3 or i5 is gamer-ready.
 
  • Like
Reactions: Reehahs and Ravey

BigBoomBoom

TS Booster
Intel makes the best-selling, most powerful line of CPU in the world. I'm hoping they can give us a 3-card solution that outperforms the 2060 (low end), 2080 (high end) and 2080Ti (Enthusiast) designed to perfectly optimize with the Core i7 and Core i9.

Of course, it wouldn't be a bad thing for them to target the low end gamer so that every computer rolling out of Walmart with an i3 or i5 is gamer-ready.
More like targeting 1060 performance or not even close, good joking thinking they can match 2080Ti. And Intel CPU dominance is already fading, courtesy of their failure at 10nm (now planned 2019/2020 whereas competitor is already well into 7nm which is equivalent to Intel's 10nm).

Intel CPU has been standing still for a long time now, every Lake is essentially the same as Sky Lake, just adding more core because of pressure from AMD. Colour me surprise when there's zero difference from single thread performance Sky Lake through to whatever Lake they are releasing.
 

Panda218

TS Evangelist
Intel makes the best-selling, most powerful line of CPU in the world. I'm hoping they can give us a 3-card solution that outperforms the 2060 (low end), 2080 (high end) and 2080Ti (Enthusiast) designed to perfectly optimize with the Core i7 and Core i9.

Of course, it wouldn't be a bad thing for them to target the low end gamer so that every computer rolling out of Walmart with an i3 or i5 is gamer-ready.
More like targeting 1060 performance or not even close, good joking thinking they can match 2080Ti. And Intel CPU dominance is already fading, courtesy of their failure at 10nm (now planned 2019/2020 whereas competitor is already well into 7nm which is equivalent to Intel's 10nm).

Intel CPU has been standing still for a long time now, every Lake is essentially the same as Sky Lake, just adding more core because of pressure from AMD. Colour me surprise when there's zero difference from single thread performance Sky Lake through to whatever Lake they are releasing.
I don't buy that for a second, but then again all I see is gamers coming in to buy chips. If you're building a workstation with multicore supported apps I'd approve of AMD, but that's about it.
 

BigBoomBoom

TS Booster
You don't buy what for a second? Even latest Battlefield V CPU performance benchmark shows i9-9900K barely better than i7-7700K in DX12, with double cores and thread counts. But most of the higher performance comes from higher clock. My i7-6700K produces similar result to that at same clock rate. Intel has been standing still at Broadwell level, except Broadwell wasn't mainstream so Sky Lake was the first 14nm mainstream. Enjoy your upcoming Comet Lake, which is another 14nm.

Now AMD is actually releasing 7nm CPU in a couple months and that makes a huge stride forward. Optical shrinking always will. Sure, Intel 10nm single core may be better but that's been postponed forever, maybe they will release at the end of 2019/early 2020. Maybe not. 7nm is already here in the form of SoC and GPU, 7nm is real and at this stage 10nm still experimental.

Intel is standing still because if a gamer already have i7-6700k or i7-7700k, or heck even i7-4790k, they have zero reasons to upgrade for gaming. How is that moving forward?
 
Last edited:

QuantumPhysics

TS Evangelist
Intel CPU has been standing still for a long time now,

AMD Threadrippers are good workstation CPU but terrible for gaming compared to the lowly 8700k and the new 9900k.

Intel is just fine. And they'll be even better later on.


and Considering the 2080ti is having so many problems and may not even be fully supported by developers down the road, I'd say Intel could easily build a competitor.
 

QuantumPhysics

TS Evangelist
You don't buy what for a second? Even latest Battlefield V CPU performance benchmark shows i9-9900K barely better than i7-7700K in DX12, with double cores and thread counts. But most of the higher performance comes from higher clock. My i7-6700K produces similar result to that at same clock rate.

I have an older i7 5960x and a newer i9 7980 ex.

I would be lying if I told you I see a difference in games regardless which desktop my 2080ti is in.

Most games aren't optimized to make use of more than 6 cores.
 

Manrubio

TS Rookie
You don't buy what for a second? Even latest Battlefield V CPU performance benchmark shows i9-9900K barely better than i7-7700K in DX12, with double cores and thread counts. But most of the higher performance comes from higher clock. My i7-6700K produces similar result to that at same clock rate. Intel has been standing still at Broadwell level, except Broadwell wasn't mainstream so Sky Lake was the first 14nm mainstream. Enjoy your upcoming Comet Lake, which is another 14nm.

Now AMD is actually releasing 7nm CPU in a couple months and that makes a huge stride forward. Optical shrinking always will. Sure, Intel 10nm single core may be better but that's been postponed forever, maybe they will release at the end of 2019/early 2020. Maybe not. 7nm is already here in the form of SoC and GPU, 7nm is real and at this stage 10nm still experimental.

Intel is standing still because if a gamer already have i7-6700k or i7-7700k, or heck even i7-4790k, they have zero reasons to upgrade for gaming. How is that moving forward?

Yeah tell'em BigBoomBoom!
 

BigBoomBoom

TS Booster
I have an older i7 5960x and a newer i9 7980 ex.

I would be lying if I told you I see a difference in games regardless which desktop my 2080ti is in.

Most games aren't optimized to make use of more than 6 cores.
News flash, they are both 14nm. Broadwell and Sky Lake X are the same architecture. Single core performance is the same. That is why Intel is standing still, while AMD is catching up on single core Intel has made no progress on single core since 2015. Zen 2 is a guaranteed progress on single core due to 7nm, Comet Lake is just another Broadwell/Sky Lake 14nm.

And keep dreaming on for catching up to 2080 Ti. NVIDIA spent as much as Intel on R&D, but Intel R&D includes CPU whereas NVIDIA only produces GPU. Raja also leads Intel GPU division, Intel GPU is likely GCN architecture and we have seen how well it did lately, can't even catch up to Titan X(P) of 2016.

We have been waiting for Intel to be better since Sky Lake for mainstream but unfortunately keep waiting. Until they can make 10nm, they can't advance.
 

NoVideoMemory

TS Rookie
Intel makes the best-selling, most powerful line of CPU in the world. I'm hoping they can give us a 3-card solution that outperforms the 2060 (low end), 2080 (high end) and 2080Ti (Enthusiast) designed to perfectly optimize with the Core i7 and Core i9.

Of course, it wouldn't be a bad thing for them to target the low end gamer so that every computer rolling out of Walmart with an i3 or i5 is gamer-ready.
Since when did 2060 become a low end?
 

Evernessince

地獄らしい人間動物園
Intel CPU has been standing still for a long time now,

AMD Threadrippers are good workstation CPU but terrible for gaming compared to the lowly 8700k and the new 9900k.

Intel is just fine. And they'll be even better later on.


and Considering the 2080ti is having so many problems and may not even be fully supported by developers down the road, I'd say Intel could easily build a competitor.
The 8700K is just as good as the 9900K in every game. It's by no standard "lowly". Ditto goes the the whole Ryzen 2000 series, which are within 10% of their Intel counterparts in gaming. Not that it matters for a vast majority of people, as it requires a 1080 Ti or higher to even see the performance difference. No point in recommending someone buy a more expensive Intel processor if they are going to have an RX 580 or GTX 1060.

Intel makes the best-selling, most powerful line of CPU in the world. I'm hoping they can give us a 3-card solution that outperforms the 2060 (low end), 2080 (high end) and 2080Ti (Enthusiast) designed to perfectly optimize with the Core i7 and Core i9.

Of course, it wouldn't be a bad thing for them to target the low end gamer so that every computer rolling out of Walmart with an i3 or i5 is gamer-ready.
Since when did 2060 become a low end?
The xx60 model used to be the low end / mainstream bridge card but lately with the prices it's been selling at it's definitely mainstream now. $250+ definitely isn't low end pricing and that's assuming Nvidia doesn't increase pricing again like they did with the 2080 Ti, 2080, and 2070.
 
C

CortyDK

My GT1030 works just fine in gaming.. though its just Solitaire and old DOS Games in DOSBox... ;-)

Performance is weighted only by what you need yourself, not what others think...

Some need 2080Ti's, others game just fine on Intel IGPs...
 
  • Like
Reactions: Reehahs

Ravey

TS Addict
Bottom line is, having a third competitor join the GPU market to go up against AMD and nVidia, can only be a good thing.

More competition generally means better competitive pricing and will hopefully mean greater savings for the consumer.
 

Evernessince

地獄らしい人間動物園
I think it won't be for desktop users. I think it will be like nvidia's tesla series - gpu for workstation loads.
Workstation GPUs and consumer GPUs are 97% the same thing. The difference being drivers and FP performance for the most part. You'd have to be crazy to not launch an entire product portfolio, simple because it allows you to use the lower bins and partially defective dies. The investment is minimal as you already have your GPU designed and made.

In essence, whether Intel releases a consumer product or workstation / professional card first they will likely release cards for the other market segment at some point regardless.

The only way they don't release a consumer product is if they created a GPU so focused on certain workloads that it fails at gaming, which at that point it isn't really a GPU and becomes more of an accelerator.
 

Kn0xx

TS Rookie
My bet, is that these GPU will take advantage on using iGPU from Intel CPU's, something SLI/CrossFire kinda thing.
If that happens, Intel can have the "edge" of GPU marketing and technical "combo" system.
Lets see what happens.
 

Darth Shiv

TS Evangelist
Intel makes the best-selling, most powerful line of CPU in the world. I'm hoping they can give us a 3-card solution that outperforms the 2060 (low end), 2080 (high end) and 2080Ti (Enthusiast) designed to perfectly optimize with the Core i7 and Core i9.

Of course, it wouldn't be a bad thing for them to target the low end gamer so that every computer rolling out of Walmart with an i3 or i5 is gamer-ready.
They've never been serious about GPUs before. Their graphics driver division is epically incompetent delivering driver quality with a similar reputation to Creative Audio. It took Apple giving them a kick up the backside to make use of their iGPUs a bit better. I'm not holding my breath. Tbh, I'm expecting years for them to catch up to gaming GPUs.
 
  • Like
Reactions: JaredTheDragon

DeanLO

TS Rookie
Will Intel pull a rabbit out of their hat and WOW us with something amazing that blows AMD / nVidia away? Will this be Larrabee resurrected?
Hmmm .... Let's take a moment to remember the i740.
 

amstech

IT Overlord
You don't buy what for a second? Even latest Battlefield V CPU performance benchmark shows i9-9900K barely better than i7-7700K in DX12,
Don't use gaming performance to make a case for or against the 9900K, it has no relevance whatsoever. While newer titles do utilize more threads/CPU power, a 6 core Intel is really all you need.
 

kira setsu

TS Maniac
For some reason intel jumping into the gpu market doesn't give me hope, intel has always seemed high end, I see them wanting to go against nvidia and not in a good way price wise, I doubt they would release a lower priced item, it would seem below them.
 

Raytrace3D

TS Addict
You don't buy what for a second? Even latest Battlefield V CPU performance benchmark shows i9-9900K barely better than i7-7700K in DX12, with double cores and thread counts. But most of the higher performance comes from higher clock. My i7-6700K produces similar result to that at same clock rate.

I have an older i7 5960x and a newer i9 7980 ex.

I would be lying if I told you I see a difference in games regardless which desktop my 2080ti is in.

Most games aren't optimized to make use of more than 6 cores.
Agreed. I upgraded from a i7-5930K (6-core) to a Threadripper 2950X (16-core) and frame rates in game are still 90+ fps at 3440x1440... its all about the GPU. If you have a relatively modern CPU, you're fine. Just get an epic GPU and you're good for gaming. Granted, the CPU isn't "awesome" for gaming, but perfect framerate is still perfect framerate. My thinking is as games become more core-focused, my 2950X will become better at those tasks rather than the "bottleneck" (hopefully).
 

ZipperBoi

TS Enthusiast
Intel makes the best-selling, most powerful line of CPU in the world. I'm hoping they can give us a 3-card solution that outperforms the 2060 (low end), 2080 (high end) and 2080Ti (Enthusiast) designed to perfectly optimize with the Core i7 and Core i9.

Of course, it wouldn't be a bad thing for them to target the low end gamer so that every computer rolling out of Walmart with an i3 or i5 is gamer-ready.
That last part would be cool. I built my own PC but know several people who would like to get into PC gaming but dont want to build because they dont have the money or dont have the knowledge to build. I would shift away from Nvidia if Intel had quality cards that meshed well with their I-series.
 

Evernessince

地獄らしい人間動物園
My bet, is that these GPU will take advantage on using iGPU from Intel CPU's, something SLI/CrossFire kinda thing.
If that happens, Intel can have the "edge" of GPU marketing and technical "combo" system.
Lets see what happens.
Intel has had that for awhile now, It's called Lucid Virtu. The only problem is it's not worthwhile for anything but the lowest end graphics cards. The big problem with using both iGPU and dGPU at the same time is the difference in speed between the two. The computer can send 3D work to both but often times the dGPU will complete it's work so much faster that the frames that the iGPU were supposed to deliver end up being much later, making the entire system wait on it to send the frames to the monitor.

In the end, the iGPU doesn't make the system faster and does introduce issues like micro-stuttering.
 

Dimitrios

TS Guru
Good job you POS trader Raja working for AMD for a short time with their GPU department then bounce out of the blue and months later INTEL has a working GPU.