Intel officially confirms discrete graphics card coming in 2020

Greg S

Posts: 1,607   +442
The big picture: Intel has confirmed that its first true dedicated graphics card will be launching in 2020. This could be the start of a new round of competition between AMD, Nvidia, and Intel, ultimately giving enthusiasts more options to choose from.

One of the unfortunate market conditions that hardware enthusiasts face is the lack of choice when it comes to some of the needed core components. Intel and AMD now have healthy competition in the CPU market, but Nvidia is very clearly outranking AMD in the consumer graphics industry.

Intel has now confirmed that it will be joining the competition for discrete graphics solutions.

Over the past year, rumors have flown about over when Intel was going to bring a GPU to market. Some leaks pointed towards CES 2019. Although these rumors could still hold true, we now know that Intel is aiming for a 2020 launch of discrete graphics products.

Although Intel is remaining scarce on details of its upcoming GPU project, past prototypes show that there could be more than just another mediocre GPU entering the fray. Then again, given that Intel has been selectively poaching AMD's graphics department, Nvidia may not be too concerned about a third GPU maker.

According to Ashraf Eassa of TheMotleyFool, Intel has split its dedicated graphics project carrying code-name Arctic Sound into two products. Jupiter Sound is also one of the codenames used by Intel. One will focus on video streaming within data centers, and the other will be for gaming. At this time there is no information on architecture, process technology, or any other hardware information that could be used to estimate performance metrics.

No matter where consumer loyalty lies, more competition can help lead to better hardware in a shorter time span. Next generation graphics cards may carry high launch prices, but a good healthy round of competition could help keep sticker shock to a minimum.

Permalink to story.

 
I don't think "enthusiasts" will be interested in this card. Even AMD with it's vast graphics card making experience, is struggling against Nvidia.

On the other hand, any good competition is always welcome, provided there is competitive pricing to go head-to-head with the current cards in all categories (low end to high end).

But if the pricing is similar across all these companies, then there is no point introducing another card with similar performance.
 
Intel, 2019: This is our new i9 with 40 cores and 80 threads. Welcome to the future of computing.

AMD, probably the next day: 60 cores, 120 threads, Threadripper 3. Buy it.

Intel, 2020: This is our new TRX102 GPU, code name “T-Rex.” The worlds first 4K, 144hz gpu.

Nvidia, one month later: Meet the GTX 1380. 8K. 60FPS. Backwards compatible with 4K 240Hz monitors and HDR. Less power, less heat, higher clocks, and real-time 4K ray tracing capability at 120Hz at stock frequencies. How great is that guys? The more you buy, the more you save.

We all know that’s how it’s going to go down.
 
Yes, Intel has former AMD people on the job, but with a much larger budget to work with, they could pull off a miracle and surprise everyone with an NVIDIA-beating GPU. I mean, AMD was trading blows at times with NVIDIA, despite their financial disadvantage, so how hard could it be for a company the size of Intel, who has their own foundry?
Whatever high price they charged for it, it would be worth buying just to light a fire under NVIDIA and AMD.
 
Yes, Intel has former AMD people on the job, but with a much larger budget to work with, they could pull off a miracle and surprise everyone with an NVIDIA-beating GPU. I mean, AMD was trading blows at times with NVIDIA, despite their financial disadvantage, so how hard could it be for a company the size of Intel, who has their own foundry?
Whatever high price they charged for it, it would be worth buying just to light a fire under NVIDIA and AMD.

And what a foundry they have too...
 
How high level are games coded in terms of GPU processing? If Intel come in with something that differentiates from pre-existing solutions won't there most likely be an issue of compatibility, or is it generalised enough that Intel can try someone radically different in terms of architecture but games wont notice?
Either way, Intel won't have the big back catalogue of driver support on existing games like the other 2 I would imagine.
 
Kinda looking forward to this, Intel have the Budget to create something pretty spectacular. Even if they don't, the extra competition will help drive prices down a bit.

It will most likely be just for compute workloads and not gaming (aka servers and workstations).
Here's to hoping they're really good at Crypto-mining so us "mere mortals" can afford Gaming GPU's again? :)
 
Intel, 2019: This is our new i9 with 40 cores and 80 threads. Welcome to the future of computing.

AMD, probably the next day: 60 cores, 120 threads, Threadripper 3. Buy it.

Intel, 2020: This is our new TRX102 GPU, code name “T-Rex.” The worlds first 4K, 144hz gpu.

Nvidia, one month later: Meet the GTX 1380. 8K. 60FPS. Backwards compatible with 4K 240Hz monitors and HDR. Less power, less heat, higher clocks, and real-time 4K ray tracing capability at 120Hz at stock frequencies. How great is that guys? The more you buy, the more you save.

We all know that’s how it’s going to go down.

Haha, isn't it just how its going to go down.

Alternatively, "INTEL IS SHUTTING DOWN ITS GPU DIVISION TO FOCUS ON CPUS AND MOBILES".
 
Didn't Intel go down this road years ago with the Intel740?
If I recall correctly it was AGP, supposedly had great performance, and then when they actually delivered, it flopped. I think I've only ever seen one of them in real life.

Hopefully they're a little more successful in this endeavor. More competition is a good thing.
 
Didn't Intel go down this road years ago with the Intel740?
If I recall correctly it was AGP, supposedly had great performance, and then when they actually delivered, it flopped. I think I've only ever seen one of them in real life.

Hopefully they're a little more successful in this endeavor. More competition is a good thing.

Correct. Intel hoped to popularize the AGP slot with the 740 since most graphics cards still used the PCI interface.

They also created a prototype called the Larrabee that wasn't released. It used architecture fundamentally different from most other video cards.
 
They also created a prototype called the Larrabee that wasn't released. It used architecture fundamentally different from most other video cards.
Linus did a video recently on it, very interesting what Intel were trying to do with it, essentially a software layer between the hardware and OS so it could run anything.
 
And what a foundry they have too...

Right. One that can't even do 10nm right, let alone 7nm. Or perhaps that 10nm process is much more suitable for a GPU than a CPU?

What on Earth?!?

It is W I D E L Y accepted that Intel rule the planet for both IPC and GHz - whatever process node they choose to run / or name.

Just because AMD finally made it to a few finals, doesn't mean they won the cup.

I shall be watching Intel's offerings with much interest, I hear they did much in software, rather than hardware, and had competing coding groups writing for their last product, much to the annoyance of those involved. Not everyone hits a home run on their first attempt... and even if they don't a second time, you should WELCOME the competition in the market, which has been a two-horse race for donkey's years.

Dam, too many sporting references today... but I felt I had to simplify things for you.
 
It is W I D E L Y accepted that Intel rule the planet for both IPC and GHz - whatever process node they choose to run / or name.

It's also widely accepted that Intel is in a rut and hasn't improved IPC in some years or been able to make its 10nm process work well enough to transition its main CPUs to it. It will be at a disadvantage if it has to depend on 14nm to compete with 7nm, regardless of any engineering advantage (which I'm not sure it will have).
 
I don't see this going anywhere tbh. If intel even delivers on their promise of a card in 2020, I have a feeling it's going to fall flat in any segment and be very underwhelming and over priced.
 
Intel haven't improved in years?

I'm running Haswell on my daily machine, and it is noticeably slower then my friends' Skylake CPUs (about -22+% IPC), let alone my parents new Coffee Lake machine, which seems very snappy indeed (but I've not benched that yet). And that is ~4 years span there...
Not long ago my Haswell 4790K was in the top 3 for single thread performance, now it's dropped to 14th place. . Clock speeds: Haswell 4790K = 4.0 - 4.4, Skylake 7700K = 4.2 - 4.5, Coffee Lake 8700K = 3.7 - 4.7 - (now the #1 CPU in single thread). You could conclude that the slight generational boost in performance is all due to higher clocks, and you'd mostly be right. But getting basically the same arch to run faster takes process improvements, die shrinks, and other tweaks. Whatever it took to get there, Intel still has the top spot in IPC. The largest gain was Haswell to Skylake, 22nm to 14nm. When they finally perfect their 10nm process, Intel will still be on top, even against 7nm chips (which use different measurement protocols).
 
Intel will be fine. Even now it's 14nm processors compete well with AMD's 12nm processors. I'd imagine when AMD goes to 7nm and Intel goes to 10nm CPUs, Intel will do very well, and I'm an AMD fan. I want Intel GPU's to do well also. A drink to more competition.
 
About damn time!!!! Nvidia has gone nuts with its graphics cards prices. Hopefully this will bring some much needed competition to the GPU space!
 
Intel, 2019: This is our new i9 with 40 cores and 80 threads. Welcome to the future of computing.

AMD, probably the next day: 60 cores, 120 threads, Threadripper 3. Buy it.

Intel, 2020: This is our new TRX102 GPU, code name “T-Rex.” The worlds first 4K, 144hz gpu.

Nvidia, one month later: Meet the GTX 1380. 8K. 60FPS. Backwards compatible with 4K 240Hz monitors and HDR. Less power, less heat, higher clocks, and real-time 4K ray tracing capability at 120Hz at stock frequencies. How great is that guys? The more you buy, the more you save.

We all know that’s how it’s going to go down.

Well no words are missing here my friend, you have had said everything and you are absolutley right.
 
Back