Intel shows off designs for its discrete graphics card due in 2020

Greg S

Posts: 1,607   +442
Something to look forward to: Intel's new graphics card is starting to become a little more real now that designs have been presented on stage. It is now a waiting game to find out how well it will perform against rivals Nvidia and AMD.

Even though Intel is still nearly a year away from launching its own discrete graphics card, the company is already beginning to show off where it is headed. In a GDC 2019 keynote, Intel has shared some early designs of their GPU slated for a 2020 release.

Taking design cues right from their own Optane SSDs, sharp lines and fairly minimal styling is in use. From what we can tell, Intel might be avoiding the obnoxious RGB flair present on some gaming oriented cards.

The first design uses the typical blower fan found in many other reference designs. Compared to Intel's blue rectangle cards that have been shown in the past, this first iteration appears to be much the same, but with a housing that would look better through a glass side panel.

Turning over to the back side of the card, a full backplate is present. You can also catch a glimpse of what appears to be three full size DisplayPort outputs and one HDMI port, although this could change before final release. It should also be noted that the HDMI spec and DisplayPort version used will be more important than just the number of ports.

Lastly, we get a look at Intel's most intricate design. Quadrilateral scales surround a cooling fan before smoothing out into a semi-matte black finish, giving it the illusion of a fan breaking out of the housing.

Unfortunately, full specifications are still not yet available for Intel's upcoming graphics card. Real world performance is essentially completely unknown for now. As the year goes on, there is a good chance Intel may share some numbers given how eager the company is to make everyone aware that they have a major new product incoming.

Image Credits: Nick Pino, TechRadar

Permalink to story.

 
By looking at the size of the card, there are limits in term of memory, power and die size.

This GPU pictured is mid-range. That doesn't mean Intel has no plan for high-end, but so far we know we should expect mid-range. Is it server grade or consumer grade? I would really be surprised if Intel is not focusing more on datacenter than consumer.
 
Total mystery, obviously not high end from the size and cooler alone.

Intel would have looked at the market and know that entering at the mainstream before launching any costly high end models is sensible. By costly I also mean not just to the consumer, but Intel themselves.

Being new to the discrete GPU market designing a high end part without validating the architecture at lower levels or even discovering a segment for the brand is a big investment and risky.

Isn't this how AMD have basically been operating for years now? Throwing out the odd high end part, but concentrating mainly on the mainstream and aiming for success there. It's a sensible tactic and wouldn't surprise me if Intel start off with the same idea.
 
Is it server grade or consumer grade? I would really be surprised if Intel is not focusing more on datacenter than consumer.

The designs are certainly aimed at consumers. Data centre doesn't need great looks. It's reasonable for Intel to aim for the data centre eventually, but I think it would go for gamers first.

Frankly I'm more excited by the Gen 11 integrated graphics. That should put a dent in the MX 150 / 250 /... market.
 
By looking at the size of the card, there are limits in term of memory, power and die size.

This GPU pictured is mid-range. That doesn't mean Intel has no plan for high-end, but so far we know we should expect mid-range. Is it server grade or consumer grade? I would really be surprised if Intel is not focusing more on datacenter than consumer.

Yeah, it was my first thought, too.
And I hope that Intel will bring some high-end stuff.
 
Hmmmm from its size I wonder if initially there are aiming to join the mid range market. But you never know with 3D ram and small nm die sizes we could be looking at a 2080ti beater with 24Gb ram :D
 
I always wonder why cards get so many displayport outputs yet only 1 hdmi?

I guess I'm an outlier cause my system is hooked to televisions but I'd really like 2 hdmi outputs
 
I always wonder why cards get so many displayport outputs yet only 1 hdmi?

I guess I'm an outlier cause my system is hooked to televisions but I'd really like 2 hdmi outputs
Converting from DisplayPort to HDMI is as easy as an adaptor. Converting from HDMI to DisplayPort isn't possible without an Active Adaptor.
 
I always wonder why cards get so many displayport outputs yet only 1 hdmi?

I guess I'm an outlier cause my system is hooked to televisions but I'd really like 2 hdmi outputs
Converting from DisplayPort to HDMI is as easy as an adaptor. Converting from HDMI to DisplayPort isn't possible without an Active Adaptor.
Thats an interesting bit of information. I've never receive a graphics card with a DP to HDMI adapter, though, and I've bought MANY
 
Converting from DisplayPort to HDMI is as easy as an adaptor. Converting from HDMI to DisplayPort isn't possible without an Active Adaptor.
AFAIK, display port doesn't provide sound, only HDMI does. If that's the case, as soon as you slap an adapter on you system, you'll be routing sound to the TV via multiple 3.5 mm connections, or SPDIF.

(Of course I'm thinking in terms of a HTPC)

I always wonder why cards get so many displayport outputs yet only 1 hdmi?

I guess I'm an outlier cause my system is hooked to televisions but I'd really like 2 hdmi outputs
My lowly EVGA 1050 ti FTW card has, dual link DVI, HDMI, & Display port.

There's logic there, since not everybody, is using TVs for monitors, and not every monitor has HDMI. A while back Display port was being pushed since it doesn't have a licensing fee and HDMI does. (You might want to fact check that).

I have never used monitor speakers, (they're worthless). Thus dial link DVI, will drive at least to 1440p (2K), freeing up HDMI for a TV, should the mood strike.
 
Converting from DisplayPort to HDMI is as easy as an adaptor. Converting from HDMI to DisplayPort isn't possible without an Active Adaptor.
AFAIK, display port doesn't provide sound, only HDMI does. If that's the case, as soon as you slap an adapter on you system, you'll be routing sound to the TV via multiple 3.5 mm connections, or SPDIF.

(Of course I'm thinking in terms of a HTPC)

I always wonder why cards get so many displayport outputs yet only 1 hdmi?

I guess I'm an outlier cause my system is hooked to televisions but I'd really like 2 hdmi outputs
My lowly EVGA 1050 ti FTW card has, dual link DVI, HDMI, & Display port.

There's logic there, since not everybody, is using TVs for monitors, and not every monitor has HDMI. A while back Display port was being pushed since it doesn't have a licensing fee and HDMI does. (You might want to fact check that).

I have never used monitor speakers, (they're worthless). Thus dial link DVI, will drive at least to 1440p (2K), freeing up HDMI for a TV, should the mood strike.
my gripe is that when hooked to an av receiver the additional adapter can cause issues.

like in my gameroom with a receiver sending signals to 2 tv's the hdmi connection is flawless but the receiver will sometimes hiccup on the adapter connection,

just seems like 3 of the same connections is overkill imo, turn one into another hdmi, my old setup with dual gpu's worked great but now that one card grants all the grunt I need it would be nice to get that extra addition, especially now with tv's that can step into the realm of monitors.
 
Converting from DisplayPort to HDMI is as easy as an adaptor. Converting from HDMI to DisplayPort isn't possible without an Active Adaptor.
AFAIK, display port doesn't provide sound, only HDMI does. If that's the case, as soon as you slap an adapter on you system, you'll be routing sound to the TV via multiple 3.5 mm connections, or SPDIF.

(Of course I'm thinking in terms of a HTPC)
Display port has audio output, I hook up to my TV thru a display port to HDMI adapter on my precision laptop, audio has always worked.
 
By looking at the size of the card, there are limits in term of memory, power and die size.

This GPU pictured is mid-range. That doesn't mean Intel has no plan for high-end, but so far we know we should expect mid-range. Is it server grade or consumer grade? I would really be surprised if Intel is not focusing more on datacenter than consumer.

1 fan small cooler, I'm thinking 1660/570 performance they are most likely going to tackle budget $150-200 cards as they are some of the most sold mid range would be more $250-$300 level probably dual fan, but I would expect maybe something high end later not sooner.
 
Total mystery, obviously not high end from the size and cooler alone.

Intel would have looked at the market and know that entering at the mainstream before launching any costly high end models is sensible. By costly I also mean not just to the consumer, but Intel themselves.

Being new to the discrete GPU market designing a high end part without validating the architecture at lower levels or even discovering a segment for the brand is a big investment and risky.

Isn't this how AMD have basically been operating for years now? Throwing out the odd high end part, but concentrating mainly on the mainstream and aiming for success there. It's a sensible tactic and wouldn't surprise me if Intel start off with the same idea.

*could be* stacked 3dram, or HBM style .. they can fab the whole kit themselves. if they did a FURY 2.0 style card with 12/14/32gb ram stacked it could be really good and small. the backplates is what makes me think that way.. seems thick hand heavy with all the plates unless they are there to disperse heat.
 
AFAIK, display port doesn't provide sound, only HDMI does.
Not true at all, sound works just fine through DisplayPort and through adapters.
turn one into another hdmi
Agreed, DisplayPort can daisy chain as well, making more ports less useful.
A while back Display port was being pushed since it doesn't have a licensing fee and HDMI does. (You might want to fact check that).
It's a good point actually, I wonder if the licensing terms dictate how many HDMI ports are on the back? As in, would the License double in price for an extra port? Who knows...
Thats an interesting bit of information. I've never receive a graphics card with a DP to HDMI adapter, though, and I've bought MANY
Yeah same! Found this all out when HP decided DisplayPort was the only connector they wanted to use a few laptop generations ago. Even then HP didn't provide adapters.
 
Thank the Ex AMD workers that bailed out and jump over to INTEL especially Raja ;-)

It would be real funny if INTEL bought Raja the same leather jacket as NVIDIA's CEO but soaked it with a whole bottle of " ARMOR ALL " car interior shine and labeled "RTX enabled."
 
my gripe is that when hooked to an av receiver the additional adapter can cause issues.

like in my gameroom with a receiver sending signals to 2 tv's the hdmi connection is flawless but the receiver will sometimes hiccup on the adapter connection,

just seems like 3 of the same connections is overkill imo, turn one into another hdmi, my old setup with dual gpu's worked great but now that one card grants all the grunt I need it would be nice to get that extra addition, especially now with tv's that can step into the realm of monitors.
Well Ms @kira setsu To be completely forthright, I have no idea where you're getting video cards with 3 display port outputs.

Granted, I am a habitual consumer of EVGA's products. At present, I have a 710, a 730, a 750 ti, & a 1050 ti, all of these cards, as well as stock Intel CPU integrated IGPs, have several different connections, not multiple single application connections.

Any of that notwithstanding, you can go directly to Amazon and buy cables with display port on one end, and HDMI on the other in several different lengths. https://www.amazon.com/dp/B015OW3P1O/?tag=httpwwwtechsp-20

71J2uGmSl3L._SX355_.jpg


That should put and end to your woes. Unless of course you live in Mogadishu, The Galapagos, or other similarly remote location.
 
Not true at all, sound works just fine through DisplayPort and through adapters.
My bad. I confused display port with DVI, which absolutely doesn't have sound.
It's a good point actually, I wonder if the licensing terms dictate how many HDMI ports are on the back? As in, would the License double in price for an extra port? Who knows...
https://www.semiconductorstore.com/blog/2014/licensing-costs-HDMI/654/
Actually, now we all can. I didn't research licensing fees for display port, I didn't want to hog all the "grunt work" for myself.

Yeah same! Found this all out when HP decided DisplayPort was the only connector they wanted to use a few laptop generations ago. Even then HP didn't provide adapters.
I have an ancient HP 724 24" 1920 x 1200 (16:10) IPS w/ CCFL back light, which I believe has display port, but no HDMI. It's tucked away behind my current 1440p , and I'm not tearing the desk apart to get a look at the connections, you'll have to take my word for it.

(I probably would be better informed about display port, had I ever actually found a use for it) :rolleyes:
 
Not true at all, sound works just fine through DisplayPort and through adapters.
My bad. I confused display port with DVI, which absolutely doesn't have sound.
It's a good point actually, I wonder if the licensing terms dictate how many HDMI ports are on the back? As in, would the License double in price for an extra port? Who knows...
https://www.semiconductorstore.com/blog/2014/licensing-costs-HDMI/654/
Actually, now we all can. I didn't research licensing fees for display port, I didn't want to hog all the "grunt work" for myself.

Yeah same! Found this all out when HP decided DisplayPort was the only connector they wanted to use a few laptop generations ago. Even then HP didn't provide adapters.
I have an ancient HP 724 24" 1920 x 1200 (16:10) IPS w/ CCFL back light, which I believe has display port, but no HDMI. It's tucked away behind my current 1440p , and I'm not tearing the desk apart to get a look at the connections, you'll have to take my word for it.

(I probably would be better informed about display port, had I ever actually found a use for it) :rolleyes:


There are some implementations of DVI out there which do have sound when used with a DVI to HDMI cable. I had an old 9400GT which had that implementation where you plugged the motherboard SPDIF into the GPU.
 
By looking at the size of the card, there are limits in term of memory, power and die size.

This GPU pictured is mid-range. That doesn't mean Intel has no plan for high-end, but so far we know we should expect mid-range. Is it server grade or consumer grade? I would really be surprised if Intel is not focusing more on datacenter than consumer.


https://www.anandtech.com/show/13882/ces-2019-digital-storm-spark-a-miniitx-with-mxm-rtx-2080

That's a MXM GTX 1080. I don't think it's size is any indication of performance, but as intel is new to gamers CPU, I would be surprised if it can manage 1070 performance level.
 
There are some implementations of DVI out there which do have sound when used with a DVI to HDMI cable. I had an old 9400GT which had that implementation where you plugged the motherboard SPDIF into the GPU.
Well Matt, my sympathies extend to you, being the proud owner of a 9400 GT.. I have an 8400 s (?) & a 9500 GT. The only reason I mention them is the fact that I personally have no sense of shame, I'm not a gamer, and I've been written off as being insane, for many years now. Nonetheless, I salute you're bravery for mentioning it.

All of that notwithstanding, I investigated your claim of sound through DVI. The only board maker's site I visited was EVGA. While in the specs, they do claim it is possible to have HDMI sound with these boards. The trouble I ran into, is that my 9500 doesn't appear to have an SPDIF socket...:confused:

The rear of the card has 2 DVI sockets, and a female DIN socket, marker "TV". It almost looks like the card can be used as a video capture card. There is also a connector printed circuit strip along the top edge of the card. It looks like as SATA power connection. I don't have any idea what to make of these.

Perhaps, your card is from a different maker, who has fully implemented the HDMI sound feature via and actual SPDIF optical connector..

As a matter of curiosity, I also found what appear to possibly be socketed EPROMs. Now there's something you don't see every day. (Especially where you should have them, for the BIOS chip)..
 
Last edited:
Why would you have three DP outputs in 2020? HDMI 2.1 is here, and offers much more bandwidth and is supported by a much wider range of devices (TVs and HT equipment) then DP is.
 
Back