1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

Intel shows off designs for its discrete graphics card due in 2020

By Greg S ยท 24 replies
Mar 21, 2019
Post New Reply
  1. Even though Intel is still nearly a year away from launching its own discrete graphics card, the company is already beginning to show off where it is headed. In a GDC 2019 keynote, Intel has shared some early designs of their GPU slated for a 2020 release.

    Taking design cues right from their own Optane SSDs, sharp lines and fairly minimal styling is in use. From what we can tell, Intel might be avoiding the obnoxious RGB flair present on some gaming oriented cards.

    The first design uses the typical blower fan found in many other reference designs. Compared to Intel's blue rectangle cards that have been shown in the past, this first iteration appears to be much the same, but with a housing that would look better through a glass side panel.

    Turning over to the back side of the card, a full backplate is present. You can also catch a glimpse of what appears to be three full size DisplayPort outputs and one HDMI port, although this could change before final release. It should also be noted that the HDMI spec and DisplayPort version used will be more important than just the number of ports.

    Lastly, we get a look at Intel's most intricate design. Quadrilateral scales surround a cooling fan before smoothing out into a semi-matte black finish, giving it the illusion of a fan breaking out of the housing.

    Unfortunately, full specifications are still not yet available for Intel's upcoming graphics card. Real world performance is essentially completely unknown for now. As the year goes on, there is a good chance Intel may share some numbers given how eager the company is to make everyone aware that they have a major new product incoming.

    Image Credits: Nick Pino, TechRadar

    Permalink to story.

     
  2. redgarl

    redgarl TS Enthusiast Posts: 50   +42

    By looking at the size of the card, there are limits in term of memory, power and die size.

    This GPU pictured is mid-range. That doesn't mean Intel has no plan for high-end, but so far we know we should expect mid-range. Is it server grade or consumer grade? I would really be surprised if Intel is not focusing more on datacenter than consumer.
     
    XtremeHammond likes this.
  3. Vulcanproject

    Vulcanproject TS Evangelist Posts: 672   +897

    Total mystery, obviously not high end from the size and cooler alone.

    Intel would have looked at the market and know that entering at the mainstream before launching any costly high end models is sensible. By costly I also mean not just to the consumer, but Intel themselves.

    Being new to the discrete GPU market designing a high end part without validating the architecture at lower levels or even discovering a segment for the brand is a big investment and risky.

    Isn't this how AMD have basically been operating for years now? Throwing out the odd high end part, but concentrating mainly on the mainstream and aiming for success there. It's a sensible tactic and wouldn't surprise me if Intel start off with the same idea.
     
  4. ET3D

    ET3D TechSpot Paladin Posts: 1,644   +304

    The designs are certainly aimed at consumers. Data centre doesn't need great looks. It's reasonable for Intel to aim for the data centre eventually, but I think it would go for gamers first.

    Frankly I'm more excited by the Gen 11 integrated graphics. That should put a dent in the MX 150 / 250 /... market.
     
  5. XtremeHammond

    XtremeHammond TS Enthusiast Posts: 40   +31

    Yeah, it was my first thought, too.
    And I hope that Intel will bring some high-end stuff.
     
  6. penn919

    penn919 TS Maniac Posts: 249   +127

    Will likely compete with RX 560 and GT 1030
     
    JaredTheDragon likes this.
  7. Emexrulsier

    Emexrulsier TS Evangelist Posts: 593   +76

    Hmmmm from its size I wonder if initially there are aiming to join the mid range market. But you never know with 3D ram and small nm die sizes we could be looking at a 2080ti beater with 24Gb ram :D
     
  8. kira setsu

    kira setsu TS Enthusiast Posts: 86   +60

    I always wonder why cards get so many displayport outputs yet only 1 hdmi?

    I guess I'm an outlier cause my system is hooked to televisions but I'd really like 2 hdmi outputs
     
  9. Burty117

    Burty117 TechSpot Chancellor Posts: 3,448   +1,216

    Converting from DisplayPort to HDMI is as easy as an adaptor. Converting from HDMI to DisplayPort isn't possible without an Active Adaptor.
     
  10. yRaz

    yRaz Nigerian Prince Posts: 2,761   +1,982

    Thats an interesting bit of information. I've never receive a graphics card with a DP to HDMI adapter, though, and I've bought MANY
     
  11. captaincranky

    captaincranky TechSpot Addict Posts: 14,599   +3,771

    AFAIK, display port doesn't provide sound, only HDMI does. If that's the case, as soon as you slap an adapter on you system, you'll be routing sound to the TV via multiple 3.5 mm connections, or SPDIF.

    (Of course I'm thinking in terms of a HTPC)

    My lowly EVGA 1050 ti FTW card has, dual link DVI, HDMI, & Display port.

    There's logic there, since not everybody, is using TVs for monitors, and not every monitor has HDMI. A while back Display port was being pushed since it doesn't have a licensing fee and HDMI does. (You might want to fact check that).

    I have never used monitor speakers, (they're worthless). Thus dial link DVI, will drive at least to 1440p (2K), freeing up HDMI for a TV, should the mood strike.
     
  12. kira setsu

    kira setsu TS Enthusiast Posts: 86   +60

    my gripe is that when hooked to an av receiver the additional adapter can cause issues.

    like in my gameroom with a receiver sending signals to 2 tv's the hdmi connection is flawless but the receiver will sometimes hiccup on the adapter connection,

    just seems like 3 of the same connections is overkill imo, turn one into another hdmi, my old setup with dual gpu's worked great but now that one card grants all the grunt I need it would be nice to get that extra addition, especially now with tv's that can step into the realm of monitors.
     
  13. GeforcerFX

    GeforcerFX TS Evangelist Posts: 846   +349

    Display port has audio output, I hook up to my TV thru a display port to HDMI adapter on my precision laptop, audio has always worked.
     
  14. Nocturne

    Nocturne TS Addict Posts: 142   +71

    1 fan small cooler, I'm thinking 1660/570 performance they are most likely going to tackle budget $150-200 cards as they are some of the most sold mid range would be more $250-$300 level probably dual fan, but I would expect maybe something high end later not sooner.
     
  15. MaikuTech

    MaikuTech TS Evangelist Posts: 1,032   +178

  16. maxxcool

    maxxcool TS Rookie

    *could be* stacked 3dram, or HBM style .. they can fab the whole kit themselves. if they did a FURY 2.0 style card with 12/14/32gb ram stacked it could be really good and small. the backplates is what makes me think that way.. seems thick hand heavy with all the plates unless they are there to disperse heat.
     
  17. Burty117

    Burty117 TechSpot Chancellor Posts: 3,448   +1,216

    Not true at all, sound works just fine through DisplayPort and through adapters.
    Agreed, DisplayPort can daisy chain as well, making more ports less useful.
    It's a good point actually, I wonder if the licensing terms dictate how many HDMI ports are on the back? As in, would the License double in price for an extra port? Who knows...
    Yeah same! Found this all out when HP decided DisplayPort was the only connector they wanted to use a few laptop generations ago. Even then HP didn't provide adapters.
     
  18. Dimitrios

    Dimitrios TS Maniac Posts: 345   +238

    Thank the Ex AMD workers that bailed out and jump over to INTEL especially Raja ;-)

    It would be real funny if INTEL bought Raja the same leather jacket as NVIDIA's CEO but soaked it with a whole bottle of " ARMOR ALL " car interior shine and labeled "RTX enabled."
     
  19. captaincranky

    captaincranky TechSpot Addict Posts: 14,599   +3,771

    Well Ms @kira setsu To be completely forthright, I have no idea where you're getting video cards with 3 display port outputs.

    Granted, I am a habitual consumer of EVGA's products. At present, I have a 710, a 730, a 750 ti, & a 1050 ti, all of these cards, as well as stock Intel CPU integrated IGPs, have several different connections, not multiple single application connections.

    Any of that notwithstanding, you can go directly to Amazon and buy cables with display port on one end, and HDMI on the other in several different lengths. https://www.amazon.com/dp/B015OW3P1O/?tag=httpwwwtechsp-20

    [​IMG]

    That should put and end to your woes. Unless of course you live in Mogadishu, The Galapagos, or other similarly remote location.
     
    Dimitrios likes this.
  20. captaincranky

    captaincranky TechSpot Addict Posts: 14,599   +3,771

    My bad. I confused display port with DVI, which absolutely doesn't have sound.
    https://www.semiconductorstore.com/blog/2014/licensing-costs-HDMI/654/
    Actually, now we all can. I didn't research licensing fees for display port, I didn't want to hog all the "grunt work" for myself.

    I have an ancient HP 724 24" 1920 x 1200 (16:10) IPS w/ CCFL back light, which I believe has display port, but no HDMI. It's tucked away behind my current 1440p , and I'm not tearing the desk apart to get a look at the connections, you'll have to take my word for it.

    (I probably would be better informed about display port, had I ever actually found a use for it) :rolleyes:
     
  21. mattferg

    mattferg TS Rookie


    There are some implementations of DVI out there which do have sound when used with a DVI to HDMI cable. I had an old 9400GT which had that implementation where you plugged the motherboard SPDIF into the GPU.
     
  22. erickmendes

    erickmendes TS Evangelist Posts: 553   +239


    https://www.anandtech.com/show/13882/ces-2019-digital-storm-spark-a-miniitx-with-mxm-rtx-2080

    That's a MXM GTX 1080. I don't think it's size is any indication of performance, but as intel is new to gamers CPU, I would be surprised if it can manage 1070 performance level.
     
  23. urbanman2004

    urbanman2004 TS Booster Posts: 72   +26

    Quit w/ all the damn speculation until Intel gives us an actual physical product, not some freaking paper launch.
     
    Dimitrios likes this.
  24. captaincranky

    captaincranky TechSpot Addict Posts: 14,599   +3,771

    Well Matt, my sympathies extend to you, being the proud owner of a 9400 GT.. I have an 8400 s (?) & a 9500 GT. The only reason I mention them is the fact that I personally have no sense of shame, I'm not a gamer, and I've been written off as being insane, for many years now. Nonetheless, I salute you're bravery for mentioning it.

    All of that notwithstanding, I investigated your claim of sound through DVI. The only board maker's site I visited was EVGA. While in the specs, they do claim it is possible to have HDMI sound with these boards. The trouble I ran into, is that my 9500 doesn't appear to have an SPDIF socket...:confused:

    The rear of the card has 2 DVI sockets, and a female DIN socket, marker "TV". It almost looks like the card can be used as a video capture card. There is also a connector printed circuit strip along the top edge of the card. It looks like as SATA power connection. I don't have any idea what to make of these.

    Perhaps, your card is from a different maker, who has fully implemented the HDMI sound feature via and actual SPDIF optical connector..

    As a matter of curiosity, I also found what appear to possibly be socketed EPROMs. Now there's something you don't see every day. (Especially where you should have them, for the BIOS chip)..
     
    Last edited: Mar 22, 2019
  25. gamerk2

    gamerk2 TS Addict Posts: 204   +126

    Why would you have three DP outputs in 2020? HDMI 2.1 is here, and offers much more bandwidth and is supported by a much wider range of devices (TVs and HT equipment) then DP is.
     

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...