1. TechSpot is dedicated to computer enthusiasts and power users. Ask a question and give support. Join the community here.
    TechSpot is dedicated to computer enthusiasts and power users.
    Ask a question and give support.
    Join the community here, it only takes a minute.
    Dismiss Notice

Intel officially confirms discrete graphics card coming in 2020

By Greg S · 27 replies
Jun 12, 2018
Post New Reply
  1. One of the unfortunate market conditions that hardware enthusiasts face is the lack of choice when it comes to some of the needed core components. Intel and AMD now have healthy competition in the CPU market, but Nvidia is very clearly outranking AMD in the consumer graphics industry.

    Intel has now confirmed that it will be joining the competition for discrete graphics solutions.

    Over the past year, rumors have flown about over when Intel was going to bring a GPU to market. Some leaks pointed towards CES 2019. Although these rumors could still hold true, we now know that Intel is aiming for a 2020 launch of discrete graphics products.

    Although Intel is remaining scarce on details of its upcoming GPU project, past prototypes show that there could be more than just another mediocre GPU entering the fray. Then again, given that Intel has been selectively poaching AMD's graphics department, Nvidia may not be too concerned about a third GPU maker.

    According to Ashraf Eassa of TheMotleyFool, Intel has split its dedicated graphics project carrying code-name Arctic Sound into two products. Jupiter Sound is also one of the codenames used by Intel. One will focus on video streaming within data centers, and the other will be for gaming. At this time there is no information on architecture, process technology, or any other hardware information that could be used to estimate performance metrics.

    No matter where consumer loyalty lies, more competition can help lead to better hardware in a shorter time span. Next generation graphics cards may carry high launch prices, but a good healthy round of competition could help keep sticker shock to a minimum.

    Permalink to story.

  2. gigantor21

    gigantor21 TS Maniac Posts: 145   +196

    "AMD + Nvidia + Intel = Full RGB graphics cards?"

    ...how dare you, LOL.
    ypsylon, Clamyboy74, baskiria and 5 others like this.
  3. cliffordcooley

    cliffordcooley TS Guardian Fighter Posts: 11,202   +4,864

    If you think Nvidia is expensive, wait and see what Intel has to offer. That is assuming they have anything that will compete.
    Clamyboy74, MoeJoe and Dimitrios like this.
  4. amghwk

    amghwk TS Maniac Posts: 404   +231

    I don't think "enthusiasts" will be interested in this card. Even AMD with it's vast graphics card making experience, is struggling against Nvidia.

    On the other hand, any good competition is always welcome, provided there is competitive pricing to go head-to-head with the current cards in all categories (low end to high end).

    But if the pricing is similar across all these companies, then there is no point introducing another card with similar performance.
    Stark likes this.
  5. Intel, 2019: This is our new i9 with 40 cores and 80 threads. Welcome to the future of computing.

    AMD, probably the next day: 60 cores, 120 threads, Threadripper 3. Buy it.

    Intel, 2020: This is our new TRX102 GPU, code name “T-Rex.” The worlds first 4K, 144hz gpu.

    Nvidia, one month later: Meet the GTX 1380. 8K. 60FPS. Backwards compatible with 4K 240Hz monitors and HDR. Less power, less heat, higher clocks, and real-time 4K ray tracing capability at 120Hz at stock frequencies. How great is that guys? The more you buy, the more you save.

    We all know that’s how it’s going to go down.
    amorl, qking, Clamyboy74 and 2 others like this.
  6. hood6558

    hood6558 TS Evangelist Posts: 353   +110

    Yes, Intel has former AMD people on the job, but with a much larger budget to work with, they could pull off a miracle and surprise everyone with an NVIDIA-beating GPU. I mean, AMD was trading blows at times with NVIDIA, despite their financial disadvantage, so how hard could it be for a company the size of Intel, who has their own foundry?
    Whatever high price they charged for it, it would be worth buying just to light a fire under NVIDIA and AMD.
    Clamyboy74, lumbeeman and Knot Schure like this.
  7. Knot Schure

    Knot Schure TS Booster Posts: 137   +49

    And what a foundry they have too...
  8. Qrox

    Qrox TS Booster Posts: 67   +27

    I was looking through the article looking for details about the RGB...
    Took me way too long to realize what they were trying to do in the title...
    Andromadus likes this.
  9. Puiu

    Puiu TS Evangelist Posts: 3,207   +1,648

    It will most likely be just for compute workloads and not gaming (aka servers and workstations).
  10. ET3D

    ET3D TechSpot Paladin Posts: 1,619   +282

    Right. One that can't even do 10nm right, let alone 7nm. Or perhaps that 10nm process is much more suitable for a GPU than a CPU?
  11. ETF Soldier

    ETF Soldier TS Evangelist Posts: 469   +138

    How high level are games coded in terms of GPU processing? If Intel come in with something that differentiates from pre-existing solutions won't there most likely be an issue of compatibility, or is it generalised enough that Intel can try someone radically different in terms of architecture but games wont notice?
    Either way, Intel won't have the big back catalogue of driver support on existing games like the other 2 I would imagine.
  12. Burty117

    Burty117 TechSpot Chancellor Posts: 3,397   +1,172

    Kinda looking forward to this, Intel have the Budget to create something pretty spectacular. Even if they don't, the extra competition will help drive prices down a bit.

    Here's to hoping they're really good at Crypto-mining so us "mere mortals" can afford Gaming GPU's again? :)
  13. H3llion

    H3llion TechSpot Paladin Posts: 1,635   +404

    Haha, isn't it just how its going to go down.

  14. CrazyDave

    CrazyDave TS Enthusiast Posts: 40   +32

    Didn't Intel go down this road years ago with the Intel740?
    If I recall correctly it was AGP, supposedly had great performance, and then when they actually delivered, it flopped. I think I've only ever seen one of them in real life.

    Hopefully they're a little more successful in this endeavor. More competition is a good thing.
  15. madboyv1

    madboyv1 TechSpot Paladin Posts: 1,524   +412

    It took me a couple extra seconds to recognize the brilliance of that statement too XD
  16. veLa

    veLa TS Evangelist Posts: 841   +278

    Correct. Intel hoped to popularize the AGP slot with the 740 since most graphics cards still used the PCI interface.

    They also created a prototype called the Larrabee that wasn't released. It used architecture fundamentally different from most other video cards.
    CrazyDave likes this.
  17. Burty117

    Burty117 TechSpot Chancellor Posts: 3,397   +1,172

    Linus did a video recently on it, very interesting what Intel were trying to do with it, essentially a software layer between the hardware and OS so it could run anything.
    davislane1 likes this.
  18. Knot Schure

    Knot Schure TS Booster Posts: 137   +49

    What on Earth?!?

    It is W I D E L Y accepted that Intel rule the planet for both IPC and GHz - whatever process node they choose to run / or name.

    Just because AMD finally made it to a few finals, doesn't mean they won the cup.

    I shall be watching Intel's offerings with much interest, I hear they did much in software, rather than hardware, and had competing coding groups writing for their last product, much to the annoyance of those involved. Not everyone hits a home run on their first attempt... and even if they don't a second time, you should WELCOME the competition in the market, which has been a two-horse race for donkey's years.

    Dam, too many sporting references today... but I felt I had to simplify things for you.
    hood6558 likes this.
  19. McMurdeR

    McMurdeR TS Addict Posts: 118   +87

    2020. Nvidia might replace their Pascal geforce line too :-
  20. ET3D

    ET3D TechSpot Paladin Posts: 1,619   +282

    It's also widely accepted that Intel is in a rut and hasn't improved IPC in some years or been able to make its 10nm process work well enough to transition its main CPUs to it. It will be at a disadvantage if it has to depend on 14nm to compete with 7nm, regardless of any engineering advantage (which I'm not sure it will have).
  21. pcnthuziast

    pcnthuziast TS Guru Posts: 428   +62

    I don't see this going anywhere tbh. If intel even delivers on their promise of a card in 2020, I have a feeling it's going to fall flat in any segment and be very underwhelming and over priced.
    CrazyDave likes this.
  22. hood6558

    hood6558 TS Evangelist Posts: 353   +110

    Not long ago my Haswell 4790K was in the top 3 for single thread performance, now it's dropped to 14th place. . Clock speeds: Haswell 4790K = 4.0 - 4.4, Skylake 7700K = 4.2 - 4.5, Coffee Lake 8700K = 3.7 - 4.7 - (now the #1 CPU in single thread). You could conclude that the slight generational boost in performance is all due to higher clocks, and you'd mostly be right. But getting basically the same arch to run faster takes process improvements, die shrinks, and other tweaks. Whatever it took to get there, Intel still has the top spot in IPC. The largest gain was Haswell to Skylake, 22nm to 14nm. When they finally perfect their 10nm process, Intel will still be on top, even against 7nm chips (which use different measurement protocols).
    Knot Schure likes this.
  23. Danny101

    Danny101 TS Guru Posts: 599   +233

    Intel will be fine. Even now it's 14nm processors compete well with AMD's 12nm processors. I'd imagine when AMD goes to 7nm and Intel goes to 10nm CPUs, Intel will do very well, and I'm an AMD fan. I want Intel GPU's to do well also. A drink to more competition.
    Knot Schure and hood6558 like this.
  24. poohbear

    poohbear TS Maniac Posts: 238   +139

    About damn time!!!! Nvidia has gone nuts with its graphics cards prices. Hopefully this will bring some much needed competition to the GPU space!
  25. amorl

    amorl TS Rookie

    Well no words are missing here my friend, you have had said everything and you are absolutley right.

Similar Topics

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...