JPR: Intel, AMD gain GPU share at Nvidia's expense

By Matthew
May 3, 2011
Post New Reply
  1. Jon Peddie Research has published the graphics market's first quarter results, showing a 10.3% spike in shipments. JPR said the change was welcomed after a weak holiday quarter, especially considering…

    Read the whole story
  2. gwailo247

    gwailo247 TechSpot Chancellor Posts: 2,105   +18

    That's a pretty interesting statistic. While Team Red and Team Green are battling it out, most people are playing for Intel (Team Blue?).
  3. Mikymjr

    Mikymjr TechSpot Enthusiast Posts: 122   +7

    AMD's fusion is still better then Intel's embedded graphics in i3 and low-end i5. And cheaper @ it too
  4. dividebyzero

    dividebyzero trainee n00b Posts: 4,788   +639

    And pretty much always have done.
    Intel's IGP, and lately on-die Clarkdale and on-cpu Sandy Bridge CPU's get sold in vast numbers to OEM's for entry level to lower mainstream systems. Between Intel's hardball (and sometimes illegal) march to marketshare and AMD's almost childlike approach to marketing, it's hardly surprising that Intel holds the numbers it does -even taking into account that Intels discrete GPU marketshare presently sits at 0%.
    And if people bought a CPU primarily for it's graphics ability then AMD might be on top of the world.
    Couple of points to note:
    1.Fusion actually seems to be cannibalizing AMD's own low end discrete graphics market (down 3% yoy - AMD's Q1 CC transcript >>here<<)
    2. As for Fusion being better than Intel's HD2000...that kind of depends on what parameters you are looking at. I wouldn't say that these gaming benchmarks would enhance your argument. Zacate and Ontario might give Atom the run around, but Zacate/Ontario v Core i3 is a mismatch........unless you're talking about unreleased parts (Llano / Sabine) and some crystal ball gazing.
  5. gwailo247

    gwailo247 TechSpot Chancellor Posts: 2,105   +18

    Yep, both of my parents Dells came with onboard Intel graphics, although I did sneak in an old 8800 GTS into my mom's computer. Not that she ever noticed.
  6. Mikymjr

    Mikymjr TechSpot Enthusiast Posts: 122   +7

    That's true. I was going more in the lines of 890GX and the unreleased parts =P. I really didn't take the hd2000 or hd3000 into account =P. Thanks
  7. stewi0001

    stewi0001 TechSpot Enthusiast Posts: 306   +18

  8. Route44

    Route44 TechSpot Ambassador Posts: 12,113   +23

    What Z said about Intel dominating with Sandy Bridge thus domination with their 3000 graphics is right on. Other than AMD's Zacata with E-350 and their 6380 on-die graphics nothing is challenging Intel right now especially in the laptop industry. Sell enough Sandy Bridge and you are selling their new 3000 graphics as well.

    I am in the market for a new Lenovo laptop but unless I pay considerable more money for the Nvidia option I'm stuck with the Intel 3000 which is less than thrilling.
  9. Archean

    Archean TechSpot Paladin Posts: 6,035   +70

    DBZ, I think things are playing out in the direction we were talking about few months ago. Having used Sandy Bridge based notebook with discrete graphics for a (little) while now, I can tell you that the IGP isn't really bad as it used to be in the old days (best part of it is that battery life of about 4:45+ is phenomenal for a quadcore based notebook). I think as soon as AMD brings out something competitive in the mobile arena nVidia's presence at least at the low end of the market will diminish even more. In the longer run, however, I think nVidia probably will focus more on the other segments of mobile computing i.e. cells/tabs etc. which is pretty smart move IMO. Although interestingly enough, even here the competition is heating up, case in point Samsung's Exynos platform offers slightly better performance when compared to Tegra 2 (e.g. in situations such as accelerated decoding for multiple multimedia codecs and formats), and if this trend of cell phone makers developing their own platforms continues, that will spell trouble for nVidia.
  10. dividebyzero

    dividebyzero trainee n00b Posts: 4,788   +639

    I think there comes a time when integrated (whether by chipset or CPU) starts gaining performance in relation to new discrete cards, but I wouldn't see IGP platforms eating into the mainstream (desktop) for quite some time. The low-end discrete graphics-as you say- will surely end up evaporating, but for the short/medium term, the cheap Dell/HP OEM generic system is still with us- and growing in developing markets. Nvidia's groundwork and aggressive pricing keeps these OEM's in close partnership...for now. 1920x1080 is now (or soon will be) the predominant screen resolution- and that represents just too many pixels to push (at good IQ) for both the integrated hardware (core speed, shader, performance/watt) and available frame buffer.
    Intel is probably on the right track with on-silicon GDDR (Ivy Bridge or Haswell) and has the process advantage (smaller node, 3D transistors, lower power requirement etc.), while AMD have the proven GPU technology but are in reality more than a full process node behind.

    In the laptop segment I think discrete graphics should all but disappear except for desktop/workstation replacements and probably represents less concern for Nvidia IF their Project Denver SoC comes to fruition.

    Nvidia seem to think that gaming real-time ray tracing is 3-4 years away, at which point you could say: What do you need after photorealistic gaming ? The answer is probably not much. Once gaming (assuming it ever moves away from DirectX 9) reaches this level, then I think we are close to moving full circle- GPU's taking on CPU attributes, and CPU's having fully integrated graphics ( their own on-die GDDR memory etc.).

    As for Nvidia in the handheld/phone market, their acquisition of Icera and their working relationship with Microsoft probably means that they are well aware of how cutthroat the market is. A failure to make inroads there probably makes Nvidia a pretty good buyout target for a company that needs a quick boost into the mobile and GPGPU markets;)
  11. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,867   +74

    Are we going to see Larrabee surface again?
     
  12. dividebyzero

    dividebyzero trainee n00b Posts: 4,788   +639

    My guess is yes. Intel might have cancelled the Larrabee 2 (or was it 3?) but as far as I'm aware, the GPGPU project was never cancelled- only the retail (desktop) discrete card...and I've heard and read the latter might not be completely set in stone either.
    Whether Intel have the ability to go toe-to-toe with Nvidia and AMD is another matter entirely....although IP and knowledgeable staff probably aren't hard to come by if you have the kind of financial resources that Intel have. A lot would likely depend upon what (if any) GPU IP sharing/ cross licensing contracts are in place between Intel and AMD (from the $1.25bn settlement in 2009), and Intel/Nvidia (from the $1.5bn settlement in Jan 2011).
  13. Archean

    Archean TechSpot Paladin Posts: 6,035   +70

    I think on-silicon GDDR is a good idea, but I am not sure how this will play out with regard to battery life; because that is very important for most users. Perhaps the power they may save from going to smaller node/3D transistors etc. compensate this but I can't find anything about it. Anyway, if nVidia eventually goes belly up in 3-5 years time, I wonder who will show up at the door to buy it .....
  14. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,867   +74


    The last one was Larrabee 3 that was canceled I think. but it has been revised/renamed so many times and with different incarnations it's hard to tell. It hasn't been able to keep up pace with the current graphic demands (appearance). The last 'demo' was actually a sort of embarrassment for them ( an old scene from half life i think) with one synchronous action of rolling waves running at not to impressive frame rates, although they said the whole scene was ray traced.
    Maybe some cross licensing with AMD for the GPU multi-threading tech will help l
    Larrabee along. I wonder if the real time ray tracing will be dropped from the project as it seems to be the real choking point of the project. But then with the HD 6000, and the Fermi series, that is fast becoming the only functional difference on the GPU side isn't it?
  15. dividebyzero

    dividebyzero trainee n00b Posts: 4,788   +639

    @Archean
    Not sure I follow.
    For instance, HD2000/HD3000 Sandy Bridge graphics uses DDR3 running at a nominal 1.5-1.65v over a 64bit memory bus.
    Samsung's (for example) GDDR5 runs at 1.35v and can be paired with 64, 128, 192, 256, 320, 384, 512-bit (and larger) memory bus.
    You might also take into consideration that for any given clock rate GDDR5 has twice the bandwidth of DDR3, and while the introduction of DDR4 @ 1.05-1.1v would lower power draw over DDR3 (and double bandwidth), the same process will likely be applied to GDDR5's successor...performance and voltage margin restored.

    @red
    I think (personally) that rasterization in it's present form is probably playing out the string. Nvidia have had ray-tracing up and running for some time already-and thanks to some revenue opportunities (and risk sharers) seem to be getting through their debugging and proof of concept suites in fairly good order. There is also the next stage of rasterization+ adaptive tessellation to consider (micropolygon rendering*) ( crappy html >here<, primo pdf >here<)....so all in all, it's like Intel bringing their T-ball bat to face Nolan Ryan

    This paper on Decoupled Sampling for Graphics Pipelines (also presented at Siggraph) is also worth a once over (pdf link on the site)
    Both micropolygon rendering and decoupled sampling have had heavy investment from Nvidia (along with a few other studies and technologies), and I think it would be a certainty that Nvidia wouldn't be pushing these advances if they weren't looking to shape (or reshape) games rendering. Nvidia are obviously working steadily towards a very heavily GPU-compute future as far as games are concerned -which ties in with their GPGPU aspirations. It will be interesting whether AMD tries to foot it with Nvidia or whether they take a different path. I would think that if they choose the former than it will call for a fundamental change in design philisophy, certainly a new architecture, and some serious software development or third party funding.
  16. Archean

    Archean TechSpot Paladin Posts: 6,035   +70

    Okay let me re-phrase and please correct me if I am wrong. The way I understood this, is that Intel plans to place GDDR memory on-board; i.e. it will be separate form the main system memory; hence, what it will do is add to power requirements / draw. That will surely have impact on overall battery life. However, if there are reasonable power savings from going to 22nm / 3D transistor setup, they may save enough power to offset this drawback, and overall battery life may remain unchanged from sandy bridge or in best scenario may improve in some way.
  17. dividebyzero

    dividebyzero trainee n00b Posts: 4,788   +639

    You could probably look at it from two viewpoints:
    The first is that having dedicated GDDR memory should allow for less aggressive timing and lower voltage RAM since the GDDR will take care of most of the heavy lifting bandwidth wise that the RAM would normally be used for. Also you could factor in shorter traces, simpler VRM, lower latency and less demand on system RAM.
    The second point is probably that GDDR on-die will become a necessity as more performance is required of the GPU. If core speed (voltage, amperage, heat) becomes restrictive due to the need to keep the package within a thermal limit and set TDP, then the onus falls upon shader pipeline count, memory buswidth and memory speed to add the performance gains...the last two being heavily favoured by GDDR.
    Remember that 3D "tri-gate" transistors are already in use with memory IC's, so what gains they have with CPU/APU are gains for memory also.


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.