TechSpot

Rumor: Nvidia GeForce GTX 880 specs revealed

By Scorpus
Apr 10, 2014
Post New Reply
  1. The latest rumor from Tyden.cz has allegedly detailed the Nvidia GeForce GTX 880, which is set to be the company's flagship Maxwell-based graphics card when it launches later this year. Of course, it's always worth taking rumors with a grain of...

    Read more
     
  2. GhostRyder

    GhostRyder This guy again... Posts: 2,191   +590

    Odd, these specs seem odd in general because GM architecture is supposed to be much more efficient (And powerful core to core) when comparing Cuda cores from past generations. The 750ti has less cuda cores the its 650ti counterpart while delivering more performance which would just seem odd in general to cram that many on the flag ship card a necessity because it could essentially blow the 780ti away and Titan with ease based on the fact the 750ti had much better compute than its other counterparts.

    Plus a 256 bit bus is pretty small and taking a step back from the 780ti... So im getting curious about this.
     
  3. TomSEA

    TomSEA TechSpot Chancellor Posts: 2,559   +598

    Sweet! The trigger for my new rebuild are these 800 series cards. Seems like I've been waiting forever for them.
     
  4. Skidmarksdeluxe

    Skidmarksdeluxe TS Evangelist Posts: 6,509   +2,056

    I have a 7 series GPU so I'm sure I can get away with skipping the next two or three generations. On average I replace my GFX card every four years.
     
    misor likes this.
  5. VitalyT

    VitalyT Russ-Puss Posts: 3,155   +1,431

    I'm using nVidia GTX 780, and my next video card will have to satisfy the following requirements:

    1. Use HDMI 2.0 and DisplayPort 1.3 for outputs;
    2. Perform superbly in 4K
    3. Support DirectX 12 or later
    4. Be at least as ergonomic as my current GTX 780 (size + power consumption).
    5. Use GDDR6 (expected in 2014)

    Until a product is released that satisfies all that, I won't be upgrading. I think it will take a bit longer than 1 year for such products to appear, which is fine by me :)
     
    Last edited: Apr 10, 2014
    Burty117 and Jad Chaar like this.
  6. Jesse

    Jesse TS Evangelist Posts: 359   +42

    Ergonomic?
     
  7. theBest11778

    theBest11778 TS Addict Posts: 234   +69

    The specs seem off to me as well. The core count would be right if Nvidia is truly aiming for 4K gaming. A Maxwell with 3200 Cuda Cores would handle that. 4GB VRAM sounds right, although I bet the TITAN version will have 8GB. This helps them keep their market segments segregated. 256bit @7.4Ghz seems stupid. 5Ghz GDDR5 @ 512bit (like AMD is using,) is smarter in so many ways. Cost is equivalent, and you get more bandwidth. Jumping to 6Ghz GDDR5 @ 512bit would make perfect sense as it's Faster than AMD's solution for only a small price increase (6Ghz GDDR5 is much cheaper/common than 7Ghz GDDR5 & uses less power.) The only reason to use 256 bus would be to put the 512 bus on the Titan version as another incentive to shell out a few hundred dollars more. Bad move in my opinion.
     
  8. VitalyT

    VitalyT Russ-Puss Posts: 3,155   +1,431

    = within the same size and power consumption.
     
  9. VitalyT

    VitalyT Russ-Puss Posts: 3,155   +1,431

    I updated my list above to include GDDR6, which has been available for a while, but first products using it are expected this year, so I would definitely wait.
     
  10. er·go·nom·ics(ûr′gə-nŏm′ĭks)
    n.
    1. (used with a sing. verb) The applied science of equipment design, as for the workplace, intended to maximize productivity by reducing operator fatigue and discomfort. Also called biotechnology, human engineering, human factors engineering.
    2. (used with a pl. verb) Design factors, as for the workplace, intended to maximize productivity by minimizing operator fatigue and discomfort: The ergonomics of the new office were felt to be optimal.
     
  11. VitalyT

    VitalyT Russ-Puss Posts: 3,155   +1,431

    Ergonomic refers to efficiency of using an object, with emphasis to maximizing such and minimizing the discomfort of using the object. With reference to a video card that implies keeping the same size and power consumption.

    P.S. Don't do that again, please...
     
    Burty117, Jesse and Jad Chaar like this.
  12. Tokido

    Tokido TS Rookie Posts: 17

    Who wants to buy my dual GTX 780 Ti Classified Hydro Coppers?
     
  13. Jad Chaar

    Jad Chaar TS Evangelist Posts: 6,477   +965

    Ill give you $1.
     
  14. cmbjive

    cmbjive TS Booster Posts: 777   +137

    I just got my 780s two months ago. No need to upgrade again (besides, my wife would kill me if I did).
     
  15. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    If the GPU was aimed at 4K gaming then the back end would have been beefed up. 40 ROPS servicing 3200 cores? No thanks.

    Just bear in mind that it is silly season. People realize that the current process node is just playing out the string, with new models (295X2, Titan Z) on aging architectures brute forcing performance at the expense of common sense. People also know that a new process is inbound, and are starting to get impatient, so it becomes a perfect environment for other people to start putting their guesses and estimates up on sites for an instant page hit deluge.

    How likely is it that a card that is in all probability still in the design stage has been given a retail price? That's aside from the fact that TSMC's CLN20SOC (20nm) process isn't by all accounts geared for high power IC's, so the GM204 will need to wait for the 16nm FinFET process to arrive (20nm BEOL + 16nm FEOL hybrid node), which isn't slated to be ready for primetime until very late 2014/early 2015 at best.

    BTW: That die represents a 11% increase in core count plus a sizeable reduction in uncore ( memory I/O, controllers, probable cache and SFU reduction) on a ~30% smaller node -uncore excepted, from the GK110 yet it is slated to cost the same as the 780 Ti*. New wafer prices must be exorbitant (they are rumoured to be 50% higher). As an aside, that would make the Pirate Islands GPU around the same size and the GM204 (460-470mm²) if these weird numbers from well renowned bad-guessing WCCF are to be believed:eek:

    * The GM204 part should be an analogue in the product hierarchy of the GF104/GF114 and GK104 (second tier performance) not the GK110- in which case the core count seems hugely optimistic.
     
    Last edited: Apr 10, 2014
    Burty117 and Jad Chaar like this.
  16. MrBungle

    MrBungle TS Booster Posts: 151   +67

    I see your $1 and raise you 31¢!
     
  17. Now if these figures are true, it will be interesting to see what AMDs TDP response will be to keep up in performance.
     
  18. Is this new 880 will make a significant performance compared with 780?

    PS: I'm totally noob here
     
  19. Skidmarksdeluxe

    Skidmarksdeluxe TS Evangelist Posts: 6,509   +2,056

    If all the hype surrounding Maxwell is to be believed then it's main party trick is a significant reduction in power consumption, or as they like to put it, performance per watt.
    Personally I feel replacing a graphics card (or any component for that matter) just for the sake of the latest model and tech is false economy.
    Methinks your GTX 780 will be sufficient for all games for a while to come.
     
    VitalyT likes this.
  20. gobbybobby

    gobbybobby TS Guru Posts: 548   +8


    on paper yes

    but we will not know until the card is actually confirmed, released, and benchmarked.

    I just got my an R9 270x from AMDs offering, quite happy with it, especially as the main game I am playing at the moment Planetside2 was completely broken by the latest Nvidia drivers (PS2 players have to roll back for the game to run)
     
  21. Was only joke from autor ..... its not realy 880.....
     
  22. Burty117

    Burty117 TechSpot Chancellor Posts: 2,920   +687

    This ^^ this is me xD

    I also have a 780, I am waiting for the exact same things, in the mean time though I'm going to get some g-sync in my life as the 780 tends to struggle at 1440p on a surprising amount of games :)
     
  23. Most English speakers understand "ergonomic" to describe efficiency in relation to human effort, not machine effort. The semantic range of the word does not include your usage (perhaps you are starting or are part of a movement that is expanding the semantic range of the word, but that usage has not made it into any dictionaries; language is alive and constantly changing, but we still want to try to be understood with as little distraction as possible, so it is typically best to avoid malaprops). I am just a random reader who happened to see your response and felt sorry for the guy who you told to "not do that again." I now have a lowered opinion of this site unfortunately. Perhaps you should edit your original comment replacing "ergonomic" with "efficient" and delete the comments relating to it. That would be much better, don't you think?
     
  24. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    Works for me, but I'm guessing a true (non tiled) 3840x2160, @ (at least) 60Hz, 10 (or preferably 12) bit colour panel that also supports 4:3 pixel ratios wont be overly cheap...these being my parameters for a 4K screen
    That might have to wait for a generation of cards geared towards 8K gaming. It's the nature of the business to shoehorn in image quality and render enhancements when, and wherever possible to keep hardware perpetually playing catch up with gaming software. The next major addition to image quality seems to be shaping up with global illumination (and possibly path/ray tracing). Somehow I cant see Maxwell or Pirate Islands running 4K with image quality settings that have been commonplace for years, such as transparency supersampled antialiasing (or even widespread use of 4 x multisampling) and compute shader options such as depth of field, ambient occlusion, particle/water/smoke effects etc.
    Probably a given at this stage. Some DX12 features are already available to DX11 level hardware.
    Maximum board length is set at 312mm by the ATX specification. Power consumption is pretty much governed by OEM demands so that shouldn't change. Under 75W for entry level, 75-150W lower mainstream, 150-200W mainstream/low end performance, 200-250W performance (second tier), 250-300W top tier single GPUs.
    There's a good chance that both Nvidia and AMD will use Hynix's High Bandwidth Memory (HBM). Stacking DRAM chips offers much more bandwidth than GDDR6, and the latter based on DDR4 might end up quite expensive since production is tied to fairly limited sales - basically Haswell-E based server and a smattering of desktop buyers.
    Yeah, I'm figuring late 2015/early 2016 at the earliest for a lot of this to come together, although the gaming graphics requirements (read IHV sponsorship) should always ensure that there is a market (need?) for multiple GPU setups.
     
    Burty117 and VitalyT like this.
  25. Patrick Proctor

    Patrick Proctor TS Rookie

    These specs are bull all the way to the core count. Some universities have already seen engineering samples, including the Nvidia Institute at Carnegie Melon. The memory bus is 512-wide. There are 288 ROPs and 94 TMUs. Also, the core count is 3584 and the base clock rate is 972 MHz.

    The wattage is about 250, or on par with the 780 TI specs. 2 DVI-D ports, 1 DisplayPort 1.3, and 1 HDMI 2.0 port.

    This is what you have to look forward to courtesy of my blabbermouth friend at CMU. Merry Christmas in June everyone.
     

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...