Gigabyte's GTX 980 tri-SLI Waterforce cooler launches for $3,000

By Scorpus
Dec 16, 2014
Post New Reply
  1. Gigabyte's crazy closed-loop water cooling solution for three high-end graphics cards, Waterforce, is now on sale with a whopping $3,000 price tag. The mammoth liquid cooling unit that sits atop your PC case does come with three Nvidia GeForce GTX...

    Read more
  2. MeladT

    MeladT TS Rookie Posts: 22   +7

    Not too sure about having exposed tubes, but it does look pretty damn cool.
  3. Skidmarksdeluxe

    Skidmarksdeluxe TS Evangelist Posts: 6,335   +1,936

    I can't think of anything better to waste my money on.
    seefizzle likes this.
  4. Nima304

    Nima304 TS Guru Posts: 365   +81

    Those exposed tubes are a deal-breaker for me. If they break during transportation, which is certainly a possibility, then you have to spend a ton of money just to get them fixed. Not to mention one or more of your GPUs are useless during that time.
  5. RustyTech

    RustyTech TS Guru Posts: 814   +382

    I'd like to get a new car; Tesla Model S perhaps?
  6. davislane1

    davislane1 TS Evangelist Posts: 3,370   +2,161

    Tesla is moving the Model S for $3,000?
  7. RustyTech

    RustyTech TS Guru Posts: 814   +382

    Lol no...I was just saying I'd love to blow my money on something like that too ;)
    If those things were going for 3k, I'd buy a few!! :D
    Skidmarksdeluxe likes this.
  8. GhostRyder

    GhostRyder You know, that one guy with the PC Posts: 2,189   +589

    3K, are they kidding with that price??? I mean a standard 980 is about 550-700 depending on the variant and even picking a nice variant with some liquid blocks and building a LC setup I could do it all for less than 3K.

    Sure I guess the argument is this comes pre-assembled and with 3 separate coolers and setups on top of the fancy display but the practicality of this versus the fact that its not exactly cheap and requires a very high end system to drive right your looking at a weird area this thing sits at.
  9. davislane1

    davislane1 TS Evangelist Posts: 3,370   +2,161

    This is designed for enthusiasts with money to blow. Trying to make a rational case for such an upgrade is like trying to fit a square into triangular hole - not gonna happen. The one guy who mentioned practicality during the design phase probably gave the rest of the team a good chuckle.
  10. Skidmarksdeluxe

    Skidmarksdeluxe TS Evangelist Posts: 6,335   +1,936

    I don't take much of an interest in cars these days and all I know about the Tesla is what I've read on this site but I suppose if you could get a new one for the same price as this setup I think you would be unwise not to do so. $3K in my country would net you a half decent 2nd hand run around depending on where you shop.
  11. GhostRyder

    GhostRyder You know, that one guy with the PC Posts: 2,189   +589

    Even so this just seems like a weird attempt at making something that seems easy but in the end makes things just more of a pain. I mean the price is above a custom made one that in this day and age is pretty easy to do and could be made for better performance (Or heck including your CPU into it) while on the other side buying 3 of the NZXT or equivalent hybrid coolers would be cheaper and work just as well (Albeit I understand these are full waterblocks I doubt the performance would be much different in both stock and overclocking).

    Your right, this is intended for enthusiasts with deep pockets who want a LC system with 3 GPU's without having to do much themselves. But in the end I feel like the idea becomes overshadowed when looking at how much of a pain routing that through the front is not to mention it throws looks away with tubes coming out the front like that. Just seems to me like they should have thought a bit more about the overall idea but that's just my opinion.
  12. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    Never been a big fan of external watercooling setups, and this setup looks like a more ungainly Koolance Exos 2.5 without the ability to customise/upgrade.

    Bearing in mind that a lot of enthusiast outlets now pre-fit waterblocks to cards to order for those unwilling to do so themselves, this Gigabyte system looks like a huge cash premium as a trade off for cutting and attaching some tubing.
  13. Steve

    Steve TechSpot Editor Posts: 2,184   +1,215

    Are you seriously going to carry around a full tower with that thing on top to a LAN party?

    That would just scream I really want to show this thing off and am willing to go through an iron man like course to do so.
  14. While it may not look pretty it is PWREFULL
  15. Ranger12

    Ranger12 TS Guru Posts: 620   +118

    So Techspot, when can I expect a benchmark and review of this system? :)
  16. hrowder

    hrowder TS Enthusiast Posts: 55   +9

    See... here's the thing: The cards only have 4GB of memory on them. If memory "stacked" in SLI, or for that matter Crossfire if using ATI cards, then you might have something viable for 4K monitors of 32" or larger with widescreens which is what we are all looking for. But... no memory stacking, so this doesn't really help. Plus, no motherboard or processor I know of will run the three cards at PCIE 16... they will go to PCIE 8 the minute you put three on the board and then you take a performance hit. Also, a lot of other peripherals now share the PCI bus, so you will limit your options there too.

    If they somehow make it technically possible to use the memory on ALL the cards at the same time as well as the GPUs and have a motherboard and processor (that is where the PCIE bus lanes are allowed) that can use the bus at PCIE 16 for fast data transfer, then ok... you got something. Otherwise three cards are just for bragging rights, and at this price bragging is pretty... unproductive.
  17. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    A 4GB framebuffer is still adequate in the majority of gaming scenarios. You would need to seriously cherry pick benchmarks to show otherwise.
    Stacked memory is HBM/HMC. I believe the term you're after is "shared (or unified) memory pooling".
    This is both incorrect ( quite a number of PEX8747 equipped boards offer 3 (and 4) * PCI-E 3.0 x16 functionality), and irrelevant. The system bus isn't constrained by eight lanes electrical, nor four lanes for that matter.
    See above
    IIRC, Tiling and SFR (Split frame rendering) can use a unified memory pool - they just aren't efficient...and in the case of tiling, can't be used in conjunction with OpenGL.
    This has pretty much always been the case, although it is demonstrably true that in the past, triple CFX/SLI alleviated microstutter considerably in the absence of effective frame pacing technology.
    Phr3d likes this.
  18. cliffordcooley

    cliffordcooley TS Guardian Fighter Posts: 8,430   +2,822

    You make it sound as if increased pixel density uses less memory. 4K is still 4K even if it is a 2" screen.
  19. RustyTech

    RustyTech TS Guru Posts: 814   +382

    Whenever I see a post from dividebyzero, I know I'll be seeing some serious fact :D
    I say Techspot promotes this guy :p
  20. Misagt

    Misagt TS Booster Posts: 94   +40

    If I was thinking of droping this much money. I think I'll just go ahead and do my build a pc in a fridge idea. I know a guy who did this and it was pretty cool. Also it's 10 years old now and still going strong.
  21. hrowder

    hrowder TS Enthusiast Posts: 55   +9

    As DivideByZero points out, with today's gaming a 4GB buffer is still adequate for monitors like 1920X1200 or its widescreen equivalent whatever you may choose (25XX by 14XX or whatever). I was referring to the trend towards 4K monitors and what that would require. I still say present hardware and even three decent vid cards, though better than any one card can do, will not be adequate for 4K monitors at speeds we have come to believe are necessary for smooth rendering. It might take as much as 12GB single cards with possibly water cooling (though I think 2 cards with "unified memory"... thanks for that correction, my bad!) will be the most used solution because the GPU power is also necessary without the need for water cooling, though we still might go there (multiple 8GB 295X's anyone?) . I am not sure that the present PCIE bus is suitable with 3 or more cards, but with new MB designs that fully utilize it with multiple cards it still may be possible if you can cool the whole package reliably (think about case designs and expense of 3 or 4 cards too, especially if water cooled!). I'm thinking it will require new hardware possibly including new bus designs to handle the rendering power necessary for future 4K monitor gaming. Perhaps simply fully utilizing what is here will work but I'm not sure about that. It is irrelevant to state that tiling and so forth can be made to use the memory in multiple cards (at what speeds?) if games, as written, using OpenGL, cannot, since that is what we have and probably will have in the foreseeable future. Certainly I know that a screen of any size using the same number of pixels requires the same rendering power. I was merely stating that widescreen (or actually ultra wide screen) single large monitors of 4K or even higher pixel density are what most of us want for an immersive experience and no bezel in the field of view, and today's hardware is not sufficient for that task at the speeds we need and expect to have. The monitors ARE coming... new hardware to run them adequately, hopefully at native speeds of at least 60hz and fast refresh rates (120hz would be better), will have to be forthcoming also. No matter how we do it, we are not there yet. That is what I am trying to say.
  22. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    What you're arguing for is a pipe dream. Never has an incoming gaming screen resolution been absolutely playable with a single card in all situations since multi-GPU has been in existence - for the simple reason that if a single card could provide the rendering capability it would destroy the market for multi GPU (including dual GPU cards) graphics. When 1280x1024 debuted the only setup that could provide the necessary fillrate was 3dfx's Voodoo Graphics in SLI - and even that came at the expense of 16-bit colour.
    Nothing has changed in the intervening years.
    The 295X2 uses 4GB per GPU. The 8GB advertised memory is not additive, so I'm unsure why you'd make an exception for it. FWIW, even if it were it cannot make use of the additional framebuffer in the vast majority of cases because the GPUs ROP count and memory interface remain static. Note that the 8GB 290X only pulls away from the 4GB version when swamping the framebuffer with texture packs - the resulting difference isn't due to computing power but the ability to hold the textures in the memory buffers. This isn't the de facto situation for PC games.
    It's fine, mainly because most game code utilizes internal bandwidth ( GPU <-> vRAM) greatly over external (PCI-E) bandwidth. Case in point:
    Very unlikely. If anything, the reverse will be true. Indications are that both Nvidia and AMD will move to incorporate ARM architecture CPUs into graphics packages.
    I mentioned it solely in the interest of completeness. Tiling has greater issues IIRC - namely needing to overdraw geometry for each adjacent tile for each tile it renders - basically a doubling up of workload.
    ...and we never will be. By the time a single GPU can render 4K in every image quality scenario, Nvidia and AMD will have moved the goalposts again. Both game development programs will add more post-process compute based rasterization techniques as well as path tracing (if not ray tracing) such as voxel based global illumination...and by the time a single GPU can handle that, 4K will be the equivalent of 1080p and we'll be looking at how much GPU power will be required for 8K gaming.

    Basically, if you're waiting for some idealized scenario where 100% of gaming is playable at a new standard of screen resolution, you'll need to wait 2-3 graphics then of course, that "new standard of screen res" is yesterdays news.
  23. hrowder

    hrowder TS Enthusiast Posts: 55   +9

    Hi again DivideByZero... I was using the 295X as another example just like the 980X3 being written about here. I trust you are more up to the minute about graphics rendering than I, so your analysis of the PCIE bus and its viability for future gaming I accept as being better than my own.

    The whole "post processing" thing is new to me since I am now some years retired from fixing computers for various manufacturers, though I try to keep up. I was sort of surprised to see an option selection for it in the Dragon Age Inquisition graphics section, let alone what it would mean to me to adjust it. I never claim to be anywhere near an "expert" on the software side of things as that is not where I worked, though I appreciate its importance for moving things forward and have "dabbled" in programming. I do understand that the hardware necessary for 4K gaming is not going to be cheap at this time. The problem is right now it would take at least 5-$7000 to do it right (or at least as "right" as is possible at this time) if you include the monitor. I do think that 4K will become common and the hardware to run it "mainstream" and less expensive in a very short time though because the demand is there and will be filled.

    I certainly agree with other things you say as well, including the part about "moving the goalposts" as time goes on. As someone who started out on vintage early 1980"s "orange text" monitors and a Franklin Ace 1000 (I had to hunt through offerings at "computer shows" to get aftermarket hardware to improve the poor thing!), I look back on the various benchmarks like the first 1MB hard drive, to minicomputers that ran whole transportation systems and had a full 1MB of memory (oh the humanity!) and through the advancement of CPU's, GPU's and memory and can only imagine what even 10 more years will bring. It's been an exciting ride so far... I hope it continues and that I am around to see it.

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...