Rumor: Nvidia GeForce GTX 680 to arrive in February

By Jos
Jan 18, 2012
Post New Reply
  1. Nvidia was all about Tegra 3 and mobile computing at last week's Consumer Electronics Show, but the company hasn't forgotten about hardcore gamers awaiting their next-generation desktop products. In fact,…

    Read the whole story
  2. captainawesome

    captainawesome TechSpot Booster Posts: 411   +42

    I wonder why AMD goes for the bigger framebuffer when Nvidia seems to like the 2GB max..?
  3. Muggs

    Muggs Newcomer, in training Posts: 56

    Been waiting on the 600series to upgrade my 400series card. Will be nice to have a single card solution again.
  4. dividebyzero

    dividebyzero trainee n00b Posts: 4,787   +639

    @ captainawesome
    That comes down to the BoM (Bill of Materials). Because Nvidia GPU's tend to be larger than AMD's (and hence more expensive to produce), Nvidia usually don't endow the cards with other big price tag components, which is usually seen in the areas of voltage regulation and vRAM (both in quantity and speed). For example, fhe 3GB of VRAM that the HD 7970 carries has a wholesale cost of around $US 90, while 1.5GB of the previous generation GDDR5 carried by the GTX580 is $US 32.

    Having more vRAM onboard, for the most part, only produces tangible gains in a minority of games at standard resolutions -reference any HD 6950 1GB vs 2GB review. As gaming moves to greater resolutions the frame buffer becomes more desirable.
  5. marinkvasina

    marinkvasina TechSpot Enthusiast Posts: 259   +9

    all i can get from this is that u are a pro user of google.
  6. LinkedKube

    LinkedKube TechSpot Project Baby Posts: 4,264   +41


    or that you're a troll waiting under a bridge
  7. bandit8623

    bandit8623 Newcomer, in training Posts: 45

    amd likes 4k resolutions :p
  8. "captainawesome said:
    I wonder why AMD goes for the bigger framebuffer when Nvidia seems to like the 2GB max..?"

    Because AMD's cards support 3 screens out of the box, and in forexample Crysis 2 i saw tests where it consumed 2.4gb of vram in eyefinity.

    That being said, dont underestimate the power of morons, many look at cards and think a 2gb card is better than a 1gb, even if its 2gb gddr3 vs 1gb gddr5.
  9. Ultraman1966

    Ultraman1966 Newcomer, in training Posts: 69

    Because bigger sounds better
  10. dividebyzero

    dividebyzero trainee n00b Posts: 4,787   +639

    Don't feel bad, an in-depth comprehension of a given subject takes time, concentration and a great deal of effort. You're bound to have a few missteps along the way.

    You could, I suppose use Google for a quick-and-dirty synopsis of most things, but it's usually better if the technology and its wider implications are something you have a keen interest in....pretty radical concept for a tech enthusiast forum I know.

    TL;DR
    LinkedKube on the money
  11. Sarcasm

    Sarcasm TechSpot Enthusiast Posts: 342   +20

    If you noticed AMD's recent string of products, they are aiming toward extreme multitasking. That also reflects in their CPU's and APU's. With their Radeon line, notice how they keep talking about multiple screens with Eyefinity? And even the FX CPU's is great for extreme multitasking (regardless of how people view it as a failure.) Point is, having more VRAM will definitely be needed especially for multi-monitor setups.

    If a person was not to use any of those features, stick to one monitor, don't do much multitasking, then an Nvidia card with 2gb should be plenty. Even though I say the more the merrier.

    My GTX580 with 1.5gb is plenty for me at my resolution of 1920x1080.
     
  12. "Previous rumors suggested Nvidia was timing the release of their new graphics cards to coincide with the launch of Intel's Ivy Bridge processors."

    With little change in performance and a hefty price tag, I guess Nvidia missed the memo that Intel's Ivy Bridge built-in gpu will have 30-60% better performance and be able to support 2 monitors running 4k x 4k.

    http://www.techspot.com/news/46832-intel-to-launch-22nm-ivy-bridge-processors-on-april-8.html
    [I don't have the source for the 2 monitors running at 4kx4k, but it was a statement made my Intel.]
  13. LinkedKube

    LinkedKube TechSpot Project Baby Posts: 4,264   +41

    Amen to that although I'll be buying my third when the 680 drops. I'm trying to keep up with red but my gf won't allow it.
  14. Still waiting for the GTX 780, i predict the 680 will suffer from power consumption and overheating issues :p

    It'd better come out before 21st December, or i'll just be pis...
  15. amstech

    amstech TechSpot Enthusiast Posts: 805   +201

    Hmmm no GTX 670 treats? :)

    How has 'Techspot' not banned you yet dividebyzero?

  16. dividebyzero

    dividebyzero trainee n00b Posts: 4,787   +639

    Taking a general observation rather personal don't you think ?

    If you think you see something worthy of a ban feel free to use the Report Bad Post link...I'm sure the administration will give your input all the attention it deserves.
  17. LinkedKube

    LinkedKube TechSpot Project Baby Posts: 4,264   +41

    Because despite what you think his mental real estate is still worth something on this site. I have a short list of TS regulars that I've gained respect for. He's one of them.

    I'm sorry if your post was a troll attempt, or even if it wasn't I'm sure you'll find a comfy bridge to hide under.
  18. dividebyzero

    dividebyzero trainee n00b Posts: 4,787   +639

    Performance, like a backwoods marriage, is relative. 30-60% increase from HD2000/HD3000 level graphics really isn't saying a lot in the realm of gaming. For most non-gaming applications, IGP will do just fine, but I think you'll still find that Intel's IGP isn't at the level to rival AMD's Fusion APU's or a discrete card- which is where Nvidia's discrete graphics and Optimus switching fill the market demand.
    Aside from gaming there are other area's where a discrete card (or two) makes perfect sense - business and workstation spring to mind.

    @LinkedKube
    Thanks for the kind words. And Ditto.
    You can change your username...when did that happen?
  19. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,867   +74

    ...so much for asking for help to find non-ref waterblocks:haha::wave:

    and what evidence are you basing this on ? power efficiency seems to be one of (if not thee main facet of the upcoming line.

    as soon as I can figure the over under on that, I'm opening the betting window. How much you want to lose Amstech?
  20. LinkedKube

    LinkedKube TechSpot Project Baby Posts: 4,264   +41

    Ask me tomorrow when I'm not in a drunken rage and on a troll hunt.


    You want full copper, copper and nickel or copper base with acrylic block? You know i got choooo! ///In my detroit lingo\\\
  21. treetops

    treetops TechSpot Evangelist Posts: 1,653   +51

    ah control f couldnt find a $ sign :(
  22. cliffordcooley

    cliffordcooley TechSpot Paladin Posts: 5,773   +1,429

    You can say that again, DBZ has my respect too.
  23. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,867   +74

    I know you do T :)
    Does the acrylic block present any problems?
  24. LinkedKube

    LinkedKube TechSpot Project Baby Posts: 4,264   +41

    1 to 2 degree temp difference at full load, but nothing detrimental over complete loop temps compared to full copper blocks with good flow

    Sent from my DROIDX using Tapatalk
  25. Archean

    Archean TechSpot Paladin Posts: 6,035   +70

    LinkedKube = supermashbrada ................ *head scratching* ............. ??????

    By the way amstech: DBZ is one of the few very well informed contributors here, and I am sure many will verify that he doesn't indulge in any non-sensical outbursts, somehow, some people always forget one thing, one have to 'gain' respect, as it is something which can't be taken as a 'given'.


Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.