Nvidia teases Fermi successor, plans for CUDA x86

By Matthew · 15 replies
Sep 21, 2010
Post New Reply
  1. During its GPU Technology Conference today, Nvidia offered a glimpse at its plans for upcoming graphics hardware. Based on the shared CUDA GPU roadmap, it seems the company intends to launch a new architecture every two years, but CEO Jen-Hsun Huang wouldn't outwardly commit to that.

    Read the whole story
  2. princeton

    princeton TS Addict Posts: 1,676

    28nm IF TSMC gets a working 28nm fabrication process. correct?
  3. Okey, with the 5000 series owning this generation, the 400 series not that bigger in performance (even with nvidia sweating to release it) and the 6850 at the begining of the next year, nvidia couldnt put out another marketing graphic but that.

    I really dont like how fast they are releasing new gpus. They are expensive and we buy them just to play games not for internet, work, homework, music, movies, etc. So, they arent that worthy it. And my new and expensive 5870 doesnt make me that proud because of the next 6850 so damn close :(
  4. EXCellR8

    EXCellR8 The Conservative Posts: 1,835

    it doesn't matter how fast they release new tech, the game industry doesn't move quite as fast so most new cards will probably be irrelevant to most users. most of us are not even running DX11 yet (i myself just bought a 5870 not long ago) and the bulk of games that will utilize it are not even out yet. it's not like the HD6xxx series is not going to be leaps and bounds ahead of the HD5xxx series anyway...
  5. Adhmuz

    Adhmuz TechSpot Paladin Posts: 1,828   +633

    Well a lot of people are in the same boat, everyone buys into current gen hardware just to see next gen months away. In this case its Nvidia shaming those who bought fermi cards, thats a real stab in the back. Anyone who has waited now gets to wait longer too. And here I was looking forwards to fermi revisions based on the GTX460 die like a 485 or wtv they would call it. Well I'm happy with my 5870, it does what I need just fine.
  6. EXCellR8

    EXCellR8 The Conservative Posts: 1,835

    exactly, i'm loving my 5870 too and if i had to have stayed with my 4870 i'd probably be fine with that as well. like i said there's still only a handful of DX11 games on the market at the moment with the next big batch being set for early next year. that's good news for people who are waiting for the HD6xxx cards but it's not like HD5xxx users will be left in the dust. i'm sure AMD has a similar roadmap in store for their GPUs, but it's no different than what's already going on with graphics cards...

    that's just how it is with buying high-performance PC hardware... deal with it.
  7. Why is it most of the comments made about Nvidia cards made by little boys who only play games.Nvidia cards and Cuda are use extensivly in Geo science and medical and mining and movie industries.What do 70% of the worlds Super computers run on, not ATI graphic cards.
  8. Maybe because "little boys" like us are probable 90% of the users of those things?
  9. Adhmuz

    Adhmuz TechSpot Paladin Posts: 1,828   +633

    Well excuse us for not wanting to pay $2000 for a GPU, I know Nvidia would like to think we do. And about the comment concerning the DX11 games, the hardware actually can't handle true DX11 implementation. Try tessellating everything in any current game on a single GPU, not happening. Even games that implement some DX11 features have trouble keeping a decent frame rate on most hardware. Think about it, the hardware will be used when it becomes available, thats all I have to say.
  10. dustin_ds3000

    dustin_ds3000 TechSpot Chancellor Posts: 887   +19

    I'm glad i decided to wait until i get out of basic training (mid-July 2011) to buy i new video card.
  11. starfreezer

    starfreezer TS Rookie Posts: 16

    If they know what they are going to do, then, why not just do it!? :)
  12. red1776

    red1776 Omnipotent Ruler of the Universe Posts: 5,224   +164

    here is your answer
  13. MrAnderson

    MrAnderson TS Maniac Posts: 488   +10

    I think this technological increase is great. And no, you don't need to buy every new GPU unless you are a developer or you have "Tim the Toolman Taylor (TTT)" Syndrome, which some of us do. Games hardly start using the power of current gen cards but about midway through their next generation brethren's lifecycle. In other words, most games work fine on last year's redesigned cards.

    I've learned to by with in the redesign lifecycle, unless I skip, or I see a really sold product.

    As for the 28nm process, I do hope this transition goes more smoothly for all. I do mean all. I think it is better to watch products compete on the actual technology and not how much time they have made it to market ahead of the competition. Manufacturing problems aside, I hope to see realese evening out this time. This will allow them to make faster corrections/redesigns which is a win for the customes always. And price battles too.

    Also, did you consider that they might also work with another manufacturing group? Global Foundry could be an option right. I do hope they get over their manufacturing issue regardless. So lets hope TSMC can get their act together this round.
  14. Johny47

    Johny47 TS Rookie Posts: 157

    Sounds good as long as the price, power requirements and heat generation isn't more than what Fermi was, And what does 'tim the toolman taylor syndrome' mean? haha =/
  15. ET3D

    ET3D TechSpot Paladin Posts: 1,380   +168

    CUDA for CPU's might be a nice thing to have if it can offer optimised CPU PhysX.
  16. OH THANK GOD nvidia got some form of x86 technology. Without that they'd be dead in the water, asmuch as some will claim otherwise, nvidia makes most of it's money out of consumer graphics cards sold to the masses either in simple gpu's, consoles and mobile devices.

    With AMD fusion and intel sandy bridge, nvidia have nowhere to run to, and are left with the small fragment of the market that is the high end gaming segement. With AMD making so much more money from fusion, money that nvidia cannot gain, they wouldn't compete there either. but with an x86 architecture they may be able to make some kind of competitor to fusion and SB and open the market a little.

    Or, of course, they could be running off into pure supercomputing and leaving the consumer graphics entirely behind them. Of course AMD also competes there too, and if/when nvidia cut themselves off from too many lifelines, AMD will just squash them there too.

    nvidia need to be very careful at the moment, too many market niches that they need to survive are being eaten up by AMD and intel.

Similar Topics

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...