TechSpot

The graphic card version of the '59 Eldorado

By red1776
Nov 11, 2011
Post New Reply
  1. Arris

    Arris TS Evangelist Posts: 4,608   +295

    A 560! This must be for show cases only. Could have at least put it on a 580.

    "2 Way SLI is supported". Need a big case for 2 of those...
     
  2. red1776

    red1776 Omnipotent Ruler of the Universe Topic Starter Posts: 5,219   +157

    Hi Arris,
    well the article says that the Heatsinks are add on/detachable, but I would bet that this thing runs no cooler than say a Twin Frozr III
     
  3. Arris

    Arris TS Evangelist Posts: 4,608   +295

    Been very impressed with my Twin Frozr II 5850s, think I'll be looking for Twin Frozr versions of 7xxx cards when they come out, unless Nvidia has something better.
     
  4. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    The '59 Caddy Biarritz and Coupe de Ville are design classics....this is

    [​IMG]

    Love the acrylic paints and brush...who's the target market, preschoolers?

    Judging by the box dimensions ( 820mm x 165mm x 140mm) and the proximity of the Christmas season, you'd have to think this would be aimed at the non-tech-savvy parent with the "bigger is better" mentality
     
  5. LNCPapa

    LNCPapa TS Special Forces Posts: 4,210   +424

    I absolutely love that the name of the card includes "iGame". I think we all know who the target audience is based on that.
     
  6. red1776

    red1776 Omnipotent Ruler of the Universe Topic Starter Posts: 5,219   +157

    :haha:, I thought the acrylic paints were more indicative than the 'i'!


    The '59 Caddy Biarritz and Coupe de Ville are design classics

    [​IMG]

    I agree Chef. My favorite car of all time. also known as the epitome of decadence.......here it comes...:p:wave:
    [​IMG]
    I always have had an affinity for old Cadillacs. I had two of these Lower) in my youth. ( since we are the same age I will let you figure when that was) nothing like cruising with an acre of bonnet, 472 CID, 8 MPG, and Book of Dreams in the cassette deck.
     
  7. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    A friend of mine had a '68 (Series 61 conv. coupe) - ended up gutted by fire from an electrical short. The 472 and 500 are remarkably balanced engines for a mass production unit. Not suited to the roads (and parking) here - the long wheelbase usually meant sparks a'plenty going over our less than straight/flat roads.
     
  8. LinkedKube

    LinkedKube TechSpot Project Baby Posts: 3,481   +44

    3 six pin power connectors?
     
  9. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    Yeah, makes no sense when the same power delivery can be gotten though an 8 pin and 6 pin.

    Maybe they are going for Asus Mars Lite.....hope there is a tube of scarlet paint in the kit, although I suspect they shoud have those little tubs of paint made for finger painting rather than tubes

    1010Mhz core makes it a beast....then Chinese marketing makes it a least.
     
  10. red1776

    red1776 Omnipotent Ruler of the Universe Topic Starter Posts: 5,219   +157

    I know why they do the 3 x 6 rather than a 6 +8. think of who buys this...most 4 pin molex to adapters shipped are 6 pin.
     
  11. LinkedKube

    LinkedKube TechSpot Project Baby Posts: 3,481   +44

    I'm wondering if the oc'ing potential is even worth the money. Think you could read 570 benches w/o blowing it up?
     
  12. red1776

    red1776 Omnipotent Ruler of the Universe Topic Starter Posts: 5,219   +157

    sure, a Hawk 560 is within -2% to + 4% of a GeForce 570, and it's running 6oMhz slower core
     
  13. LinkedKube

    LinkedKube TechSpot Project Baby Posts: 3,481   +44

    Pshh, I'm thinking of adding another 580 atm. My pc life is getting boring
     
  14. red1776

    red1776 Omnipotent Ruler of the Universe Topic Starter Posts: 5,219   +157


    :haha: see now you need to let your inner 'enthusiast OCD' out like I do. my machine has turned over every major component (MB,GPU's, PSU,CPU,HS/F,Case, RAM, three times in 18 months...see...easy :haha:
     
  15. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    Probably a case of trying to reach as many wannabe's as possible. The card is set for the Asian (read Chinese) market - you seldom see Colorful (Nvidia) or ColorFire (AMD) cards anywhere else- and most buyers would be running crappy chinese OEM PSU's that generally don't come equipped with 8 pin PCI-E.

    Well prices are due to take a tumble - some vendors are already dropping prices trying to clear inventory.
    Scuttlebutt is that Nvidia in addition to getting Kepler (GK1xx) series cards up and running, took the precaution to die-shrink the GF110 to 28nm as an insurance policy...and these might not be too far away from retail. From Nvidia's conference call yeaterday (Q3 earnings) it sounds like they are pushing hard to get at least a few cards out in time for the holiday season.
     
  16. red1776

    red1776 Omnipotent Ruler of the Universe Topic Starter Posts: 5,219   +157

    Thats interesting, a lesson from Fermi, and adds flexability to the lineup.

    I wonder which company has the correct timeline to SOC?
    AMD is apparently going to keep building gaming enthusiast Chips/cards, and Nvidia is moving to GPGPU parallel chips/cards on the journey to 'useable and ample power SOC's. I wonder which timeline is correct? AMD's strategy seems to foretell of a shorter timeline than Nvidia's?
     
  17. LinkedKube

    LinkedKube TechSpot Project Baby Posts: 3,481   +44

    Well getting another waterblock will probably take the longest, and I'll prob have to add another pump. FML. water cooling is cool, but I feel like I'm build a brand new system everytime i switch out anything other than an hdd or some ram.
     
  18. red1776

    red1776 Omnipotent Ruler of the Universe Topic Starter Posts: 5,219   +157

    and thats why I didn't get to my bespoke WC'ing. It changes to often these days. although, I have a Xigmatek Elysium sitting on the shelf, and after the 7000/600 lines are launched, I am going to let it be for a bit other than get it wet.
     
  19. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    I wasn't aware that AMD had a timeline to SoC. They are still fully committed to x86 at this stage, and as far as I'm aware are still relying on die shrinks to get power down, but it's still a far cry from Brazos (and Cedar Trail Atom for that matter) reaching ~1 watt.
    There's also the no small matter of AMD having no baseband IP.
    Nvidia seem to be tackling the problem from the oposite end of the scale- start with ARM base, then 1. add the Icera baseband where applicable, and 2. introduce parallelism (Project Denver). I've seen Nvidia's timeline for "Wayne" (Tegra5) and "Grey" (Tegra w/integrated wireless) which says 2013, with "Logan" tand "Stark" to follow. I haven't seen anything remotely concrete from AMD as of yet....I'm inclined to take AMD's slides and timetable with the equivalent of the Bonneville Salt Flats in any case.
    I've put together a few customer builds that don't even have that luxury - harddrive water jackets and those stupid (and fiddly) waterpiped RAM heatspreaders. I know what you mean though, over the course of a few of my own builds I've built up quite a collection of "spares" ( connectors, blocks, tubing that becomes redundant every time you add/subtract components, and needs to be remeasured and cut to avoid kinks and unnecessary bends) - not to mention troubleshooting fun when one of the cards might look like it's failing.
     
  20. red1776

    red1776 Omnipotent Ruler of the Universe Topic Starter Posts: 5,219   +157

    I haven't seem slides on it, so it's an assumption, but I think it's a solid bet that discrete GPU's are on a finite timetable. maybe AMD is just missing it completely. Nvidia seems to be evolving GPU computing, AMD seems to be refining gaming chips. Just seems to be a canyon in MO's between them. i guess I assumed a plan on both sides. If the above are the 'plans' for both, AMD seems to be way behind the curve if it is still years from on die GPU.
     
  21. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    Hard to know what the f___ AMD are planning IMO. The HD 7000 series arch (GCN) was laid down in 2009 as far as I'm aware -which was while Dirk was still captain. I sincerely doubt that a change in CEO and a BoD can influence a design is intended to keep AMD going in the graphics business for the next 2-3 iterations. I'd assume, like VLIW5 before it, AMD are tied to GCN for at least the HD8000 series if not the HD9000. Like many other facets of AMD's business, I don't think their change in headman and business direction will be readily apparent until the present tech has run it's course.

    AMD can't very well back out of the high end (read: workstation/enthusiast) card market in any case- not without cutting the collective throats of dev's that have committed to AMD's OpenCL SDK.

    The only points of interest as far as I can discern will be what kind of effort AMD are willing to put into future drivers, gaming development and open source software. Nvidia just posted record revenue and profit numbers from their workstation/hpc graphics- so either that market is now growing or they have taken marketshare away from FirePro's (using Cayman GPU's) while still utilizing an old (Fermi) architecture - I suspect that it is the latter.
    Nvidia have been pilloried for their memory controller issues in recent times- a large part of which was due to moving away from the 256-bit memory bus (note that the GTX460/560 don't suffer from memory speed issues) and using 72 bit ECC memory modules (as opposed to 64-bit) in addition to being late to the GDDR5 party. AMD still have to demonstrate both these attributes (or to convince enterprise users that EDC will suffice in lieu of ECC) to deliver in the high ASP/high margin workstation market in addition to providing a suitable software ecosystem and overcoming the inertia inherent in the business sector.
    The performance/enthusiast desktop graphics market isn't that big, so I think the prime mover in it's ongoing existance is sharing architecture with enterprise GPGPU -true in Nvidia's case since they have already made a case for Kepler and Maxwell.

    TL: DR
    I Don't think either company will be exiting the high-end gaming business unless they are willing to surrender the highly lucrative workstation market.
    It's a certainty that Nvidia won't walk away from either -not with the profile that Quadro and TWIMTBP enjoy- and the fact that GPGPU factors highly in Nvidia's future plans.
    AMD...well, who knows what's going through their minds- aside from some fist pumping/chest beating from Rory and PR drivel from John Fruehe, I haven't actually heard anything from AMD. And what is anticipated are still products of the old regime.

    AMD's main problem is they are fighting a war on multiple fronts - against Nvidia in graphics (who have superior cash reserves and R&D), against Intel (who have cash, R&D, and fabs that are purpose built for their product), and a future problem with ARM x64 in the ultra-low voltage sector. AMD have limited resources- something has to give.

    Probably depends on if AMD plan on moving into the ultra-portable market. If they plan on getting into smartphones and the like then they are starting from further back than the companys that already have an ARM licence and have to fight smaller more cost efficient chips with SoC's that are burdoned with x86 overhead (i.e. overly complex for what is required). If they plan on desktops/laptops/tablets/enterprise then APU incorporating GCN graphics would serve them just fine and Trinity should stand them in good stead...but an APU isn't a SoC. Moreover, Rory and the Board seem to be putting AMD on the Jenny Craig diet- cutting staff/payroll and divesting yourself of code writers and engineers (amongst the PR monkeys)doesn't -at least to me, sound like they are gearing up to fight on every front. To me it seems like Rory and Co are getting ready to narrow down AMD's portfolio
     
  22. LinkedKube

    LinkedKube TechSpot Project Baby Posts: 3,481   +44

    Well I suppose I can see how this could happen. Thinking I'm in need of another pump, and I'd like to get some 45 and 90 degree fittings. The little things add up to. I'm trying to stick to a budget here! Story of a pc enthusiast gone wrong. lol.
     
  23. red1776

    red1776 Omnipotent Ruler of the Universe Topic Starter Posts: 5,219   +157

    well there you have it...i guess.I am getting the impression The interviews from both sides regarding the time-line of the demise of discrete GPU must not be in play, or that would not even be an option for AMD. My original inquiry was for the time between 5 TF on die in now inside 4-5 years, and now. How is each GPU manufacturer going to deal with the the transition regarding the integration of GP²U. given that, it seems that Nvidia is far ahead of AMD.

    I just read your edit:
    SoC was the stated goal, I was more focused and wondering (apparently with a faulty premise) more immediately about the demise of the discrete GPU and being on die. It sound like system ram wont be an on die possibility until around the time of the 8nm shrink anyway. like you pointed out, with the gutting of AMD staff, any previous stated pursuits are now moot anyway, in short who knows.
     
  24. dividebyzero

    dividebyzero trainee n00b Posts: 4,891   +1,258

    I'm just going by what's been published and extrapolating based on what could eventuate. Dirk Meyer's axing and AMD's seeming change in emphasis tend to show that focus and direction aren't set in stone. I'd wonder if Nvidia's future would remain as it's set out if Jen-Hsun stood down.

    AMD certainly have the expertise for forging ahead with iGPU. Llano certainly makes a strong case, as will Trinity I expect. But to my mind these are products destined for entry/mainstream level consumer products -laptop/desktop, probably moving down to tablet.

    There seems to be a lot of talk about discrete graphics going the way of the passenger pigeon and the dodo, but I would think that while you still have a graphics AIB enterprise market then AMD and Nvidia will both ensure that the gaming market requires high-end discrete graphics.
    I'm not certain that a large percentage of PC gamers would for instance be happy playing BF3 at the resolution or IQ that Llano would offer. Even if you could somehow shoehorn HD 6870 or GTX 560 level graphics power into an APU package ( die area, heat dissipation, TDP), that still leaves a considerable amount of gaming out in the cold - 3D, Eyefinity/Surround, conventional AA, HB and SSAO, particle effects etc.- and that's without taking into account that AMD, and Nvidia especially, would likely keep pushing graphical effects to keep the uptake of new hardware high (Crysis 2 anyone)...I'm kind of thinking that Metro:Last Light and S.T.A.L.K.E.R.2 won't be too friendly with entry level graphics. Add DirectX 11.1 (Win8) into the mix and you have a graphics landscape that moves just fast enough to outpace whatever the low end of market produces.

    Even if AMD pulls the plug on Gaming Evolved (and that wouldn't surprise me), I think Nvidia still pushes gaming harder and faster -they pretty much have to- other than graphics they don't have a lot else to offer at the present time. Given that there seem to be some rather graphically intensive game engines in development- Unreal Engine 3 is now supported by DX11 (Nvidia sponsored "Samaritan" demo to it's name) and Unreal Engine 4 is slated for 2014 for example, I wouldn't see Nvidia pushing the development if they weren't going to gain something out of it...especially if AMD end up with their GPU's in the next-gen consoles.
     

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...