Nvidia fires back regarding AMD's FreeSync technology demo

By on January 9, 2014, 11:30 AM

Hot off the heels of Nvidia’s G-Sync announcement late last year, AMD came to the Consumer Electronics Show with a variable refresh rate technology of their own. Known as FreeSync, the Nvidia alternative aims to essentially do the same thing as G-Sync but without requiring extra hardware.

Oh, and they want to make it free for all users. What’s not to like about that?

The gang over at The Tech Report must have though the same so they headed over to Nvidia’s booth at CES for a sit-down with Tom Petersen, the executive behind the development of G-Sync technology. While Petersen said he was excited to see competitors taking an interest in dynamic refresh rates, it is important to point out that AMD’s demo was running on laptops.

As he explained, laptops have a different display architecture compared to desktops. Broken down further, they have a more direct interface between the GPU and the LCD panel and are typically based on standards like LVDS or eDP (embedded DisplayPort). Desktops, on the other hand, typically use HDMI and DisplayPort and usually have a scalar chip positioned between the GPU and the display.

Because of this, it’s nearly impossible to implement variable refresh on a desktop monitor at present.

It’s the reason why Nvidia created the G-Sync module – to replace the scalar ASIC with logic of their own creation. To the best of his knowledge, there exists no scalar ASIC with variable refresh capability. If it existed, Nvidia would know, he said.




User Comments: 58

Got something to say? Post a comment
MilwaukeeMike said:

Ok, so AMD's free G-sync tech only works on laptops... that's still great isn't it?

Oh, and maybe someone can answer this for me. I don't really know what DisplayPort is... I haven't built a PC in a while and I still think of DVI and HDMI as being the main interfaces. If DisplayPort is new, why didn't they build in dynamic refresh like they have in laptop interfaces?

treeski treeski said:

Ok, so AMD's free G-sync tech only works on laptops... that's still great isn't it?

Oh, and maybe someone can answer this for me. I don't really know what DisplayPort is... I haven't built a PC in a while and I still think of DVI and HDMI as being the main interfaces. If DisplayPort is new, why didn't they build in dynamic refresh like they have in laptop interfaces?

Original version of displayport was developed in 2006, and the current version has been around since 2009. I believe Apple and Dell have been using it the longest, but these days it's quickly replacing DVI from what I've seen.

cliffordcooley cliffordcooley, TechSpot Paladin, said:

[link] . Each device has a input and output for creating the chain.

The new DisplayPort v1.2 daisy chainable displays have both a DisplayPort input and a DisplayPort output. The DisplayPort output connects to the next downstream display.

m4a4 m4a4 said:

Well then, no excuses, get "g-sync" drivers out for laptops Nvidia! :p

amstech amstech, TechSpot Enthusiast, said:

Nvidia is always a step ahead of AMD so I knew there would be a catch. Sorry AMD but no clock speed adjustments will save you here.

treeski treeski said:

Nvidia is always a step ahead of AMD so I knew there would be a catch. Sorry AMD but no clock speed adjustments will save you here.

Sure, but Nvidia doesn't provide any argument for using their technology on laptops.

Duskywolf50 said:

Yes, Nivida did ahead of AMD but if AMD do not bring this up then Nivida will charge you more money for this technology so they poke Nivida. You should thank AMD.

cliffordcooley cliffordcooley, TechSpot Paladin, said:

Could be from nVidia point of view, they really don't care about AMD's technology on laptops.

GhostRyder GhostRyder said:

Sounds like Nvidia is mad at AMD for making a free version and trying to spite AMD for stealing the spotlight. I can see the fanboy patrol is on the hunt above trying to make the tech sound bad so they can justify 600+ dollars for a tech AMD is offering for free.

Then again, we just have to wait for the response, its just a back and forth war at this point between the companies.

cliffordcooley cliffordcooley, TechSpot Paladin, said:

Seriously @GhostRyder nVidia didn't come forward with an attack, someone went to their booth and ask them questions. nVidia was answering the questions that were ask of them on their own ground. nVidia has nothing to be mad at.

GhostRyder GhostRyder said:

Seriously @GhostRyder nVidia didn't come forward with an attack, someone went to their booth and ask them questions. nVidia was answering the questions that were ask of them on their own ground. nVidia has nothing to be mad at.

Didn't say it was an attack, merely they made a smite at the tech shown by AMD because they wanted their tech to sound better. Both companies are playing this game right now where one says one thing then the other smacks the other with an "Answer to a question" by stating their tech is designed to be the real deal or better or whatever. This is just another one of those instances and the constant harass will go back and forth.

Just the laws of Computer Companies in this day and age.

Guest said:

Yay.... now I'm worried that each company develops their own proprietary tech and you'll either have a monitor that supports one or the other...

whont that make switching graphics cards a pain because you'll have to change not only the card but the display as well if you are "changing sides"??

1 person liked this | ikesmasher said:

Nvidia is always a step ahead of AMD so I knew there would be a catch. Sorry AMD but no clock speed adjustments will save you here.

So wait. AMD releases a free alternative to gysnc, but apparently (according to the COMPETITOR, whos trying to commercialize this type of tech, instead of give it out for free) right now it only works on laptops.

Nvidia has no gysnc for laptops. Their gysnc desktop monitors cost hundreds of dollars more than normal monitors. This makes me think it would be cheaper to implement AMD's tech into monitors than it is to do so with gysnc. But Nvidia is still a step ahead and vastly superior.

I can imagine it now. AMD gives out free HD7870s that only work on windows. nvidia charges you at least $300 for a GTX 770. But Nvidia is clearly the better company because the product is better and it works in MOAR SITUATIONS, even though it costs a fortune and someone else is giving you a still fantastic product for no real direct gain to them.

Hey im not hating on Nvidia, ive got one of their cards in my build. But blind fanboyism scares me. AMD is trying to do something FREE and nice for computer users (and they have, and they will keep trying) and you still try to justify nvidea's blatant attempts to make some quick green.

cliffordcooley cliffordcooley, TechSpot Paladin, said:

If they honestly wanted to find a solution, they would implement a cable that allows better communication between the GPU and monitor. In my opinion; trying to implement GSync (because it is proprietary) or FreeSync over current cabling is a dead-end.

SparkMasterB SparkMasterB said:

The edge AMD has here is display manufacturers are more likely to incorporated the variable refresh rate standard into ALL monitors, and not just nvidia exclusive monitors. More potential customers, more likely to have broader support.

1 person liked this | cliffordcooley cliffordcooley, TechSpot Paladin, said:

The edge AMD has here is display manufacturers are more likely to incorporated the variable refresh rate standard into ALL monitors
nVidia plainly stated was not possible with current cable standards.

When I first saw AMD's demonstration, I was wondering why they were using laptops. After reading nVidia's comment, using laptops was the only way AMD could demonstrate their FreeSync. Neither GSync or FreeSync will work with the majority of desktop hardware. So all in all, neither one has a leg up on the other.

1 person liked this | JC713 JC713 said:

Nvidia is always a step ahead of AMD so I knew there would be a catch. Sorry AMD but no clock speed adjustments will save you here.

I am not hating, but this is basically what you are saying xD: "OMG it doesnt have an nVidia logo on it, it must be bad!"

1 person liked this | GhostRyder GhostRyder said:

So wait. AMD releases a free alternative to gysnc, but apparently (according to the COMPETITOR, whos trying to commercialize this type of tech, instead of give it out for free) right now it only works on laptops.

Nvidia has no gysnc for laptops. Their gysnc desktop monitors cost hundreds of dollars more than normal monitors. This makes me think it would be cheaper to implement AMD's tech into monitors than it is to do so with gysnc. But Nvidia is still a step ahead and vastly superior.

I can imagine it now. AMD gives out free HD7870s that only work on windows. nvidia charges you at least $300 for a GTX 770. But Nvidia is clearly the better company because the product is better and it works in MOAR SITUATIONS, even though it costs a fortune and someone else is giving you a still fantastic product for no real direct gain to them.

Hey im not hating on Nvidia, ive got one of their cards in my build. But blind fanboyism scares me. AMD is trying to do something FREE and nice for computer users (and they have, and they will keep trying) and you still try to justify nvidea's blatant attempts to make some quick green.

@ikesmasher Pushes the like button...

nVidia plainly stated was not possible with current cable standards.

When I first saw AMD's demonstration, I was wondering why they were using laptops. After reading nVidia's comment, using laptops was the only way AMD could demonstrate their FreeSync. Neither GSync or FreeSync will work with the majority of desktop hardware. So all in all, neither one has a leg up on the other.

They just picked up some laptops at retail to show this off on s othey could try to show this in the least biased way possible. I mean we have to wait and see if this is true or not, it could be just a Nvidia guy saying something quickly to counter AMD.

I am not hating, but this is basically what you are saying xD: "OMG it doesnt have an nVidia logo on it, it must be bad!"

Is there anything beyond like I can push for this comment @JC713

amstech amstech, TechSpot Enthusiast, said:

Nvidia has no gysnc for laptops. Their gysnc desktop monitors cost hundreds of dollars more than normal monitors. This makes me think it would be cheaper to implement AMD's tech into monitors than it is to do so with gysnc. But Nvidia is still a step ahead and vastly superior..

It's cheaper because its not as good. Its second rate knee-jerk technology in response to Gsync, which is proven to be unmatched and work good. Gsync monitors are more expensive because the technology is better then any competition and its still quite new, you want the best you gotta pay for it. Nvidia is always one step ahead and has been for years. This truth makes me a fanboy? Hahaha, so be it.

ikesmasher said:

It's cheaper because its not as good. Its second rate knee-jerk technology in response to Gsync, which is proven to be unmatched and work good. Gsync monitors are more expensive because the technology is better then any competition and its still quite new. Nvidia is always one step ahead and has been for years. The truth makes me a fanboy? Lol, so be it.

There isnt any competition. This technology was announced yesterday when gysnc was announced MONTHS ago. No one except AMD knows the potential of it or how well it actually works-including you (and yes, me.) Theres a difference between the truth and trying to predict the future.

amstech amstech, TechSpot Enthusiast, said:

There isnt any competition. This technology was announced yesterday when gysnc was announced MONTHS ago.

So, that makes AMD several months late to the table with their knee-jerk version. In America, we call this being 'one step behind'.

I believe someone called me a fanboy for saying that? How does it feel to eat your words?

Theres a difference between the truth and trying to predict the future.
The truth is that Nvidia's tech works (now, not after testing, it works NOW) and the only word you can use to describe AMD's reactionary technology is 'it has potential'.

Nvidia isn't the only company who plans to release monitors with their G-Sync technology either. Also on the list are Acer, AOC, ASUS, BEnQ, Philips, and ViewSonic. So all in all a pretty hefty corner of the computer monitor market will be boasting this new next-gen tech. G-Sync monitors will be available in 24 inch and 27 inch sizes.

Read more at [link]

I didn't need anymore information to support my claims but thanks, keep talking!

ikesmasher said:

So, that makes AMD several months late to the table with their knee-jerk version. In America, we call this being 'one step behind'.

I believe someone called me a fanboy for saying that? How does it feel to eat your words?

The truth is that Nvidia's tech works (now, not after testing, it works NOW) and the only word you can use to describe AMD's reactionary technology is 'it has potential'.

I didn't need anymore information to support my claims but thanks, keep talking!

You look at this like its competition; when in reality it is generosity vs greed. According to AMD, this tech works without incorporating a 200 dollar chip into the monitor. Apparently you don't like being told that your favorite company isnt perfect so im done arguing here.

cliffordcooley cliffordcooley, TechSpot Paladin, said:

You look at this like its competition; when in reality it is generosity vs greed.
You have got to be joking. The only thing AMD has out has already been paid for. They just want you to think it is a free feature.

ikesmasher said:

The only thing AMD has out has already been paid for.

Please explain.

cliffordcooley cliffordcooley, TechSpot Paladin, said:

Catalyst Control Center has already been programmed. Their cards already support controlling the signal in question. Laptop monitors have already been marketed to support the varying frame rate. The expense of R&D has already been paid for. It seems it was chalked up as a loss and nearly forgotten, but thats just me looking at how the news came out. Nearly forgotten at least until GSync was introduced. I'll bet if AMD came out with a viable alternative, it wouldn't be free either.

GhostRyder GhostRyder said:

You look at this like its competition; when in reality it is generosity vs greed. According to AMD, this tech works without incorporating a 200 dollar chip into the monitor. Apparently you don't like being told that your favorite company isnt perfect so im done arguing here.

I would give up that argument with the greenboy troll of the forums. If you stuck a Nvidia sticker on a brick wall, he would argue for days on why that wall was better than all the other walls.

Catalyst Control Center has already been programmed. Their cards already support controlling the signal in question. Laptop monitors have already been marketed to support the varying frame rate. The expense of R&D has already been paid for. It seems it was chalked up as a loss and nearly forgotten, but thats just me looking at how the news came out. Nearly forgotten at least until GSync was introduced. I'll bet if AMD came out with a viable alternative, it wouldn't be free either.

But this is a viable alternative, we have seen one showcase of it so far and thats it. It may have been around for awhile but there had to be reasons behind letting it sit for a bit (Perhaps waiting for tech to catch up and need this possibly). Either way, besides a few people opininos and the smoke being blown form Nvidia and AMD, we have no idea what either will really do or if paying an extra 300 bucks minimum for a monitor is worth it. For all we know, but could end up being a complete flop and bot ha waste of our time, this so called problem has become the next "Frame Stuttering" Debate or the likes.

hahahanoobs hahahanoobs said:

AMD should focus on their drivers and getting some frickin R series GPU's out so we can actually buy them instead of trying to distract us with this sh*t. Even if AMD did follow through with Freesync, it's still a couple years away since DP 1.3 would be required. Man AMD pisses me off sometimes.

cliffordcooley cliffordcooley, TechSpot Paladin, said:

For all we know, but could end up being a complete flop and bot ha waste of our time, this so called problem has become the next "Frame Stuttering" Debate or the likes.
I think it is a foundation to a whole new multimedia specification. A specification where the display refresh rates are controlled by media being rendered. Once they figure out how, they can then introduce a new cabling method. Just think for this to really catch wind, it will need to be used for more than just gaming.

JC713 JC713 said:

Nvidia is always one step ahead and has been for years. This truth makes me a fanboy? Hahaha, so be it.

nVidia isnt one step ahead when it comes to OpenCL and GL is it? I didnt think so. Look, I am neutral (neither an nVidia or AMD fanboy), but I hate when people bash AMD because they are struggling and I hate when people take it as an opportunity to bash them for being a "step behind". Ok they may be a "step behind", but cut them some slack, the competition is hot and AMD is trying to gain as much advantage they can get in a losing battle.

Darth Shiv Darth Shiv said:

[link] . Each device has a input and output for creating the chain.

The new DisplayPort v1.2 daisy chainable displays have both a DisplayPort input and a DisplayPort output. The DisplayPort output connects to the next downstream display.

*Can... whether they actually do or not is another problem. Unfortunately, seen plenty of devices that don't have the output...

1 person liked this | amstech amstech, TechSpot Enthusiast, said:

the competition is hot and AMD is trying to gain as much advantage they can get in a losing battle.

I like AMD GPU's and recommend them all the time, but this is about one company gettings things right & sooner, with higher quality, most of the time. Another example is how good Nvidia's Vsync technology worked before this even came out, and how long it took AMD to figure out its frame stutter. It's nothing personal, its just the truth.

At this point, both companys have filled in so many holes that either choice is a damn good one.

6 people like this | dividebyzero dividebyzero, trainee n00b, said:

There isnt any competition. This technology was announced yesterday when gysnc was announced MONTHS ago. No one except AMD knows the potential of it or how well it actually works-including you (and yes, me.) Theres a difference between the truth and trying to predict the future.

AMD's GPUs have had variable VBLANK capability since the Cayman series AFAIK. Typical AMD...add features and don't bother to market them until someone else shows them what to do.

FreeSync may be a nice addition to AMD's mobile GPU arsenal, but they still face an uphill battle overcoming the negativity they engendered over mobile drivers and GPU usage (Enduro)

Seriously @GhostRyder nVidia didn't come forward with an attack, someone went to their booth and ask them questions. nVidia was answering the questions that were ask of them on their own ground. nVidia has nothing to be mad at.

In point of fact AMD's own Raja Koduri stated the issue fairly plainly:

Koduri did admit that NVIDIA deserved credit for seeing this potential use of the variable refresh feature and bringing it to market as quickly as they did. It has raised awareness of the issue and forced AMD and the rest of the display community to take notice. But clearly AMD's goal is to make sure that it remains a proprietary feature for as little time as possible.

As it stands today, the only way to get variable refresh gaming technology on the PC is to use NVIDIA's G-Sync enabled monitors and GeForce graphics cards. It will likely take until the ratification and release of DisplayPort 1.3 monitors before AMD Radeon users will be able to enjoy what I definitely believe is one of the best new technologies for PC gaming in years.

Basically not too dissimilar to the Optimus, Vsync, frame pacing, and GeForce Experience technologies. Nvidia being a software orientated company looking at the end user experience to maintain their brand via both hardware and software, with AMD caught flat-footed due to their almost non-existent interest in the software side of the business.

I hate when people take it as an opportunity to bash them for being a "step behind". Ok they may be a "step behind", but cut them some slack.

Why? AMD have always been a hardware orientated company and have never professed to have any interest in software. Just look how long it took the company to divest itself of the high maintenance requirement for the All-In-Wonder lines (as an example).

AMD have been swimming against the tide of a unified hardware/software ecosystem using the flawed open source argument since their inception. Virtually every other company covets and acquires IP at breakneck speed, whilst AMD have a record of divesting themselves of theirs for very short term financial gain.

Hardly surprising that some people get a little peeved at AMD's misfiring board decisions diluting the brand and market position on occasion.

Why cut AMD slack for being a market reactor more than a market shaper ? Ultimately, their current position is down to its own BoD's intransigence and short-sightedness.

theBest11778 theBest11778 said:

I haven't seen a G-Sync monitor yet, but the price is crazy. Companies should be focusing on higher refresh rates instead of lowering them better. I saw no vertical tearing on an Asus 144hz monitor regardless of the framerate (some games running well over 300FPS.) Considering the higher refresh rate monitors only set you back about $100 more than the 60hz competition, I see that as the better overall solution. BTW it works fine with today's DVI standards .

amstech amstech, TechSpot Enthusiast, said:

I would give up that argument with the greenboy troll of the forums.

You have no argument. You never do. I see things for what they are, you choose not to. All trolls call other people trolls/names first. You should listen to your own sig advice namecaller, your once again the first one to do it.

In point of fact AMD's own Raja Koduri stated the issue fairly plainly:

Basically not too dissimilar to the Optimus, Vsync, frame pacing, and GeForce Experience technologies. Nvidia being a software orientated company looking at the end user experience to maintain their brand via both hardware and software, with AMD caught flat-footed due to their almost non-existent interest in the software side of the business.s.

Optimus was a pleasant surprise on my L702X with Intel HD Graphics + GT550M. Works quite nicely.

Guest said:

What AMD did was to shoot down NVIDIA's plans at marketing N-Sync (More in line to their N-VIDIA name) for laptops. I understand that NVIDIA didn't mention the implementation for Laptop displays, otherwise AMD wouldn't have pointed it out. Imagine them selling "G-Sync capable" laptops over the next few years, only for consumers to realize it was actually already capable without the extra cost of the hardware?

I'm going to assume NVIDIA is going to REFINE their statement on this by including their plans on eliminating screen tearing on laptops - either by insisting that G-SYNC SHOULD be added on laptops, or concede to AMD's solution. They'll look bad with laptop consumers when they'll ignore the applications on that hardware segment. Also note, though: NVIDIA has implemented their G-Sync tech on 24" and up monitors for now, where effects are most noteworthy I suppose (and to justify the cost of it)

1 person liked this | amstech amstech, TechSpot Enthusiast, said:

, otherwise AMD wouldn't have pointed it out. Imagine them selling "G-Sync capable" laptops over the next few years, only for consumers to realize it was actually already capable without the extra cost of the hardware?

Like most cases with AMD/Nvidia software comparisons, Nvidia's version is of higher quality, more reliable, released earlier, and sometimes the only of its kind. AMD's knee-jerk second rate response technology to Gsync isn't proven to work as good; they aren't even sure if it will compete at this point. Articles I have read say it won't work as good.

Nvidia is usually one step ahead of AMD and Gsync is just another example.

It's not a knock on AMD, they've been the underdog for years now.

AMD have been swimming against the tide of a unified hardware/software ecosystem using the flawed open source argument since their inception. Virtually every other company covets and acquires IP at breakneck speed, whilst AMD have a record of divesting themselves of theirs for very short term financial gain.

Hardly surprising that some people get a little peeved at AMD's misfiring board decisions diluting the brand and market position on occasion.

Why cut AMD slack for being a market reactor more than a market shaper ? Ultimately, their current position is down to its own BoD's intransigence and short-sightedness.

How dare you talk bad about AMD! You must be a troll!

I am sure Ghostcryer will have paragraphs of irrelevant Pro-AMD information loaded in his little squirt gun and pointed at you when he reads your comment.

1 person liked this | GhostRyder GhostRyder said:

I think it is a foundation to a whole new multimedia specification. A specification where the display refresh rates are controlled by media being rendered. Once they figure out how, they can then introduce a new cabling method. Just think for this to really catch wind, it will need to be used for more than just gaming.

Agreed

You have no argument. You never do. I see things for what they are, you choose not to. All trolls call other people trolls/names first. You should listen to your own sig advice namecaller, your once again the first one to do it.

Optimus was a pleasant surprise on my L702X with Intel HD Graphics + GT550M. Works quite nicely.

You have no argument either, its always " I see that green sticker which means its better". You never provide proof, you jump the gun saying a technology thats only been shown in controlled environments is better. If that was always the case, then we would be living in a world where our cars drive us, computers get 60FPS in every game without any problems, and food is always tasting like heaven.

Like most cases with AMD/Nvidia software comparisons, Nvidia's version is of higher quality, more reliable, released earlier, and sometimes the only of its kind. AMD's knee-jerk second rate response technology to Gsync isn't proven to work as good; they aren't even sure if it will compete at this point. Articles I have read say it won't work as good.

Nvidia is usually one step ahead of AMD and Gsync is just another example.

It's not a knock on AMD, they've been the underdog for years now.

How dare you talk bad about AMD! You must be a troll!

Yea thats why those last like 6 Nvidia drivers have been so great with no complaints...Right?

In point of fact AMD's own Raja Koduri stated the issue fairly plainly:

Point proven, either way the monitors already contain this feature, so were just waiting for a displayport 1.3. Good catch, had not noticed that comment.

amstech amstech, TechSpot Enthusiast, said:

Agreed

You have no argument either, its always " I see that green sticker which means its better". You never provide proof, you jump the gun saying a technology thats only been shown in controlled environments is better. .

Well anything that has a green sticker is better. Not because I say so, because Nvidia makes it so. You pay more because you get more, of higher quality.

I don't argue with you because I would never say anything that wasn't backed heavily by truth and facts, as I have did here. Plus, you have no argument, you just don't like what I have to say. Regardless of my loyalites, everything I say is accurate and nothing more then an observation of what has happened and what is happening.

You have nothing to do except complain and practice denial. Your a fanboy in every sense of the word.

AMD is the underdog to Nvidia, period.

Yea thats why those last like 6 Nvidia drivers have been so great with no complaints...Right?

.

Nvidia's software isn't perfect, its just better then the competition. Graham took what I already know and put it into words. Words you cannot deny or argue, again.

GhostRyder GhostRyder said:

Well anything that has a green sticker is better. Not because I say so, because Nvidia makes it so. You pay more because you get more, of higher quality.

I don't argue with you because I would never say anything that wasn't backed heavily by truth and facts, as I have did here.

You have nothing to do except ***** and practice denial.

Let me just grab a quote one second...

I am not hating, but this is basically what you are saying xD: "OMG it doesnt have an nVidia logo on it, it must be bad!"

Thus ending this charade you call a debate, im sure you can grab a hand full of stickers that say Nvidia on eBay. Try sticking some on a wall, your TV, and everything you own to see if that improves the performance *snickers.

Back on subject, DisplayPort is finally being pushed, but in reality the problem will also coincide with that fact that only high end monitors have displayport on them. HDMI and DVI is really the main thing probably at least 75% of monitors generally have. So in this regard, it would seem that mobile will be first for the AMD side and Desktop monitors will be first for Nvidia. Either way, were going to either be waiting for the monitors to support it, paying alot of money for them, or stuck with laptop support. I see a no win scenario for anyone at this point unless your willing to pay a lot or stick to mobile.

amstech amstech, TechSpot Enthusiast, said:

Let me just grab a quote one second....

That quote does nothing to help your case, I never said AMD was bad, just that Nvidia was one step ahead. How people interpret that is not up to me.

Thus ending this charade you call a debate, .

I already knew this wasn't a debate and there was no argument, I said that several posts ago.

im sure you can grab a hand full of stickers that say Nvidia on eBay. Try sticking some on a wall, your TV, and everything you own to see if that improves the performance *snickers.

.

I wonder if they cost more then AMD stickers? Hahaha.

1 person liked this | GhostRyder GhostRyder said:

That quote does nothing to help your case. JC advocates AMD hardware and I never said AMD was bad, just that Nvidia was one step ahead. How people interpret that is not up to me.

Because @JC713 defends AMD hardware/software that makes him biased. Most threads when he helps people he actually recommends Nvidia cards in actuality. His debated and picks things based on facts and if he recommends something he provides proof of his claim.

I wonder if they cost more then AMD stickers? Hahaha.

Only because Nvidias Patented Glue contains 30% more glue which in turn causes the sticker to have the potential to stick on all surfaces and the special Nvidia G series glue will come off without leaving residue. The AMD sticker contains glue that will leave a residue but costs half the price resulting in a user if they remove the sticker having to get a damp cloth to clean it off.

There, now according to your logic this post just automatically won.

amstech amstech, TechSpot Enthusiast, said:

There, now according to your logic this post just automatically won.
There is no winner or loser here. Just facts.

Nvidia is usually one step ahead and they offer higher quality with just about everything they do compared to AMD, at a higher price of course.

This is not my opinion, its the truth, and there is no debate. Gsync is just another example. I would educate you but Graham just did a couple posts ago.

Your little jabs at me will do nothing to aid your empty claims and butthurt comments over your beloved underdog brand.

1 person liked this | GhostRyder GhostRyder said:

See, you think its one way or the other. There is no winner or loser here. Just facts, most of which have been recently provided and there is nothing you can do about it and you know it.

But there are no facts to prove one way or the other because as I have said, neither technology is open to the public and is only released to be shown in a controlled environment. G-Sync looks sick in its controlled environments for the public to see but home users have not been able to see it or most have not been able to actually see it in action for themselves besides a couple of videos that really can't display easily what were seeing. Same with AMDs FreeSync, all the people who have seen either say they look good and some have said they see the G-sync still looks better which is ok because it sounds like having the separate module will result in it being better. However, AMD's is free/going and require and AMD GPU where as the Nvidia requires a G-Sync capable monitor which in turn is costing an additional 300+ dollars for a 1080p screen which is in the same realm as buying some real nice 120hz monitors or 2560x1440p monitors (In fact more) and an Nvidia GPU. Either way noone is going to win and I doubt many people will jump on the tech bandwagon for awhile because the price is not going to be justifiable for most seeing as how a GTX 780 is 500+ and the monitor is 600+. The alternative won't be around for god knows how long from AMD so honestly I could not care less.

I like G-Sync better and its probably going to be better, but until we see this in the home user's place, its still considered a specualtion.

Your little jabs at me will do nothing to aid your empty claims and butthurt comments over your beloved underdog brand.

Right back at ya.

JC713 JC713 said:

I like AMD GPU's and recommend them all the time, but this is about one company gettings things right & sooner, with higher quality, most of the time. Another example is how good Nvidia's Vsync technology worked before this even came out, and how long it took AMD to figure out its frame stutter. It's nothing personal, its just the truth.

At this point, both companys have filled in so many holes that either choice is a damn good one.

Eh. I guess AMD is always playing catch up but they are still a great company trying to make their customers happy.

JC713 JC713 said:

How dare you talk bad about AMD! You must be a troll!I am sure Ghostcryer will have paragraphs of irrelevant Pro-AMD information loaded in his little squirt gun and pointed at you when he reads your comment.

Eh lets quit arguing please. I just came here to nicely defend AMD and it turned into an argument. Ok, nVidia has some nice potential and AMD has some nice potential. Now lets stuff fussing about nonsense.

JC713 JC713 said:

Why? AMD have always been a hardware orientated company and have never professed to have any interest in software. Just look how long it took the company to divest itself of the high maintenance requirement for the All-In-Wonder lines (as an example).

AMD have been swimming against the tide of a unified hardware/software ecosystem using the flawed open source argument since their inception. Virtually every other company covets and acquires IP at breakneck speed, whilst AMD have a record of divesting themselves of theirs for very short term financial gain.

Hardly surprising that some people get a little peeved at AMD's misfiring board decisions diluting the brand and market position on occasion.

Why cut AMD slack for being a market reactor more than a market shaper ? Ultimately, their current position is down to its own BoD's intransigence and short-sightedness.

Very true. I still pity AMD a bit though.

2 people like this | dividebyzero dividebyzero, trainee n00b, said:

Very true. I still pity AMD a bit though.

Each to their own, although I'd say if any group deserved pity it would AMD's shareholders and engineers.

Pitying AMD as a company is akin to rewarding/excusing mediocrity. The board of directors have historically demonstrated either a singular lack of semiconductor intellect, or are the unfortunate owners of more misfiring synapses than a mental institution.

That such a talented cadre of engineers and architects have been saddled with a board of such limited strategic vision and wishy-washy decision making, borders on tragedy - but hardly unique. I certainly don't see anyone (esp. the Sunnyvale cheer squad) pitying Intel's graphics engineers for their managements myopia over the years. That Intel flourishes still is largely due to the boneheaded (in)actions of its competitors management- not just AMD (Exxon, which put Zilog into a terminal nosedive being a textbook example).

1 person liked this | JC713 JC713 said:

Pitying AMD as a company is akin to rewarding/excusing mediocrity. The board of directors have historically demonstrated either a singular lack of semiconductor intellect, or are the unfortunate owners of more misfiring synapses than a mental institution.

HAHAHA. Wow you made my day DBZ.

1 person liked this | cliffordcooley cliffordcooley, TechSpot Paladin, said:

HAHAHA. Wow you made my day DBZ.

Intel and nVidia all the way! hahaha

Seriously I don't have anything against AMD. I never have used AMD for CPU needs, other than a few machines brought to me for repair. I once used AMD graphics and was reluctant to switch to nVidia. I only switched because of a computational project I was contributing to. Heck during that time I was still wondering why I would want a $200 dollar graphics card.

Moving forward and graduating from Integrated graphics AMD was my first experience. I didn't have a bad experience, like I said I was prompted to switch because of the computational project. Since I'm loyal in nature, I will tend to stick with Intel and nVidia. Present day, I see no reason why AMD or nVidia should not be considered as equals in the world of gaming. I know I'm not an enthusiast riding the edge of top notch gear, but do we really have to be so petty as to throw rocks at the other side?

1 person liked this | GhostRyder GhostRyder said:

Intel and nVidia all the way! hahaha

Seriously I don't have anything against AMD. I never have used AMD for CPU needs, other than a few machines brought to me for repair. I once used AMD graphics and was reluctant to switch to nVidia. I only switched because of a computational project I was contributing to. Heck during that time I was still wondering why I would want a $200 dollar graphics card.

Moving forward and graduating from Integrated graphics AMD was my first experience. I didn't have a bad experience, like I said I was prompted to switch because of the computational project. Since I'm loyal in nature, I will tend to stick with Intel and nVidia. Present day, I see no reason why AMD or nVidia should not be considered as equals in the world of gaming. I know I'm not an enthusiast riding the edge of top notch gear, but do we really have to be so petty as to throw rocks at the other side?

Well and why would you, if you are happy then why make a change? If you are happy with your cards, CPU, or whatever then don't change and no one should be able to force you to otherwise.

I find your comment the perfect comment when talking about the problems these days clifford. Well said sir!

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.