Nvidia fires back regarding AMD's FreeSync technology demo

Catalyst Control Center has already been programmed. Their cards already support controlling the signal in question. Laptop monitors have already been marketed to support the varying frame rate. The expense of R&D has already been paid for. It seems it was chalked up as a loss and nearly forgotten, but thats just me looking at how the news came out. Nearly forgotten at least until GSync was introduced. I'll bet if AMD came out with a viable alternative, it wouldn't be free either.
 
You look at this like its competition; when in reality it is generosity vs greed. According to AMD, this tech works without incorporating a 200 dollar chip into the monitor. Apparently you don't like being told that your favorite company isnt perfect so im done arguing here.
I would give up that argument with the greenboy troll of the forums. If you stuck a Nvidia sticker on a brick wall, he would argue for days on why that wall was better than all the other walls.

Catalyst Control Center has already been programmed. Their cards already support controlling the signal in question. Laptop monitors have already been marketed to support the varying frame rate. The expense of R&D has already been paid for. It seems it was chalked up as a loss and nearly forgotten, but thats just me looking at how the news came out. Nearly forgotten at least until GSync was introduced. I'll bet if AMD came out with a viable alternative, it wouldn't be free either.
But this is a viable alternative, we have seen one showcase of it so far and thats it. It may have been around for awhile but there had to be reasons behind letting it sit for a bit (Perhaps waiting for tech to catch up and need this possibly). Either way, besides a few people opininos and the smoke being blown form Nvidia and AMD, we have no idea what either will really do or if paying an extra 300 bucks minimum for a monitor is worth it. For all we know, but could end up being a complete flop and bot ha waste of our time, this so called problem has become the next "Frame Stuttering" Debate or the likes.
 
AMD should focus on their drivers and getting some frickin R series GPU's out so we can actually buy them instead of trying to distract us with this sh*t. Even if AMD did follow through with Freesync, it's still a couple years away since DP 1.3 would be required. Man AMD pisses me off sometimes.
 
For all we know, but could end up being a complete flop and bot ha waste of our time, this so called problem has become the next "Frame Stuttering" Debate or the likes.
I think it is a foundation to a whole new multimedia specification. A specification where the display refresh rates are controlled by media being rendered. Once they figure out how, they can then introduce a new cabling method. Just think for this to really catch wind, it will need to be used for more than just gaming.
 
Nvidia is always one step ahead and has been for years. This truth makes me a fanboy? Hahaha, so be it.
nVidia isnt one step ahead when it comes to OpenCL and GL is it? I didnt think so. Look, I am neutral (neither an nVidia or AMD fanboy), but I hate when people bash AMD because they are struggling and I hate when people take it as an opportunity to bash them for being a "step behind". Ok they may be a "step behind", but cut them some slack, the competition is hot and AMD is trying to gain as much advantage they can get in a losing battle.
 
the competition is hot and AMD is trying to gain as much advantage they can get in a losing battle.
I like AMD GPU's and recommend them all the time, but this is about one company gettings things right & sooner, with higher quality, most of the time. Another example is how good Nvidia's Vsync technology worked before this even came out, and how long it took AMD to figure out its frame stutter. It's nothing personal, its just the truth.
At this point, both companys have filled in so many holes that either choice is a damn good one.
 
There isnt any competition. This technology was announced yesterday when gysnc was announced MONTHS ago. No one except AMD knows the potential of it or how well it actually works-including you (and yes, me.) Theres a difference between the truth and trying to predict the future.
AMD's GPUs have had variable VBLANK capability since the Cayman series AFAIK. Typical AMD...add features and don't bother to market them until someone else shows them what to do.
FreeSync may be a nice addition to AMD's mobile GPU arsenal, but they still face an uphill battle overcoming the negativity they engendered over mobile drivers and GPU usage (Enduro)
Seriously @GhostRyder nVidia didn't come forward with an attack, someone went to their booth and ask them questions. nVidia was answering the questions that were ask of them on their own ground. nVidia has nothing to be mad at.
In point of fact AMD's own Raja Koduri stated the issue fairly plainly:
Koduri did admit that NVIDIA deserved credit for seeing this potential use of the variable refresh feature and bringing it to market as quickly as they did. It has raised awareness of the issue and forced AMD and the rest of the display community to take notice. But clearly AMD's goal is to make sure that it remains a proprietary feature for as little time as possible.
As it stands today, the only way to get variable refresh gaming technology on the PC is to use NVIDIA's G-Sync enabled monitors and GeForce graphics cards. It will likely take until the ratification and release of DisplayPort 1.3 monitors before AMD Radeon users will be able to enjoy what I definitely believe is one of the best new technologies for PC gaming in years.
Basically not too dissimilar to the Optimus, Vsync, frame pacing, and GeForce Experience technologies. Nvidia being a software orientated company looking at the end user experience to maintain their brand via both hardware and software, with AMD caught flat-footed due to their almost non-existent interest in the software side of the business.
I hate when people take it as an opportunity to bash them for being a "step behind". Ok they may be a "step behind", but cut them some slack.
Why? AMD have always been a hardware orientated company and have never professed to have any interest in software. Just look how long it took the company to divest itself of the high maintenance requirement for the All-In-Wonder lines (as an example).
AMD have been swimming against the tide of a unified hardware/software ecosystem using the flawed open source argument since their inception. Virtually every other company covets and acquires IP at breakneck speed, whilst AMD have a record of divesting themselves of theirs for very short term financial gain.
Hardly surprising that some people get a little peeved at AMD's misfiring board decisions diluting the brand and market position on occasion.
Why cut AMD slack for being a market reactor more than a market shaper ? Ultimately, their current position is down to its own BoD's intransigence and short-sightedness.
 
Last edited:
I haven't seen a G-Sync monitor yet, but the price is crazy. Companies should be focusing on higher refresh rates instead of lowering them better. I saw no vertical tearing on an Asus 144hz monitor regardless of the framerate (some games running well over 300FPS.) Considering the higher refresh rate monitors only set you back about $100 more than the 60hz competition, I see that as the better overall solution. BTW it works fine with today's DVI standards :).
 
I would give up that argument with the greenboy troll of the forums.
You have no argument. You never do. I see things for what they are, you choose not to. All trolls call other people trolls/names first. You should listen to your own sig advice namecaller, your once again the first one to do it.
In point of fact AMD's own Raja Koduri stated the issue fairly plainly:

Basically not too dissimilar to the Optimus, Vsync, frame pacing, and GeForce Experience technologies. Nvidia being a software orientated company looking at the end user experience to maintain their brand via both hardware and software, with AMD caught flat-footed due to their almost non-existent interest in the software side of the business.s.
Optimus was a pleasant surprise on my L702X with Intel HD Graphics + GT550M. Works quite nicely.
 
Last edited:
What AMD did was to shoot down NVIDIA's plans at marketing N-Sync (More in line to their N-VIDIA name) for laptops. I understand that NVIDIA didn't mention the implementation for Laptop displays, otherwise AMD wouldn't have pointed it out. Imagine them selling "G-Sync capable" laptops over the next few years, only for consumers to realize it was actually already capable without the extra cost of the hardware?

I'm going to assume NVIDIA is going to REFINE their statement on this by including their plans on eliminating screen tearing on laptops - either by insisting that G-SYNC SHOULD be added on laptops, or concede to AMD's solution. They'll look bad with laptop consumers when they'll ignore the applications on that hardware segment. Also note, though: NVIDIA has implemented their G-Sync tech on 24" and up monitors for now, where effects are most noteworthy I suppose (and to justify the cost of it)
 
, otherwise AMD wouldn't have pointed it out. Imagine them selling "G-Sync capable" laptops over the next few years, only for consumers to realize it was actually already capable without the extra cost of the hardware?
Like most cases with AMD/Nvidia software comparisons, Nvidia's version is of higher quality, more reliable, released earlier, and sometimes the only of its kind. AMD's knee-jerk second rate response technology to Gsync isn't proven to work as good; they aren't even sure if it will compete at this point. Articles I have read say it won't work as good.
Nvidia is usually one step ahead of AMD and Gsync is just another example.
It's not a knock on AMD, they've been the underdog for years now.

AMD have been swimming against the tide of a unified hardware/software ecosystem using the flawed open source argument since their inception. Virtually every other company covets and acquires IP at breakneck speed, whilst AMD have a record of divesting themselves of theirs for very short term financial gain.
Hardly surprising that some people get a little peeved at AMD's misfiring board decisions diluting the brand and market position on occasion.
Why cut AMD slack for being a market reactor more than a market shaper ? Ultimately, their current position is down to its own BoD's intransigence and short-sightedness.
How dare you talk bad about AMD! You must be a troll!
I am sure Ghostcryer will have paragraphs of irrelevant Pro-AMD information loaded in his little squirt gun and pointed at you when he reads your comment.
 
Last edited:
I think it is a foundation to a whole new multimedia specification. A specification where the display refresh rates are controlled by media being rendered. Once they figure out how, they can then introduce a new cabling method. Just think for this to really catch wind, it will need to be used for more than just gaming.
Agreed
You have no argument. You never do. I see things for what they are, you choose not to. All trolls call other people trolls/names first. You should listen to your own sig advice namecaller, your once again the first one to do it.

Optimus was a pleasant surprise on my L702X with Intel HD Graphics + GT550M. Works quite nicely.
You have no argument either, its always " I see that green sticker which means its better". You never provide proof, you jump the gun saying a technology thats only been shown in controlled environments is better. If that was always the case, then we would be living in a world where our cars drive us, computers get 60FPS in every game without any problems, and food is always tasting like heaven.

Like most cases with AMD/Nvidia software comparisons, Nvidia's version is of higher quality, more reliable, released earlier, and sometimes the only of its kind. AMD's knee-jerk second rate response technology to Gsync isn't proven to work as good; they aren't even sure if it will compete at this point. Articles I have read say it won't work as good.
Nvidia is usually one step ahead of AMD and Gsync is just another example.
It's not a knock on AMD, they've been the underdog for years now.
How dare you talk bad about AMD! You must be a troll!
:D
Yea thats why those last like 6 Nvidia drivers have been so great with no complaints...Right?
In point of fact AMD's own Raja Koduri stated the issue fairly plainly:
Point proven, either way the monitors already contain this feature, so were just waiting for a displayport 1.3. Good catch, had not noticed that comment.
 
Last edited:
Agreed
You have no argument either, its always " I see that green sticker which means its better". You never provide proof, you jump the gun saying a technology thats only been shown in controlled environments is better. .
Well anything that has a green sticker is better. Not because I say so, because Nvidia makes it so. You pay more because you get more, of higher quality.
I don't argue with you because I would never say anything that wasn't backed heavily by truth and facts, as I have did here. Plus, you have no argument, you just don't like what I have to say. Regardless of my loyalites, everything I say is accurate and nothing more then an observation of what has happened and what is happening.
You have nothing to do except complain and practice denial. Your a fanboy in every sense of the word.
AMD is the underdog to Nvidia, period.

Yea thats why those last like 6 Nvidia drivers have been so great with no complaints...Right?
.
Nvidia's software isn't perfect, its just better then the competition. Graham took what I already know and put it into words. Words you cannot deny or argue, again.
 
Last edited:
Well anything that has a green sticker is better. Not because I say so, because Nvidia makes it so. You pay more because you get more, of higher quality.
I don't argue with you because I would never say anything that wasn't backed heavily by truth and facts, as I have did here.
You have nothing to do except ***** and practice denial.
Let me just grab a quote one second...
I am not hating, but this is basically what you are saying xD: "OMG it doesnt have an nVidia logo on it, it must be bad!"

Thus ending this charade you call a debate, im sure you can grab a hand full of stickers that say Nvidia on eBay. Try sticking some on a wall, your TV, and everything you own to see if that improves the performance *snickers.

Back on subject, DisplayPort is finally being pushed, but in reality the problem will also coincide with that fact that only high end monitors have displayport on them. HDMI and DVI is really the main thing probably at least 75% of monitors generally have. So in this regard, it would seem that mobile will be first for the AMD side and Desktop monitors will be first for Nvidia. Either way, were going to either be waiting for the monitors to support it, paying alot of money for them, or stuck with laptop support. I see a no win scenario for anyone at this point unless your willing to pay a lot or stick to mobile.
 
Let me just grab a quote one second....
That quote does nothing to help your case, I never said AMD was bad, just that Nvidia was one step ahead. How people interpret that is not up to me.
Thus ending this charade you call a debate, .
I already knew this wasn't a debate and there was no argument, I said that several posts ago.
im sure you can grab a hand full of stickers that say Nvidia on eBay. Try sticking some on a wall, your TV, and everything you own to see if that improves the performance *snickers.
.
I wonder if they cost more then AMD stickers? Hahaha.
 
Last edited:
That quote does nothing to help your case. JC advocates AMD hardware and I never said AMD was bad, just that Nvidia was one step ahead. How people interpret that is not up to me.
Because @JC713 defends AMD hardware/software that makes him biased. Most threads when he helps people he actually recommends Nvidia cards in actuality. His debated and picks things based on facts and if he recommends something he provides proof of his claim.
I wonder if they cost more then AMD stickers? Hahaha.
Only because Nvidias Patented Glue contains 30% more glue which in turn causes the sticker to have the potential to stick on all surfaces and the special Nvidia G series glue will come off without leaving residue. The AMD sticker contains glue that will leave a residue but costs half the price resulting in a user if they remove the sticker having to get a damp cloth to clean it off.
nvidia-4-logo-primary.jpg

There, now according to your logic this post just automatically won.
 
There, now according to your logic this post just automatically won.
There is no winner or loser here. Just facts.
Nvidia is usually one step ahead and they offer higher quality with just about everything they do compared to AMD, at a higher price of course.
This is not my opinion, its the truth, and there is no debate. Gsync is just another example. I would educate you but Graham just did a couple posts ago.
Your little jabs at me will do nothing to aid your empty claims and butthurt comments over your beloved underdog brand.
 
Last edited:
See, you think its one way or the other. There is no winner or loser here. Just facts, most of which have been recently provided and there is nothing you can do about it and you know it.
But there are no facts to prove one way or the other because as I have said, neither technology is open to the public and is only released to be shown in a controlled environment. G-Sync looks sick in its controlled environments for the public to see but home users have not been able to see it or most have not been able to actually see it in action for themselves besides a couple of videos that really can't display easily what were seeing. Same with AMDs FreeSync, all the people who have seen either say they look good and some have said they see the G-sync still looks better which is ok because it sounds like having the separate module will result in it being better. However, AMD's is free/going and require and AMD GPU where as the Nvidia requires a G-Sync capable monitor which in turn is costing an additional 300+ dollars for a 1080p screen which is in the same realm as buying some real nice 120hz monitors or 2560x1440p monitors (In fact more) and an Nvidia GPU. Either way noone is going to win and I doubt many people will jump on the tech bandwagon for awhile because the price is not going to be justifiable for most seeing as how a GTX 780 is 500+ and the monitor is 600+. The alternative won't be around for god knows how long from AMD so honestly I could not care less.

I like G-Sync better and its probably going to be better, but until we see this in the home user's place, its still considered a specualtion.

Your little jabs at me will do nothing to aid your empty claims and butthurt comments over your beloved underdog brand.
Right back at ya.
 
I like AMD GPU's and recommend them all the time, but this is about one company gettings things right & sooner, with higher quality, most of the time. Another example is how good Nvidia's Vsync technology worked before this even came out, and how long it took AMD to figure out its frame stutter. It's nothing personal, its just the truth.
At this point, both companys have filled in so many holes that either choice is a damn good one.
Eh. I guess AMD is always playing catch up but they are still a great company trying to make their customers happy.
 
How dare you talk bad about AMD! You must be a troll!I am sure Ghostcryer will have paragraphs of irrelevant Pro-AMD information loaded in his little squirt gun and pointed at you when he reads your comment.
Eh lets quit arguing please. I just came here to nicely defend AMD and it turned into an argument. Ok, nVidia has some nice potential and AMD has some nice potential. Now lets stuff fussing about nonsense.
 
Why? AMD have always been a hardware orientated company and have never professed to have any interest in software. Just look how long it took the company to divest itself of the high maintenance requirement for the All-In-Wonder lines (as an example).
AMD have been swimming against the tide of a unified hardware/software ecosystem using the flawed open source argument since their inception. Virtually every other company covets and acquires IP at breakneck speed, whilst AMD have a record of divesting themselves of theirs for very short term financial gain.
Hardly surprising that some people get a little peeved at AMD's misfiring board decisions diluting the brand and market position on occasion.
Why cut AMD slack for being a market reactor more than a market shaper ? Ultimately, their current position is down to its own BoD's intransigence and short-sightedness.
Very true. I still pity AMD a bit though.
 
Very true. I still pity AMD a bit though.
Each to their own, although I'd say if any group deserved pity it would AMD's shareholders and engineers.
Pitying AMD as a company is akin to rewarding/excusing mediocrity. The board of directors have historically demonstrated either a singular lack of semiconductor intellect, or are the unfortunate owners of more misfiring synapses than a mental institution.

That such a talented cadre of engineers and architects have been saddled with a board of such limited strategic vision and wishy-washy decision making, borders on tragedy - but hardly unique. I certainly don't see anyone (esp. the Sunnyvale cheer squad) pitying Intel's graphics engineers for their managements myopia over the years. That Intel flourishes still is largely due to the boneheaded (in)actions of its competitors management- not just AMD (Exxon, which put Zilog into a terminal nosedive being a textbook example).
 
Pitying AMD as a company is akin to rewarding/excusing mediocrity. The board of directors have historically demonstrated either a singular lack of semiconductor intellect, or are the unfortunate owners of more misfiring synapses than a mental institution.
HAHAHA. Wow you made my day DBZ.
 
HAHAHA. Wow you made my day DBZ.
Intel and nVidia all the way! hahaha

Seriously I don't have anything against AMD. I never have used AMD for CPU needs, other than a few machines brought to me for repair. I once used AMD graphics and was reluctant to switch to nVidia. I only switched because of a computational project I was contributing to. Heck during that time I was still wondering why I would want a $200 dollar graphics card.

Moving forward and graduating from Integrated graphics AMD was my first experience. I didn't have a bad experience, like I said I was prompted to switch because of the computational project. Since I'm loyal in nature, I will tend to stick with Intel and nVidia. Present day, I see no reason why AMD or nVidia should not be considered as equals in the world of gaming. I know I'm not an enthusiast riding the edge of top notch gear, but do we really have to be so petty as to throw rocks at the other side?
 
Back