Nvidia GeForce RTX 4090 is 78% faster than RTX 3090 Ti, hits 3.0 GHz in leaked benchmarks

Not here to troll, just to make a point about human eye.
There are tons of research done on this subject in the past.
Here is just one:

copied the wrong link first time,

 
Last edited:
Not here to troll, just to make a point about human eye.
There are tons of research done on this subject in the past.
Here is just one: https://www.healthline.com/health/human-eye-fps#takeaway
Here is one from my perspective
Eye has muscle memory too and just like your biceps and triceps if you train them they can get accustomed to slight changes or lag. So in you can train you eyes to notice a difference between 60 and 120hz. Also because there is more lag in what you see with a slower refresh rate monitors your opponent will see you before you see them and have the upper advantage with a higher refresh rate monitor just like in the video I shown. Lastly I feel like a broken record here when using a cursor on the mouse the motion is also smoother when compared to the 30 hs or 120hz monitor. We aren't arguing about just seeing motion we are saying there is a perceived advantage once you saturate that 60 fps 16 ms per frame media content that going to 120 fps 8ms per frame or better that there is a perceived advantage.
your article states that the eye can see 13 ms images at 75 fps so your own article disproves your point and this

As technology evolves, experts may continue to develop new ways to gauge what the eye is capable of seeing. This was based on a 2014 article when there was only a handful of high refresh rate monitors on the market. Your turn!
 
In real life, wave fast your hand in front of your eyes, let me know if you see all hand positions or just pieces of the motion. Human eye is not a slowmotion CCD.
Or on a PC fan stick a paint mark on a blade, spin the fan starting with 1 rpm then faster, let me know the RPM when you will not see the mark but a circle.
If you can see the movement cursive then you will see your 144,240,360hz.

To me it's all marketing bullshlt and human ego on who has the fastest LCD or GPU.
lol...

What another epic fail... Again, higher frames and higher frequency in (Games) has nothing to do about what you can see.

I already told you this.. YOU ARE NOT WATCHING ANYTHING.. you are playing.

People like the unfettered movement of their character to jump over a table, or threw a window with Precision when gaming at higher frames & frequency...

Again it is utterly hilarious that you have no awareness of this... and actually think it's about your vision.
 
Again it is utterly hilarious that you have no awareness of this... and actually think it's about your vision.
You could be right about my vision, haven't been to a doctor in recent years.
But article say:

"In the past, experts maintained that most people’s maximum ability to detect flicker ranged between 50 and 90 Hz, or that the maximum number of frames per second that a person could see topped out around 60."
"However, there are a few types of animals with very good visual acuity that’s even better than ours. This includes some birds of prey, who can see as many as 140 frames per second."

So you gamers @ 144hz must be some soft of wild animal.

 
You could be right about my vision, haven't been to a doctor in recent years.
But article say:

"In the past, experts maintained that most people’s maximum ability to detect flicker ranged between 50 and 90 Hz, or that the maximum number of frames per second that a person could see topped out around 60."
"However, there are a few types of animals with very good visual acuity that’s even better than ours. This includes some birds of prey, who can see as many as 140 frames per second."

So you gamers @ 144hz must be some soft of wild animal.

I guess you aren't familiar with the term outliers? Or do you think than humans are a one size fits all in characteristics? Hmmm?
 
People do not buy GPUs to game at 60hz...
Ok, so that makes absolutely no sense whatsoever. Everyone who games at 60Hz still needs a GPU to do so and so they DO buy GPUs to game at 60Hz. I don't know what world you live in but it's not this one.
I guess you aren't familiar with the term outliers? Or do you think than humans are a one size fits all in characteristics? Hmmm?
There are always outliers but they're called outliers because they are exceedingly rare. I believe that many people can see faster than 90Hz but I would expect it to be less than 0.1% of the humans on Earth. Not exactly relevant in a macro-marketing discussion.
 
You could be right about my vision, haven't been to a doctor in recent years.
But article say:

"In the past, experts maintained that most people’s maximum ability to detect flicker ranged between 50 and 90 Hz, or that the maximum number of frames per second that a person could see topped out around 60."
"However, there are a few types of animals with very good visual acuity that’s even better than ours. This includes some birds of prey, who can see as many as 140 frames per second."

So you gamers @ 144hz must be some soft of wild animal.


Again... gaming at 144Hz+ has nothing to do about PEOPLE's VISION. It is about the movement of your Character within the game world and how your character reacts to the game world.

It's about precision...


Again, you are PLAYING a moveable character within a game world, not watching Television. What you are talking about has NOTHING to do with gaming, it has to do with watching something on a screen (which is NOT what we are discussing)..

And you still keep brining vision up... because you are 100% unaware of character movement within a game and probably have never even played a FPS game in your life.
 
Last edited:
Ok, so that makes absolutely no sense whatsoever. Everyone who games at 60Hz still needs a GPU to do so and so they DO buy GPUs to game at 60Hz. I don't know what world you live in but it's not this one.

There are always outliers but they're called outliers because they are exceedingly rare. I believe that many people can see faster than 90Hz but I would expect it to be less than 0.1% of the humans on Earth. Not exactly relevant in a macro-marketing discussion.
The enthusiast niche market lines up with that 0.1% of the world figure. I guess we do make up some outliers of some chart that push progress and innovations. When I hear 4k qd oled at 240hz via dp 2.0 at 1000 nits of brightness hdr display I get excited, but I guess this would make someone triggered if you don't make up the niche outlier market that is. Some call us PC master race and now I guess some call us humans with animal like eye reflexes 😜.
 
Ok, so that makes absolutely no sense whatsoever. Everyone who games at 60Hz still needs a GPU to do so and so they DO buy GPUs to game at 60Hz. I don't know what world you live in but it's not this one.

There are always outliers but they're called outliers because they are exceedingly rare. I believe that many people can see faster than 90Hz but I would expect it to be less than 0.1% of the humans on Earth. Not exactly relevant in a macro-marketing discussion.

There is a difference between having a GPU and buying a GPU.

Cell phones have a GPU... but people don't buy a GPU for their phones.. understand? (Even newer cellphones are 90Hz or 120Hz...)

Playstation 5 has a GPU, but people don't buy a GPU for their PS5.... understand?


People do not go out and BUY a discreet GPU to play games at 60Hz... they are buying the Graphics card for higher Hertz and more frames. Otherwise if you want just 60Hz you'll stick to onboard graphics, with no need to BUY a gpu.
 
Last edited:
You can say the exact same about the generation before the current generation and go on and on and on. Your point being ?
If you don't get my point, then you're on the wrong forum and should be reading comics instead.
 
The enthusiast niche market lines up with that 0.1% of the world figure. I guess we do make up some outliers of some chart that push progress and innovations. When I hear 4k qd oled at 240hz via dp 2.0 at 1000 nits of brightness hdr display I get excited, but I guess this would make someone triggered if you don't make up the niche outlier market that is. Some call us PC master race and now I guess some call us humans with animal like eye reflexes 😜.
I'm not triggered at all. I understand that high-refresh monitors decrease input lag and in competitive games like Counter-Strike: Potato Offensive (Steve Walton's name for it... :laughing: ), it has been demonstrated (by Linus Sebastian of all people) that faster refresh monitors make you an instinctively better marksman.

What drives me crazy is when the noobs get this stupid idea in their heads that if they don't bankrupt themselves on halo products they'll have a subpar gaming experience. I guess it's more of an altruistic thing because I hate to see people struggling just because they don't know any better. Hell, I remember 14 years ago when 40+fps on average was considered pretty good and it really was pretty good.

The most legendary "hard-on-the-hardware" game in history is Crysis. It came out in 2007 and initially, no single card could run it properly.
Crysis_02-p.webp

This picture (from 2008) shows an array of the most potent video cards at the time and none of them could crack 50fps. Nevertheless, people LOVED playing Crysis, even in the 40-49fps range. Nobody whined about getting less than 60fps, we just played the game and enjoyed it. Nowadays people whine about getting less than 120-140fps and, remembering how great Crysis was on my Phenom II X4 940 and XFX Radeon HD 4870 1GB (at about 45-47fps), it really makes me wonder what the hell is wrong with them. It's like listening to children crying about having to wear Nikes instead of Jordans. It's just cringeworthy! :laughing:
 
I'm not triggered at all. I understand that high-refresh monitors decrease input lag and in competitive games like Counter-Strike: Potato Offensive (Steve Walton's name for it... :laughing: ), it has been demonstrated (by Linus Sebastian of all people) that faster refresh monitors make you an instinctively better marksman.

What drives me crazy is when the noobs get this stupid idea in their heads that if they don't bankrupt themselves on halo products they'll have a subpar gaming experience. I guess it's more of an altruistic thing because I hate to see people struggling just because they don't know any better. Hell, I remember 14 years ago when 40+fps on average was considered pretty good and it really was pretty good.

The most legendary "hard-on-the-hardware" game in history is Crysis. It came out in 2007 and initially, no single card could run it properly.
Crysis_02-p.webp

This picture (from 2008) shows an array of the most potent video cards at the time and none of them could crack 50fps. Nevertheless, people LOVED playing Crysis, even in the 40-49fps range. Nobody whined about getting less than 60fps, we just played the game and enjoyed it. Nowadays people whine about getting less than 120-140fps and, remembering how great Crysis was on my Phenom II X4 940 and XFX Radeon HD 4870 1GB (at about 45-47fps), it really makes me wonder what the hell is wrong with them. It's like listening to children crying about having to wear Nikes instead of Jordans. It's just cringeworthy! :laughing:
I remember Crysis because that game made me into an enthusiast. Lol I was a member of the incrysis forums website when it was a thing I was obsessed with crysis and still am. When Crysis 1 came out there no monitors that were refresh rate monitors so everyone had a competitive playing field no one had an advantage from a high refresh rate monitor. Those were the good old days although in the end those servers got plagued with cheaters but I still used to own those cheaters even when cheated. Eventually the hardware caught up Went from 8800 ultra to gtx 285 then dual fermis 480s sc in sli, to 580 classified to 690 classified. My screen went from 1920 by 1650 60 hz Samsung led screen probably tn to a Asus 120hz 3d vision 1080p screen. How I know that the screen performance matters is when I eventually played Crysis 3 on a later Asus 3440 x1440p ips 100hz screen where the input lag and response times were worse than the 120hz tn 1080 one and my performance in competitive marksman took a hit. I recall the killcam specifically seeing the enemy seeing me way before I have seen them. I was like wow this is a eye opener if anything. Have you ever experienced that on a killcam where your killer sees you way before you see them? Currently I have the LG cx oled and my Markman performance improved because no one has the competitive advantage over me of seeing me before I see them. I guess it comes down to personal experience. When you experience it you can't unexperience it.
 
The most legendary "hard-on-the-hardware" game in history is Crysis. It came out in 2007 and initially, no single card could run it properly.
That game was a real challange for my 8800GT, had to turn some dials down to play on a 21" Nokia CRT 1600x1200. But I enjoy the game at that time anyway.
 
When you experience it you can't unexperience it.
Agreed. That's why I never want to experience it. I'm happy as things are and to me, this is one of those "If it ain't broke, don't fix it!" scenarios. I don't do any PvP so I wouldn't get much of an advantage anyway.
That game was a real challange for my 8800GT, had to turn some dials down to play on a 21" Nokia CRT 1600x1200. But I enjoy the game at that time anyway.
Exactly. The 8800 GT was no match for cards like the GTX 280 and HD 4870 but even though you were getting (probably) below 40fps on average, the game was still GREAT! You had what you had, you made it work and enjoyed it. The kids these days whine about EVERYTHING. I was just glad that I was able to play it at all. I remember some guy saying that he gets seasick if he plays a game at less than 80fps. I remember saying "Man, that must suck because you can't even watch TV!". Of course, I only said that because I figured that he was full of it and just trying to act like a snob. :laughing:
 
The 8800 GT was no match for cards like the GTX 280
8800GT was $250 and 280 was $700 but double the performance, heat and power was 235W :scream: compared to 125W.
I remember saying "Man, that must suck because you can't even watch TV!". Of course, I only said that because I figured that he was full of it and just trying to act like a snob. :laughing:
Sony did some CRT's with 100hz for those people.
 
8800GT was $250 and 280 was $700 but double the performance, heat and power was 235W :scream: compared to 125W.

Sony did some CRT's with 100hz for those people.
They must have been exceedingly rare because I'd never heard of them. I thought that CRTs were already extremely fast when it came to refresh rates.

Nevertheless, gaming at 100fps was relegated to VERY old or simple games like Commander Keen and NTSC broadcasts were only like 29.something fps no matter what the speed of the TV was. Anyone who paid more for a faster TV when they were still only getting ~30fps really wasted their dollars. :laughing:
 
Last edited:
Power usage should go down, not up lol
Yeah but as I pointed out in an earlier thread about an RTX 4090 leak, the 4090 just looks like an insanely overclocked 3090 because it showed a 78% performance improvement coupled with a 78% increase in wattage. That's why I don't find the RTX 4000-series to be all that impressive at this point.

Maybe nVidia is just cursed when it comes to the number 4 because here we have the RTX 4000-series which will do double duty as a barbecue just like the GTX 400-series.
 
They must have been exceedingly rare because I'd never heard of them. I thought that CRTs were already extremely fast when it came to refresh rates.

https://en.wikipedia.org/wiki/Comparison_of_CRT,_LCD,_plasma,_and_OLED_displays

0.01ms but limited by phosphor to 5ms

It was the last CRT TV my father had. I remember we broke our backs with it when we get it from the shop.

Maybe nVidia is just cursed when it comes to the number 4 because here we have the RTX 4000-series which will do double duty as a barbecue just like the GTX 400-series.

I skiped 400 series from 200 to 500, but I remember those were also hot and loud.
 
https://en.wikipedia.org/wiki/Comparison_of_CRT,_LCD,_plasma,_and_OLED_displays

0.01ms but limited by phosphor to 5ms

It was the last CRT TV my father had. I remember we broke our backs with it when we get it from the shop.
Oh yeah, I remember how heavy those blasted CRTs were! Absolutely insane!
I skiped 400 series from 200 to 500, but I remember those were also hot and loud.
The last nVidia card I ever bought for actual constant use was a Palit 8500 GT. Later on, bought a PNY 8400 GS because I wanted a PCI card for diagnostic work.

Ever since I bought my first XFX HD 4870, I haven't purchased one damn thing from nVidia and I've been perfectly happy not doing so. My Radeons have always served me well (except for the faulty RX 5700 XT) so I have been very happy with them. I just never got the itch to pay more for less performance and that's what GeForces essentially are.
 
Not sure what many of you are on about, but many of us old-school gamers have been Gaming at 1440p @85Hz since late 90s.. w/SONY GDM-F500 / GDM-900

My 1st gaming flat panel was an 1440p Overlord @100Hz and that is 11 years old...

Nobody games at 60Hz... unless you are playing solitaire at the library and chances are that is set to 78Hz anyways.
 
I like this!
But prev gen Turing RTX8000 has half of the TDP of 3090 and exibits half of the performance. So 3090 might in turn have been an overclocked 2080Ti :)
 
Back