Expect plenty of large 4K OLED gaming monitors with 240Hz refresh rates to arrive next...

midian182

Posts: 9,745   +121
Staff member
Forward-looking: OLED gaming monitors are becoming increasingly popular and, while most are still pricey, they are expected to get cheaper as time goes on. But what does the future of these displays look like? According to an industry insider, a slew of new 240Hz 32-inch and larger OLED panels will arrive next summer.

Writing in a post on Reddit, a representative for boutique monitor startup Dough gave a good explainer of the different types of OLED panels – W-OLED, RGB OLED, and QD-OLED – including their structure, advantages, and disadvantages.

The post also includes a roadmap of expected OLED panels arriving next year, though the earliest entry, a 31.5-inch 4K 144Hz RGB OLED (the only one of this kind on the map) from JOLED, has been canceled. The Japanese company, formed from the OLED businesses of Sony and Panasonic in 2015, went bankrupt in March, meaning its panel isn't coming to market. The RGB OLED subpixel structure offers more color accuracy and better text rendering, and doesn't suffer from color-fringing issues caused by the unique subpixel layout of QD-OLED and W-OLED.

Includes top OLED monitor choices: The Best Gaming Monitors - Mid 2023

The middle of next year will see a slew of new OLED panels. Three W-OLEDs are expected to arrive from LG in 31.5-inch (4K), 34-inch (WQHD), and 39-inch (WQHD) sizes, all with 240Hz refresh rates. Samsung also has 31.5-inch 4K and 34-inch WQHD panels with 240Hz refresh rates on the map, along with a 27-inch QHD panel with a 360Hz refresh.

OLED monitors remain more expensive than LCDs. Next year's QHD models are expected to be in the range of $700 to $1100 while the 4K models are predicted to fall between $1000 and $1500.

Other panels not on the map are launching next year, but the Dough rep says they only included what they think are the most exciting ones.

Another noticeable absence from the roadmap is LCD panels. The rep says LG is following Samsung, which exited the LCD business last year, by drastically cutting down on these models, mainly because the likes of BOE can now make better-quality LCDs at lower prices subsidized by the Chinese government.

In June, industry insider Bob Raikes spoke with Merck, a supplier of LCDs to companies like Samsung, claiming the conversation led him to believe that no further advancements will be made in the core display technology. Merck later disputed the claim, and Merck emphasized that the likes of miniLED backlights, optical components including quantum dots, phosphors, and films will continue to be developed for LCD TVs.

The Dough rep also assured people that LCDs were here to stay, especially in lower-to-mid-range products, with their costs falling as refresh rates and resolutions increase.

Permalink to story.

 
The real question is: What for?
What king of GPU is able to drive a game up to 240Hz@4K? It doesn't exist. nVidia, de facto monopolist, is raising prices of 1080p GPUs to $400, and promoting performance killing pathtracing on Their 4090, that also tanks performance under 30fps@4K.
 
"Futureproofing"
Except you actually can "futureproof" a display. A good display can last a decade without really needing an upgrade. I just bought an LG monitor to be hooked up to my xbox. I seriously doubt an Xbox (or frankly any PC) is going to push higher than 4K 120hz anytime soon, especially when I can just raise quality settings to keep myself there.
 
"Futureproofing"
Sorry, no such thing. I always bought midrange graphics, from $200 in 2000s up to $400 in 2022. In 2018 I bought relatively cheap 4K@60Hz monitor that I had to play in 1440p resolution to get anywhere near 60fps. Today You have to spend upwards of $750 to play average AAA game at 4K@60fps. Litography hits the wall at 3nm, game developeres gets lazy with new titles (Remnant 2 anyone?). Competitive gamesr don't care about 4K, more about 240Hz. On Steam survey the most popular (means: the one that people can afford) is GTX1650. Don't futureproof with $2000 monitor, kid, nVidia will not deliver, Mark My word.
Depends entirely on the game and what settings/rendering tech it has. For example, Doom Eternal will easily run over 240 fps at 4K with an RTX 4090 (and suitable CPU).
Few years old Doom and ancient CS are literally only two games that I can think of that can run 4K@240fps. Maybe there are few others competitive games that could do that. And that's the point - competitive gamers don't but 4K monitors. They buy 240Hz, believing It will improve Their kill ratios. 4K@240Hz is a gimmick no one really needs. Gimmick that will cost You few thousand $.
 
Last edited:
Sorry, no such thing. I always bought midrange graphics, from $200 in 2000s up to $400 in 2022. In 2018 I bought relatively cheap 4K@60Hz monitor that I had to play in 1440p resolution to get anywhere near 60fps. Today You have to spend upwards of $750 to play average AAA game at 4K@60fps. Litography hits the wall at 3nm, game developeres gets lazy with new titles (Remnant 2 anyone?). Competitive gamesr don't care about 4K, more about 240Hz. On Steam survey the most popular (means: the one that people can afford) is GTX1650. Don't futureproof with $2000 monitor, kid, nVidia will not deliver, Mark My word.

Few years old Doom and ancient CS are literally only two games that I can think of that can run 4K@240fps. Maybe there are few others competitive games that could do that. And that's the point - competitive gamers don't but 4K monitors. They buy 240Hz, believing It will improve Their kill ratios. 4K@240Hz is a gimmick no one really needs. Gimmick that will cost You few thousand $.
plus with an OLED you don't need to use features like black frame insertion. But it should at least be a good monitor when you upgrade to a 5090 or 6090 in the future
 
Sorry, no such thing. I always bought midrange graphics, from $200 in 2000s up to $400 in 2022. In 2018 I bought relatively cheap 4K@60Hz monitor that I had to play in 1440p resolution to get anywhere near 60fps. Today You have to spend upwards of $750 to play average AAA game at 4K@60fps. Litography hits the wall at 3nm, game developeres gets lazy with new titles (Remnant 2 anyone?). Competitive gamesr don't care about 4K, more about 240Hz. On Steam survey the most popular (means: the one that people can afford) is GTX1650. Don't futureproof with $2000 monitor, kid, nVidia will not deliver, Mark My word.

Few years old Doom and ancient CS are literally only two games that I can think of that can run 4K@240fps. Maybe there are few others competitive games that could do that. And that's the point - competitive gamers don't but 4K monitors. They buy 240Hz, believing It will improve Their kill ratios. 4K@240Hz is a gimmick no one really needs. Gimmick that will cost You few thousand $.


His point was, if you future proof your monitor by buying a monitor that is superior to your GPU. That way in a few years, when u do upgrade you dGPU... and another few years later when u upgrade again... those upgrades are in pursuance of your monitor's abilities.

You can always run your monitor at a lower frequency...
 
LOL, what gpu will run 4K@240fps. Not even 5090 I think.

Just give me high colour gamut, mini-led, 120Hz, HDR1400, for $1.2K max. I'm not convinced about OLED for a monitor. Lot of static elements and lot's of text. I'd prefer mini-led with at least 1200 zones until micro-led becomes a thing.
 
His point was, if you future proof your monitor by buying a monitor that is superior to your GPU. That way in a few years, when u do upgrade you dGPU... and another few years later when u upgrade again... those upgrades are in pursuance of your monitor's abilities.

You can always run your monitor at a lower frequency...
Not with OLED, You remember they burn out, right?
 
Except you actually can "futureproof" a display. A good display can last a decade without really needing an upgrade. I just bought an LG monitor to be hooked up to my xbox. I seriously doubt an Xbox (or frankly any PC) is going to push higher than 4K 120hz anytime soon, especially when I can just raise quality settings to keep myself there.
Except you can't actually future proof a display that cannot be fully utilized by our current hardware, with some very minor exceptions, and unless there's a huge leap in GPU performance in the next 5-10 years, by the time you get a GPU capable enough to do so, your display will be too old.

Sorry, no such thing. I always bought midrange graphics, from $200 in 2000s up to $400 in 2022. In 2018 I bought relatively cheap 4K@60Hz monitor that I had to play in 1440p resolution to get anywhere near 60fps. Today You have to spend upwards of $750 to play average AAA game at 4K@60fps. Litography hits the wall at 3nm, game developeres gets lazy with new titles (Remnant 2 anyone?). Competitive gamesr don't care about 4K, more about 240Hz. On Steam survey the most popular (means: the one that people can afford) is GTX1650. Don't futureproof with $2000 monitor, kid, nVidia will not deliver, Mark My word.

Few years old Doom and ancient CS are literally only two games that I can think of that can run 4K@240fps. Maybe there are few others competitive games that could do that. And that's the point - competitive gamers don't but 4K monitors. They buy 240Hz, believing It will improve Their kill ratios. 4K@240Hz is a gimmick no one really needs. Gimmick that will cost You few thousand $.
Hence the " ".
 
If they do not come with DPv2.0+, then nobody cares. Squeezing bits out of DPv1.4 has become absurd.

I'm sure LG at least will support HDMI 2.1, which more then meets the bandwidth requirements for the display.
 
The real question is: What for?
What king of GPU is able to drive a game up to 240Hz@4K? It doesn't exist. nVidia, de facto monopolist, is raising prices of 1080p GPUs to $400, and promoting performance killing pathtracing on Their 4090, that also tanks performance under 30fps@4K.

A lot of people will run @ 1440p for that exact reason. Even then, in situations where VRR can't be used, a higher refresh rate reduces overall latency of the displayed image so it certainly isn't a *bad* thing even if you can't reach those refresh rates.

Case in point, I run my LG C2 @ 1440p because a 3080Ti struggles to hit even 4k60 in most titles, and I prefer the higher framerates to the increase resolution.
 
Except you can't actually future proof a display that cannot be fully utilized by our current hardware, with some very minor exceptions, and unless there's a huge leap in GPU performance in the next 5-10 years, by the time you get a GPU capable enough to do so, your display will be too old.
Why pray tell would my display be too old? I don't see any mention of a new HDMI standard coming out that has a physically different connector layout. I simply fail to see how age of my display correlates to the performance of a GPU (especially one found in a console). 4K 120hz is a struggle now and it was a struggle years ago and it will be a struggle years from now so long as graphical effects like raytracing continue to improve and take their toll on performance.
 
Why pray tell would my display be too old? I don't see any mention of a new HDMI standard coming out that has a physically different connector layout. I simply fail to see how age of my display correlates to the performance of a GPU (especially one found in a console). 4K 120hz is a struggle now and it was a struggle years ago and it will be a struggle years from now so long as graphical effects like raytracing continue to improve and take their toll on performance.
Simple: OLEDs and their main weakness, regardless of how much they've improved. You're getting a 4K 240hz display that you won't fully utilize because, as you said, even 120hz is a struggle, so why get it in the first place if you're not gonna use it to it full extent in the near future? There will probably be cheaper alternatives all the time until GPUs can catch up to that level of performance, and when they do, much better OLED (or other tech) that won't have the same problems.
 
Simple: OLEDs and their main weakness, regardless of how much they've improved. You're getting a 4K 240hz display that you won't fully utilize because, as you said, even 120hz is a struggle, so why get it in the first place if you're not gonna use it to it full extent in the near future? There will probably be cheaper alternatives all the time until GPUs can catch up to that level of performance, and when they do, much better OLED (or other tech) that won't have the same problems.

Nothing u just said makes sense to me.... so you are saying DON'T buy a good monitor..? And instead buy bad monitors because it's easier to afford a dGPU that will push them..

And you want others to be in the same boat as you...?


Or are you trying to say it's a bad idea to buy the monitor you want, because you will eventually have to buy a new dGPU next year to fully push that 4K monitor...?

A buying point in which We already said is called "future-proofing"...! Otherwise, when u buy that new GPU next year, you would have to buy ANOTHER monitor, if u didn't already buy one the one you wanted, that has future capabilities...!

One instance....
You suggest spending money on Two monitors... one now and another one in a few years when u buy a video card. Knowing most people don't need two monitors and also knowing buying two monitors is a lot of money...

Instead of....
Buying a super monitor that you can keep for the next 4-7 years.... that will meet your needs as you upgrade your dGPU ovr those years.


Case and point:
My 7 year old Acer X34 went through 3 different dGPU... until that monitor no longer is viable against todays technology. It was $1,299 when I bought it and sells for $399 now.

My new monitor is a 42" Asus OLED and even though my 7900XTX can only push maximum frames in a few games, won't stop me from buying a new dGPU next year to make full use of my new OLED monitor.

I don't know anyone who replaces their monitor often, (bcz they are so expensive), so if you are going to buy one, future-proof it...!

Gamers do not stare at their dGPU.... a good monitor is a must.
 
Nothing u just said makes sense to me.... so you are saying DON'T buy a good monitor..? And instead buy bad monitors because it's easier to afford a dGPU that will push them..

And you want others to be in the same boat as you...?


Or are you trying to say it's a bad idea to buy the monitor you want, because you will eventually have to buy a new dGPU next year to fully push that 4K monitor...?

A buying point in which We already said is called "future-proofing"...! Otherwise, when u buy that new GPU next year, you would have to buy ANOTHER monitor, if u didn't already buy one the one you wanted, that has future capabilities...!

One instance....
You suggest spending money on Two monitors... one now and another one in a few years when u buy a video card. Knowing most people don't need two monitors and also knowing buying two monitors is a lot of money...

Instead of....
Buying a super monitor that you can keep for the next 4-7 years.... that will meet your needs as you upgrade your dGPU ovr those years.


Case and point:
My 7 year old Acer X34 went through 3 different dGPU... until that monitor no longer is viable against todays technology. It was $1,299 when I bought it and sells for $399 now.

My new monitor is a 42" Asus OLED and even though my 7900XTX can only push maximum frames in a few games, won't stop me from buying a new dGPU next year to make full use of my new OLED monitor.

I don't know anyone who replaces their monitor often, (bcz they are so expensive), so if you are going to buy one, future-proof it...!

Gamers do not stare at their dGPU.... a good monitor is a must.
No wonder it doesn't make sense to you, because if that's what you interpreted from my comment, then there's not much else to explain.

TL;DR: don't get a high end performance monitor when there are no GPU now and there won't be one in the near future to properly utilize it.
 
No wonder it doesn't make sense to you, because if that's what you interpreted from my comment, then there's not much else to explain.

TL;DR: don't get a high end performance monitor when there are no GPU now and there won't be one in the near future to properly utilize it.
But you have NOT explained why....
I explained and showed you why One should future-proof when purchasing a Monitor and how it is cheaper to do so.


All you are saying is don't do it, because at some point you have to buy a new GPU. So are you saying that YOU buy a new Monitor with every new GPU..?

How does that make sense?
 
Except you actually can "futureproof" a display. A good display can last a decade without really needing an upgrade. I just bought an LG monitor to be hooked up to my xbox. I seriously doubt an Xbox (or frankly any PC) is going to push higher than 4K 120hz anytime soon, especially when I can just raise quality settings to keep myself there.
I've had my 24" 1920 x 1200 LG LCD monitor for probably over a decade now. I had to manually modify its firmware because PCs recognized it as a TV instead of a monitor - which would skew the entire image off the screen. But once I did that, the monitor has worked great and still works great. I would like to get an OLED, however, as an upgrade at some point.
 
Back