Steam Deck OLED shows slight burn-in at 1,500 hours, or 750 hours at max HDR brightness

Daniel Sims

Posts: 1,375   +43
Staff
Facepalm: Users approaching 1,000 hours played on one game with static UI elements on the Steam Deck OLED should consider checking for burn-in and taking steps to mitigate or prevent the problem. Turning up the brightness and constantly using HDR can hasten the effect.

Although manufacturers have made strides in reducing the risk of burn-in on OLED displays, the risk of image retention remains inherent to the technology. Thus, independent testers run experiments on devices like the Steam Deck OLED to advise consumers about the durability of the screen.

Preliminary findings from YouTubers such as Wulff Den and The Phawx indicate that the OLED display of Valve's handheld PC may begin to show minor image retention issues between 1,000 and 1,500 hours of use.

Although this is less than half the time it took for the Nintendo Switch OLED to show burn-in, and some heavy users may have already noticed it, most will likely take longer to suffer from the effect. In that comparison, it's also worth considering that the Switch OLED achieves only 350 nits of brightness, compared to a remarkable peak brightness of 1,000 nits for the Steam Deck OLED, or up to 600 nits for SDR content.

Burn-in, which results in overused pixels darkening and leaving ghostly images of frequently displayed graphics on the screen, occurs when a static image is left on an OLED panel for too long. A video game's heads-up display or user interface could lead to image retention issues after thousands of hours of gameplay on an OLED device.

In testing the Steam Deck OLED, the aim was to simulate a worst-case scenario in conditions that most users will likely never face. Nevertheless, these findings can provide a useful reference point for potential issues.

Wulff Den conducted a test by displaying a static screen from The Legend of Zelda: Breath of the Wild, with an added color bar, for 1,500 hours. Meanwhile, Phawx created a program to test various colors in both SDR and HDR modes. Notably, HDR at 1,000 nits demonstrated significant image retention after 750 hours, whereas SDR at 600 nits started showing slight burn-in at 1,500 hours, with blue subpixels being the most affected, followed by red.

It's important to note that on the Steam Deck OLED, the maximum physical brightness is 75 percent screen brightness. Settings above this level digitally enhance exposure, so to lessen the risk of burn-in, it's advisable to keep the brightness below 75 percent.

Heavy users who play a single game extensively might face image retention issues after the Steam Deck OLED's one-year warranty period, potentially leading to problems. Fortunately, iFixit has reported that the device's screen is relatively easy to replace.

Permalink to story.

 
Good thing I don't plan leaving it on a static screen for 1,000 hours continuously
So the big problem with burn in is that can occur with certain color profiles. If you have a warmer game then it can kill the red OLEDs faster and create a blueish/green image. If you display all colors equally then you will get serious dimming across the color spectrum after certain periods of them. That said, after about 3000 hours I wouldn't mind paying $150 to replace the screen.

But people need to realize that oleds with extreme peak brightnesses degrade the screen faster. When they first came out, I don't know how many people remember this, you basically had to watch an OLED in a dark room beaks of how low the peak brightness. Even with that low peak Brightness, burn in was an even bigger issue than it is today. Now I see the average of being around 1500 hours of screen time before burn starts to be in issue which is about 60 days straight of screen on time.
 
So the big problem with burn in is that can occur with certain color profiles. If you have a warmer game then it can kill the red OLEDs faster and create a blueish/green image. If you display all colors equally then you will get serious dimming across the color spectrum after certain periods of them. That said, after about 3000 hours I wouldn't mind paying $150 to replace the screen.

But people need to realize that oleds with extreme peak brightnesses degrade the screen faster. When they first came out, I don't know how many people remember this, you basically had to watch an OLED in a dark room beaks of how low the peak brightness. Even with that low peak Brightness, burn in was an even bigger issue than it is today. Now I see the average of being around 1500 hours of screen time before burn starts to be in issue which is about 60 days straight of screen on time.
I must be very lucky with my 3.5 year old cx oled, I have about 1500 hours in Vermitide 2 playing primarily with the Fire wizard Sienna Unchianned that shoots bright red, and orange fire, burns enemies on push/block and ults fire around her. No signs of burn in. Brightness also set to maximum. 😅 Living on the edge.
 
Last edited:
I'll just wait for them to release a device that doesn't damage itself during prolonged use. This issue exists alongside the already painfully real joycon issues. I sold my switch because buying controllers for it was a total coin toss. Yes, even the pro controllers drift, some of them when brand new.

I have plenty of handhelds that won't have this problem, some by Nintendo, all with great looking displays. Maybe they'll mitigate this somehow in the future or use a less fragile technology.
 
What a shocker. IPS does not have burnin problems. OLED might look better, but IPS it is not flawed in this way..
 
What a shocker. IPS does not have burnin problems. OLED might look better, but IPS it is not flawed in this way..
Nothing wrong with IPS when you consider they're still making new displays with TN and VA panels. IMO, IPS is the perfect middle ground of refreshrate, color accuracy and latency. 20 years ago, IPS displays were the gold standard. Sure, OLEDs are better in many cases, but IPS are cheaper and they don't burn in. Get one with a 240hz refreshrate and full array local dimming and you pay half what you do with OLEDs without burnin
 
:rolleyes: As I see it, no one would, or should, leave a display on at max brightness and then wonder why they are getting burn in. Doh!

Then again, I would never use a device like this.
 
Nothing wrong with IPS when you consider they're still making new displays with TN and VA panels. IMO, IPS is the perfect middle ground of refreshrate, color accuracy and latency. 20 years ago, IPS displays were the gold standard. Sure, OLEDs are better in many cases, but IPS are cheaper and they don't burn in. Get one with a 240hz refreshrate and full array local dimming and you pay half what you do with OLEDs without burnin
True. There is a debate among some that FALD is very close to OLED in some respects. It just goes to show how manufacturers keep improving the products. Yet, IMO, FALD is a compromise. It still can't match OLED for black level.

Oh how I long for the days when micro-LED becomes affordable. Note I said "micro" not "mini".
 
Nothing wrong with IPS when you consider they're still making new displays with TN and VA panels.
That's what I mean. IPS is the better display tech for mobile devices as no burn-in is possible. This is especially true for for gaming devices like the SteamDeck. OLED was a poor choice.
 
As I see it, no one would, or should, leave a display on at max brightness and then wonder why they are getting burn in. Doh!
I do it as a rule. I see no reason not to use full brightness on my devices. People should not have to be worried about poor design choices that will result in problems like burn in.
 
Here is photos of my CX with a handful of titles at maximum brightness. Maybe LG's software burn in mitigation is superior?







Ran a yt burn in test where the whole screen changes random solid colors. There was no image retention and the images appeared in uniformity.
 
Last edited:
Latest TVs from the main players are pretty immune to burn in
pixel shifting , cleansing routine , dimming logos after sometime . heat sinks.
Algorithm/software modelling - will help in balancing overuse in say warm colours as mentioned about.

Ie they know how it ages and adjusts

Common sense goes a long way . I still have a plasma TV , no burn in , and they are meant to be more prone.
I did run a colour changing screensaver when first bought for a few days, as was meant to be prone in early stages ?? who knows . But trained family to kill screen when say downloading a huge game on playstation and other static screen situations . Now not concerned as screen must be really aged
 
How is this even a thing? I have seen even TVs have burn in mitigation with pixel shifting for several years. It will be extremely easy to detect static elements/screens and then do pixel shifting to avoid image retention.

Also, nothing wrong with consumers using peak brightness (though it's very bad for eyes). If they are so detrimental for product's life then they should show warning messages about it on selecting brightness above 90%.
 
Are there people out there that actually play only one game and it has a static bar that always stays there in the same place at all times for 1500 hours straight? Even games like strategy games with lots of static bars have different menus you go in and changes the whole thing up. I just find this insanely unrealistic and the only reason YouTubers make these videos is to make dumb click bait videos.
 
Found a chart matching the peak brightness at standard definition mode at 600 nits and HDR at 1000 nits. The Standard definition seems higher than most oleds that peak at 200 to 250 nits from memory.

https://cdn.mos.cms.futurecdn.net/KRrucSU3ZffoYms5pduc5i-1200-80.png

The sSDR peaks almost as high as My CX ( around 700 nits) in hdr. This might the culprit. Yeah I agree lower the peak brightness will improve burn in risk.
A working theory I have does radiant heat from the cpu and gpu within close proximity to the display also worsen the risk of burn in? Meaning the delta surrounding temperature compared to a oled display without warmer temperature around the display. ( Like a direct correlation between ambient/ radiant temp and risk of burn in)
 
Last edited:
I'll just wait for them to release a device that doesn't damage itself during prolonged use. This issue exists alongside the already painfully real joycon issues. I sold my switch because buying controllers for it was a total coin toss. Yes, even the pro controllers drift, some of them when brand new.

I have plenty of handhelds that won't have this problem, some by Nintendo, all with great looking displays. Maybe they'll mitigate this somehow in the future or use a less fragile technology.

I don't play on the Switch, consoles haven't interested me since the PS2 days and having heard of the joy con drift issues I was hesitant to get a Switch for my son a couple years ago. This past Christmas he got a couple of new games he wanted and when family was over visiting we all agreed to play some games with him - he was mostly interested in playing Mario Kart 8.

Here we go!

Everyone played a few rounds, taking turns. When it was my turn to race I was sitting on the edge of his bed, so I had a slight lean to the right (I wasn't sitting straight up and holding the controller straight). The whole race I constantly drifting to the right and I was fighting hard to go left and it bothered the hell out of me. I thought the half of the joy con I was using had a drift issue.

Come to find out that Mario Kart 8 defaults to some pretty stupid control options that my daughter showed us:
1) Smart Steering (default is on)
2) Tilt Controls (default is on)
3) Auto-accelerate (default is on)

The game damn near drives itself without your interactions with all these options turned on. I had to turn them all off so I could actually play the game. Only at that point was I actually in control (no pun intended) of the game and it reminded me more of the golden days playing Mario Kart 64.

With all that said, I don't think the drift con issue is that prevalent, but I do know that it does exist. We just haven't actually had any issues with it yet.
 
:rolleyes: As I see it, no one would, or should, leave a display on at max brightness and then wonder why they are getting burn in. Doh!

Then again, I would never use a device like this.

Maybe that's why I rarely experienced burn-in problems in any screens in any devices I ever owned since CRT days, and also no burn-in issues at all with any OLED devices. A properly calibrated screen will usually be set far from max brightness.

The only exception I remember is one of the first LCD tvs I've owned, a Samsung model manufactured in 2006 or 2007, that had a very strong burn-in bias even when set far from max brightness. Playing a video game with a static hud or watching some tv program with broadcaster watermark, for only around 5 or 6 hours, would leave burn-in. Thankfully it never suffered from permanent burn-in and any burn-in disappeared completely after leaving the tv on for a couple hours on some video with no static elements.
 
Nothing wrong with IPS when you consider they're still making new displays with TN and VA panels. IMO, IPS is the perfect middle ground of refreshrate, color accuracy and latency. 20 years ago, IPS displays were the gold standard. Sure, OLEDs are better in many cases, but IPS are cheaper and they don't burn in. Get one with a 240hz refreshrate and full array local dimming and you pay half what you do with OLEDs without burnin
Nothing wrong with IPS when you consider they're still making new displays with TN and VA panels. IMO, IPS is the perfect middle ground of refreshrate, color accuracy and latency. 20 years ago, IPS displays were the gold standard. Sure, OLEDs are better in many cases, but IPS are cheaper and they don't burn in. Get one with a 240hz refreshrate and full array local dimming and you pay half what you do with OLEDs without burnin
Nothing wrong with IPS when you consider they're still making new displays with TN and VA panels. IMO, IPS is the perfect middle ground of refreshrate, color accuracy and latency. 20 years ago, IPS displays were the gold standard. Sure, OLEDs are better in many cases, but IPS are cheaper and they don't burn in. Get one with a 240hz refreshrate and full array local dimming and you pay half what you do with OLEDs without burnin
Nothing wrong with IPS when you consider they're still making new displays with TN and VA panels. IMO, IPS is the perfect middle ground of refreshrate, color accuracy and latency. 20 years ago, IPS displays were the gold standard. Sure, OLEDs are better in many cases, but IPS are cheaper and they don't burn in. Get one with a 240hz refreshrate and full array local dimming and you pay half what you do with OLEDs without burnin
Nothing wrong with IPS when you consider they're still making new displays with TN and VA panels. IMO, IPS is the perfect middle ground of refreshrate, color accuracy and latency. 20 years ago, IPS displays were the gold standard. Sure, OLEDs are better in many cases, but IPS are cheaper and they don't burn in. Get one with a 240hz refreshrate and full array local dimming and you pay half what you do with OLEDs without burnin

IPS has one major drawback, contrast ratio. I do love IPS panels but at best your looking at 1500:1 which is very low next to something like VA.
 
Who plays one game for 1000 hours. Every user varies content.

Just as useless as RTINGS longevity testing, running CNN 24/7/365 or whatever on end and never turning TVs off and letting pixels refresh.

IPS is terrible when it comes to image quality compared to OLED. Low contrast + IPS Glow and pretty much every LCD issue you can think of, meaning blooming, mediocre viewing angles, bad HDR, lacking motion clarity because of smearing frame to frame

VA has better image quality than IPS for sure but other problems, like Black Smearing and viewing angles. VA is pretty bad for fast paced games, even the best ones.

TN is all about speed while looking crap and still loosing to OLED in motion clarity.

LCD is kinda dated tech, self emitting pixels is the future.

LCD should only be used in low to mid-end stuff.

OLED is for high-end stuff, just look at phones and TVs. OLED dominate the high-end for a reason.

LCD still has a place for mostly 2D work like text work and coding but for gaming and video? No way it even comes close to OLED.

The motion clarity on 240 Hz OLED is much better than LCD running at 480 Hz. Tried both. Even with BFI LCD lost and image was dimmer than OLED when BFI was enabled.

OLED generally don't need BFI to deliver insane motion clarity and refreshes perfectly frame for frame, where LCD is smearing frames together unevenly. Especially true for VA but a problem on all LCD panels.
 
Last edited:
I can't tell for sure how badly the screen burning is damaging the SteamDeck OLED's screen but I own a Oneplus 8 Pro for at least 3 years and after my recent heavy play on the device with MtG:Arena I finally have burnt the HUD elements on the phone screen. It took exclusively running the game but it didn't take as long as the deck does so I ended up with damaged screen in around 300-450h of this useage type. Before I've played Cod:Mobile a lot but the static hud of card game has done the dmg.
I own a RoG Ally but it uses lcd screen, however I purchased 13'3 portable OLED screen from Asus but I don't use it for d8splaying windows desktop like I'd do with a LCD.
 
OLED TV's, no matter the price, are running into this same issue if they are used enough.
 
OLED TV's, no matter the price, are running into this same issue if they are used enough.
True but the elephant in the room is that SDR at 600 nits. Most Tvs oleds cap out at almost 1/3 that value. Moderns Tvs have the ability to refresh the pixels as well. Some modern oled monitors also come with 3 year warranty for burn in. LG has a 5 year warranty on G and Z lineups as well.
 
Back