Alienware is bringing the first QD-OLED monitor to market for $1,300

mongeese

Posts: 643   +123
Staff
In a nutshell: At face value, the new Alienware AW3423DW is a great monitor. It has a 175 Hz refresh rate and an ultrawide 3440 x 1400 resolution. In typical Alienware fashion, it has a steep 1800R curve too. However, its selling point feature is its Quantum Dot OLED panel, which Alienware says will change the game.

At CES 2022, Samsung created a buzz around QD-OLED televisions with several partnered brands, but Alienware stood alone as the only company offering a monitor with the technology. It might have a PR-driven name, but quantum dots are an impressive display innovation.

In brief, they're a filtering layer that converts the blue light emitted by the OLED panel underneath into the red, green, or blue subpixels needed for an image. In theory, quantum dots allow more light to pass through than traditional filters do, and they release light omnidirectionally so that the viewing angles are better. Best of all, they don't get in the way of all the benefits of OLED. The AW3423DW has a blazing 0.1 ms response time and pure blacks like other OLED monitors. It should also have minimal input lag, but Alienware hasn't put a number on it yet.

Alienware's new gaming monitor has 99.3-percent DCI-P3 and 149-percent sRGB coverage, a typical brightness of 250 nits and a peak brightness of 1000 nits, and VESA DisplayHDR 400 certification. In our opinion, that's not good enough for a true HDR experience, but it's still a good viewing experience.

Importantly, it also has a three-year warranty for the panel that covers burn-in. Alienware and Samsung have both said that QD-OLED displays are "resistant" to burn-in, but that could mean a lot of things, so don't lose the receipt.

Perhaps the best feature of the AW3423DW is its price: $1,299. it's a lot, but it’s considerably less than other OLED monitors and one of only a handful designed with gamers in mind.

As is often the case with flagship monitors, this one sounds impressive on paper. Still, we strongly recommend reading a few reviews before picking one up because of its use of new technology. That said, if it takes your fancy, it launches on March 29.

Permalink to story.

 
Yet another break-through monitor, without proper HDR support. Hard pass.
Proper HDR isn't super important to me. I just don't want an edge lit display. 1000 nits on an OLED is nice, but I'd be worried about longevity at that point. I know burn on isn't the issue it once was but I've noticed OLEDs get a brown tint after awhile
 
Proper HDR isn't super important to me. I just don't want an edge lit display. 1000 nits on an OLED is nice, but I'd be worried about longevity at that point. I know burn on isn't the issue it once was but I've noticed OLEDs get a brown tint after awhile

What do you mean it is not proper HDR?
 
Well, if Alienware says this new monitor is a game changer then it must be true! Come to think of it, OLED screens have been a "game changer" for at least a decade now with no real drop in price or a fix for the burn-in issues. Let's be realistic here, OLED will never be the next big thing like all these companies keep harping about it. They're too expensive to produce and that pesky burn-in problem is a deal breaker for most people. Even if OLED screens were the price of VA or IPS panels I still wouldn't want one because of the burn-in, so yeah.
 
What do you mean it is not proper HDR?
It's only HDR400 certified meaning it's HDR performance is rubbish and most likely looks worse than just using standard SDR mode.

HDR600 monitors and above actually put some effort in and make HDR content work properly.

The only reason I'm a little sceptical this time is because it's QD-OLED meaning it might actually still work really well but just not get very bright on the overall image. Will have to wait for reviews to confirm.
 
OLED is stupid even for television. Now, as a monitor... enjoy your burnt in Start menu. We get it, the colors are beautiful. Yes. But it's not sustainable.

Unfortunately OLED will never solve this. That's like expecting TN panels to get good viewing angles. It's just not gonna happen because it's inherent of the tech.

Give us microLED. No, not blooming miniLED, I said microLED.

... until then, there's nothing to see here. Give us some good IPS monitors and sit tight.
 
It's only HDR400 certified meaning it's HDR performance is rubbish and most likely looks worse than just using standard SDR mode.

It's very hard for OLED panel to maintain brightness, on the other hand OLED panel has infinite contrast ratio so unlike LCD panel you don't need super brightness to highlight the HDR.
https://www.rtings.com/tv/reviews/lg/c1-oled
Look at C1 rtings review, the HDR brightness is more or less in light with this monitor. In fact I bet this monitor will actually have better HDR brightness. Rtings still rated C1 as exceptionally well at HDR.
 
It's very hard for OLED panel to maintain brightness, on the other hand OLED panel has infinite contrast ratio so unlike LCD panel you don't need super brightness to highlight the HDR.
https://www.rtings.com/tv/reviews/lg/c1-oled
Look at C1 rtings review, the HDR brightness is more or less in light with this monitor. In fact I bet this monitor will actually have better HDR brightness. Rtings still rated C1 as exceptionally well at HDR.
Regardless, it still needs high brightness because in movies you will rarely have pure blacks.

The C1 has OK HDR, nothing great:
"HDR brightness is okay. As you can see in the EOTF, the overall brightness is on-target, but it may not be bright enough to hit the brightest highlights. "

This should on paper have higher peak brightness, but it seems the sustained brightness is lower. Hopefully I'm proven wrong.
 
Proper HDR isn't super important to me. I just don't want an edge lit display. 1000 nits on an OLED is nice, but I'd be worried about longevity at that point. I know burn on isn't the issue it once was but I've noticed OLEDs get a brown tint after awhile
You get up to 250 nits in SD space, 1000 nits peak you get while watching HDR content. So, calibration wise, 250 nits is too much, 120 is the target @6500. With HDR, 1000 nits peak fulfills HDR10 spec. First OLED TVs manged 800 nits ( for HDR, on a much larger surface area) and they could get brown hue, when nothing is displayed, over time.
People bought Plasma screens left and right and were super happy with them having burn in after a few months, burning the paint of the walls behind them, heating the room during the summer etc.

Now y'all are just snowflakes like :D
 
It's only HDR400 certified meaning it's HDR performance is rubbish and most likely looks worse than just using standard SDR mode.

HDR600 monitors and above actually put some effort in and make HDR content work properly.

The only reason I'm a little sceptical this time is because it's QD-OLED meaning it might actually still work really well but just not get very bright on the overall image. Will have to wait for reviews to confirm.
No it's HDR10 certified
 
I know everyone talks about OLED burn in, but does anyone remember when LCD panels first came out? They had burn issues as well and they figured that out. Maybe OLED will figure that out, too.
 
Proper HDR isn't super important to me. I just don't want an edge lit display. 1000 nits on an OLED is nice, but I'd be worried about longevity at that point. I know burn on isn't the issue it once was but I've noticed OLEDs get a brown tint after awhile
Perhaps you already know this, but the tech is a bit different. As I understand it, and as it states in the article, the OLED part is not the part that actually emits the light. Don't get me wrong, however, the OLED does emit light. Its just that the OLED is emitting light that then excites the quantum dot and it is light from the QD that is seen as the display output. In a way, it is similar to a tube display or plasma display where, in the Tube display, it was a phosphor excited by an electron beam that produced the light that produced the picture. Here's a link for a better explanation - https://www.tomsguide.com/news/what-is-qd-oled-samsungs-next-gen-tv-display-explained
 
I know everyone talks about OLED burn in, but does anyone remember when LCD panels first came out? They had burn issues as well and they figured that out. Maybe OLED will figure that out, too.
Sure, buy one then. You be the first adopter :-D
 
Being HDR10 certified doesn't make it a true high end HDR monitor. It's still just an HDR400 display (meaning the brightness is still lacking).

All HDR certifications (400-1400, true black or not) require HDR10 support to be included.
What I am even more curious is PWM ... amoleds often have issues with low frequency PWM. That could be a much bigger issue than falsely claiming HDR or HDR10 support.
 
What I am even more curious is PWM ... amoleds often have issues with low frequency PWM. That could be a much bigger issue than falsely claiming HDR or HDR10 support.
Like every other first gen TV tech: you pay a lot to get something that should be better, but is full of compromises.
 
Being HDR10 certified doesn't make it a true high end HDR monitor. It's still just an HDR400 display (meaning the brightness is still lacking).

All HDR certifications (400-1400, true black or not) require HDR10 support to be included.
Having peak 1000 nits and 10bit color along with HDR 10 specification DOES make it HDR10 monitor. I don't know where do you get HDR400, 400 nits is peak for SDR (which this monitor does not have, it's 250), which is too much.

I really don't understand, can't you read?

PS. It's not high end HDR, That would be HDR10+, 12bit, etc. They don't claim that it is, afaik
 
Last edited:
Having peak 1000 nits and 10bit color along with HDR 10 specification DOES make it HDR10 monitor. I don't know where do you get HDR400, 400 nits is peak for SDR (which this monitor does not have, it's 250), which is too much.

I really don't understand, can't you read?

PS. It's not high end HDR, That would be HDR10+, 12bit, etc. They don't claim that it is, afaik
It seems you are confused about the names of certifications. Even the cheapest HDR screen is HDR10. The link you gave clearly states: DisplayHDR 400 True Black.

And nobody said anything about them claiming anything, people are just disappointed that it is only HDR400 for that kind of money. Even with high peak brightness, it means nothing if they are stuck at sub 500nits when it counts.

Even worse, the SDR brightness is 250 nits. That's what low end screens have.

From what I've read from different places it seems that the 1000 nits is for the 1% patch test and it drops significantly once it reaches 25%. It also seems to lack DP 1.4a support (only1.4) which could further affect HDR performance.

Like I said before: cool new tech, tons of compromises. It's best to skip the first gen stuff.
 
Last edited:
It seems you are confused about the names of certifications. Even the cheapest HDR screen is HDR10. The link you gave clearly states: DisplayHDR 400 True Black.

And nobody said anything about them claiming anything, people are just disappointed that it is only HDR400 for that kind of money. Even with high peak brightness, it means nothing if they are stuck at sub 500nits when it counts.
I was literally just replying to say exactly this, they linked us to the page that states its DisplayHDR400 and also states HDR10 is the default minimum spec to be considered "HDR".

I think they're just confused by the naming convention and to be honest, I don't blame them, HDR is actually a great technology and when done right, makes images really come to life, but it's been destroyed with bad marketing, terrible naming conventions and most screen manufacturers just shoe-horning in the lowest HDR spec into everything.

Don't get me wrong, I bet this screen does actually look really good, QD-OLED was praised by literally everyone who got to experience it earlier this year, I wouldn't even be worried about burn-in since it's got a 3 year warranty and the underlying tech of a QD-OLED is different from LG's WRGB panel's there's actually a fairly decent chance Samsung has further lowered burn-in issues.
 
Back