Intel: Ivy Bridge GPU to support 4K resolutions

Shawn Knight

Posts: 15,294   +192
Staff member

Intel quietly announced at IDF this past week that Ivy Bridge will support 4K video resolutions using the integrated GPU. Intel broke the news in one of their side technical sessions at the conference, according to VR Zone.

4K resolution means that the GPU (and supporting monitor) can run a video stream at up to 4096 x 4096 pixels (known as 4Kx4K), a feat that Intel claims Ivy Bridge can do with ease thanks to their Multi-Format Codec Engine called MFX.

intel ivy bridge gpu

We already knew that the GPU in Ivy Bridge would be fast, up to 60 percent faster than Sandy Bridge in some scenarios, but being able to run 4K video is pretty significant. And if that weren’t enough, chipzilla says that their next-generation platform can run multiple 4K videos simultaneously.  

YouTube announced support for 4K video resolution in July 2010 and we have used this sample video run in “Original” mode as an informal measuring stick in several of our notebook reviews. The video is extremely taxing on the GPU and CPU of modern computers and you will need a fast Internet connection, lest you want to waste half your day buffering the video.

Anandtech points out that current Sandy Bridge GPUs only support resolutions of up to 2560 x 1600 and a bump up to 4Kx4K has over four times the number of pixels. Furthermore, displays that support resolutions over 2560 x 1600 are extremely rare and expensive and the bandwidth needed to push 4Kx4K video isn’t feasible at a refresh rate of 60Hz.
 

Permalink to story.

 
If displays that support over 2560 x 1600 are extremely rare and expensive, why is Intel making a big deal over being able to support 4Kx4K? Dual 2560 x 1600 displays?
 
mevans336 said:
If displays that support over 2560 x 1600 are extremely rare and expensive, why is Intel making a big deal over being able to support 4Kx4K? Dual 2560 x 1600 displays?

I suppose its like car manufacturers bragging about top speed. Few people are going to take a Ferrari above 200 mph (and live to tell the tale) but still, it has that capacity.
 
I run a U3011, came from a 120hz 23" Acer.
Superior for everything, especially gaming but it does take decent power to game at 2560 X 1600.\\

I can't imagine 4K by 4K, my 570 would crap its pants.
 
gwailo247 said:
mevans336 said:
If displays that support over 2560 x 1600 are extremely rare and expensive, why is Intel making a big deal over being able to support 4Kx4K? Dual 2560 x 1600 displays?

I suppose its like car manufacturers bragging about top speed. Few people are going to take a Ferrari above 200 mph (and live to tell the tale) but still, it has that capacity.

Nice example but you missed a point, them saying that it supports a 4Kx4K its like a "benchmark" since its the "top" resolution it can handle (and what about framerate/bitrate? :facepawns:), the more power the Ivy can get, the more easily it will take lower resolution videos.

Its like using a last Generation CPU to encode a video or compressing some files, it will do the work extremely well, but it wont be using to the 100%
 
You guys look at it from the wrong perspective. It is the creation of such graphics that enables monitor manufacturers to go ahead and try create a monitor with such resolution.

Without such cards existing no monitor manufacturer would attempt a 4K resolution, because there would be no point. It is the way of the progress in the computer graphics, and not the other way round.
 
Just to make it clear, don't confuse the maximum desktop resolution (2560x1600) with maximum texture resolution (4096x4096). 4096x4096 is the maxmium texture size supported by GPU to be used in games for diffuse/bump/normal/height/specaular/shadow ... maps.

Doesn't Low/High quality texture options in game settings menu ring some bells? Lower and Higher texture resolution results in blurry textured models (like old Doom/Unreal games) and sharp textured models in new games (like Crysis, ...)

And as a side note, your brand new GPU already supports texture resolutions up to 8192x8192 and higher.
 
gwailo247 said:
mevans336 said:
If displays that support over 2560 x 1600 are extremely rare and expensive, why is Intel making a big deal over being able to support 4Kx4K? Dual 2560 x 1600 displays?

I suppose its like car manufacturers bragging about top speed. Few people are going to take a Ferrari above 200 mph (and live to tell the tale) but still, it has that capacity.
Nice analogy, very on point :D
 
Wow, something tells me that Ateme might be involved in the AVC hardware encoding/decoding part on this CPU. If that is the case, then this would be quite good. I can just see the new "refresh" on the "late" 2011 iMac's featuring the latest Intel Ivy Bridge. Probably the same will happen as with the Z68 chipset. Apple got a slice of it first, before we even knew about it. it is called BAU.
 
Maybe this has more to do with CPU flexibility than anything. If the GPU can handle such a workload, then this probably frees the rest of the processor to do other tasks.

Am I missing something?
 
This support for 4K by 3K res is actually something that can be claimed without the requirement to show it. BTW no current video is shot or edited in 4095 by 4096. All video resolutions are near either 4:3 or 16:9 ration, never 1:1. This link at Wikipedia show 4Kstandards:

http://en.wikipedia.org/wiki/4K_resolution
 
Back