4K resolution

myrmidonks

Posts: 48   +0
I was wondering if people could post their FPS while watching 4K video on YouTube or other 4K sites with the specs of their GPU as well.

To start:

Video: Surf NYC (4K resolution)
http://www.youtube.com/watch?v=9dgSa4wmMzk

CPU: Intel Core i7-920 (D0) @ 2.66Ghz (stock)

RAM: 8 gb DDR3 @ 1066 MHz

GPU: Visiontek Radeon HD 5870
Core clock: 850MHz
Memory clock: 1200 MHz

Using Fraps, I got these results:

Min. Max. Avg. (frames per second)
14 28 20.992

I'm curious to see what other people get as results. I believe an alternative to using Fraps is right clicking in the video window and click "show video info". (YouTube). However this does not show an average frame rate, and only updates once a second.
 
1fps. 100% CPU usage, 5% GPU usage. notebook 9600M GT.

-edit:

Original resolution at 4096x1706. IE8 / FF 3.6.11 / FF 4b6.
 
24 fps avg.

Phenom 9850 be 2.5 ghz.
19 inch 16.10 monitor.. 1440 x 900
BFG gtx 275 oc edition.
 
Looks like a constant 25 fps
24 inch monitor at 1920x1200.
HD mode and fullscreen look like 25 fps also - occassional fluctuation to 26 fps.
Did not use fraps - just "show video info".

This was on my i7 940 w/ 4850
 
Got an average of 25 FPS on both
HD/Original Full screen on 1650x1080
Phenom II x4 955 with 4770
Downloaded extremely fast too :D .
 
1080p -fullscreen
Using "Show video info", I got:
Min. 3 fps
Avg. 11 fps
Max. 25 fps

Original -fullscreen
Using "Show video info", I got:
Min. 0 fps
Avg. 0 fps
Max. 0 fps

19 in. Monitor at: 1024 x 768
Mozilla Firefox 3.6.11 (latest version)
--------------------------------
CPU usage ~98%
GPU usage unknown

CPU: Intel Pentium 4 @ 2.8Ghz (stock)
GPU: stock Dell Radeon X300 128 MB
RAM: 2.5 GB DDR2 @ 533? MHz
 
1080p -fullscreen

w/ 19 in. Monitor at: 1024 x 768

Using "Show video info", I got:
Min. 3 fps
Avg. 11 fps
Max. 25 fps

CPU usage was ~98%
GPU usage unknown

CPU: Intel Pentium 4 @ 2.8Ghz (stock)
GPU: stock Dell Radeon X300 128 MB
RAM: 2.5 GB DDR2 @ 533? MHz

Try "Original" instead of "1080p". :D
 
It's interesting seeing similar setups getting different results. Maybe its the browser used or flash version or driver differences.
 
In response to nismo91, I updated my original post.
It now reflects both 1080p and original quality,
also added that I ran that set from Firefox and this set from IE (IE was actually a little faster on 1080p)
*Flash was updated on both browsers last week
** Nice slide show on original quality

1080p -fullscreen
Using "Show video info", I got:
Min. 4 fps
Avg. 18 fps
Max. 27 fps

Original -fullscreen
Using "Show video info", I got:
Min. 0 fps
Avg. 0 fps
Max. 0 fps

19 in. Monitor at: 1024 x 768
Internet Explorer 8 (up to date)
--------------------------------
CPU usage ~95%
GPU usage unknown

CPU: Intel Pentium 4 @ 2.8Ghz (stock)
GPU: stock Dell Radeon X300 128 MB
RAM: 2.5 GB DDR2 @ 533 MHz
 
I've tried updating to Adobe Flash 10.2.xx beta. And use Firefox 4 beta 6, Firefox 3.6.11, and IE 8 for Vista SP2. Increased from average of 1fps to 4fps. and up to 15% GPU usage.

I even resorted in downloading the 220MB movie and play it in MPC.
- 300MB video memory used
- average of 15fps
- GPU acceleration enabled (DXVA), 500MB memory used and driver stopped working :D
 
Forgive my ignorance, but I thought that the term, "4000K" referred to the "average data transfer rate", and had nothing to do with "resolution".

"Resolution" attaching itself to describe how many pixels are on the screen, and in what order.

Or substitute the term, "bandwidth", for "data transfer rate" which everybody is so fond of nowadays. As in KBs (kilobytes per second).
 
Forgive my ignorance, but I thought that the term, "4000K" referred to the "average data transfer rate", and had nothing to do with "resolution".

"Resolution" attaching itself to describe how many pixels are on the screen, and in what order.

Or substitute the term, "bandwidth", for "data transfer rate" which everybody is so fond of nowadays. As in KBs (kilobytes per second).
The whole thing is a bit 'off'. The max width now on videos (this one) is 4096 pixels, the max height is 3072 pixels. The height on this is 1706 pixels.

In the HDTV world Youtube's format could possibly be called 3072p. But I guess 4000 sounds better than 3000 when you are talking high numbers.

Edit: I just looked on wikipedia, and apparently this is one of the "extra high definition video modes". It threw me off a bit because it apparently breaks what I thought was standard - using the vertical resolution with an "i" or "p" after it to indicate interlaced or progressive scan.
 
And what exactly would you show these on? The VGA would have to dither the display down, (at least) 2:1. What's the point?

We a similar discussion to this about wallpaper. Your stand was, "wallpaper should be at the native screen resolution". In this case, I took the other side with, "no, you can use up to 2X screen res without artifacting, the video card will dither it down". There is a big difference displaying wallpaper, since there is no frame to frame differential information.

That said, 2X screen resolution equals 4 times the pixel count, hence the difficulty of maintaining frame rate. The only way you could show this at 1:1 would be (barely) on 4 monitors stacked 2 on top of 2. Does this make any sense?
 
Well, the first big screen plasma TVs from Pioneer were $10,000+ when they hit the consumer market. That said, with offerings @ $76,000 (Sony 56"), and Panasonic @ $1/2 million plus,(152"), it seems the trickle down might take quite a bit longer on these babies.

But, that doesn't address itself to the this fact. With the panels we have now, the signal has to be "downsampled" 2: 1 (Appx) to be displayed.

Upsampling has a lot more practical application, than taking half the resolution and throwing it away, while meanwhile requiring that much bandwidth to transmit.

I suppose it does point to the fact that video codecs have become much more compact, so that you can download it at all.
 
Back