Sharp announces 70-inch 8K TV, thoroughly outdating your new 4K set

By Shawn Knight · 52 replies
Sep 1, 2017
Post New Reply
  1. richcz3

    richcz3 TS Rookie Posts: 19   +9

    So the technical point that a TV needs to be xSize to appreciate xResolution to be viewed at 8 feet comes to mind. The pixel density to achieve 8k on a 70" TV would require you to sit how close to the TV?
     
  2. richcz3

    richcz3 TS Rookie Posts: 19   +9

    Ever look at the Steam stats page? You represent 00.90% of game players playing at 4k. His 4k "Barely Supported" carries more weight and has more merit. As for Console Game Developers on the topic of 4k - "The majority of respondents, 41 percent, were undecided on the benefits of the new consoles, while 36 percent felt neutral on the topic. For many, the issue seems to be the extra work involved in creating games for multiple versions of the same console, and the potential for splitting up the userbase." Remember - Even if you use a PC, like I do, the games you play on a PC are revisions of those designed for the console. 4K has a long ways to go before its truly mainstream. 8k is a pipe dream for now.
     
    cliffordcooley likes this.
  3. DaveBG

    DaveBG TS Addict Posts: 283   +90

    I do not have account on steam so I have no idea what their stats show but knowing some FPS gamers care about frame counter than picture quality that might be true. It still does not convince me that anything less than 4K is worth considering in late 2017 and 8K in very near future is "pipe dream"... but you can have whatever fits your liking.
     
  4. DaveBG

    DaveBG TS Addict Posts: 283   +90

    Keep being ordinary then, I am tired of people that do not understand new technology and do not care to even try no matter how simple it is explained to them.
     
  5. richcz3

    richcz3 TS Rookie Posts: 19   +9

    Steam Survey http://store.steampowered.com/hwsurvey/
    Steam has 125 Million users - When it comes to this survey, I also find myself on the low percentage scales on various stats. I play at 2560 x 1440 which is 2.66% of users. Multiple GeForce 1080 Ti cards 0.54% over 12+ GB Ram 25.39%. The whole idea being, what I can afford doesn't necessarily reflect what a majority of end users use.

    Producers, Developers, and Content Creators etc, as a general rule, gear their large productions to a 1-3 year release in mind. The biggest hurdle for 4k is the lack of Network or Cable adoption for the standard. Unlike the build up to HDTV (1080) roll out that had wide support even a government standards push, no such movement for wide adoption has taken place. Try streaming 4k movies on Netflix or Amazon. Bandwidth usage issues aside, we barely have the infrastructure (true full fiber) to support true HD 1080. 4k is truly a Build it and they will Come, problem is, very few Networks are buying into it.
     
  6. captaincranky

    captaincranky TechSpot Addict Posts: 12,693   +2,393

    Dave, there's nothing I enjoy more than a starting an argument which wins itself. You're a techno-snob, get over yourself.

    You seem to think people with less means and less interest in being, "the first kid on the block" to have the latest and greatest techno-trash, are somehow lesser human beings than yourself.

    Given the laundry list of equipment you bragged about owning, it's a conclusion nobody could avoid making.

    It's all a matter of perspective really. Buying sh!t isn't really a talent you could peddle on a resume, now is it?

    Hash tag..... #sorryaboutthat
     
    Last edited: Sep 2, 2017
    bluto 2050 likes this.
  7. DaveBG

    DaveBG TS Addict Posts: 283   +90

    Oh sweet irony! 4K is getting replaced by 8K and is technology that is many years old yet the old chap calls it new completely confirming what I suspected. :D

    Seems like someone has just got 1080 monitor and is making excuses? LOL
     
    Last edited: Sep 3, 2017
  8. captaincranky

    captaincranky TechSpot Addict Posts: 12,693   +2,393

    Oh Dave, you're just swell. Can I have your autograph? :p
     
  9. captaincranky

    captaincranky TechSpot Addict Posts: 12,693   +2,393

    Well kidz, all this TV really is, would be four 35" panel molds, pushed together. So where's the 'innovation'? In truth, there really isn't any.

    I used to sell Hi-Fi gear, (granted, a long, long time ago), and the Sharp brand, was pretty much a non-factor. And if I recall correctly, it was mostly low end stuff, huckstered by Kmart and the like. I never saw any of it in the wild, unless it was in a poor persons home who new nothing about TV or audio, and often had a few knob missing. (Yes, this was the era of manual tuners).

    With that said, there are any number of Asian fabs turning out excellent panels on the cheap, and I'm pretty sure if they put their minds to it, they could turn out this junk.

    So, for you gamers out there, what do you need for a quality experience @4K resolution, a GTX-1070 or 1080?

    Since this is, as I said earlier, simply four 35" 4K panels pushed together, it seems to me, you'd need GTX-1070's or GTX-1080's in quad SLI, to render the frame rates you're accustomed to now at 4K..
     
  10. EClyde

    EClyde TS Evangelist Posts: 1,105   +347

    I run a 8 or 9 year old 34 inch Wal-Mart Phillips HDTV and I can see her nipples just fine.
     
  11. captaincranky

    captaincranky TechSpot Addict Posts: 12,693   +2,393

    Hush you, or the real Techno-crats here will start belittling you for not keeping up with the times
     
  12. richcz3

    richcz3 TS Rookie Posts: 19   +9

    A 1080 Ti would be the starting point, but in reality even a Titan isn't going to give you a steady 60 fps. Gaming in 4k is very dependent on the game you are playing. 4k is very demanding and having all the right components like a very fast SSD and a fast OC'd CPU and memory would help. In reality, Ultra settings in many AAA games won't be an option. To get that resolution running at appreciable frame rates, other effects will need to be toned down.

    ..and before anyone says "XBox One X and PS4 Pro do 4k"; know that the picture fidelity is compromised and will be in future titles. Know that there will be set number of games designed and set to run in 4k, but just as there were quality caveats for gaming systems to run at 1080, there will be limitations when games run in 4k.
     
  13. captaincranky

    captaincranky TechSpot Addict Posts: 12,693   +2,393

    So this TV is exactly what I said it is, four 35" 4K panels shoved together, for the sake of a dog & pony show "look at us, look what we made" press release.

    Reasonably speaking then, You'd need all 4 1080 ti cards running in quad SLI @ PCI-E 16x to game effectively with it. Yes? No?
     
    Last edited: Sep 4, 2017
  14. cliffordcooley

    cliffordcooley TS Guardian Fighter Posts: 9,235   +3,309

    That's the way I take it.
     
  15. knckstdy213

    knckstdy213 TS Rookie

    I'm sure that it's just pixel density over actual features. What panel does it use? Does it have Dolby Vision or true HDR? If it can't replicate true colors or black levels properly then it's overcompensating.
     
  16. bluto 2050

    bluto 2050 TS Addict Posts: 271   +32

    I have 2 4K HDR wide color 10 bit panel (real) HDR TV''s that look MUCH better at any distance at all in my home on any content than the best 1080p 2014 retail TV on the planet I own also and unless you dispute that you can not prove anything you said and don't know about for real anyway LOL go fish



    I have the variables here up to 4K WCG HDR I mean 1500 NIT HDR and 1150 NIT SDR 4K or 2K with 7000:1 contrast and 1B colors possible not some pitiful 350 NIT 300 :1 contrast 1.6m color 1080P or 400- 500 NIT fake HDR TV or 400 NIT SDR 4K that aint nothing kike what I have on any content


    you may have no idea about this TV go fish .☻☺

    I know these *significant variables here and unqualified opinions aren't facts.

    FWIW most folks prefer the 20,000 NIT laboratory TV panels at Dolby LABS

    ,you know how Many NITS the 8K Sharp panel well be???

    I dont , but it will tone map what we have like today from HD/SDR to 4K HDR TV well

    ppfftt You never seen anthing properly on a 1080p or 4K SDR TV and especially any good cinema film or digital productions LOL





    and maybe sharot TV will be 5000,or 10,000 NITS ,and make a fool of the 8K panel naysayers when yoou conider Foxon /Sharp Display havs the most advanced TV panel fabs there are and kust who Terrty Gow is and his long games LOL

    and I ,mnot teach you .


    it sentient matter if you have enough folks have the wallet for it ,like $ 7-$10K withioutv a thiught of worry (cash too) like I do they will build 8K TV even if I don't buy one I an waiting for maybe 4K HDR Samsung Micro dot LED (QD-LED) now anyway not an 8K LCD that is still and passive display light spectrum filter /block for min /black ( LCD panel) but 8K TADF emitter OLED mat be more interesting today right now anyway
     
    Last edited: Sep 4, 2017
  17. captaincranky

    captaincranky TechSpot Addict Posts: 12,693   +2,393

    @bluto 2050 You know, every time I read one of your tirades, I reach for the Xanax. I'm not sure whether it's out of tribute, or sympathy.

    I would like to know, what in god's name would you do with a panel of 10,000 nits brightness? The first thing which springs to mind, is put a fresnel lens in front of it, and use it as a searchlight to guide anti-aircraft fire during a blackout
     
    Burty117, richcz3 and Skidmarksdeluxe like this.
  18. sac39507

    sac39507 TS Enthusiast Posts: 67   +18

    I hope this will drop the price of the few remaining 1080P 70 -75" screens to $200. Jackpot!
     
  19. bluto 2050

    bluto 2050 TS Addict Posts: 271   +32

    Moderator note, shouting has been removed thank you for the message about that I will note that going forward .

    Primarily for the benefit of others here :

    LOL 10,000 nit's is less than a yellow sunflowers 11,000 nit's of reflected sunlight at noon

    We do do not presently have visually ,realistically accurate colors or adequate color space or bright enough retail TV panels to even remotely be realistic at all .

    Electronic pixel and scan line display panels must all be individually electronically calibrated to room variables and conforming content standards like my new QDOT /SUHD 4K HDR 1000++ Samsung flagship TV binned for a Samsung Direct FA01 ultra black /ultra clear /moth-eye panel code of my choosing and the 4 other TVs and the PC panels here are calibrated and it matters .

    Out of the box retail TV are mostly dreadful or nearly so including my new flagship tier Samsung and 2015 Sony 4K HDR TV's and not usually accurate to conforming standards and it does matter trus`me !

    Noting we can see substantially wider 14 bit color than any 8 or 10 bit color retail TV panel has :

    Retail SDR /HD TV LCD panels only have bt709 - 1.6M color 8 bit native color space up to 1B native color HDR 10 bit DCI P3 cinema color space boundaries with or without LCD panel [+frc] spatial and temporal panel dithering to emulate a higher color native bit depth like any LCD panel does up from 256 colors nativity .

    ITU .bt 2020 or Rec 2020 is only 11 bit color ,we can see vastly wider 14 bit color

    [​IMG]

    To get the info here of course one needs to understand electronic pixel emission displays , like RGB/RGBW OLED , light spectrum RGB /RGBW LCD Cell and back light pixel panels and maybe historically scan line shadow mask or screen grid RGB phosphor CRT emission panels and what color bit depth represents here along with [+frc ] LCD panel dither . conforming standards , ITU color spaces and color volume for all this along with perceptual and actual human vision boundaries here ,

    Today's finest retail TV panels aren't even remotely close to human vision acuity ?

    As optical black applies to all of the universe and not philosophy it is not an additive color but wholly an absence of light that is impossible outside of a confined space with an absence of reflected or direct light or maybe a black hole in outer space or a void in an empty head ?

    An LCD panel simpy blocks back light spectrum to its minimum native and sometimes assisted native black levels like active edge dimming or direct lit FALD , Zone or or Frame dimming as calibrated and operated but never never an absolute black, an OLED panel does minimum black much,much better to figuratively an infinite black as it apples here .

    Where all this applies here a good 8K panel should have many more "croma gradations" visible than a 4K panel just like a 4K vs a 2K panel and it matters but you need at least a 1150 -1500 nit LCD panel or perhaps less with OLED to do this well ,otherwise go fish ?

    Chroma is not color but it may as well be LOL


    Chrominance or chroma, one of the two components of a television signal that supplement a brightness signal to represent a color
    -Wikiopinion -

    I have been told we can educate but we can not fix stupid ?


    Full Stop.
     
    Last edited: Sep 10, 2017
  20. bluto 2050

    bluto 2050 TS Addict Posts: 271   +32

    You can get 70" cheap like 4K SDR Vizio e70-e3 rubbish for 1,248.00 70" pocket lint now but thats a stretch getting a good picture @ 55" much less at 70" and good luck finding a 1080p 70" LOL
     
  21. bluto 2050

    bluto 2050 TS Addict Posts: 271   +32

    I see maybe outside of a cheap 4K SDR TV you may have seen or perhaps you have been reading uniformed misconceptions that do not account for vastly improved 4K LCD pixel YCbCr chroma gradations , color and near black low intensity details @upscaled 55" + 720/1080i content and @ 4K SDR or HDR content up to 25' distance observed with the finest 64" 1080p TV plasma for the 1080i image here and it was best overall 2013- 2014 TV at those annual V.E. hi end TV Shootouts that matter in the industry.


    The excellent 1080p Samsung plasma was compared to my much better Samsung direct FA01 panel code binned 2016 flagship tier Samsung 4K HDR 1000 ++ QDOT /SHHD TV and my arguably decent 2015 4K HDR Sony Truminous 55" Wide color 4K HDR LCD TV's and note this image upgrade equally applies to 720p / 1080p HDTV up scaled @ the TV to 2160p SDR for the panel and significantly upgraded 4:2:0 Chroma 720p/1080i image result

    The science tell is with a good TV & TV panel with * good contrast and *moreover adequate brightness and color volume there is lot more being upgraded than the pixel count which is necessary for the other image upgrades provoked by competently up-scaled 1080i/720p to 2160p you wont see in a Vizio or outside of Samsung,Sony ,LG ,EU Panasonic and Phillips tier one brands vis a vis a lot more chroma information than a native 720p/1080i picture


    again :Chroma is not color but it may as well be LOL

    Chrominance or chroma, one of the two components of a television signal that supplement a brightness signal to represent a color
    -Wikiopinion -
     
    Last edited: Sep 10, 2017
  22. captaincranky

    captaincranky TechSpot Addict Posts: 12,693   +2,393

    The essence of which is, "your TV is better than that TV".

    So basically, without "chrominance" or "chroma", you'd have gray scale. How profound.
     
  23. bluto 2050

    bluto 2050 TS Addict Posts: 271   +32

    Yo
    Earh to captaincranky,

    3 of my TVs are probably better than most in the wild but in any case you don't understand chroma on an LCD pr OLED panel at all or the related benefits I outlined above at 4K vs 2K and probably 8K on a good 8K panel and TV set but probably not this 8K TV so much as I can see

    This TV looks a low end panel in a low end trade show test mule ,if it were Samsung or LG that would be another mater entirely new TV panels are always lacking something until they shake out and that ain't cheap ask LGD about OLED TV


    OTOH pixel emission Micro Dot LED (QD-LED ) probably 4K HDR still is my new upgrade ship coming in at some point

    .FWIW chroma is resolution and resolution is chroma here at sub pixel manipulations instred is analog scan line demodulated chroma and neither are grey sale or color UR clueless LOL

    Frankly speaking you need some study ,pixel chroma is not grey scale and grey scale is not pixel chroma they are different data sets or analog demodulated signals ,you don't appear to *really understand any of this at any level that matters so why do you perhaps make yourself look like a fool around all this ?,

    "Grayscale is a range of shades of gray without apparent color."
    -Wikiopinion-

    "Chroma subsampling is the practice of encoding images by implementing less resolution for chroma information than for luma information, taking advantage of the human visual system's lower acuity for color differences than for luminance.
    It is used in many video encoding schemes – both analog and digital – and also in JPEG encoding"
    -Wikiopinion-


    Have you ever electronically calibrated a panel correctly or at all ,
    do you own a spectra scope or colorimeter that isn't from mid century NTSC scan line TV and if so, you know how to use it

    TBH I don't think so LOL


    Well Then Dr of all knowledge Captain Cranky, tell us when was they last you saw colored grey scale images or ramps not to be conflated with conforming color gradations ,color jpeg images and ramps outside of a panels faulty calibration ,vinaigretting , video processing in the TV or color banding making a grey scale push a tint ,banding or worse ?
     
    Last edited: Sep 10, 2017
  24. captaincranky

    captaincranky TechSpot Addict Posts: 12,693   +2,393

    [​IMG]
    Then there's this:
    [​IMG]
    That's a composite histogram. Separate individual color channel adjustments are also available

    Really? I've been hollering up to your ivory electronic tower for what is beginning to seem like an eternity?

    No, you need to dismount that high horse of yours, before you continue to souind like a pompous a**

    Keep in mind I've been around for a long, long time, and all of this crap you're spouting goes right to what the "tint" control on those old nasty analog TVs actually did. And that was, manage the magenta to green ratio, hopefully providing a neutral gray as a mean.

    I have absolutely no idea where you summon the hubris to believe a person needs all that equipment to understand color intensity or balance. The entire motion picture industry has a long running history, of using color balance, lighting, and contrast ratio, to establish mood and intent in practically every scene in a motion picture. And FWIW, they likely have some transient tech calibrate the panels before they begin the final editing. After which, the true art begins. Hopefully, those techs aren't as long winded and self absorbed as you .Otherwise, for the editing staff, it's going to be a long, miserable day.


    Laboratory calibration becomes sort of meaningless, in the face of all the signal tampering being done by TV stations, advertisers, and content providers.

    AFAIK, all that fancy equipment you're espousing being able to operate, is available for QC and calibration right on the assembly line, and could be done by robots.

    Besides, each and every time you open any of the Adobe imaging programs, you are provided with the monitor's (supposed) calibration profile..
     
  25. bluto 2050

    bluto 2050 TS Addict Posts: 271   +32

     

Similar Topics

Add New Comment

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...