"HDMI 2.0 (the name going around the industry, despite the HDMI forum's decision to do away with version numbers for all HDMI products)" Why do away with version numbers? This seems like a recipe for confusing the hell out of uninformed consumers. Any rube can figure out that 2>1, but how are they supposed to know that their HDMI-equipped receiver doesn't have the right version of HDMI? It might not be a big issue now, but when 4K gains a little more traction and theres HDMI 2.0 stuff being sold along non-4k capable HDMI equipment, it'll be a mess.
Also, why haven't they standardized a resolution for 4k?
I spent almost two hours watching The Avengers 3-D on Sony's 84-inch 4K XBR-84X900 in the Manhattan Sony Style store over Thanksgiving weekend. Oddly enough, it was stashed downstairs in a corner room with no signs saying it was there. I had to ask three people before someone even knew what I was talking about! It was a thing of beauty. I'm not a fan of 3-D in general (the flicker is horrid), but this TV sold me on passive, full-resolution 3-D. Even switching to its 2-D mode was great.
Here's the thing, though. 4K is just one of the two UHD standards being pushed and developed. 8K @ 120fps gear is already being shown at trade shows next to 4K gear and the asian manufacturers are obviously not sitting still on 4K. 8K is already part of the Rec 2020 spec and is moving forward toward broadcast by Japan's NHK. Naturally, 8K will be more cost-prohibitive than 4K, but the same was true of 1080p vs 720p. It may take 7 years for 8K to be cheap enough, but it took 7 years for 1080p to be cheap as well.
Hm, when I started looking for an LCD TV (2008) 1080p was fractionally more expensive than 720p solutions (which used 1366x768). And isn't there always "the next thing" to look out for...? When 4k gets here it will be much cheaper than 8k for quite a while and while 8k is being tested (none of these resolutions is particularly "new"), that doesn't translate to faster adoption, cheap prices or anything like you seem to imply. :)
Yeah, I fully acknowledge that we won't know for sure until 8K sets start hitting stores, but I do know for a fact that Japan is pushing 8K adoption hard - the display and the OTA transmission tech. It's not like we saw 4K 6 years before 8K - these standards have been rolled out near simultaneously with hardware to show within 1-2 years apart. Actual production isn't far behind.
It's totally my opinion - I'd like to think it's an educated one - that 4K will be a stopgap and quickly be replaced by 8K - at least in the living room/HT. I'm sure 4K will continue in computer monitors, laptops, tablets, and probably even some crazy smartphones for a good long while.
Should we see who wins the bet 5 years from now? LOL
The problem with 8K (for a home at least) is a matter of where you are going to put it? Seriously, If you sit across an average 15' room from your TV, and had it set at a 'retina' resolution for that distance then you are talking about a TV that is taller than your average 8' ceiling, and more than 20' across! That is truly massive, and there is no way I am going to be caught lifting such a thing (this is where projectors come in handy).
I think we will see 4K become the standard for in-home use. A few rich people will get 8K to have their cinema wall, or to simply say that they have it, but 4K is plenty of resolution across a room to make an 80+" TV a 'retina' display.
Personally, I am excited for the 4K revolution to hit simply because I will be able to get a ~30" 4K computer monitor with the same print-like visuals I get on my phone, and do away with things like AA in games because the pixels will be finer than the jaggies.
In my current configuration, I project 1080p at 110". It looks great starting about 6' away and I normally sit about 7-8' away. *gasp* I don't follow the rules on screen size calculators!!
However, I would rather project 8K at 175" - or maybe over 200" in another HT build if I build an addition. The great thing about 8K is that it is just about transparent to 70mm/65mm film. As more and more directors shoot IMAX or move to higher and higher resolution digital, 4K just won't cut it. (Doesn't cut it now, IMO.)
I'm not rich, but I will have 8K projection before 2020.
Last I knew Blu-ray adoption is still under DVD. So why would anyone think that 4K is what the average consumer wants? I just don't see it. Sure, it might be a better resolution, but if the average Joe can't tell the difference, there won't be much take up on the consumer side.
Not only that, but the horrible compression that cable co's put on 720p/1080i signals is already pretty bad. I don't know how much 4K with H.265 can fix that.
Blu-ray adoption has been slow, NOT because people don't want 1080p movies but because (a) they don't want to be gouged in paying for them. Until recently, Blu-ray discs cost a LOT more than DVDs. (b) the DRM on them is even more hassle and more painful than DVDs.
A more interesting data point would be to compare, on iTunes, how many people buy/rent content in SD vs HD format. I don't know if Apple has ever released those numbers.
I think playing the "consumers can't tell the difference" card is dumb. People said the same thing (go back and look) about SD vs HD. People said the same thing about retina-quality displays. 1080p on a 42" screen (what I have) is obviously not as crisp as real life, even what viewed from a few feet away. The issue is not "no-one can tell the quality difference", the issue is what content will be available, at what price, under what conditions.
The first big problem is broadcast. There is no obvious 4K broadcast plan, and given how badly the cable co's handle HD, don't rely on them as a source of 4K content. So discs? Good luck with that dream. Sony may feel they want a near-death experience all over again shipping Blu-Ray-4K (or whatever they're called) drives in PS4s, and charging $50 for the discs, but I suspect even fewer people will bite than with Blu-Ray. So it comes down to network delivery. Which means (a) how many people have a good enough internet connection? (b) who has an h.265 decoder in some sort of box?
Which in turn (IMHO) means that nothing REAL happens until Apple feels like making it happen. Apple will, one day, bless the whole enterprise, announcing 4K content in iTunes store, h.265 decoders in the newest iOS devices (and, maybe via some sort of HW + using the GPU, in macs), HDMI2 output, maybe even retina iMacs so they can display 4K content without shrinking it down.
"Unlike 3D, 4K has legitimate industry uses in the medical imaging and IP surveillance industry"
Sorry but there are legitimate industry uses in medical imaging for 3D, molecular modeling is a natural application for 3D and can be applied to a wide variety of medical disciplines.
Also, the article makes it sound as if 3D is going to be phased out completely. I imagine most if not all of these high-end 4K panels will still support 3D in some form or another. I'm excited because the 4K native resolution and higher bandwidth specs of HDMI 2.0 mean we should finally get full 2x1080p@60Hz in 3D in the HDTV space.
There is a difference between 'can be applied' and 'can be effectively applied'. The problem is that almost all 3D technologies have some drawback or the other. Things which really help molecular modeling / visualization such as Planar 3D's tech [ http://www.planar3d.com/ ] haven't been adopted by 3D TV manufacturers.
Considering '3D being phased out' is akin to 'SD is now phased out because everything is HD'. The ground reality seems to be that TV manufacturers have realized that 3D no longer attracts consumers [ http://www.theverge.com/2013/1/8/3852452/death-of-... ] and that something better needs to be done.
"Contrary to popular belief, there is really no dearth of 4K content since most professional videographing solutions have been 4K capable for a number of years. It is a simple matter of bringing that content in the right format to the end-consumer."
So... there *is* a dearth of 4K content. No one cares how much source content is out there that was produced in 4K+ resolutions. If that content is not available to the end-consumer in some form, then, as far as the end-consumer is concerned, there is a dearth of 4K content.
There is currently no delivery methods for 4K, and I don't see that changing anytime soon. Cable companies currently have trouble delivering even 720p content without compression artifacts, let alone 1080i, and I don't think anyone does full 1080p. And they're going to get up to 4K broadcasts? Right. The bandwidth for people to stream 4K (or download the files in a reasonable amount of time) isn't there, and even a few years from now it's not likely to be widely available. Discs? I personally buy Blu-rays (when I can find the movies on sale for a non rip-off price), but Blu-ray isn't widely adopted, and a 4K disc solution will probably be very niche.
All-in-all, I don't see a bright future for 4K probably for at least a decade. I work in the TV industry in people's homes all day, and I can tell you: there are still a lot of people who'd rather watch 480i on their flatscreen TVs than pay $10/mo for HD. A lot.
I never mentioned that consumers have plenty of 4K content. The truth is: there is lots of 4K content waiting to reach consumers. There is no point putting 4K content amongst consumers when there is no cost-effective 4K display available.
You can already create 4K videos from a GoPro Hero 3 Black, and it costs less than $400. Compare that with when 1080p HD was introduced. Camcorders recording HD cost more than $1000, and 1080p60 wasn't available for less than $500 until 2 years ago.
The shift from SD to HD and HD to 4K is not going to follow the same timeframes primarily because of the cost-effectiveness with which content can be created by the common man (read, GoPro Hero lineup), and the ease with which it can be distributed (Internet downloads). These two factors didn't start playing a major role in the SD to HD transition until 2008 and later.
I predict 4Kp24 / 4Kp30 content to become popular and accessible by early 2015. 4Kp60 access for the common man, on the other hand, might be around 2018 - 2020.
On a side note, linear TV will become less and less relevant. Cord cutting might or might not exist, but the fact is that people are becoming less and less reliant on their cable or satellite provider for HD content (Netflix provides 1080p streaming for many titles, and there are quite a few OTA channels in the USA broadcasting in HD - 720p60 or 1080i60)
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
14 Comments
Back to Article
Activate: AMD - Tuesday, January 8, 2013 - link
"HDMI 2.0 (the name going around the industry, despite the HDMI forum's decision to do away with version numbers for all HDMI products)"Why do away with version numbers? This seems like a recipe for confusing the hell out of uninformed consumers. Any rube can figure out that 2>1, but how are they supposed to know that their HDMI-equipped receiver doesn't have the right version of HDMI? It might not be a big issue now, but when 4K gains a little more traction and theres HDMI 2.0 stuff being sold along non-4k capable HDMI equipment, it'll be a mess.
Also, why haven't they standardized a resolution for 4k?
nathanddrews - Tuesday, January 8, 2013 - link
I spent almost two hours watching The Avengers 3-D on Sony's 84-inch 4K XBR-84X900 in the Manhattan Sony Style store over Thanksgiving weekend. Oddly enough, it was stashed downstairs in a corner room with no signs saying it was there. I had to ask three people before someone even knew what I was talking about! It was a thing of beauty. I'm not a fan of 3-D in general (the flicker is horrid), but this TV sold me on passive, full-resolution 3-D. Even switching to its 2-D mode was great.Here's the thing, though. 4K is just one of the two UHD standards being pushed and developed. 8K @ 120fps gear is already being shown at trade shows next to 4K gear and the asian manufacturers are obviously not sitting still on 4K. 8K is already part of the Rec 2020 spec and is moving forward toward broadcast by Japan's NHK. Naturally, 8K will be more cost-prohibitive than 4K, but the same was true of 1080p vs 720p. It may take 7 years for 8K to be cheap enough, but it took 7 years for 1080p to be cheap as well.
Death666Angel - Tuesday, January 8, 2013 - link
Hm, when I started looking for an LCD TV (2008) 1080p was fractionally more expensive than 720p solutions (which used 1366x768).And isn't there always "the next thing" to look out for...? When 4k gets here it will be much cheaper than 8k for quite a while and while 8k is being tested (none of these resolutions is particularly "new"), that doesn't translate to faster adoption, cheap prices or anything like you seem to imply. :)
nathanddrews - Tuesday, January 8, 2013 - link
Yeah, I fully acknowledge that we won't know for sure until 8K sets start hitting stores, but I do know for a fact that Japan is pushing 8K adoption hard - the display and the OTA transmission tech. It's not like we saw 4K 6 years before 8K - these standards have been rolled out near simultaneously with hardware to show within 1-2 years apart. Actual production isn't far behind.It's totally my opinion - I'd like to think it's an educated one - that 4K will be a stopgap and quickly be replaced by 8K - at least in the living room/HT. I'm sure 4K will continue in computer monitors, laptops, tablets, and probably even some crazy smartphones for a good long while.
Should we see who wins the bet 5 years from now? LOL
CaedenV - Tuesday, January 8, 2013 - link
The problem with 8K (for a home at least) is a matter of where you are going to put it?Seriously, If you sit across an average 15' room from your TV, and had it set at a 'retina' resolution for that distance then you are talking about a TV that is taller than your average 8' ceiling, and more than 20' across! That is truly massive, and there is no way I am going to be caught lifting such a thing (this is where projectors come in handy).
I think we will see 4K become the standard for in-home use. A few rich people will get 8K to have their cinema wall, or to simply say that they have it, but 4K is plenty of resolution across a room to make an 80+" TV a 'retina' display.
Personally, I am excited for the 4K revolution to hit simply because I will be able to get a ~30" 4K computer monitor with the same print-like visuals I get on my phone, and do away with things like AA in games because the pixels will be finer than the jaggies.
nathanddrews - Tuesday, January 8, 2013 - link
In my current configuration, I project 1080p at 110". It looks great starting about 6' away and I normally sit about 7-8' away. *gasp* I don't follow the rules on screen size calculators!!However, I would rather project 8K at 175" - or maybe over 200" in another HT build if I build an addition. The great thing about 8K is that it is just about transparent to 70mm/65mm film. As more and more directors shoot IMAX or move to higher and higher resolution digital, 4K just won't cut it. (Doesn't cut it now, IMO.)
I'm not rich, but I will have 8K projection before 2020.
danjw - Tuesday, January 8, 2013 - link
Last I knew Blu-ray adoption is still under DVD. So why would anyone think that 4K is what the average consumer wants? I just don't see it. Sure, it might be a better resolution, but if the average Joe can't tell the difference, there won't be much take up on the consumer side.Pneumothorax - Tuesday, January 8, 2013 - link
Not only that, but the horrible compression that cable co's put on 720p/1080i signals is already pretty bad. I don't know how much 4K with H.265 can fix that.FeelLicks - Tuesday, January 8, 2013 - link
That is because back in the DVD days you didn't have online options like iTunes/Amazon or streaming options such as Hulu/Netflix, etc.I think it is pretty safe to say the average consumer wants 720p/1080p minimum these days. And I can def see that move to 4K depending on prices.
name99 - Tuesday, January 8, 2013 - link
Blu-ray adoption has been slow, NOT because people don't want 1080p movies but because(a) they don't want to be gouged in paying for them. Until recently, Blu-ray discs cost a LOT more than DVDs.
(b) the DRM on them is even more hassle and more painful than DVDs.
A more interesting data point would be to compare, on iTunes, how many people buy/rent content in SD vs HD format.
I don't know if Apple has ever released those numbers.
I think playing the "consumers can't tell the difference" card is dumb. People said the same thing (go back and look) about SD vs HD. People said the same thing about retina-quality displays. 1080p on a 42" screen (what I have) is obviously not as crisp as real life, even what viewed from a few feet away. The issue is not "no-one can tell the quality difference", the issue is what content will be available, at what price, under what conditions.
The first big problem is broadcast. There is no obvious 4K broadcast plan, and given how badly the cable co's handle HD, don't rely on them as a source of 4K content.
So discs? Good luck with that dream. Sony may feel they want a near-death experience all over again shipping Blu-Ray-4K (or whatever they're called) drives in PS4s, and charging $50 for the discs, but I suspect even fewer people will bite than with Blu-Ray.
So it comes down to network delivery. Which means
(a) how many people have a good enough internet connection?
(b) who has an h.265 decoder in some sort of box?
Which in turn (IMHO) means that nothing REAL happens until Apple feels like making it happen. Apple will, one day, bless the whole enterprise, announcing 4K content in iTunes store, h.265 decoders in the newest iOS devices (and, maybe via some sort of HW + using the GPU, in macs), HDMI2 output, maybe even retina iMacs so they can display 4K content without shrinking it down.
chizow - Wednesday, January 9, 2013 - link
"Unlike 3D, 4K has legitimate industry uses in the medical imaging and IP surveillance industry"Sorry but there are legitimate industry uses in medical imaging for 3D, molecular modeling is a natural application for 3D and can be applied to a wide variety of medical disciplines.
Also, the article makes it sound as if 3D is going to be phased out completely. I imagine most if not all of these high-end 4K panels will still support 3D in some form or another. I'm excited because the 4K native resolution and higher bandwidth specs of HDMI 2.0 mean we should finally get full 2x1080p@60Hz in 3D in the HDTV space.
ganeshts - Wednesday, January 9, 2013 - link
There is a difference between 'can be applied' and 'can be effectively applied'. The problem is that almost all 3D technologies have some drawback or the other. Things which really help molecular modeling / visualization such as Planar 3D's tech [ http://www.planar3d.com/ ] haven't been adopted by 3D TV manufacturers.Considering '3D being phased out' is akin to 'SD is now phased out because everything is HD'. The ground reality seems to be that TV manufacturers have realized that 3D no longer attracts consumers [ http://www.theverge.com/2013/1/8/3852452/death-of-... ] and that something better needs to be done.
kyuu - Wednesday, January 9, 2013 - link
"Contrary to popular belief, there is really no dearth of 4K content since most professional videographing solutions have been 4K capable for a number of years. It is a simple matter of bringing that content in the right format to the end-consumer."So... there *is* a dearth of 4K content. No one cares how much source content is out there that was produced in 4K+ resolutions. If that content is not available to the end-consumer in some form, then, as far as the end-consumer is concerned, there is a dearth of 4K content.
There is currently no delivery methods for 4K, and I don't see that changing anytime soon. Cable companies currently have trouble delivering even 720p content without compression artifacts, let alone 1080i, and I don't think anyone does full 1080p. And they're going to get up to 4K broadcasts? Right. The bandwidth for people to stream 4K (or download the files in a reasonable amount of time) isn't there, and even a few years from now it's not likely to be widely available. Discs? I personally buy Blu-rays (when I can find the movies on sale for a non rip-off price), but Blu-ray isn't widely adopted, and a 4K disc solution will probably be very niche.
All-in-all, I don't see a bright future for 4K probably for at least a decade. I work in the TV industry in people's homes all day, and I can tell you: there are still a lot of people who'd rather watch 480i on their flatscreen TVs than pay $10/mo for HD. A lot.
ganeshts - Wednesday, January 9, 2013 - link
I never mentioned that consumers have plenty of 4K content. The truth is: there is lots of 4K content waiting to reach consumers. There is no point putting 4K content amongst consumers when there is no cost-effective 4K display available.You can already create 4K videos from a GoPro Hero 3 Black, and it costs less than $400. Compare that with when 1080p HD was introduced. Camcorders recording HD cost more than $1000, and 1080p60 wasn't available for less than $500 until 2 years ago.
The shift from SD to HD and HD to 4K is not going to follow the same timeframes primarily because of the cost-effectiveness with which content can be created by the common man (read, GoPro Hero lineup), and the ease with which it can be distributed (Internet downloads). These two factors didn't start playing a major role in the SD to HD transition until 2008 and later.
I predict 4Kp24 / 4Kp30 content to become popular and accessible by early 2015. 4Kp60 access for the common man, on the other hand, might be around 2018 - 2020.
On a side note, linear TV will become less and less relevant. Cord cutting might or might not exist, but the fact is that people are becoming less and less reliant on their cable or satellite provider for HD content (Netflix provides 1080p streaming for many titles, and there are quite a few OTA channels in the USA broadcasting in HD - 720p60 or 1080i60)