2015 was the first year that 4K TVs and monitors began to gain serious traction, but that’s not stopping businesses at the top. Sharp has announced that it will launch 8K displays by the end of October, with its first 85-inch displays selling for roughly $125,000 apiece. These panels aren’t headed for the mainstream consumer market, however — instead, they’ll be snapped up by public broadcasters like NHK, which plans to test its first 8K broadcasts next year and wants to have regular services ready for the 2020 Tokyo Olympics.
With 8K apparently hurrying close on the heels of 4K, could we see a rapid transition between the two standards, either in multimedia or gaming?
Probably not. Consumer sets aren’t expected to actually enter the market until 2018, with these early panels designated for business purchases and further 8K standard development. More than that, however, there are issues of cost, scale, and need to be considered. First, there’s the fact that at least in the United States, a significant amount of HDTV programming still relies on either 720p or 1080i. Companies like Comcast are slowly switching over to H.264 — even though the H.265 standard has been finalized, and H.265 Blu-ray discs in 4K resolution will start to hit shelves by this Christmas.
It’s always possible that cable networks could skip 1080p altogether and leap from 1080i to 4K, but it frankly seems unlikely. Without H.265, the bandwidth requirements for 4K over H.264 would be a huge increase over and above the current MPEG-2 1080i / 720p standard that most cable companies use. That doesn’t mean we won’t see 4K content — satellite companies and Video on Demand networks are already moving in this direction, with services like Netflix now offering 4K streaming on certain TVs. The entire content push is nascent, however, and the industry isn’t going to spend several years spinning up on 4K production just to start 8K up in 2018, when the first commercial sets are expected to be available.
Shooting and producing in 8K can still be valuable for capturing fine-grained detail or for later downsampling. Start editing with 8K and you’ve got more room to trim or correct errors in the image without compromising shot quality. Expect 8K adoption in the studio long before we ever see it on consumer screens or content feeds.
What about gaming or other content?
The other option for this kind of resolution would be gaming or smartphone displays. In each case, there are profound barriers to adoption. We recently examined the amount of power it takes to render each frame of our Metro Last Light Redux benchmark in our R9 Nano coverage, and that graph is worth checking out again in this context:
In Last Light Redux, it takes almost exactly 4x as much power to draw a 4K screen as it does to draw 1080p. That makes sense, considering that 4K panels have 4x the pixels. Nvidia has a general advantage, but the gap between the two companies isn’t that big. A hypothetical 8K display would therefore require an astronomical 75W of power per frame if we use the GTX 980 Ti as a baseline. 30 FPS at 8K? Get ready for a 2.2kW power draw.
Even if we assume that 14nm draws 35-50% less power than 28nm, that still puts our hypothetical 8K render-station at 1400-1650W. Furthermore, it’ll be 2016 by the time 14nm is ready for GPUs, which puts graphics cards on a four-year cadence to deliver this kind of power consumption improvement. At that rate, the next major reduction hits in 2020, and only cuts the power consumption required for 8K performance @ 30 FPS to less than 1KW. If you want to game at the 300-400W power envelopes current cards provide, it’ll happen between 2024-2028 depending on how optimistic one is about the process. Of course, it’s always technically possible that we’ll invent a new type of semiconductor, or discover that pizza sauce is actually a superconductor at room temperature, thereby throwing all previous metrics out the window, but absent such radical innovations, we can take a pretty good guess at what the future looks like.
Meanwhile, other content runs into an even simpler problem — at high resolutions, as shown in the chart above, the human eye is no longer capable of perceiving individual pixels. Any increase in pixel density past that point is wasted — you can’t see what you can’t see. Given that those invisible pixels still suck down battery power and create waste heat, there’s a good reason not to stuff pixel densities in smartphones and tablets beyond what even perfect human vision can resolve.
There are ways to improve visual quality that don’t rely on relentlessly pushing higher resolutions. Better color gradients and dynamic range would both qualify, as would technologies like OLED (if it can ever get off the ground). We’re all for better screens — but resolution is just one way to improve them, and 4K TV panels will qualify as “perfect” for the overwhelming majority of consumers.
No comments:
Post a Comment