It’s difficult to avoid all of the hype surrounding ‘4K this’ and ‘4K that’, but does the availability of entry model 4K televisions mean that the 4K standard has truly made its arrival? Technically, ‘4K’ is a buzzword that has been replaced with the proper definition of ‘Ultra HD’ since the Consumer Electronics Association defined the resolution to be 3,840×2,160. The notion of ‘4K’ comes from the standardization of a full HD display, 1920×1080, into the value of K and thus Ultra HD was the mathematic inevitability moving forward from 1K. However, the issue at hand is the question of whether or not the media and supplemental devices have caught up with Ultra HD or if buying now runs the risk of the 3D TV debacle from a few years ago. Technologies like Ultra HD are inevitable, granted, but the same could have been said for the laserdisc format in the late 1970’s. It wasn’t until the advent of DVD in the late 90’s that optic-based media was viable enough to beat out analog tape completely.

Ultra HD Televisions

Imagine a theoretical scenario: it has been a long day at the Soylent Green factory, so a person wants to unwind in front of their new UHD home theater. What can they watch? As Techradar explains which brand someone purchases directly affects the content viewable at this time.

Sony launched its Video Unlimited 4K service in 2013, which offers more than 70 films and TV shows for rental or purchase. It requires Sony’s 4K Ultra HD Media Player, the FMP-X1 ($350), which comes with a 2TB hard drive and is only compatible with Sony 4K TVs.

The 40 inch Samsung UN40HU6950 will serve as the TV in this example since it is the first sub $1000 UHD TV that supports Netflix at native resolution. Speaking of which, Netflix 4K service creates a higher tier in the streaming service.

Worst case, imagine that this person’s internet is out and that they need to run to the nearby Redbox equivalent for a movie. At this point, the selection of native 4K Blu-Rays are limited to Sony Pictures films, documentaries and the majesty of Fireplace 4K.

This is a real product. Sadly, it isn't available right now.

This is a real product. Sadly, it isn’t available right now.

Even then, the product reviews seem to suggest that these early releases are upscales from 1K content instead of actually being shot at the UHD standard. This tends to happen a lot at the beginning of formats: there are lazy ports/’remasters’ to compensate for the fact that the industry is still in the process of transitioning to the new standard. The risk was perfectly demonstrated in the 3D fad a few years ago that some companies insist is still relevant despite shifting attention to UHD as the future. There’s always a chance that the release of the Oculus Rift in the coming months will spark a new fad that makes 4K merely another reason movie ticket prices increase once again. Until there is enough media from multiple studios and sources to bite, then the risk may not be worth the short-term gloating rights.

What about gaming with the ‘new gen’ consoles from Sony, Nintendo and Microsoft? The good news is that the PS4 will work at UHD, yet only for viewing photos and watching video. Microsoft has made a lot of claims about ‘THE CLOUD’ processing in the past, but I’m going to go out on a limb and make a bold statement that the XBOX One is as incapable of UHD as the PS4 due to similar hardware. Techradar ran an article discussing the technical limitations of the XBOX One and PS4 even if wunderkind programmers achieved peak optimization with the hardware.

“Both consoles sport HDMI 1.4 outputs, which limits 4K to 30HZ/frames per second. For a UHD gaming experience, this just isn’t good enough.”

This inherent limitation means that, regardless of what programmers try, the currently produced versions of these consoles would be limited to 30fps at best in 4K. The Wii U lacks the RAM necessary to render at UHD, which suggests that it would always be locked in 1K mode on an UHD TV. That’s not to say upscaled 1K content on a UHD display won’t look good. The real question here is why did this person buy a UHD TV if there is basically nothing they can do with it?

Ultra HD Monitors

The distinction between ‘HD’ and ‘UHD’ becomes a bit complicated in the field of PC monitors thanks to ‘QHD’ or, as it is often called, Quad HD. Quad HD,  2560×1440, is equal to four 720p screen values together in much the same fashion as UHD to 1080p. There are no TV equivalents to 1440p since, like a lumbering brontosaurus, the studios wait for a major standard shift before putting their money into new technology. My personal primary monitor is 1440p. That said is there any incentive to go out and buy a UHD monitor this holiday? There are some problems that are summed up here. Not only are UHD monitors still incredibly expensive but also you need dongle adapters or to change out the GPU entirely to go beyond 30hz. Keep a note of how quickly these costs add up apart from buying the UHD TV/monitor itself, obviously. There are two possible ways to reach the much desired goal of 60fps on a UHD display with current GPU technology: the scaling back settings or the SLI/Crossfire to infinity and beyond method. In fairness, some developers are beginning to optimize titles for UHD such as Sniper Elite 3 with post-release updates, but many games still require either the ‘play at medium’ or the ‘run up your credit card and power bill’ approaches. The cost at this point depends on how much a person is willing to spend to prove a point. The R9 295X is considered the pinnacle of 4K capable cards on the market right now and even it stumbles.

The R9 295X Running Crysis 3 on Very High

The R9 295X Running Crysis 3 on Very High

If someone out there is now thinking of a way to rig together four R9 295X’s into a Crossfire monstrosity simply to run Crysis 3 at 4K Very High/60fps, then don’t forget these cards have an MSRP of $1000 each. That’s not even including the multiple PSUs, power and operating costs for this machine. Plus, the amount of extra frames per card added operates with diminishing returns. The kicker with all of this is that, even if someone were to go out tomorrow, buy a completely NEW computer like the joke rig I described for $10,000, there is no guarantee that the games would run well or look good because developers are still mostly working at 1K. It’s akin to rushing out to buy the XBOX 360 HD-DVD player and the full range of movies available in 2006. Waiting for the second generation of proper 4K GPUs from Nvidia and AMD should be enough time for the gaming industry to catch up to the UHD standard. This duration of time could be anywhere from two to three years, depending entirely on how quickly the standard is adopted. If people are slow to adopt, then 1440p and its sister 1600p [16:10] may more aggressively fill the gap. As it stands, ‘Very High’ settings at 60fps are achievable in many games on a single GTX 780Ti at 1440p with no sign of the consoles coming close to that performance anytime soon.

What about the iMac with Retina Display at 5K? It offers a 5120×2880 resolution screen starting at $2500.

iMac 5K

Mac config

The problem here is in the GPU section: AMD GPUs with an ‘M’ at the front are mobile GPUs that favor efficiency over raw power, intended for laptops. Not many gaming benchmarks exist for this iMac on Youtube, but this particular one shows the major drop in performance between the various levels of graphic fidelity in Battlefield 4. The game is a title optimized for AMD GPUs, which is another factor to consider. Additionally, the GPU is part of the logic board in the unit with no capacity for upgrading. While the allure of the 5K iMac may be extreme for some, realize that the unit itself would need to be replaced to change out the graphics processing capabilities. In a time of uncertainty like this, there is no sense in suggesting a PC user jump ship for an all-in-one unit intended primarily for photographers and productivity users.

The Unspoken Format Wars

Does it have a curved display or a flat display? If the industry can’t decide on a standard shape for televisions moving forward, then is it really a wise idea to spend thousands for a top of the line UHD television for a small amount of content currently available while waiting for the studio system to catch up? The unspoken format war is the changing way in how people consume the types of media that would be expected to become 4K content for these televisions. Phones, tablets and computers are basically phasing out television when at all possible with previously recorded media thanks to services like Netflix, Hulu and alternative entertainment choices on sites like Youtube. What sounds more appealing? Watching a movie via Netflix on a tablet or the computer on a whim or firing up the UHD behemoth, connecting to the 4K server and risking time outs if and when the local ISP gets into a fight with Netflix again. The path of least resistance for the consumer often determines the winner of a format war or, in the case of Blu-Ray, a lot of money. The rate of technological change, especially in consumer electronics, has accelerated, greatly in the last five to six years. It’s reasonable to gamble that technology may change again within the next two to three years before UHD can take hold.


What is there to lose in waiting? The ‘Limited Edition Version’ of Fireplace 4K?

Matt M

I'm a contributor to the tech and gaming sections here on TechRaptor. I hold a B.A in English from University of California at Davis. It took me this long to realize just how much of a buzzkill my 'bio' makes me come across as. My hobbies include accumulating more games on Steam than I'll ever have time to play and discussing everything apart from video games on video game forums. Feel free to add other things expected in a corporate news letter blurb. I like long walks on the beach to escape from my video game backlog.