4K TVs should have saved 3D – here's what went wrong

Kids wearing 3D glasses
(Image credit: Shutterstock)

Whatever happened to 3D TVs? One moment they were being touted as the future of television, and for a few years, every major TV manufacturer was offering 3D as standard on their high-end panels. Then they disappeared completely – just as 4K TVs were being rolled out worldwide

But why did 4K supplant 3D in what should have been the latter’s heyday? Was this mere coincidence? Was it a matter of cost? Was it a lack of consumer interest in 3D? Or is 4K just easier to create and more convenient to watch?

The first mainstream 3D TVs were unveiled at CES 2010, with most manufacturers displaying 3D TV demos of various types at the international expo. Later that year the first 3D TVs started hitting shelves. 

As with most new technology these cutting-edge panels were expensive, with 40-inch panels starting at roughly $2,000 / £1,350 / AU$2,600. Prices soon fell, and by 2012 a Samsung 50-inch 1080p Full HD 3D TV could be picked up for around half the amount.

As expected, this increased sales, and by 2012 3D TVs made up 25.7% of global TV sales, which equates to approximately 25 million units shipped. Most manufacturers and pundits predicted this trend would continue, with some predicting that as many as 180 million 3D TVs would be sold in 2019 (via HDTVTest).

Suffice to say, these predictions were wrong. Neither market share nor units sold expanded greatly beyond 2012’s levels, and sales dropped sharply from 2015 onwards. This led Samsung to drop support for 3D TVs in 2016, with all other major manufacturers doing likewise in 2017. Read on to find out why.

The rise of 3D

So, how does 3D work anyway? Essentially, 3D is an optical illusion that aims to trick the brain into perceiving a flat two-dimensional (2D) image as a three-dimensional (3D) image with depth. It achieves this via stereoscopy – feeding the left and right eyes slightly offset versions of the same image. Our brains then process these two video feeds and calculates the differences between them, which we perceive as stereoscopic vision with depth perception.

The 2009 3D boom was enabled by the convergence of several technologies. In the cinemas this technology was digital cinemaphotography. This made capturing, reproducing, and displaying 3D content far easier than with film. The 3D debut of this technology was James Cameron’s movie Avatar. The film’s financial success and superb use of 3D paved the way for the 3D films that would follow, and it no doubt convinced the major TV manufactures, content creators and broadcasters that 3D TV would be equally popular in the home.

Avatar

James Cameron's Avatar was the flagship 3D movie that truly started the craze (Image credit: 20th Century Fox)

Another enabling technology was the sunglasses-style polarised 3D glasses. Unlike the older anaglyph (red and green) glasses, polarised glasses did not distort the film’s color space. Audiences could now enjoy 3D movies without compromising image quality.

Avatar worked well because it was shot natively in 3D from the start. Unfortunately, most 3D movies that came after were shot in 2D then converted to 3D in post-production, which bore inconsistent results. Perhaps this, and the increased cost of 3D tickets, accounted for falling returns from 3D movies, which resulted in studios making fewer 3D films over time. But if 3D cinema was being phased out gradually, why was support for 3D TV ended so abruptly, and seemingly all at once?

What happened to 3D TVs?

At their core, 3D TVs are the same as 2D TVs, they simply have the extra CPU power to display two full HD 1080p images at once when in 3D mode. When in 2D mode they operate the same as any other comparable panel. The best TVs from 2010 onwards had the processor power to do this anyway, since they needed it to run their smart TV features and image-altering effects such as motion smoothing.

Soon 3D was everywhere. Almost all major blockbuster movies had a 3D variant in the cinemas, and almost all high-end TVs came with 3D as standard. There were two main competing technologies in the 3D TV market: active and passive, each with their own pros and cons.

Active 3D was favoured by Samsung and Sony. Here, the TV pulsed between the left and right images about 120 times per second. Battery-powered ‘Active Shutter’ glasses were used to ensure each eye received only one set of images. The ‘lenses’ were miniature LCD screens that would alternately turn opaque 120 times per second in sync with the TV via Bluetooth. Because the active lenses flickered so rapidly, the illusory depth perception was maintained. 

The major advantage Active Shutter 3D TVs had over passive ‘cinema’ 3D TVs was that each eye received a Full HD 1080p image, so the resolutions of their 3D and 2D modes were the same.

They had several drawbacks, however. For a start, they were often locked to specific manufactures – for example, a pair of Samsung Active Shutter glasses wouldn’t work with a Sony Active 3D TV, and vice versa. Secondly, Active Shutter glasses were expensive at around $100 / £100 / AU$200 per pair and were battery-powered, which made them somewhat heavy and uncomfortable to wear. Worse yet, the battery life on early models was limited to a few hours, barely long enough to watch a 3D movie in one sitting. (Later models were somewhat improved, though, sporting a longer battery life and reduced weight and cost.)

Sadly, all active shutter glasses suffered from both a flicker effect and ‘crosstalk’ – - both of which were distracting. Crosstalk became more pronounced the stronger the 3D effect grew. The only remedy was to reduce the strength of the 3D effect or turn it off completely.

Passive 3D used simple polarised glasses much like the cinemas. This was LG’s preferred option, which they marketed as ‘Cinema 3D’. They were lighter, more comfortable to wear, did not cause flicker, allegedly created less crosstalk, worked with all passive 3D TVs, didn’t require syncing or a built-in power source, and were far less expensive, costing as little as a cup of coffee for a pair. 

Passive glasses were the better option in every way but one – they cut the vertical resolution in half. On a Full HD 1080p TV this resulted in 540 lines, which was roughly the same as Standard Definition (SD) TV. This also created a very noticeable ‘screen door effect’ which degraded image quality, and this became increasingly distracting as screen sizes increased.

Movie night with 3D glasses

(Image credit: Shutterstock)

Sadly, there was no ideal 3D solution, and even at its best 3D TV suffered from some inherent problems. Two common complaints were eye-strain and eye fatigue, which made prolonged viewing uncomfortable for some. Stronger 3D effects and rapid on-screen changes tended to make these worse.

The need for stereoscopic vision for 3D to work means that some people are unable to perceive the effects of 3D TV. This includes those who possess only one functional eye, those with a lazy eye etc. Due to the eye strain it can cause, 3D content is not recommended for those under six, and even pre-teens are advised to watch it only in moderation. This of course reduces 3D TV’s potential audience.

Another issue with 3D TVs was that they were, well, a hassle’. To watch 3D content, you would need to find and put on your 3D glasses. If they were Active Shutter glasses you would need to turn them on and sync them via Bluetooth as well, all the while hoping you had remembered to charge them. You would lose the ability to do other things while watching TV, too, like using a smartphone, or pottering around the room, as you’d need to take off the 3D glasses to interact with your surroundings effectively.

For some, the hassle of this relegated 3D to being a movie night treat (along with the surround sound and popcorn,) and not something that was used regularly.

3D TV always had great potential, though – and if it had been introduced a little later in the history of TV development, closer to the arrival of 4K than HD, things might have turned out different.

Movie night with 3D glasses

(Image credit: Shutterstock)

Where do 4K TVs come into this?

A 4K TV is simply a TV with a resolution of 3840 x 2160 pixels (also known as Ultra HD). It packs in twice as many horizontal and vertical lines as 1080p ‘Full-HD’ 1920 x 1080 screens, resulting in four times the pixel density for the same screen size and a clearer, sharper, brighter image.

4K was enabled by several technologies, including high-speed HDMI 2.0, 4K Blu-ray discs, and rising internet speeds allowing for higher-resolution streaming . These were essential due to the file sizes of native 4K content being so much greater than those of Full HD.

The advantages 4K has over regular HD is that the images can be brighter and sharper, especially if combined with other technologies such as OLED and HDR. This is most noticeable on larger screens. A 40-inch 1080p screen may look razer sharp at normal viewing distances, but a 65-inch screen may look slightly fuzzy due to the pixels becoming spaced further apart, whereas a 65 inch 4K TV will retain its razor-like clarity.

There have been some downsides to 4K, however. Initially, there was a dearth of native 4K content, which meant viewers had to rely on upscaled 1080p content. Some TVs handled this upscaling well, while on others it was shoddy and arguably looked worse than 1080p. Another teething issue was the lack of sufficiently high-speed internet, which meant streamed 4K content was unreliable and prone to buffering – while offline 4K Blu-ray players could be costly solutions Indeed, when 4K was first unveiled some pundits predicted these problems would hinder 4K, and that the only use for so many pixels would be to perfect 3D TV.

But passive 3D at 4K resolution could have been the ideal home 3D TV solution. With 2160 vertical lines to work with, even when halved to 1080 lines the 3D image would still look fantastically sharp, whilst retaining the advantages of passive 3D TV. LG created a few such sets, such as the LG OLED65E6V

Unfortunately, few people got to experience this, as most manufacturers ended support for 3D TV when phasing in support for 4K. So why was 3D suddenly dropped like a hot potato?

LG OLED65E6

2016's LG OLED E6 combined 3D images with an OLED panel with excellent results (Image credit: LG)

It all comes down to how swiftly the issues around 4K were resolved. Live broadcasts such as BT Sports Ultra HD and streaming services such as Netflix Ultra HD began offering 4K content, while Internet service providers (ISPs) rolled out Superfast fibre broadband to cope with 4K’s demands.

The paucity of physical 4K content was solved by Blu-ray two packs which included both standard and 4K Blu-rays. 4K Blu-ray players found their way into many people’s homes by default, as video game consoles, such as the last-gen Xbox One S and the current-gen PS5 and Xbox Series X, all feature integral 4K Blu-ray players as standard. With such easy access to 4K content, and most high-end TVs featuring 4K screens to watch it on, it was only a matter of time until 4K TVs became the new standard.

But why did 4K supplant 3D TVs instead of supplementing them?

Filming native 3D content requires complex and expensive camera setups, and 2D to 3D post-production conversion is time-consuming and inconsistent. Filming native 4K is now pretty much standard, and 4K content can usually be downscaled to 1080p quickly and easily. This makes 4K a more attractive proposition for content creators.

4K is also just far more convenient than 3D for many viewers. 4K viewing requires no more effort than viewing Full HD – you simply turn on, sit down, and watch. It does not involve additional accessories and face-wear, for one.

Now, in 4K’s heyday, I can’t help but think that combining 4K with passive 3D could have contributed to the ultimate home viewing experience. But who knows, perhaps 3D TV will make a comeback in the future, by being incorporated into some other technology? The arrival of 8K TVs has shown that consumer appetite for ever higher resolutions has not waned, and if 4K could have perfected passive 3D TV, just imagine what 8K could do for it.