The arrival of the hotly-anticipated Season 6 of Game of Thrones cements its status as the most pirated TV show of them all. Meanwhile, VR is one of the main topics at this year’s NAB, while there’s mounting concern about the amount of power that all the new 4K HDR TV sets are going to be consuming.
Game of Thrones Season 6 Triggers Piracy Spike (TorrentFreak)
Unsurprisingly, despite all the various anti-piracy measures enacted by HBO such as simultaneous transmissions ,a clampdown on review copies and press previews, and even a free weekend of viewing in the US, the first episode of Game of Thrones Season 6 debuted and almost instantaneously set off a giant wave of illegal downloads.
According to data compiled by TorrentFreak, S6E01 chalked up a million downloads in 12 hours or so, and the morning after its first US transmission (and in advance of a primetime showing in many territories) 200,000 users were sharing copies of the episode on BitTorrent.
It’s not quite up to the levels that the last episode of Season 5 reached, which saw 1.5m downloads in the first 8 hours and the largest torrent swarm ever reported, but the numbers are, of course, still rising as the days wear on and tend to peak as the series progresses anyway.
They also only cover BitTorrent downloads. Another TorrentFreak story shares data indicating that visits to Torrent sites are actually declining as direct download and streaming sites take over, with the net result that demand is actually stable, albeit running at an astonishing 140bn visits per year. All of which means in turn that GoT piracy could actually be on the up despite everything.
Also increasing is the number of HD downloads as a percentage of the whole picture. A few years ago, 720p and 1080p video was very much in the minority of Torrent requests at around 10%. Now, for the first time this is edging closer to 50% of traffic.
NAB 2016: Turning VR in to Actual R (BBC)
At Viaccess-Orca we, of course, had our own groundbreaking world’s first VR demo at NAB, but, while pioneering, we were a long way from being the only ones. Virtual Reality is now a movement that is progressing very swiftly along the path from promise to delivery.
Probably the biggest announcement of the lot was YouTube’s introduction of live streamed 360º video on its service, with some select concerts from California's Coachella festival being amongst the first to hit the servers. Essentially, it gave anyone owning a compatible 360º camera — and there were a lot of them on show at NAB, including models and rigs from such industry giants as GoPro and Kodak — and an uplink speed of between 10-20Mbps the ability to broadcast 360º content live onto the web.
This is a bit of a land grab from Facebook, which can also do 360º video and live streams but not both at the same time. And, importantly, it stretches what is considered to be a minimum spec for the 360º format with its support for binaural audio and thus ‘3D’ sound.
Away from the live arena there were plenty of new tools as well to edit the material, including products from Adobe and SGO.
As we’ve written before, there is a lot to play for still, and even the likes of Facebook and now Oculus Rift owner Mark Zuckerberg estimates that we could be as much as a decade away from the mass market. But things are moving fast and with Sony aiming to start offering in-store trials of its PlayStation VR unit as early as June (and planning to log 500,000 of them by year end) things are moving apace.
A report from the California-based Natural Resources Defense Council has shone an unexpected light on an unintended consequence of the move to 4K HDR — some dramatically increased power consumption.
As the centre’s Noah Horowitz explains in an interview with TV Tech Europe, the Centre tested the energy use of UHD TVs playing current content in both HD and also 4K before moving on to HDR (as well as some of the more standard energy consumption tests we associated with TVs such as standby power use and resume times of smart TVs).
There is a huge amount of very granular detail in the report, whose title, The Big Picture: Ultra High-Definition Televisions Could Add $1bn to Viewers’ Annual Electric Bills, kind of gives away one of its most startling conclusions. This is based on data from the Centre’s testing, namely that 4K TVs consume an average of 30% more power than HD, primarily because not only are people buying larger sets, but their backlights need to be brighter to deliver comparable or higher luminance and more vivid colours through a larger number of pixels.
Then there’s the impact of HDR to measure. This is still a bit of an unknown factor, but in the one set the centre tested it increased power consumption by a massive 47%. As yet, the standard federal tests for energy efficiency don’t include HDR energy use.
“Americans’ residential energy bills could rise by more than $1 billion per year if all televisions larger than 36 inches transition to 4K at today’s average efficiency,” says the report. “This increase in energy use is equivalent to three times the annual electricity use of all the homes in San Francisco and represents an extra 4 million metric tons of CO2 emissions per year.”
It recommends a range of measures to reduce energy consumption: enabling automatic brightness control by default; using local screen dimming to shut off portions of the backlight in areas where no light is needed to improve image quality; using more efficient technologies, including quantum dots, high- performance LEDs, improved optical films, and more transmissive LCD panels; and to design the CPUs of smart TVs to boot up quickly and to automatically shut off unnecessary features when not being used.
All this can be done, it’s just a matter of will.
“The good news is that the technology already exists to prevent much of this increased energy use and related impacts, as some of the most efficient 4K TV models on the market today use little to no more power than similar-size HD TVs,” concludes the Centre.