The issue of privacy and security is one of the 21st century’s biggest dilemmas: how does one balance the technological advantages which give us access to a wealth of personal information, with the basic right of privacy, and often with the need to protect ourselves from threats such as global terrorism?
I recently had the opportunity to hear about these issues from Isabelle Falque- Pierrotin, who chairs the group of 29 European data protection regulators as well as France's CNIL - the Authority for Privacy protection (CNIL).
Ms Falque- Pierrotin compared today’s security vs privacy status with the situation when the CNIL was established 35 years ago. Back then, the threat at stake was public authorities and government databases. A major concern was how to secure the Identification number of all French citizens, as at the time this number enabled sharing various aspects of someone's personal information. The system, therefore, was built in a way which required “a priori” information sharing approval.
Fast forward to 2015 and to the technological advances we mostly enjoy: broadband, cloud technologies and big data. Today, you don’t log on to the Internet – you live in it: meaning that anything you do in real life has its digital footprints. The threat, therefore, isn’t only public authorities, but rather large – private and public -companies who own and share large databases. The way to regulate privacy, according to Ms Falque-Pierrotin, is hence based on the integrity and accountability of the different players, with an “a posteriori” control which will implement punitive measurements such as high fines in case of infringement of privacy laws.
What does it mean for content services and the TV market?
There is a lot of value for the end users when we, as TV operators, collect data to enable a personalized experience. But this advantage comes at a price- personal data collection, which should be approved by users a-priori.
What could be the guidelines to make this service acceptable?
- Users should know exactly what data is collected by operators. We were recently reminded of this requirement when Samsung warned viewers that the voice activation feature on their Smart TV’s might be collecting personal information – and sharing it.
- The use of data should be clear: is it only used for improving video service and personalization service or is it also used for targeted advertising? If it's sent to the user’s employer to measure TV consumption, for example, that’s a whole different story!
- Users should be able to access their personal data and to request to delete them
If you go this way and follow the guidelines, your next step is to make your company “privacy-friendly”.
How can a company be “privacy-friendly”? The recommendations are to train people and ensure there is enough awareness in the company about privacy; appoint someone as Chief Privacy Officer; and finally, include privacy in the product design process.
In her discussion, Ms Falque -Pierrotin mentioned another risk enhanced by personalization algorithms. Obviously, it cannot be compared with the threat of cyber-attacks which target large databases, but it is very relevant to our market. In some cases algorithms can create “bubbles”.
This obstacle is relevant in our market: Think about content recommendations algorithms that identify that user X likes comedies but would not spot that he also likes other genres like action movies. Next, the recommendation engine will only offer comedies to the user, and by that he is kept inside a “bubble” of comedies. In other words, the challenge when using these algorithms is the lack of serendipity. As users, we expect serendipity – entertainment is also made of surprises! Any solution for content discovery should include enough random choices and serendipity, and prevent the risk of “bubbles”.
Privacy is and will be a growing concern by users and operators alike. If you want the audience to opt-in, you need to address these concerns: Plan your privacy strategy well ahead, and make sure to communicate this policy appropriately to the users.