Increasingly, the information we need and use every day is stored, accessed and controlled online.

We have become accustomed to the convenience and efficiency of being able to access significant swathes of information about ourselves, our business and the world at the tap of a button.

Many of us accept that such convenience comes at a cost, with some platforms such as Facebook, Twitter and YouTube primarily being funded by advertisers.

We understand that, in order to provide a free service, we must endure some form of targeted advertising, assuming this is just the same as old fashioned TV or bus stop advertising, albeit in a different form.

Every day we volunteer information about ourselves, whether by completing an online enquiry form, subscribing to social media platforms or simply by opening an app on our phone.

But do we really know what happens to the information we share? Do we actually know what information we may be inadvertently revealing to others when we log on?

It appears not – at least according to a report by the UK Joint Committee of Human Rights: “The Right to Privacy (Article 8) and the Digital Revolution” (HC 122/HL Paper 14) (Report).

The Report considered how private companies’ use of personal data impacted on human rights. But its findings also revealed that “vast numbers of people are not fully aware of how their data is being used” and that the consent model used as the default legal basis for processing personal data under the General Data Protection Regulation (GDPR) is “broken”.

Companies, it seems, according to the Report, are providing the information they are required under the GDPR via a privacy notice in an overly complicated, inaccurate or vague manner.

The question posed by the Report is whether the consent we provide by ticking or checking a box is even valid under the GDPR if we do not appreciate, from the information we are provided, what is being done with our data.

Transparency

Organisations are required to process personal data only if they meet their obligations under the GDPR which include the obligation to provide data subjects with a fair processing notice under Article 13 or Article 14.

The most common form of notice is the privacy statement which is usually available on an organisation’s website or a tab in a mobile app.

The fair processing notice is designed to ensure that the data subject is adequately informed of the way in which an organisation uses their data.

Organisations must also comply with the privacy principles, the first of which is the requirement that personal data must be processed lawfully, fairly and in a transparent manner.

Fairness is not defined, but the principle is understood to refer to the effect processing has on an individual’s rights and freedoms. The ICO describes fairness as handling “data in ways that people would reasonably expect and not use it in ways that have unjustified adverse effects on them.”

The principle of transparency requires that any information relating to processing (i.e. the privacy notice) be easily accessible, easy to understand with clear and plain language being used.

It appears from the Report’s findings, that organisations are not being transparent in their privacy notices about how they will be sharing their data, nor are they providing the notices in an easy to understand fashion.

The Report cited research by Doteveryone which found that 62 per cent of the people they spoke to were unaware that social media companies made money by selling data to third parties and 45 per cent were unaware that information they enter on websites and social media can help target advertisements (page 20 of the Report)

In a press release published by the European Commission on 13 June 2019, the results of a Eurobarometer survey on data protection, revealed that whilst Europeans were “relatively aware” of the GDPR and their rights, of the 60 per cent of those surveyed who read their privacy statements, only 13 per cent read them fully due to the statements being too long or too difficult to understand.

Similarly in the ICO’s annual report 2018/2019 the research conducted found that only one in three (34 per cent) people have high trust and confidence in companies and organisations storing and using their personal information.

Unexpected sharing of data

The Report considered the trend for businesses to share data through data brokers without the data subject’s knowledge and despite the fact that the data subject only gave consent to use of their data in return for a service from one business.

In a study conducted by Dr Binns from the University of Oxford, assessing 1 million android apps, he found that 9 out of 10 apps sent data back to Google and four out of ten apps sent data back to Facebook (page 21 of the Report).

The practice of combining data and data aggregation raised concerns about the collection of detailed profiles on individuals without their knowledge.

Most people will not be aware that information from the GPS in mobile phones or vehicles, search histories, purchases, social media posts and cookies, when combined, will create a comprehensive profile of an individual.

The Report highlighted that these profiles are being shared within “an eco-system” comprising of thousands or organisations all competing for digital advertising space through Real Time Bidding (page 22).

Real Time Bidding is a type of advertising which is in the form of an auction; companies place bids on whether they want to advertise their product to a particular demographic; eg a 21 year old male living in Reading, works in retail and likes to buy shoes.

This practice of combining your information with other data sets and being sold on to data brokers is not being explained in privacy notices.

Consumers may well not expect that their buying history with a particular website will be sold and packaged together with their IP address, browser history, political or religious views as expressed on social media.

The risk to privacy is heightened when special category data is being processed, eg internet searches on medical conditions or online purchases of medical/pharmaceutical products. Profiling of children and vulnerable people who are not as capable of understanding how their data is being used was also raised as a key concern.

Is the consent model broken?

In order for private companies to lawfully process personal data, they may rely on a number of bases under Article 6 of the GDPR, one of which is with the consent of the data subject.

For special category data which includes sensitive information such as health, sexual orientation, religion, political views, biometric information, an exemption must also be found under Article 9. One exemption that could apply is with the “explicit consent” of the data subject.

Another common basis under Article 6 is for the purpose of the legitimate interests of the organisation, provided that such interests are not overridden by the interests or fundamental rights and freedoms of the data subject. Often the legitimate interests of the organisation, such as the example above, to share their data with data brokers will be overridden by the “rights and freedoms” of the data subject which will prevent the processing entirely,

Because of difficulties with relying on legitimate interests and the fact that an individual’s consent can also be used for special category data, user consent is invariably being used as the default legal basis.

The challenge when relying on consent is that, to be lawful, consent must be “specific, informed, freely given and an unambiguous indication given.”

If a company has failed to be transparent in their privacy notice about the proposed purposes and uses of data as required, consent can hardly be classed as informed. However, the length and complexity which may be required to adequately explain the ways in which the data is intended to be used is creating an environment where users are either not reading or not understanding the information provided.

Consent must also be freely given, which means that consent should be capable of being withdrawn at any time. The Report found that, whilst Google allows users to opt out of personalisation of ads and revoke consent, the process by which a user must go through in order to change privacy settings to a privacy friendly option was considerably longer (page 26 of the Report).

What can be done?

Privacy proponents have long called for stricter controls and specific regulation to govern the big tech firms’ use of personal data. The Government’s Online Harms White Paper released in April 2019, outlined plans for a new regulatory framework and oversight for internet companies. Some of the proposals included establishing a statutory duty of care which is enforced by an independent regulator and to ensure that companies’ terms and conditions are “sufficiently clear and accessible”, including to children and other vulnerable users.

The Report made a number of recommendations with the central theme of changing the current privacy framework.

The onus, the Committee argued, should not be on the individual, relying on their consent. Instead the government should introduce robust regulatory standards for internet companies which are rigorously enforced (page 39 of the Report). The Committee recommended that the government should explore a simpler way for individuals to see what data is being shared about them and with whom, and to prevent some or all of their data being shared at all.

Whilst it is understandable that the government sees the solution as increased regulation backed by stringent penalties, the systemic change that is needed to promote transparency will require backing of the big tech companies and a business case for it. Ultimately, it will be consumer demand for greater respect of their privacy rights that will motivate change.

Further reading

UK Joint Committee of Human Rights: The Right to Privacy (Article 8) and the Digital Revolution” https://bit.ly/2QSMRtV

ICO: Principle (a): Lawfulness, fairness and transparency https://bit.ly/ICOprinciplea

ICO: Annual report 2018/2019 https://bit.ly/ICOannual

European Commission: Data Protection Regulation one year on https://bit.ly/2Uniazg

Chrysilla de Vere is the Commercial partner at Clarkslegal, specialising in privacy and data protection. Email cdevere@clarkslegal.com. Twitter @Clarkslegal.

Photo by geralt on Needpix.com.