All posts in “GDPR”

UK watchdog wants disclosure rules for political ads on social media

The UK’s data protection agency will push for increased transparency into how personal data flows between digital platforms to ensure people being targeted for political advertising are able to understand why and how it is happening.

Information commissioner Elizabeth Deham said visibility into ad targeting systems is needed so that people can exercise their rights — such as withdrawing consent to their personal data being processed should they wish.

“Data protection is not a back-room, back-office issue anymore,” she said yesterday. “It is right at the centre of these debates about our democracy, the impact of social media on our lives and the need for these companies to step up and take their responsibilities seriously.”

“What I am going to suggest is that there needs to be transparency for the people who are receiving that message, so they can understand how their data was matched up and used to be the audience for the receipt of that message. That is where people are asking for more transparency,” she added.

The commissioner was giving her thoughts on how social media platforms should be regulated in an age of dis(and mis)information during an evidence session in front of a UK parliamentary committee that’s investigating fake news and the changing role of digital advertising.

Her office (the ICO) is preparing its own report this spring — which she said is likely to be published in May — which will lay out its recommendations for government.

“We want more people to participate in our democratic life and democratic institutions, and social media is an important part of that, but we also do not want social media to be a chill in what needs to be the commons, what needs to be available for public debate,” she said.

“We need information that is transparent, otherwise we will push people into little filter bubbles, where they have no idea about what other people are saying and what the other side of the campaign is saying. We want to make sure that social media is used well.

“It has changed dramatically since 2008. The Obama campaign was the first time that there was a lot of use of data analytics and social media in campaigning. It is a good thing, but it needs to be made more transparent, and we need to control and regulate how political campaigning is happening on social media, and the platforms need to do more.”

Last fall UK prime minister Theresa May publicly accused Russia of weaponizing online information in an attempt to skew democratic processes in the West.

And in January the government announced it would set up a dedicated national security unit to combat state-led disinformation campaigns.

Last month May also ordered a review of the law around social media platforms, as well as announcing a code of conduct aimed at cracking down on extremist and abusive content — another Internet policy she’s prioritized.

So regulating online content has already been accelerated to the top of government in the UK — as it is increasingly on the agenda in Europe.

Although it’s not yet clear how the UK government will seek to regulate social media platforms to control political advertising.

Denham’s suggestion to the committee was for a code of conduct.

“I think the use of social media in political campaigns, referendums, elections and so on may have got ahead of where the law is,” she argued. “I think it might be time for a code of conduct so that everybody is on a level playing field and knows what the rules are.

“I think there are some politicians, some MPs, who are concerned about the use of these new tools, particularly when there are analytics and algorithms that are determining how to micro-target someone, when they might not have transparency and the law behind them.”

She added that the ICO’s incoming policy report will conclude that “transparency is important”.

“People do not understand the chain of companies involved. If they are using an app that is running off the Facebook site and there are other third parties involved, they do not know how to control their data,” she argued.

“Right now, I think we all agree that it is much too difficult and much too opaque. That is what we need to tackle. This Committee needs to tackle it, we need to tackle it at the ICO, and the companies have to get behind us, or they are going to lose the trust of users and the digital economy.”

She also spoke up generally for more education on how digital systems work — so that users of services can “take up their rights”.

“They have to take up their rights. They have to push companies. Regulators have to be on their game. I think politicians have to support new changes to the law if that is what we need,” she added.

And she described the incoming General Data Protection Regulation (GDPR) as a “game-changer” — arguing it could underpin a push for increased transparency around the data flows that are feeding and shaping public opinions. Although she conceded that regulating such data flows to achieve the sought for accountability will require a fully joined up effort.

“I would like to be an optimist. The point behind the General Data Protection Regulation as a step-up in the law is to try to give back control to individuals so that they have a say in how their data are processed, so that they do not just throw up their hands or put it on the ‘too difficult’ pile. I think that is really important. There is a whole suite of things and a whole village that has to work together to be able to make that happen.”

The committee recently took evidence from Cambridge Analytica — the UK based company credited with helping Donald Trump win the US presidency by creating psychological profiles of US voters for ad targeting purposes.

Denham was asked for her response to seeing CEO Alexander Nix’s evidence. But said she could not comment to avoid prejudicing the ICO’s own ongoing investigation into data analytics for political purposes.

She did confirm that a data request by US voter and professor David Carroll, who has been trying to use UK data protection law to access the data held on him for political ad targeting purposes by Cambridge Analytica, is forming one of the areas of the ICO enquiry — saying it’s looking at “how an individual becomes the recipient of a certain message” and “what information is used to categorise him or her, whether psychographic technologies are used, how the categories are fixed and what kind of data has fed into that decision”.

Although she also said the ICO’s enquiry into political data analytics is ranging more widely.

“People need to know the provenance and the source of the data and information that is used to make decisions about the receipt of messages. We are really looking at — it is a data audit. That is really what we are carrying out,” she added.

Featured Image: Tero Vesalainen/Getty Images

BigID pulls in $14 million Series A to help identify private customer data across big data stores

As data privacy becomes an increasingly important notion, especially with the EU’s GDPR privacy laws coming online in May, companies need to find ways to understand their customer’s private data. BigID thinks it has a solution and it landed a $14 million Series A investment today to help grow the idea.

Comcast Ventures, SAP (via, ClearSky Security Fund and one of the company’s seed round investors, BOLDstart Ventures, all participated in the investment. The deal closed last week. Today’s investment on top of the $2.1 million seed round in 2016 brings the total raised to $16.1 million.

CEO and co-founder Dimitri Sirota says before companies can do anything with their data, they have to understand what they have. The starting point therefore is creating a catalogue of private data types you need to protect without actually moving the data.

“Think of us as Google for data. We index the information, figure out what information belongs to what entity, the data subject [and so forth], but it’s all virtual. We don’t copy it. It stays wherever it was,” Sirota explained.

The company can search across multiple big data stores, find private information for individuals across different sources, map the relationships across sources, see how data flows across different geographies and help customers comply with local data privacy regulations.

And he says the solution supports a variety of data formats out of the box whether that’s MongoDB, Cassandra, a cloud service like AWS or Azure or an enterprise software package like SAP. It also provides a generic connector that customers can customize to connect to unsupported data stores. It typically takes a couple of weeks to add a new one. BigID is also constantly working on expanding supported data types and updates the product on a regular basis, according to Sirota.

The solution is offered in a subscription model delivered as a Docker image. Most customers so far have opted to install it behind their firewall.

You can map relationships of customer information across data stores with BigID. Photo: BigID

While the company’s data identification and compliance solution maps well to GDPR compliance rules, Sirota said they didn’t set to out solve one compliance problem. They wanted to build a product to help companies be “better stewards of private customer information.”

The company hired its first engineer in March, 2016 and GDPR regulations passed in April 2016. That bit of serendipity helped the company build something that was GDPR compliant. “While we don’t call ourselves explicitly a GDPR solution, we do cover a lot of the requirements,” he told TechCrunch.

The company is still young with 16 employees spread across two offices in NYC and Tel Aviv, which houses the company engineering department. It plans to expand the number of employees in the coming year with today’s investment.

It has around two dozen initial customers. While Sirota wouldn’t name specific ones just yet, he did say the list includes Dow 30 companies, big web scale companies and large systems integrators.

Featured Image: baranozdemir/Getty Images

Facebook’s least favorite Austrian can now press privacy suit in Vienna

A big blow for Facebook today after Europe’s top court delivered a verdict in a long-running legal challenge that opens the door for plaintiff and privacy campaigner, Max Schrems, to sue Facebook in his home city of Vienna.

The company had sought to argue that Schrems’ does not have consumers rights on account of his privacy campaigning activities. But in its judgement today the CJEU rejects that argument, saying Schrems’ campaigning activities do not cancel out his status as a consumer with a private Facebook account.

“After throwing dirt at me for three years and circulating that I would try to make a profit from my political activities, it’s maybe the time now for Facebook to apologize,” said Schrems in a statement on the judgement.

Facebook has previously tried to argue that Austrian courts do not have international jurisdiction over its business, which has its European HQ in Ireland. But in 2015 a local appeals court ruled Schrems can file personal claims in his local court in Vienna.

The company’s tactics have stalled the substance of the lawsuit from being heard for more than three years.

Now, with the CJEU ruling, Schrems can bring a model case against Facebook on his home turf — challenging the company over a suite of awkward privacy issues.

Such as US government surveillance program’s access to Facebook user data; how it pervasively tracks users around the rest of the web; and the complexity and opacity of its privacy policies — and whether the company is therefore obtaining legal consent from users to process their personal data.

Truly this will be a * get popcorn * lawsuit.

“There’s a lot of stuff that Facebook will have to deal with,” said a jubilant Schrems in a video response to the judgement posted to Twitter.

Facebook does have one reason to be cheerful, though.

Being as, back in 2014 when Schrems filed the original suit, he had tried to structure it as a privacy class action — gathering thousands of other Facebook users to join the cause and assign their claims to him. (As an attempt to workaround Austria’s lack of class action law for consumers.)

However today’s CJEU ruling closes off that possibility — with the judges concluding:

Article 16(1) of Regulation No 44/2001 must be interpreted as meaning that it does not apply to the proceedings brought by a consumer for the purpose of asserting, in the courts of the place where he is domiciled, not only his own claims, but also claims assigned by other consumers domiciled in the same Member State, in other Member States or in non-member countries.

“Today’s decision by the European Court of Justice supports the previous decisions of two courts that Mr. Schrems’s claims cannot proceed in Austrian courts as “class action” on behalf of other consumers. We were pleased to have been able to present our case to the European Court of Justice and now look forward to resolving this matter.”

Under the EU’s incoming data protection framework GDPR, which will apply from May 25, there is a provision for consumer organizations to pursue collective redress on behalf of individual consumers.

And Schrems is currently crowdfunding to get an not-for-profit off the ground for exactly that purpose — saying the aim of the organization will include bringing “privacy class actions” under a different legal regime (i.e. Article 80 of the GDPR).

So he’s clearly not going to abandon his fight for consumer class actions in the EU.

Though he also calls out the CJEU’s judgement as problematic, saying it implies a consumer only has rights against a company if they themselves entered into the original contract — so, for example, someone buying a secondhand Volkswagen wouldn’t have consumer rights against the company.

“Unfortunately the CJEU has massively limited consumer rights in this case and missed a golden opportunity to finally allow collective redress in Europe,” he said in a statement on that. “This will hit consumers in many cases where they have not signed the original contract with a company.”

“We now have the absurd situation that 71 companies that were harmed by a cartel could bring their claims jointly, only consumers cannot join forces. Equally you can sue ‘into’ a country that has a class action but not ‘out’ of such a country. As the Advocate General has already said in its option: There is now an urgent need to get a European solution for collective redress,“ he added.

Featured Image: Bryce Durbin/TechCrunch

Facebook to roll out global privacy settings hub — thanks to GDPR

Facebook COO Sheryl Sandberg has said major privacy changes are coming to the platform later this year, as it prepares to comply with the European Union’s incoming data protection regulation.

Speaking at a Facebook event in Brussels yesterday, she said the company will be “rolling out a new privacy center globally that will put the core privacy settings for Facebook in one place and make it much easier for people to manage their data” (via Reuters).

Last year the company told us it had assembled “the largest cross functional team” in the history of its family of companies to support General Data Protection Regulation (aka: GDPR) compliance.

From May 25 this year, the updated privacy framework will apply across the 28 Member State bloc — and any multinationals processing European citizens’ personal data will need to ensure they are compliant. Not least because the regulation includes beefed up liabilities for companies that fail to meet its standards. Under GDPR, penalties can scale as large as 4% of a company’s global turnover.

In Facebook’s case, based on its 2016 full year revenue, the new rules mean it could be facing fines that exceed a billion dollars — giving the company a rather more sizable incentive to ensure it meets the EU’s privacy standards and isn’t found to be playing fast and loose with users’ data.

Sandberg said the incoming changes will give the company “a very good foundation to meet all the requirements of the GDPR and to spur us on to continue investing in products and in educational tools to protect privacy”.

“Our apps have long been focused on giving people transparency and control,” she also remarked — a claim that any long-time Facebook user might laugh at rather long and hard.

Long history of hostility to privacy

Facebook has certainly made a lot of changes to privacy and control over the years, though its focus has rarely seemed aimed at “giving people transparency and control”.

Instead, many of its shifts and tweaks have been positioned to give the company more ways to exploit user data while simultaneously nudging people to give up more privacy (and thus hand it more options for exploiting their data).

Here, for example, is an EFF assessment of a 2009 Facebook privacy change — ostensibly, Facebook claimed at the time, to give users “greater control over their information”:

These new “privacy” changes are clearly intended to push Facebook users to publicly share even more information than before. Even worse, the changes will actually reduce the amount of control that users have over some of their personal data.

Among the changes Facebook made back then was to “recommend” preselected defaults to users that flipped their settings to share the content they post to Facebook with everyone on the Internet. (This recommendation was also pushed at users who had previously specified they wanted to limit any sharing to only their “Networks and Friends”.)

Clearly that was not a pro-privacy change. As we warned at the time it could (and did) lead to “a massive privacy fiasco” — given it encouraged Facebookers to inadvertently share more than they meant to.

A mere six months later — facing a major backlash and scrutiny from the FTC — Facebook was forced to rethink, and it put out what it claimed was a set of “drastically simplified” privacy controls.

Though it still took the company until May 2014 to change the default visibility of users’ statuses and photos to ‘friends’ — i.e. rather than the awful ‘public’ default.

Following the 2009 privacy debacle, a subsequent 2011 FTC settlement barred Facebook from making any deceptive privacy claims. The company also settled with the Irish DPA at the end of the same year — after privacy complaints had sparked an audit in Europe.

So in 2012, when Facebook decided to update its privacy policy — to give itself greater control over users’ data — it was forced to email all its users about the changes, as a consequence of those earlier regulatory settlements.

But it took direct action from EU privacy campaigner Max Schrems to force Facebook to put the proposed changes up for a worldwide vote — by mobilizing opinion online and triggering a long standing Facebook policy governance clause (which the company couldn’t exactly ignore, even as the structure of the clause essentially made it impossible for a user vote to block the changes).

At the time Schrems was also campaigning for Facebook to implement an ‘Opt-In’ instead of an ‘Opt-Out’ system for all data use and features; and also for limits on use of users’ data for ads. So, in other words, for exactly the sorts of changes GDPR is likely to bring in — with its requirement, for instance, that data controllers obtain meaningful consent from users to process their personal data (or else find another legal basis for handling their data).

What’s crystal clear is that, time and again, it’s taken regulatory and/or privacy campaigner pressure to push Facebook away from user-hostile data practices.

And that prior to regulatory crackdown the company’s intent was to reduce users’ privacy by pushing them to make more of their data public.

But even since then the company has continued to act in a privacy hostile way.

Another major low in Facebook’s privacy record came in 2016, when its subsidiary company, messaging giant WhatsApp, announced a privacy U-turn — saying it would begin sharing user data with Facebook for ad-targeting purposes, including users’ phone numbers and their last seen status on the app.

This hugely controversial anti-privacy move quickly attracted the ire of European privacy regulators — forcing Facebook to partially suspend data-sharing in the region. (The company remains under scrutiny in the EU over other types of WhatsApp-Facebook data-sharing which it has not paused.)

Facebook was eventually fined $122M by the European Commission, in May last year, for providing “incorrect or misleading” information to the regulators that had assessed its 2014 acquisition of WhatsApp (not a privacy fine, btw, a penalty purely for process failing).

At the time Facebook had claimed it could not automatically match user accounts between the two platforms — before going on to do just that two years later.

The company also only gave WhatsApp users a time-limited, partial opt-out for the data-sharing. Again, an approach that just wouldn’t wash under GDPR.

EU citizens who consent to their personal data being processed will also have a suite of associated rights — such as being able to ask for the data to be deleted, and the ability to withdraw their consent at any time. (Read our GDPR primer for a full overview of the changes fast incoming.)

While the full impact of the regulation will take time to shake out — the exact shape and tone of Facebook’s new global privacy settings center remains to be seen, for example — European Union lawmakers are already rightly celebrating a long overdue shift in the balance of power between platforms and consumers.

Featured Image: Bryce Durbin/TechCrunch


European Union lawmakers proposed a comprehensive update to the bloc’s data protection and privacy rules in 2012.

Their aim: To take account of seismic shifts in the handling of information wrought by the rise of the digital economy in the years since the prior regime was penned — all the way back in 1995 when Yahoo was the cutting edge of online cool and cookies were still just tasty biscuits.

Here’s the EU’s executive body, the Commission, summing up the goal:

The objective of this new set of rules is to give citizens back control over of their personal data, and to simplify the regulatory environment for business. The data protection reform is a key enabler of the Digital Single Market which the Commission has prioritised. The reform will allow European citizens and businesses to fully benefit from the digital economy.

For an even shorter tl;dr the EC’s theory is that consumer trust is essential to fostering growth in the digital economy. And it thinks trust can be won by giving users of digital services more information and greater control over how their data is used. Which is — frankly speaking — a pretty refreshing idea when you consider the clandestine data brokering that pervades the tech industry. Mass surveillance isn’t just something governments do.

The General Data Protection Regulation (aka GDPR) was agreed after more than three years of negotiations between the EU’s various institutions.

It’s set to apply across the 28-Member State bloc as of May 25, 2018. That means EU countries are busy transposing it into national law via their own legislative updates (such as the UK’s new Data Protection Bill — yes, despite the fact the country is currently in the process of (br)exiting the EU, the government has nonetheless committed to implementing the regulation because it needs to keep EU-UK data flowing freely in the post-brexit future. Which gives an early indication of the pulling power of GDPR.

Meanwhile businesses operating in the EU are being bombarded with ads from a freshly energized cottage industry of ‘privacy consultants’ offering to help them get ready for the new regs — in exchange for a service fee. It’s definitely a good time to be a law firm specializing in data protection.

GDPR is a significant piece of legislation whose full impact will clearly take some time to shake out. In the meanwhile, here’s our guide to the major changes incoming and some potential impacts.

Data protection + teeth

A major point of note right off the bat is that GDPR does not merely apply to EU businesses; any entities processing the personal data of EU citizens need to comply. Facebook, for example — a US company that handles massive amounts of Europeans’ personal data — is going to have to rework multiple business processes to comply with the new rules. Indeed, it’s been working on this for a long time already.

Last year the company told us it had assembled “the largest cross functional team” in the history of its family of companies to support GDPR compliance — specifying this included “senior executives from all product teams, designers and user experience/testing executives, policy executives, legal executives and executives from each of the Facebook family of companies”.

“Dozens of people at Facebook Ireland are working full time on this effort,” it said, noting too that the data protection team at its European HQ (in Dublin, Ireland) would be growing by 250% in 2017. It also said it was in the process of hiring a “top quality data protection officer” — a position the company appears to still be taking applications for.

The new EU rules require organizations to appoint a data protection officer if they process sensitive data on a large scale (which Facebook very clearly does). Or are collecting info on many consumers — such as by performing online behavioral tracking. But, really, which online businesses aren’t doing that these days?

The extra-territorial scope of GDPR casts the European Union as a global pioneer in data protection — and some legal experts suggest the regulation will force privacy standards to rise outside the EU too.

Sure, some US companies might prefer to swallow the hassle and expense of fragmenting their data handling processes, and treating personal data obtained from different geographies differently, i.e. rather than streamlining everything under a GDPR compliant process. But doing so means managing multiple data regimes. And at very least runs the risk of bad PR if you’re outed as deliberately offering a lower privacy standard to your home users vs customers abroad.

Ultimately, it may be easier (and less risky) for businesses to treat GDPR as the new ‘gold standard’ for how they handle all personal data, regardless of where it comes from.

And while not every company harvests Facebook levels of personal data, almost every company harvests some personal data. So for those with customers in the EU GDPR cannot be ignored. At very least businesses will need to carry out a data audit to understand their risks and liabilities.

Privacy experts suggest that the really big change here is around enforcement. Because while the EU has had long established data protection standards and rules — and treats privacy as a fundamental right — its regulators have lacked the teeth to command compliance.

But now, under GDPR, financial penalties for data protection violations step up massively.

The maximum fine that organizations can be hit with for the most serious infringements of the regulation is 4% of their global annual turnover (or €20M, whichever is greater). Though data protection agencies will of course be able to impose smaller fines too. And, indeed, there’s a tiered system of fines — with a lower level of penalties of up to 2% of global turnover (or €10M).

This really is a massive change. Because while data protection agencies (DPAs) in different EU Member States can impose financial penalties for breaches of existing data laws these fines are relatively small — especially set against the revenues of the private sector entities that are getting sanctioned.

In the UK, for example, the Information Commissioner’s Office (ICO) can currently impose a maximum fine of just £500,000. Compare that to the annual revenue of tech giant Google (~$90BN) and you can see why a much larger stick is needed to police data processors.

It’s not necessarily the case that individual EU Member States are getting stronger privacy laws as a consequence of GDPR (in some instances countries have arguably had higher standards in their domestic law). But the beefing up of enforcement that’s baked into the new regime means there’s a better opportunity for DPAs to start to bark and bite like proper watchdogs.

GDPR inflating the financial risks around handling personal data should naturally drive up standards — because privacy laws are suddenly a whole lot more costly to ignore.

More types of personal data that are hot to handle

So what is personal data under GDPR? It’s any information relating to an identified or identifiable person (in regulatorspeak people are known as ‘data subjects’).

While ‘processing’ can mean any operation performed on personal data — from storing it to structuring it to feeding it to your AI models. (GDPR also includes some provisions specifically related to decisions generated as a result of automated data processing but more on that below).

A new provision concerns children’s personal data — with the regulation setting a 16-year-old age limit on kids’ ability to consent to their data being processed. However individual Member States can choose (and some have) to derogate from this by writing a lower age limit into their laws.

GDPR sets a hard cap at 13-years-old — making that the defacto standard for children to be able to sign up to digital services. So the impact on teens’ social media habits seems likely to be relatively limited.

The new rules generally expand the definition of personal data — so it can include information such as location data, online identifiers (such as IP addresses) and other metadata. So again, this means businesses really need to conduct an audit to identify all the types of personal data they hold. Ignorance is not compliance.

GDPR also encourages the use of pseudonymization (such as encrypting personal data and storing the encryption key separately and securely) — as a pro-privacy, pro-security technique that can help minimize the risks of processing personal data. Although pseudonymized data is likely to still be considered personal data; certainly where a risk of reidentification remains. So it does not get a general pass from requirements under the regulation.

Data has to be rendered truly anonymous to be outside the scope of the regulation. (And given how often ‘anonymized’ data-sets have been shown to be re-identifiable, relying on any anonymizing process to be robust enough to have zero risk of re-identification seems, well, risky.)

The incoming data protection rules apply to both data controllers (i.e. entities that determine the purpose and means of processing personal data) and data processors (entities that are responsible for processing data on behalf of a data controller — aka subcontractors).

Indeed, data processors have some direct compliance obligations under GDPR, and can also be held equally responsible for data violations, with individuals able to bring compensation claims directly against them, and DPAs able to hand them fines or other sanctions.

So the intent for the regulation is there be no diminishing in responsibility down the chain of data handling subcontractors. GDPR aims to have every link in the processing chain be a robust one.

For companies that rely on a lot of subcontractors to handle data operations on their behalf there’s clearly a lot of risk assessment work to be done.

As noted above, there is a degree of leeway for EU Member States in how they implement some parts of the regulation (such as with the age of data consent for kids).

Consumer protection groups are calling for the UK government to include an optional GDPR provision on collective data redress to its DP bill, for example — a call the government has so far rebuffed.

But the wider aim is for the regulation to harmonize as much as possible data protection rules across all Member States to reduce the regulatory burden on digital businesses trading around the bloc.

On data redress, European privacy campaigner Max Schrems — most famous for his legal challenge to US government mass surveillance practices that resulted in a 15-year-old data transfer arrangement between the EU and US being struck down in 2015 — is currently running a crowdfunding campaign to set up a not-for-profit privacy enforcement organization to take advantage of the new rules and pursue strategic litigation on commercial privacy issues.

Schrems argues it’s simply not viable for individuals to take big tech giants to court to try to enforce their privacy rights, so thinks there’s a gap in the regulatory landscape for an expert organization to work on EU citizen’s behalf. Not just pursuing strategic litigation in the public interest but also promoting industry best practice.

The proposed data redress body — called noyb; short for: ‘none of your business’ — is being made possible because GDPR allows for collective enforcement of individuals’ data rights. And that provision could be crucial in spinning up a centre of enforcement gravity around the law. Because despite the position and role of DPAs being strengthened by GDPR, these bodies will still inevitably have limited resources vs the scope of the oversight task at hand.

Some may also lack the appetite to take on a fully fanged watchdog role. So campaigning consumer and privacy groups could certainly help pick up any slack.

Privacy by design and privacy by default

Another major change incoming via GDPR is ‘privacy by design’ no longer being just a nice idea; privacy by design and privacy by default become firm legal requirements.

This means there’s a requirement on data controllers to minimize processing of personal data — limiting activity to only what’s necessary for a specific purpose, carrying out privacy impact assessments and maintaining up-to-date records to prove out their compliance.

Consent requirements for processing personal data are also considerably strengthened under GDPR — meaning lengthy, inscrutable, pre-ticked T&Cs are likely to be unworkable. (And we’ve sure seen a whole lot of those hellish things in tech.) The core idea is that consent should be an ongoing, actively managed process; not a one-off rights grab.

As the UK’s ICO tells it, consent under GDPR for processing personal data means offering individuals “genuine choice and control” (for sensitive personal data the law requires a higher standard still — of explicit consent).

There are other legal bases for processing personal data under GDPR — such as contractual necessity; or compliance with a legal obligation under EU or Member State law; or for tasks carried out in the public interest — so it is not necessary to obtain consent in order to process someone’s personal data. But there must always be an appropriate legal basis for each processing.

Transparency is another major obligation under GDPR, which expands the notion that personal data must be lawfully and fairly processed to include a third principle of accountability. Hence the emphasis on data controllers needing to clearly communicate with data subjects — such as by informing them of the specific purpose of the data processing.

The obligation on data handlers to maintain scrupulous records of what information they hold, what they are doing with it, and how they are legally processing it, is also about being able to demonstrate compliance with GDPR’s data processing principles.

But — on the plus side for data controllers — GDPR removes the requirement to submit notifications to local DPAs about data processing activities. Instead, organizations must maintain detailed internal records — which a supervisory authority can always ask to see.

It’s also worth noting that companies processing data across borders in the EU may face scrutiny from DPAs in different Member States if they have users there (and are processing their personal data). Although the GDPR sets out a so-called ‘one-stop-shop’ principle — that there should be a “lead” DPA to co-ordinate supervision between any “concerned” DPAs — this does not mean that once it applies a cross-EU-border operator like Facebook is only going to be answerable to the concerns of the Irish DPA.

Indeed, Facebook’s tactic of only claiming to be under the jurisdiction of a single EU DPA looks to be on borrowed time. And the one-stop-shop provision in the GDPR seems more about creating a co-operation mechanism to allow multiple DPAs to work together in instances where they have joint concerns. Rather than offering a way for multinationals to go ‘forum shopping’ — which the regulation does not permit (per WP29 guidance).

Another change: Privacy policies that contain vague phrases like ‘We may use your personal data to develop new services’ or ‘We may use your personal data for research purposes’ will not pass muster under the new regime. So a wholesale rewriting of vague and/or confusingly worded T&Cs is something Europeans can look forward to this year.

Add to that, any changes to privacy policies must be clearly communicated to the user on an ongoing basis. Which means no more references in the privacy statement telling users to ‘regularly check for changes or updates’ — that just won’t be workable.

The onus is firmly on the data controller to keep the data subject fully informed of what is being done with their information. (Which almost implies that good data protection practice could end up tasting a bit like spam, from a user PoV.)

The overall intent behind GDPR is to inculcate an industry-wide shift in perspective regarding who ‘owns’ user data — disabusing companies of the notion that other people’s personal information belongs to them just because it happens to be sitting on their servers.

“Organizations should acknowledge they don’t exist to process personal data but they process personal data to do business,” is how analyst Gartner research director Bart Willemsen sums this up. “Where there is a reason to process the data, there is no problem. Where the reason ends, the processing should, too.”

The data protection officer (DPO) role that GDPR brings in as a requirement for many data handlers is intended to help them ensure compliance.

This officer, who must report to the highest level of management, is intended to operate independently within the organization, with warnings to avoid an internal appointment that could generate a conflict of interests.

Which types of organizations face the greatest liability risks under GDPR? “Those who deliberately seem to think privacy protection rights is inferior to business interest,” says Willemsen, adding: “A recent example would be Uber, regulated by the FTC and sanctioned to undergo 20 years of auditing. That may hurt perhaps similar, or even more, than a one-time financial sanction.”

“Eventually, the GDPR is like a speed limit: There not to make money off of those who speed, but to prevent people from speeding excessively as that prevents (privacy) accidents from happening,” he adds.

Another right to be forgotten

Under GDPR, people who have consented to their personal data being processed also have a suite of associated rights — including the right to access data held about them (a copy of the data must be provided to them free of charge, typically within a month of a request); the right to request rectification of incomplete or inaccurate personal data; the right to have their data deleted (another so-called ‘right to be forgotten’ — with some exemptions, such as for exercising freedom of expression and freedom of information); the right to restrict processing; the right to data portability (where relevant, a data subject’s personal data must be provided free of charge and in a structured, commonly used and machine readable form).