All posts in “Law”

Apple ad focuses on iPhone’s most marketable feature — privacy

Apple is airing a new ad spot in primetime today. Focused on privacy, the spot is visually cued, with no dialog and a simple tagline: Privacy. That’s iPhone.

In a series of humorous vignettes, the message is driven home that sometimes you just want a little privacy. The spot has only one line of text otherwise, and it’s in keeping with Apple’s messaging on privacy over the long and short term. “If privacy matters in your life, it should matter to the phone your life is on.”

The spot will air tonight in primetime in the U.S. and extend through March Madness. It will then air in select other countries.

[embedded content]

You’d have to be hiding under a rock not to have noticed Apple positioning privacy as a differentiating factor between itself and other companies. Beginning a few years ago, CEO Tim Cook began taking more and more public stances on what the company felt to be your “rights” to privacy on their platform and how that differed from other companies. The undercurrent being that Apple was able to take this stance because its first-party business relies on a relatively direct relationship with customers who purchase its hardware and, increasingly, its services.

This stands in contrast to the model of other tech giants like Google or Facebook that insert an interstitial layer of monetization strategy on top of that relationship in the forms of application of personal information about you (in somewhat anonymized fashion) to sell their platform to advertisers that in turn can sell to you better.

Turning the ethical high ground into a marketing strategy is not without its pitfalls, though, as Apple has discovered recently with a (now patched) high-profile FaceTime bug that allowed people to turn your phone into a listening device, Facebook’s manipulation of App Store permissions and the revelation that there was some long overdue house cleaning needed in its Enterprise Certificate program.

I did find it interesting that the iconography of the “Private Side” spot very, very closely associates the concepts of privacy and security. They are separate, but interrelated, obviously. This spot says these are one and the same. It’s hard to enforce privacy without security, of course, but in the mind of the public I think there is very little difference between the two.

The App Store itself, of course, still hosts apps from Google and Facebook among thousands of others that use personal data of yours in one form or another. Apple’s argument is that it protects the data you give to your phone aggressively by processing on the device, collecting minimal data, disconnecting that data from the user as much as possible and giving users as transparent a control interface as possible. All true. All far, far better efforts than the competition.

Still, there is room to run, I feel, when it comes to Apple adjudicating what should be considered a societal norm when it comes to the use of personal data on its platform. If it’s going to be the absolute arbiter of what flies on the world’s most profitable application marketplace, it might as well use that power to get a little more feisty with the bigcos (and littlecos) that make their living on our data.

I mention the issues Apple has had above not as a dig, though some might be inclined to view Apple integrating privacy with marketing as boldness bordering on hubris. I, personally, think there’s still a major difference between a company that has situational loss of privacy while having a systemic dedication to privacy and, well, most of the rest of the ecosystem which exists because they operate an “invasion of privacy as a service” business.

Basically, I think stating privacy is your mission is still supportable, even if you have bugs. But attempting to ignore that you host the data platforms that thrive on it is a tasty bit of prestidigitation.

But that might be a little too verbose as a tagline.

Don’t break up big tech — regulate data access, says EU antitrust chief

Breaking up tech giants should be a measure of last resort, the European Union’s competition commissioner, Margrethe Vestager, has suggested.

“To break up a company, to break up private property would be very far reaching and you would need to have a very strong case that it would produce better results for consumers in the marketplace than what you could do with more mainstream tools,” she warned this weekend, speaking in a SXSW interview with Recode’s Kara Swisher. “We’re dealing with private property. Businesses that are built and invested in and become successful because of their innovation.”

Vestager has built a reputation for being feared by tech giants, thanks to a number of major (and often expensive) interventions since she took up the Commission antitrust brief in 2014, with still one big outstanding investigation hanging over Google.

But while opposition politicians in many Western markets — including high profile would-be U.S. presidential candidates — are now competing on sounding tough on tech, the European commissioner advocates taking a scalpel to data streams rather than wielding a break-up hammer to smash market-skewing tech giants.

“When it comes to the very far reaching proposal to split up companies, for us, from a European perspective, that would be a measure of last resort,” she said. “What we do now, we do the antitrust cases, misuse of dominant position, the tying of products, the self-promotion, the demotion of others, to see if that approach will correct and change the marketplace to make it a fair place where there’s no misuse of dominant position but where smaller competitors can have a fair go. Because they may be the next big one, the next one with the greatest idea for consumers.”

She also pointed to an agreement last month, between key European political institutions on regulating online platform transparency, as an example of the kind of fairness-focused intervention she believes can work to counter market imbalance.

The bread and butter work regulators should be focused on where big tech is concerned are things like digital sector enquiries and hearings to examine how markets are operating in detail, she suggested — using careful scrutiny to inform and shape intelligent, data-led interventions.

Albeit ‘break up Google’ clearly makes for a punchier political soundbite.

Vestager is, however, in the final months of her term as antitrust chief — with the Commission due to turn over this year. Her time at the antitrust helm will end on November 1, she confirmed. (Though she remains, at least tentatively, on a shortlist of candidates who could be appointed the next European Commission president.)

The commissioner has spoken up before about regulating access to data as a more interesting option for controlling digital giants vs breaking them up.

And some European regulators appear to be moving in that direction already. Such the German Federal Cartel Office (FCO) which last month announced a decision against Facebook which aims to limit how it can use data from its own services. The FCO’s move has been couched as akin to an internal break up of the company, at the data level, without the tech giant having to be forced to separate and sell off business units like Instagram and WhatsApp.

It’s perhaps not surprising, therefore, that Facebook founder Mark Zuckerberg announced a massive plan to merge all three services at the technical level just last week — billing the switch to encrypted content but merged metadata as a ‘pro-privacy’ move, while clearly also intending to restructure his empire in a way that works against regulatory interventions that separate and control internal data flows at the product level.

The Competition Commission does not have a formal probe of Facebook or the social media sector open at this point but Vestager said her department does have its eye on how social media giants are using data.

“We’re sort of hoovering over social media, Facebook — how data’s being used in that respect,” she said, also flagging the preliminary work it’s doing looking into Amazon’s use of merchant data. (Also still not yet a formal probe.)

“The good thing is now the debate is really sort of taking off,” she added, of competition regulation generally. “When I’ve been visiting and speaking with people on The Hill previously, I’ve sensed a new sort of interest and curiosity as to what can competition achieve for you in a society. Because if you have fair competition then you have markets serving the citizen in our role as consumer and not the other way around.”

Asked whether she’s personally convinced by Facebook’s sudden ‘appreciation’ of privacy Vestager said if the announcement signifies a genuine change of philosophy and direction which leads to shifts in its business practices it would be good news for consumers.

Though she said she’s not simply taking Zuckerberg at his word at this point. “It may be a little far-reaching to assume the best,” she said politely when pushed by Swisher on whether she believed a sincere pivot is possible from a company with such a long privacy-hostile history.

Big tech, small tax

The interview also delved into the issue of big tech and the tiny amounts it pays in tax.

Reforming the global tax system so digital businesses pay a fair share vs traditional businesses is now “urgent” work to do, said Vestager — highlighting how the lack of a consensus position among EU Member States is pushing some countries to move forward with their own measures, given resistance to Commission proposals from other corners of the bloc.

France‘s push for a tax on tech giants this year is “absolutely necessary but very unfortunate”, Vestager said.

“When you do numbers that can be compared we see that digital businesses they would pay on average nine per cent [in taxes] where traditional businesses on average pay 23 per cent,” she continued. “Yet they’re in the same market for capital, for skilled employees, sometimes competing for the same customers. So obviously this is not fair.”

The Commission’s hope is that individual “pushes” from Member States frustrated by the current tax imbalance will generate momentum for “a European-wide way of doing things” — and therefore that any fragmentation of tax policies across the bloc will be short-lived.

She also she Europe is keen for the Organisation for Economic Co-operation and Development to “push forward for this” too, remarking: “Because we sense in the OECD that a number of places in the world take an interest also in the U.S. side of things.”

Is the better way to reset inequalities related to big tech and society achieved via reforming the tax system or are regulators doomed to have to keep fining them “into the next century”, wondered Swisher.

“You get a fine when you do something illegal. You pay your taxes to contribute to society where you do your business. These are two different things and we definitely need both,” responded Vestager. “But we cannot have a situation where some businesses do not contribute and the majority of businesses they do. Because it’s simply not fair in the marketplace or fair towards citizens if this continues.”

She also made short shrift of the favored big tech lobbyist line — to loudly claim privacy regulation helps big guys because it’s easier for them to fund compliance — by pointing out that Europe’s General Data Protection Regulation has “different brackets” and does not simply clobber big and small alike with the same requirements.

Of course small businesses “don’t have the same obligations as Google”, said Vestager.

“I’d say if they find it easy, I’d say they can do better,” she added, raising the much complained about consumer rights issue of consent vs inscrutable T&Cs.

“Because I still find that it’s quite tricky to understand what it is that you accept when you accept your terms and conditions. And I think it would be great if we as citizens could really say ‘oh this is what I am signing up to and I’m perfectly happy with that’.”

Though she admitted there’s still a way to go for European privacy rights to be fully functioning as intended — arguing it’s still too hard for individual consumers to exercise the rights they have in law.

“I know I own my data but I really do not know how to exercise that ownership,” she said. “How to allow for more people to have access to my data if I want to enable innovation, new market participants coming in. If that was done in large scale you could have an innovative input into the marketplace and we’re definitely not there yet,” she said.

Asked about the idea of taxing data flows as another possible means of clipping the wings of big tech Vestager pointed to early signs of an intermediate market spinning up in Europe to help individual extract value from what corporate entities are doing with their information. So not literally a tax on data flows but a way for consumers to claw back some of the value that’s being stripped from them.

“It’s still nascent in Europe but since now we have the rights that establishes your ownership of your data we see there is a beginning market development of intermediaries saying should I enable you yourself to monetize your data, so it’s not just the giants who monetize your data. So that maybe you get a sum every month reflecting how your data has been passed on,” she said. “That is one opportunity.”

She also said the Commission is looking at how to make sure “huge amounts of data will not be a barrier to entry in a marketplace” — or present a barrier to innovation for newcomers. The latter being key given how tech giants’ massive data pools are translating into a meaty advantage in AI R&D.

In another interesting exchange, Vestager suggested the convenience of voice interfaces presents an acute competition challenge — given how the tech could naturally concentrate market power via preferring quick-fire Q&A style interactions which don’t support offering lots of choice options.

“One of the things that is really mindboggling for us is how to have choice if you have voice,” she said, arguing that voice assistance dynamic doesn’t lend itself to multiple suggestions being offered every time a user asks a question. “So how to have competition when you have voice search?.. How would this change the marketplace and how would we deal with such a market? So this is what we’re trying to figure out.”

Again she suggested regulators are thinking about how data flows behind the scenes as a potential route to remedying interfaces that work against choice.

“We’re trying to figure out how access to data will change the marketplace,” she added. “Can you give a different access to data because the one who holds the data, also holds the resources for innovation. And we cannot rely on the big guys to be the innovative ones.”

Asked for her worst case scenario for tech 10 years hence, she said it would be to have “all of the technology but none of the societal positive oversight and direction”.

On the flip side, the best case would be for legislators to be “willing to take sufficient steps in taxation and in regulating access to data and fairness in the marketplace”.

“We would also need to see technology develop to have new players,” she emphasized. “Because we still need to see what will happen with quantum computing, what will happen with blockchain, what other uses are there for all if that new technology. Because I still think that it holds a lot of promise. But only if our democracy will give it direction. Then you will have a positive outcome.”

LinkedIn forced to ‘pause’ mentioned in the news feature in Europe after complaints about ID mix-ups

LinkedIn has been forced to ‘pause’ a feature in Europe in which the platform emails members’ connections when they’ve been ‘mentioned in the news’.

The regulatory action follows a number of data protection complaints after LinkedIn’s algorithms incorrect matched members to news articles — triggering a review of the feature and subsequent suspension order.

The feature appears as a case study in the ‘Technology Multinationals Supervision’ section of an annual report published today by the Irish Data Protection Commission (DPC). Although the report does not explicitly name LinkedIn — but we’ve confirmed it is the named professional social network.

The data watchdog’s report cites “two complaints about a feature on a professional networking platform” after LinkedIn incorrectly associated the members with media articles that were not actually about them.

“In one of the complaints, a media article that set out details of the private life and unsuccessful career of a person of the same name as the complainant was circulated to the complainant’s connections and followers by the data controller,” the DPC writes, noting the complainant initially complained to the company itself but did not receive a satisfactory response — hence taking up the matter with the regulator.

The complainant stated that the article had been detrimental to their professional standing and had resulted in the loss of contracts for their business,” it adds.

“The second complaint involved the circulation of an article that the complainant believed could be detrimental to future career prospects, which the data controller had not vetted correctly.”

LinkedIn appears to have been matching members to news articles by simple name matching — with obvious potential for identity mix-ups between people with shared names.

“It was clear from the complaints that matching by name only was insufficient, giving rise to data protection concerns, primarily the lawfulness, fairness and accuracy of the personal data processing utilised by the ‘Mentions in the news’ feature,” the DPC writes.

“As a result of these complaints and the intervention of the DPC, the data controller undertook a review of the feature. The result of this review was to suspend the feature for EU-based members, pending improvements to safeguard its members’ data.”

We reached out to LinkedIn with questions and it pointed us to this blog post where it confirms: “We are pausing our Mentioned in the News feature for our EU members while we reevaluate its effectiveness.”

LinkedIn adds that it is reviewing the accuracy of the feature, writing:

As referenced in the Irish Data Protection Commission’s report, we received useful feedback from our members about the feature and as a result are evaluating the accuracy and functionality of Mentioned in the News for all members.

The company’s blog post also points users to a page where they can find out more about the ‘mentioned in the news’ feature and get information on how to manage their LinkedIn email notification settings.

The Irish DPC’s action is not the first privacy strike against LinkedIn in Europe.

Late last year, in its early annual report, on the pre-GDPR portion of 2018, the watchdog revealed it had investigated complaints about LinkedIn related to it targeting non-users with adverts for its service.

The DPC found the company had obtained emails for 18 million people for whom it did not have consent to process their data. In that case LinkedIn agreed to cease processing the data entirely.

That complaint also led the DPC to audit LinkedIn. It then found a further privacy problem, discovering the company had been using its social graph algorithms to try to build suggested networks of compatible professional connections for non-members.

The regulator ordered LinkedIn to cease this “pre-compute processing” of non-members’ data and delete all personal data associated with it prior to GDPR coming into force.

LinkedIn said it had “voluntarily changed our practices as a result”.

Even years later, Twitter doesn’t delete your direct messages

When does “delete” really mean delete? Not always or even at all if you’re Twitter .

Twitter retains direct messages for years, including messages you and others have deleted, but also data sent to and from accounts that have been deactivated and suspended, according to security researcher Karan Saini.

Saini found years-old messages found in a file from an archive of his data obtained through the website from accounts that were no longer on Twitter. He also filed a similar bug, found a year earlier but not disclosed until now, that allowed him to use a since-deprecated API to retrieve direct messages even after a message was deleted from both the sender and the recipient — though, the bug wasn’t able to retrieve messages from suspended accounts.

Saini told TechCrunch that he had “concerns” that the data was retained by Twitter for so long.

Direct messages once let users to “unsend” messages from someone else’s inbox, simply by deleting it from their own. Twitter changed this years ago, and now only allows a user to delete messages from their account. “Others in the conversation will still be able to see direct messages or conversations that you have deleted,” Twitter says in a help page. Twitter also says in its privacy policy that anyone wanting to leave the service can have their account “deactivated and then deleted.” After a 30-day grace period, the account disappears and along with its data.

But, in our tests, we could recover direct messages from years ago — including old messages that had since been lost to suspended or deleted accounts. By downloading your account’s data, it’s possible to download all of the data Twitter stores on you.

A conversation, dated March 2016, with a suspended Twitter account was still retrievable today. (Image: TechCrunch

Saini says this is a “functional bug” rather than a security flaw, but argued that the bug allows anyone a “clear bypass” of Twitter mechanisms to prevent accessed to suspended or deactivated accounts.

But it’s also a privacy matter, and a reminder that “delete” doesn’t mean delete — especially with your direct messages. That can open up users, particularly high-risk accounts like journalist and activists, to government data demands that call for data from years earlier.

That’s despite Twitter’s claim that once an account has been deactivated, there is “a very brief period in which we may be able to access account information, including tweets,” to law enforcement.

A Twitter spokesperson said the company was “looking into this further to ensure we have considered the entire scope of the issue.”

Retaining direct messages for years may put the company in a legal grey area ground amid Europe’s new data protection laws, which allows users to demand that a company deletes their data.

Neil Brown, a telecoms, tech and internet lawyer at U.K. law firm Decoded Legal, said there’s “no formality at all” to how a user can ask for their data to be deleted. Any request from a user to delete their data that’s directly communicated to the company “is a valid exercise” of a user’s rights, he said.

Companies can be fined up to four percent of their annual turnover for violating GDPR rules.

“A delete button is perhaps a different matter, as it is not obvious that ‘delete’ means the same as ‘exercise my right of erasure’,” said Brown. Given that there’s no case law yet under the new General Data Protection Regulation regime, it will be up to the courts to decide, he said.

When asked if Twitter thinks that consent to retain direct messages is withdrawn when a message or account is deleted, Twitter’s spokesperson had “nothing further” to add.

The UK now has a law against upskirting

A law change that comes into force in the UK today makes the highly intrusive practice of ‘upskirting’ illegal.

The government said it wants the new law to send a clear message that such behaviour is criminal and will not be tolerated.

Perpetrators in the UK face up to two years in prison under the new law if they’re convicted of taking a photograph or video underneath a person’s clothes for the purpose of viewing their underwear or genitals/buttocks without their knowledge or consent for sexual gratification or to cause humiliation, distress or alarm.

There have been prosecutions for upskirting in England and Wales under an existing common law offence of outraging public decency. But following a campaign started by an upskirting victim the government decided to legislate to plug gaps in the law to make it a sexual offence.

The Voyeurism (Offences) (No. 2) Bill was introduced on June 21 last year and gains royal assent today.

Where the offence of upskirting is committed in order to obtain sexual gratification it can result in the most serious offenders being placed on the sex offenders register.

Under the new law victims are also entitled to automatic protection, such as from being identified in the media.

While the UK government is intending the law change to send a clear message that upskirting is socially unacceptable, there’s no doubt that legislation alone can’t do that. Robust enforcement is essential to counter any problematic attitudes that might be contributing to encourage antisocial uses of technologies in the first place.

For example, in South Korea a law against upskirting carries a maximum sentence of five years in prison yet the legislation has failed to curb an epidemic of offences fuelled by cheap access to tiny hidden spy cameras and baked in societal sexism — the latter seemingly also influencing how police choose to uphold the law, with campaigners complaining most perpetrators get off with small fines.