Reality Check: Was Facebook data’s value ‘literally nothing’?

There is a huge spectrum of opinion on the value of the Facebook data that Cambridge University academic Aleksandr Kogan gave to Cambridge Analytica’s parent company, SCL.

Dr Kogan told a parliamentary committee: “Given what we know now, nothing, literally nothing – the idea that this data is accurate I would say is scientifically ridiculous.”

On the other hand, there have been suggestions this sort of data will allow computers to gain a profound understanding of people and their preferences.

In a news conference on Tuesday, Cambridge Analytica’s spokesman said the company had also found Dr Kogan’s data set to be “virtually useless”.

The orthodox view among data scientists is that the use of social media data to target adverts on Facebook is in its infancy and not yet hugely effective – but Dr Kogan is going further than that, saying that it was completely without value.

Reality Check has seen Dr Kogan’s unpublished research into the value of predicted personalities for micro-targeting. We judge that he is underselling its value although he is correct to say that the data was not accurate.

Personality test

Let’s go back to where the data came from and what it included.

Dr Kogan had a personality testing app on Facebook, on which users would answer questions about themselves and be given scores on how they rated on the Big Five personality traits: openness, conscientiousness, extraversion, agreeableness and neuroticism, which are used by research psychologists and advertisers.

Dr Kogan says about 270,000 users took this test. Taking the test also gave the app data on all the users’ friends, which created a database of 30 million people and their predicted personality scores, according to Dr Kogan. Facebook puts the figure at up to 87 million.

These personality predictions are based on the idea that, for example, if it turned out that people who liked particular brands of sports cars and nightclubs had also turned out to be extraverts, then you might predict that other people who liked those things would also be extraverts.

You can see a similar sort of system on the website of the Psychometrics Centre at the University of Cambridge, which attempts to predict your personality test result based on your social media activity.

Inaccurate predictions

Dr Kogan’s research was funded by SCL, the research and communications company that formed Cambridge Analytica. Dr Kogan passed the data, including some of the pages that users had liked, to SCL.

Dr Kogan now says that the data he gave to SCL was useless for targeting adverts on Facebook because individual predictions were too inaccurate.

But some data scientists argue that the overall quality of the personality predictions is not the most important measure.

Part of the point of targeted advertising is to reduce costs by trying to appeal to only a relatively small number of users.

So you might be more interested in people turning up at the extremes of particular personality measures rather than those coming up as being close to average, because they are the ones most likely to exhibit the traits you are targeting.

As such, the overall reliability of the data may be less important than finding groups who may be targeted.

Also, Dr Kogan argues that trying to assess the personality of an individual gives too large a margin of error so the predictions are reliable only if you’re taking averages across larger groups. But looking at larger groups may be helpful during an election, when you might be trying to decide where to buy advertising on local radio or where to hold an election rally, for example.

So Dr Kogan is underselling the value of his dataset. While not all of it would have been useful, parts of it could have been helpful.

Read more from Reality Check

Send us your questions

Follow us on Twitter

Tech Tent: Questions for Zuckerberg and Cambridge

It was a two-day interrogation with dozens of questions – some of them acute, some of them rambling, a few quite bizarre.

On the Tech Tent podcast this week, we zero in on what Mark Zuckerberg failed to answer during his US congressional appearances, about just how much data Facebook collects – and the control users have over it.

We also try to find out whether something bad is going on at University of Cambridge when it comes to academic use of Facebook data, as Mr Zuckerberg suggested.

  • Stream or download the latest Tech Tent podcast
  • Listen live every Friday at 15.00 GMT on the BBC World Service

    The single most uncomfortable moment for Facebook’s founder was probably when Senator Dick Durbin asked him whether he would share with the committee the name of the hotel where he had spent the night in Washington.

    After a long pause and an embarrassed grin he answered “umm…no!”

    It made the point, according to Senator Durbin, that he was more cautious about his privacy than the average Facebook user who “checks in” without a thought.

    The following day, he was asked by Congressman Ben Lujan about the data collected on people who had never even signed up to Facebook. Again, Mr Zuckerberg appeared uncomfortable. He had never heard of the widely used term “shadow profiles” to describe this kind of data collection.

    Then the congressman took us down an Alice in Wonderland-style rabbit hole, where people who do not use Facebook are told to log in to their Facebook accounts to find out what data Facebook holds on them. “We’ve got to fix that,” he said.

    Frederike Kaltheuner from Privacy International tells Tech Tent that this kind of data collection, with users unaware of what is happening, is all too common – and Facebook is far from the only culprit.

    We also examine the issue raised by Mr Zuckerberg when he was asked whether he planned to sue either Dr Aleksandr Kogan or Cambridge University over the misuse of Facebook data.

    ‘Stronger action’

    He talked of a whole programme at the university, where a number of researchers were building similar apps to that made by Dr Kogan for Cambridge Analytica.

    “We do need to know whether there was something bad going on at Cambridge University overall that will require a stronger action from us,” he said.

    The university fired straight back. Mr Zuckerberg should have known that perfectly respectable academic research into social media had been going on, some of it with the involvement of Facebook employees. And as for Dr Kogan, the university had written to Facebook about its allegations against him but had not received a reply.

    On Wednesday morning, before Mr Zuckerberg’s remarks, I visited the Cambridge Psychometrics Centre and found some acknowledgement of the harm caused to the university’s reputation.

    The Centre, which is located in the Judge Business School, was drawn into the controversy when Facebook banned Cubeyou, another firm that had developed a personality quiz in collaboration with the university’s academics.

    Business development director Vesselin Popov insisted it was opt-in only and was in line with Facebook’s policies at the time, so was not at all like the app developed for Cambridge Analytica by Dr Kogan.

    He told me that Dr Kogan’s work had raised issues for the university: “Even if an academic does something – quote unquote in their ‘spare time’, with their own company – they still ought to be held to professional standards as a psychologist.”

    Dr Kogan and the Cambridge Psychometrics Centre are in dispute over whether a row over his personality app – and the involvement of the centre’s academics – was about ethics or money. I wrote another article about that issue on Friday.

    But the two sides agree that Facebook needs to focus on what commercial businesses do with user data, rather than academics.

    “It’s very clear that Cambridge Analytica and these kinds of companies are the product of an environment to which Facebook has contributed greatly,” says Mr Popov. “Although they might be making some changes today in response to public and regulatory pressure, this needs to be seen as an outcome of very permissive attitudes towards those companies.”

    With an audit of thousands of Facebook apps under way, we may hear more in the coming weeks about just how cavalier some companies have been with our personal data.

    • Stream or download the latest Tech Tent podcast
    • Listen live every Friday at 15.00 GMT on the BBC World Service

Facebook ‘too slow to deal with hack’, says singer

A musician whose Facebook account was hijacked has urged the company to make it easier for people to recover control of their social media pages.

Country and gospel singer Philippa Hanna said Facebook was difficult to contact and took several days to act.

The attacker changed her contact details and username, so Ms Hanna was locked out of her own account, and even mocked her followers.

The company says it has a dedicated reporting channel but is investigating.

Ms Hanna has supported Lionel Ritchie, Leona Lewis and Little Mix on tour and has more than 5,000 friends and 6,000 followers on Facebook.

She was concerned by the hijack because she used her account to promote her music, and her account was linked to another website that stored her bank details.

Automated emails

According to Ms Hanna, South Yorkshire Police told her to look the problem up on Google.

The police force has been contacted by the BBC for comment.

Ms Hanna said she contacted Facebook as soon as she realised what had happened, but found it very difficult to get a response.

“One of the worst things was being stuck in a loop of automated emails telling me to try the same things I had already tried,” she said.

“My friends were trying to report the page, but Facebook kept coming back, saying ‘there’s nothing offensive about this account’.

“There wasn’t the option to say the page had been hijacked. There was a ‘fake account’ option, but mine was not fake. It was stolen.”

Mother ‘unfriended’

Ms Hanna admits that the email address she had used to set up her account was no longer active, so Facebook could not send her a reset link to unlock it.

But she was disappointed that one of Facebook’s automated suggestions was to delete the account.

“After 10 years of building it up, using it for my career as an independent musician, I thought that was not acceptable. It felt like a kick in the guts after 10 years of devoted data entry.”

While the attacker did not make any demands or public posts, the person, who appeared to be logging in from Turkey, did change her friends list and “unfriended” her mother.

The attacker also sent a private message full of laughing emojis to a fan who had messaged the singer about their mental health.

“That was when I got really annoyed – to me this is a public safety issue,” Ms Hanna said.

“I have vulnerable people who trust me and this hacker was mocking that, pretending to be me.”

‘Amazing’ platform

Ms Hanna put a note on Instagram explaining that she had been hacked on Facebook.

When she woke up the following day, she discovered the post had been removed and she had received an email saying somebody had been trying to change her settings.

“It was really eerie – he was censoring my Instagram to keep himself protected.”

She thinks she may have come to the attention of the hacker after a video of her singing an Ed Sheeran song went viral, attracting more than 18 million views.

“I certainly don’t hate Facebook. It’s an amazing platform,” she said.

“But it really needs to give serious thought into how to protect people.”

Dedicated reporting channels

Ms Hanna says she now has her account back.

“The lady who eventually helped me was an angel. There are amazing, clever people at Facebook – but its far too hard to get to them,” she said.

“There should be an emergency helpline. I would gladly have paid a premium charge to speak to someone if only it had been an option. It would have been worth doing to protect my followers.”

Facebook said it was investigating what had happened.

It said: “We want everyone to have a positive experience on Facebook which is why we have a dedicated reporting channel on our Help Centre for people to secure their account if they think it has been compromised.”

Facebook to vet UK political ads for May 2019 local elections

Facebook’s chief technology officer is to promise MPs that the social network will act to make political advertising far more transparent for UK users.

Mike Schroepfer will say that his firm will be ready to authorise ads in time for England and Northern Ireland’s May 2019 local elections.

He will make the pledge while giving evidence to a parliament committee.

Facebook had previously committed itself to similar action in the US later this year.

Mr Schroepfer is being questioned as part of the Department of Culture, Media and Sport Select Committee’s inquiry into fake news.

But the politicians also want to know more about the leak of Facebook data to the political consultancy Cambridge Analytica.

The committee had wanted to hear from Facebook’s founder and chief executive Mark Zuckerberg.

  • Facebook sales soar ‘despite challenges’
  • Was Facebook data’s value ‘literally nothing’?
  • Facebook threw us under bus, says data firm
  • ‘Facebook in PR crisis’ on data row

    However, he opted to send other executives to answer questions from politicians outside the US, having given two days of testimony in Washington earlier this month.

    Advert archive

    In his opening remarks, Mr Schroepfer will tell MPs that he and his boss are deeply sorry about what happened with Cambridge Analytica, which he will describe as a breach of trust.

    He will also promise to deploy a new “view ads” button in the UK by June 2018, which will let members see all the adverts any page is showing to users via Facebook, Messenger and Instagram. The company first launched the facility in Canada last October.

    In addition, Mr Schroepfer will promise the following will be up and running in time for the 2019 local elections:

    • political ads will only be allowed if they are submitted by authenticated accounts
    • such ads will be labelled as being “political” and it will be made clear who paid for them
    • the adverts will subsequently be placed in a searchable archive for seven years, where information will be provided about how many times they may have been seen and how much money was spent on them

      But MPs are likely to have questions about the use of Facebook in past elections, notably the EU referendum, and whether there was any foreign involvement.

      They will also want to drill down into the Cambridge Analytica affair and find out whether Facebook has uncovered similar cases during an audit of developer behaviour.

      View comments

Mark Zuckerberg’s dreaded homework assignments

Over two days, almost 10 hours.

If you watched every moment of Mark Zuckerberg’s testimony in front of Congress this week, you’ll know he rolled out one phrase an awful lot: “I’ll have my team get back to you.”

Now some of these were bits of data Mr Zuckerberg simply didn’t have to hand – such as why a specific advertisement for a political candidate in Michigan didn’t get approved.

Other follow ups, though, will require some hard graft from his team. What they produce could provide even more negative headlines for the company, as it is forced to divulge more of its inner workings than it has ever felt comfortable with.

Looking through the transcripts, I’ve counted more than 20 instances where Mr Zuckerberg promised to get back to representatives with more information. But these are the assignments I think could cause the company the most headaches – and provide some revealing answers.

1) Data on non-users

Set by: Congressman Ben Lujan (Democrat, New Mexico)

“You’ve said everyone controls their data, but you’re collecting data on people who are not even Facebook users who have never signed a consent, a privacy agreement.”

Dubbed “shadow” profiles, details of exactly what Facebook gathers on people who haven’t even signed up to the service has been always been a bit of mystery.

Even, apparently, to Mr Zuckerberg himself. He testified that he didn’t know the term, but acknowledged the firm did monitor non-users for “security” purposes.

Mr Zuckerberg promised to share more details on what data is gathered on people who don’t sign up for Facebook, as well as a full breakdown of how many data points it has on those who do.

In a related request, Mr Zuckerberg will provide details on how users are tracked (on all their devices) when they are logged out of Facebook.

2) Moving to opt-in, not opt-out

Set by: Congressman Frank Pallone (Democrat, New Jersey)

“I think you should make that commitment.”

Creating new regulation will be an arduous, flawed process. But one thing Facebook could do right now? Move to an opt-in model, one which requires users to decide to make something public, as is the default (and most popular) option for posting content now.

In a similar vein, Mr Zuckerberg was asked to get back to Congressman Frank Pallone on how the company might consider collecting less information on its users.

3) Repercussions for censorship mistakes

Set by: Congressman Steve Scalise (Republican, Louisiana)

“Was there a directive to put a bias in [the algorithms]? And, first, are you aware of this bias that many people have looked at and analysed and seen?”

One surprising admission made by Mr Zuckerberg before these hearings was that despite acknowledging the company made big mistakes, nobody has been fired over the Cambridge Analytica affair.

Representative Steve Scalise wants to take questions on accountability a step further.

In cases where Facebook reverses a decision to remove content – i.e. admitting it over-moderated – what kind of repercussions did those responsible face? If someone created an algorithm that unfairly filtered certain political views, was there any kind of punishment?

4) Specific rules for minors

Set by: Senator Ed Markey (Democrat, Massachusetts)

“We’re leaving these children to the most rapacious commercial predators in the country who will exploit these children unless we absolutely have a law on the books.”

On Facebook the minimum age of users is 13, not counting the company’s Messenger for Kids app (which doesn’t collect the type of data Facebook’s main app does).

But for those aged 13-18, or maybe 21, what happens in those oh-so-delicate years should be protected by tighter rules, Senator Ed Markey suggested.

Mr Zuckerberg said the idea “deserved a lot of discussion”, but maybe not a new law. He promised to get his team to “flesh out the details”.

5) How many ‘like’ and ‘share’ buttons are out there?

Set by: Congresswoman Debbie Dingell (Democrat, Michigan)

“It doesn’t matter whether you have a Facebook account. Through those tools, Facebook is able to collect information from all of us.”

It seems like everywhere you look there is a button prompting you to “like” or share things on Facebook – indeed, there’s one on the page you’re reading right now.

A request to at least estimate how many of Facebook’s buttons are out there might at first seem like an abstract demand – but the response could be quite something.

The “like” buttons enable Facebook to track users on pages that are not part of Facebook itself, providing more data for advertisers.

If it’s even possible to tot up how many buttons are out there on the web, expect a number in the hundreds of millions – that’s hundreds of millions of pages with which Facebook is tracking your activity beyond its own borders.

View comments

Cambridge University saw ‘no issue’ with Facebook research

The academic at the centre of Facebook’s data scandal has hit back at Mark Zuckerberg’s suggestion that “something bad” might be going on at Cambridge University.

Dr Aleksandr Kogan, who collected data for Cambridge Analytica, told the BBC that Facebook should be investigating commercial uses of its data, not focusing on academic research.

He also denied that fellow academics had had any “ethical issues” with his work for Cambridge Analytica.

On Wednesday, Mark Zuckerberg said at a congressional hearing that there were a number of Cambridge academics building similar apps to Dr Kogan’s.

He said Facebook needed to know “whether there was something bad going on at Cambridge University”.

Commercial purposes

In an email to the BBC, Dr Kogan said it was true that the Cambridge Psychometrics Centre had developed a personality quiz to collect Facebook data, and that the dataset was shared with academics around the world.

However, he added: “It’s surprising that Facebook would choose to focus its investigation on academics working with other academics. There are tens of thousands of apps [which] had access to the data for commercial purposes.

“I would have thought it makes the most sense to start there.”

On Wednesday, Cambridge University said it was surprised that Mr Zuckerberg had only recently become aware of its research into social media, since it had appeared in peer-reviewed journals.

It said Facebook had not responded to its request for information about the allegations against Dr Kogan.

‘Still representing university’

Dr Kogan also defended himself against criticism by the university’s Psychometric Centre, which said that even though he had never been connected with it, his commercial activities had reflected on the university as a whole.

Vesselin Popov, the business development director of the Psychometrics Centre, said: “Our opinion is that even if an academic does something ‘in their spare time’ with their own company, they still ought to be held to professional standards as a psychologist because, like it or not, they are still representing that body and the university in doing it.”

Dr Kogan said he was surprised by Mr Popov’s comments as he had discussions with academics at the centre about their participation in the project.

“In truth, the Psychometrics Centre never had an ethical issue with the project, as far as I’m aware. To the contrary, my impression was that they very much wanted to be a part of it,” he told the BBC.

He said the relationship went sour only after a dispute over how much the Psychometric Centre would be paid for its involvement in the project, not over any ethical concerns.

The Psychometrics Centre, which is based at the university’s Judge Business School, rejected Dr Kogan’s version of events.

It said it had complained to the university authorities about his behaviour towards two of its academic staff, not about the monetary issue.

Cambridge University says it has received reassurances from Dr Kogan about his business interests but is now conducting a wide-ranging review of the case.

‘More than 600 apps had access to my iPhone data’

While Facebook desperately tightens controls over how third parties access its users’ data – trying to mend its damaged reputation – attention is focusing on the wider issue of data harvesting and the threat it poses to our personal privacy.

Data harvesting is a multibillion dollar industry and the sobering truth is that you may never know just how much data companies hold about you, or how to delete it.

That’s the startling conclusion drawn by some privacy campaigners and technology companies.

“Thousands of companies are in the business of harvesting your data and tracking your online behaviour,” says Frederike Kaltheuner, data programme lead for lobby group Privacy International.

“It’s a global business. And not just online, but offline, too, via loyalty cards and wi-fi tracking of your mobile. It’s almost impossible to know what’s happening to your data.”

The really big data brokers – firms such as Acxiom, Experian, Quantium, Corelogic, eBureau, ID Analytics – can hold as many as 3,000 data points on every consumer, says the US Federal Trade Commission.

Ms Kaltheuner says more than 600 apps have had access to her iPhone data over the last six years. So she’s taken on the onerous task of finding out exactly what these apps know about her.

“It could take a year,” she says, because it involves poring over every privacy policy then contacting the app provider to ask them. And not taking “no” for an answer.

Not only is it difficult to know what data is out there, it is also difficult to know how accurate it is.

“They got my income totally wrong, they got my marital status wrong,” says Pamela Dixon, executive director of the World Privacy Forum, another privacy rights lobby group.

She was examining her record with one of the merchants that scoop up and sell data on individuals around the globe.

She found herself listed as a computer enthusiast – “which is a bit annoying, I’m not running around buying computers every day” – and as a runner, though she’s a cyclist.

Susan Bidel, senior analyst at Forrester Research in New York, who covers data brokers, says a common belief in the industry is that only “50% of this data is accurate”.

So why does any of this matter?

Because this “ridiculous marketing data”, as Ms Dixon calls it, is now determining life chances.

Consumer data – our likes, dislikes, buying behaviour, income level, leisure pursuits, personalities and so on – certainly helps brands target their advertising dollars more effectively.

But its main use “is to reduce risk of one kind or another, not to target ads,” believes John Deighton, a professor at Harvard Business School who writes on the industry.

We’re all given credit scores these days.

If the information flatters you, your credit cards and mortgages will be much cheaper, and you will pass employment background checks more easily, says Prof Deighton.

But these scores may not only be inaccurate, they may be discriminatory, hiding information about race, marital status, and religion, says Ms Dixon.

“An individual may never realize that he or she did not receive an interview, job, discount, premium, coupon, or opportunity due to a low score,” the World Privacy Forum concludes in a report.

Collecting consumer data has been going on for as long as companies have been trying to sell us stuff.

As far back as 1841, Dun & Bradstreet collected credit information and gossip on possible credit-seekers. In the 1970s, list brokers offered magnetic tapes containing data on a bewildering array of groups: holders of fishing licences, magazine subscribers, or people likely to inherit wealth.

But nowadays, the sheer scale of online data has swamped the traditional offline census and voter registration data.

Much of this data is aggregated and anonymised, but much of it isn’t. And many of us have little or no idea how much data we’re sharing, often because we agree to online terms and conditions without reading them. Perhaps understandably.

Two researchers at Carnegie Mellon University in the US worked out that if you were to read every privacy policy you came across online, it would take you 76 days, reading eight hours a day.

And anyway, having to do this “shouldn’t be a citizen’s job”, argues Frederike Kaltheuner, “Companies should have to protect our data as a default.”

Rashmi Knowles from security firm RSA points out that it’s not just data harvesters and advertisers who are in the market for our data.

  • How to protect your Facebook data
  • Facebook suspends Brexit data firm
  • Facebook to warn users in data scandal

    “Often hackers can answer your security question answers – things like date of birth, mother’s maiden name, and so on – because you have shared this information in the public domain,” she says.

    “You would be amazed how easy it is to piece together a fairly accurate profile from just a few snippets of information, and this information can be used for identity theft.”

    So how can we take control of our data?  

    There are ways we can restrict the amount of data we share with third parties – changing browser settings to block cookies, for example, using ad-blocking software, browsing “incognito” or using virtual private networks.

    More Technology of Business

    • Meet the gargantuan air freighter that looks like a whale
    • Reaping the wind with the biggest turbines ever made
    • Making deliveries in a badly mapped world
    • Meet the female ‘artpreneur’ making a splash online
    • Virtual reality as sharp as the human eye can see?

      And search engines like DuckDuckGo limit the amount of information they reveal to online tracking systems.

      But StJohn Deakins, founder and chief executive of marketing firm CitizenMe, believes consumers should be given the ability to control and monetise their data.

      On his app, consumers take personality tests and quizzes voluntarily, then share that data anonymously with brands looking to buy more accurate marketing data to inform their advertising campaigns.

      “Your data is much more compelling and valuable if it comes from you willingly in real time. You can outcompete the data brokers,” he says.

      “Some of our 80,000 users around the world are making £8 a month or donating any money earned to charities,” says Mr Deakins.

      Brands – from German car makers to big retailers – are looking to source data “in an ethical way”, he says.

      “We need to make the marketplace for data much more transparent.”

      • Follow Technology of Business editor Matthew Wall on Twitter and Facebook
        • Click here for more Technology of Business features

The man with (almost) no data trail

No Facebook account. No Twitter. No Instagram.

No smartphone. No tablet. No online banking.

Just an email account accessed at the local library and a chunky Nokia 3210 with a built-in torch.

Felix, not his real name (as you might expect…), lives without the tools and social media accounts that are woven into most of our lives.

For many of us it’s a love-hate relationship – enjoying the regular social contact with friends and family and the efficiencies but lamenting the banality of much of it and the hours it sucks up.

And most recently there’s the matter of the digital trail we leave behind us, the breadcrumbs that social media companies gather up and sell, as we lose track of who knows what about our movements, our needs, our impulses and behaviours.

Felix, a 33-year-old gardener, has been swimming against the tide for years.

In 2018, it may sound like a staunch political statement to veer away from technology and the internet, he says, but, in truth, he just never fancied it.

As new technology emerged and became mainstream, Felix wasn’t drawn to it.

“They weren’t useful to me. I got along without them, like playing the trombone,” he says.

But the world spun several times and a couple of decades later Felix now finds himself something of an anomaly.

People treat him with a sort of low-key admiration and slight bemusement, he says, and when new people see his phone for the first time “they crack up”.

“Most people think if you can live your life that way, good on you. But most wouldn’t want to live like this,” he says.

And don’t write off Felix as someone with little knowledge of the modern world – he is aware of today’s technology used by others his age.

“I would never say you should throw your Alexa in the bin,” he says.

“But it is easier to have a natural human engagement with the world and other people without layers of technology interfering with that.”

  • How to protect your Facebook data
  • If I’ve got your number, so has Facebook
  • Facebook data – are users to blame?

    He does use the internet at the library, going online about twice a week for an hour at a time.

    Typically he’ll work through a list of admin tasks, searching for numbers, addresses, or finding out about a new band – music is his passion. Rarely will he venture onto Facebook or Twitter.

    “When Facebook came out I was interested that it was becoming so popular. So I had a look at a friend’s profile to see the shape of it – that was enough for me. I didn’t touch it for 10 years,” he says.

    Now he might look up a public event advertised on it or scroll through Twitter, albeit without feeling the need to create his own account.

    Asked what he does with all the time he saves by avoiding social media, he laughs and calls it a funny question.

    “Social media is not a fundamental human need. I’m just not sure people were wandering round in 1995 thinking it’s a crying shame I don’t know what Kim Kardashian had for lunch.”

    There are no computers at his family home in Kent, where he lives with his parents, and no tablets or Netflix, just a Freeview box and a TV.

    “I don’t have a hunger for the new thing.

    “If I don’t see Game of Thrones, I assume it will be around for a long time. It diffuses the immediate requirement to gobble things up,” he says.

    But he is on the electoral register and his home number is in the phone book. This type of public data bothers Felix less.

    “It’s older, more firmly established. It was not driven by a thirst for monetising thoughts and personality. It was more expressly an exchange,” he says.

    He also carries an Oyster travelcard, meaning authorities could track his movements around the capital.

    “I don’t love the fact that someone could find out where I’m travelling to in London, but my thinking isn’t fuelled by paranoia.”

    Being traceable is an unpleasant symptom of life today, he says, and if he can avoid his life being publicly available, he will.

    Felix says he expects to see a few more people leave Facebook and chop up their credit cards as a sort of political statement but believes the uptake is so complete and the digital footprint is so deep, most of us are welded to it – it’s how we process our world.

    At times, this interview has been uncomfortable for Felix, who asked us not to share his real name.

    “I am quite far away from having a digital identity,” he says. “It’s quite a big step for me to have something documented for me online.”

    But he says he can understand the interest in his own non-digital stance, given the recent attention around Facebook and what it does with our data.

    News that the social network users’ data is harvested and sold without explicit permission did not come as a surprise to Felix but he does find it distasteful that Facebook packages itself with a friendly face as though it’s more than just a business.

    Does he feel relieved that his own data was never at risk?

    “There is a small part of me that thinks it’s nice I don’t have to have that on my mind,” he replies.

    Felix says he has no intention of changing tack, despite everyone around him, including his older brother, having “normal” attitudes towards technology.

    But, surely, there must be something he feels he is missing out on? Breaking news updates, social gossip, looking through pictures of events he has attended?

    Finally, he cracks… but just a little.

    “I’m indifferent to pictures of that gig, that dinner we went to, but I do have a pang about the wedding pictures I might miss seeing. I don’t get to see them because no hard copies exist of them.”

    But is that enough incentive for Felix to get a Facebook account, to surrender to a world of likes and random friend requests. Still his answer remains an emphatic “no”.

Facebook ‘ugly truth’ growth memo haunts firm

A Facebook executive’s memo that claimed the “ugly truth” was that anything it did to grow was justified has been made public, embarrassing the company.

The 2016 post said that this applied even if it meant people might die as a result of bullying or terrorism.

Both the author and the company’s chief executive, Mark Zuckerberg, have denied they actually believe the sentiment.

But it risks overshadowing the firm’s efforts to tackle an earlier scandal.

Facebook has been under intense scrutiny since it acknowledged that it had received reports that a political consultancy – Cambridge Analytica – had not destroyed data harvested from about 50 million of its users years earlier.

The memo was first made public by the Buzzfeed news site, and was written by Andrew Bosworth.

The 18 June 2016 memo:

So we connect more people.

That can be bad if they make it negative. Maybe it costs a life by exposing someone to bullies. Maybe someone dies in a terrorist attack co-ordinated on our tools.

And still we connect people.

The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is *de facto* good. It is perhaps the only area where the metrics do tell the true story as far as we are concerned.

[…]

That’s why all the work we do in growth is justified. All the questionable contact importing practices. All the subtle language that helps people stay searchable by friends. All of the work we do to bring more communication in. The work we will likely have to do in China some day. All of it.

Read the post in full

Mr Bosworth – who co-invented Facebook’s News Feed – has held high-level posts at the social network since 2006, and is currently in charge of its virtual reality efforts.

Mr Bosworth has since tweeted that he “didn’t agree” with the post at the time he had posted it, but he had shared it with staff to be “provocative.”

“Having a debate around hard topics like these is a critical part of our process and to do that effectively we have to be able to consider even bad ideas,” he added.

  • Facebook privacy settings revamped after scandal
  • Cambridge Analytica files spell out election tactics
  • Data row: Facebook’s Zuckerberg will not appear before MPs

    Mark Zuckerberg has issued his own statement.

    “Boz is a talented leader who says many provocative things,” it said.

    “This was one that most people at Facebook including myself disagreed with strongly. We’ve never believed the ends justify the means.”

    A follow-up report by the Verge revealed that dozens of Facebook’s employees have subsequently used its internal chat tools to discuss concerns that such material had been leaked to the media.

    Analysis

    By Rory Cellan-Jones, Technology correspondent

    What immediately struck me about this leaked memo was the line about “all the questionable contact importing practices”.

    When I downloaded my Facebook data recently, it was the presence of thousands of my phone contacts that startled me. But the company’s attitude seemed to be that this was normal and it was up to users to switch off the function if they didn’t like it.

    What we now know is that in 2016 a very senior executive thought this kind of data gathering was questionable.

    So, why is it only now that the company is having a debate about this and other dubious practices?

    Until now, Facebook has not been leaky. Perhaps we will soon get more insights from insiders as this adolescent business tries to grow up and come to terms with its true nature.

    Fact checking

    The disclosure coincided with Facebook’s latest efforts to address the public and investors’ concerns with its management.

    Its shares are trading about 14% lower than they were before the Cambridge Analytica scandal began, and several high profile figures have advocated deleting Facebook accounts.

    The company hosted a press conference on Thursday, at which it said it had:

    • begun fact-checking photos and videos posted in France, and would expand this to other countries soon
    • developed a new fake account investigative tool to prevent harmful election-related activities
    • started work on a public archive that will make it possible for journalists and others to investigate political-labelled ads posted to its platform

      In previous days it had also announced a revamp of its privacy settings, and said it would restrict the amount of data exchanged with businesses that collect information on behalf of advertisers.

      The latest controversy is likely, however, to provide added ammunition for critics.

      CNN reported earlier this week that Mr Zuckerberg had decided to testify before Congress “within a matter of weeks” after refusing a request to do so before UK MPs. However, the BBC has been unable to independently verify whether he answer questions in Washington.

Facebook’s Zuckerberg fires back at Apple’s Tim Cook

Facebook’s chief executive has defended his leadership following criticism from his counterpart at Apple.

Mark Zuckerberg said it was “extremely glib” to suggest that because the public did not pay to use Facebook that the firm did not care about them.

Last week, Apple’s Tim Cook said it was an “invasion of privacy” to traffic in users’ personal lives.

And when asked what he would do if he were Mr Zuckerberg, Mr Cook replied: “I wouldn’t be in that situation.”

Facebook has faced intense criticism after it emerged that it had known for years that Cambridge Analytica had harvested data from about 50 million of its users, but had relied on the political consultancy to self-certify that it had deleted the information.

Channel 4 News has since reported that at least some of the data in question is still in circulation despite Cambridge Analytica insisting it had destroyed the material.

Mr Zuckerberg was asked about Mr Cook’s comments during a lengthy interview given to news site Vox about the privacy scandal.

He also acknowledged that Facebook was still not transparent enough about some of the choices it had taken, and floated the idea of an independent panel being able to override some of its decisions.

‘Dire situation’

Mr Cook has spoken in public twice since Facebook’s data-mining controversy began.

On 23 March, he took part in the China Development Forum in Beijing.

“I think that this certain situation is so dire and has become so large that probably some well-crafted regulation is necessary,” news agency Bloomberg quoted him as saying in response to a question about the social network’s problems.

“The ability of anyone to know what you’ve been browsing about for years, who your contacts are, who their contacts are, things you like and dislike and every intimate detail of your life – from my own point of view it shouldn’t exist.”

  • Facebook haunted by ‘ugly truth’ memo
  • Facebook privacy settings revamped after scandal
  • Zuckerberg will not appear before MPs

    Then in an interview with MSNBC and Recode on 28 March, Mr Cook said: “I think the best regulation is no regulation, is self-regulation. However, I think we’re beyond that here.”

    During this second appearance – which has yet to be broadcast in full – he added: “We could make a tonne of money if we monetised our customer, if our customer was our product. We’ve elected not to do that… Privacy to us is a human right.”

    Apple makes most of its profits from selling smartphones, tablets and other computers, as well as associated services such as online storage and its various media stores.

    This contrasts with other tech firms whose profits are largely derived from advertising, including Google, Twitter and Facebook.

    Mr Zuckerberg had previously told CNN that he was “open” to new regulations.

    But he defended his business model when questioned about Mr Cook’s views, although he mentioned neither Apple nor its leader by name.

    “I find that argument, that if you’re not paying that somehow we can’t care about you, to be extremely glib and not at all aligned with the truth,” he said.

    “The reality here is that if you want to build a service that helps connect everyone in the world, then there are a lot of people who can’t afford to pay.”

    He added: “I think it’s important that we don’t all get Stockholm syndrome and let the companies that work hard to charge you more convince you that they actually care more about you, because that sounds ridiculous to me.”

    Mr Zuckerberg also defended his leadership by invoking Amazon’s chief executive.

    “I make all of our decisions based on what’s going to matter to our community and focus much less on the advertising side of the business,” he said.

    “I thought Jeff Bezos had an excellent saying: “There are companies that work hard to charge you more, and there are companies that work hard to charge you less.”

    ‘Turned into a beast’

    Elsewhere in the 49-minute interview, Mr Zuckerberg said he hoped to make Facebook more “democratic” by giving members a chance to challenge decisions its own review team had taken about what content to permit or ban.

    Eventually, he said, he wanted something like the “Supreme Court”, in which people who did not work for the company made the ultimate call on what was acceptable speech.

    Mr Zuckerberg also responded to recent criticism from a UN probe into allegations of genocide against the Rohingya Muslims in Myanmar.

    Last month, one of the human rights investigators said Facebook had “turned into a beast” and had “played a determining role” in stirring up hatred against the group.

    Mr Zuckerberg claimed messages had been sent “to each side of the conflict” via Facebook Messenger, attempting to make them go to the same locations to fight.

    But he added that the firm had now set up systems to detect such activity.

    “We stop those messages from going through,” he added.

    “But this is certainly something that we’re paying a lot of attention to.”