Battle Royale: Fortnite hit by server outage

A lengthy server outage that prevented gamers playing Fortnite’s Battle Royale mode has been resolved.

The title, which was developed by US-based Epic Games, began experiencing problems on Wednesday. It has more than 45 million players worldwide.

The issue meant that some gamers were unable to play the last-player-standing challenge on Xbox, PlayStation, PC or mobile.

A statement by Epic had blamed the glitch on a database fault.

“Login and game service issues have been resolved and we’ve returned to a healthy state,” it announced shortly after 20:00 BST.

Soon after, it added that a related email notifications backlog had also been corrected.

Several of the game’s fans had voiced their displeasure on social media during the near day-long outage.

But one expert said such glitches were not unusual.

“It’s not ideal from their perspective because they have millions of players that want to play the game,” said Piers Harding-Roll, head of games at analyst firm IHS Markit.

“But the most important thing is that people [were] given regular updates.”

‘Addictive’

Fortnite launched last June as a paid game.

However, the Battle Royale mode – which pits 100 players against each other – is free to play.

It makes money by selling costumes and other cosmetic in-game items.

These generated more than $15m (£10.5m) on iOS alone over its first three weeks on Apple’s mobile platform – which it joined on 15 March – according to one analytics firm.

Some parents have raised concerns about the amount of time their children are spending within Fortnite’s cartoon-like fights.

However, one academic recently cautioned against describing the game as being “addictive”.

Fortnite has been nominated for two prizes at Thursday evening’s Bafta Games Awards.

Google loses ‘right to be forgotten’ case

A businessman fighting for the “right to be forgotten” has won a UK High Court action against Google.

The man, who has not been named due to reporting restrictions surrounding the case, wanted search results about a past crime he had committed removed from the search engine.

The judge, Mr Justice Mark Warby, ruled in his favour on Friday.

But he rejected a separate claim made by another businessman who had committed a more serious crime.

The businessman who won his case was convicted 10 years ago of conspiring to intercept communications. He spent six months in jail.

The other businessman, who lost his case, was convicted more than 10 years ago of conspiring to account falsely. He spent four years in jail.

Both had ordered Google to remove search results about their convictions, including links to news articles, stating that they were no longer relevant.

They took Google to court when it refused to remove the search results.

Google said it would accept the rulings.

“We work hard to comply with the right to be forgotten, but we take great care not to remove search results that are in the public interest,” it said in a statement.

“We are pleased that the Court recognised our efforts in this area, and we will respect the judgements they have made in this case.”

‘Legal precedent’

The right to be forgotten is a legal precedent set by the Court of Justice of the European Union in 2014, following a case brought by Spaniard Mario Costeja Gonzalez who had asked Google to remove information about his financial history.

Google says it has removed 800,000 pages from its results following so-called “right to be forgotten” requests. However, search engines can decline to remove pages if they judge them to remain in the public interest.

Explaining the decisions made on Friday, the judge said one of the men had continued to “mislead the public” while the other had “shown remorse”.

The Open Rights Group, which campaigns for internet freedoms, said the rulings set a “legal precedent”.

“The right to be forgotten is meant to apply to information that is no longer relevant but disproportionately impacts a person,” said Jim Killock, executive director.

“The Court will have to balance the public’s right to access the historical record, the precise impacts on the person, and the public interest.”

Facebook to vet UK political ads for May 2019 local elections

Facebook’s chief technology officer is to promise MPs that the social network will act to make political advertising far more transparent for UK users.

Mike Schroepfer will say that his firm will be ready to authorise ads in time for England and Northern Ireland’s May 2019 local elections.

He will make the pledge while giving evidence to a parliament committee.

Facebook had previously committed itself to similar action in the US later this year.

Mr Schroepfer is being questioned as part of the Department of Culture, Media and Sport Select Committee’s inquiry into fake news.

But the politicians also want to know more about the leak of Facebook data to the political consultancy Cambridge Analytica.

The committee had wanted to hear from Facebook’s founder and chief executive Mark Zuckerberg.

  • Facebook sales soar ‘despite challenges’
  • Was Facebook data’s value ‘literally nothing’?
  • Facebook threw us under bus, says data firm
  • ‘Facebook in PR crisis’ on data row

    However, he opted to send other executives to answer questions from politicians outside the US, having given two days of testimony in Washington earlier this month.

    Advert archive

    In his opening remarks, Mr Schroepfer will tell MPs that he and his boss are deeply sorry about what happened with Cambridge Analytica, which he will describe as a breach of trust.

    He will also promise to deploy a new “view ads” button in the UK by June 2018, which will let members see all the adverts any page is showing to users via Facebook, Messenger and Instagram. The company first launched the facility in Canada last October.

    In addition, Mr Schroepfer will promise the following will be up and running in time for the 2019 local elections:

    • political ads will only be allowed if they are submitted by authenticated accounts
    • such ads will be labelled as being “political” and it will be made clear who paid for them
    • the adverts will subsequently be placed in a searchable archive for seven years, where information will be provided about how many times they may have been seen and how much money was spent on them

      But MPs are likely to have questions about the use of Facebook in past elections, notably the EU referendum, and whether there was any foreign involvement.

      They will also want to drill down into the Cambridge Analytica affair and find out whether Facebook has uncovered similar cases during an audit of developer behaviour.

      View comments

Mark Zuckerberg’s dreaded homework assignments

Over two days, almost 10 hours.

If you watched every moment of Mark Zuckerberg’s testimony in front of Congress this week, you’ll know he rolled out one phrase an awful lot: “I’ll have my team get back to you.”

Now some of these were bits of data Mr Zuckerberg simply didn’t have to hand – such as why a specific advertisement for a political candidate in Michigan didn’t get approved.

Other follow ups, though, will require some hard graft from his team. What they produce could provide even more negative headlines for the company, as it is forced to divulge more of its inner workings than it has ever felt comfortable with.

Looking through the transcripts, I’ve counted more than 20 instances where Mr Zuckerberg promised to get back to representatives with more information. But these are the assignments I think could cause the company the most headaches – and provide some revealing answers.

1) Data on non-users

Set by: Congressman Ben Lujan (Democrat, New Mexico)

“You’ve said everyone controls their data, but you’re collecting data on people who are not even Facebook users who have never signed a consent, a privacy agreement.”

Dubbed “shadow” profiles, details of exactly what Facebook gathers on people who haven’t even signed up to the service has been always been a bit of mystery.

Even, apparently, to Mr Zuckerberg himself. He testified that he didn’t know the term, but acknowledged the firm did monitor non-users for “security” purposes.

Mr Zuckerberg promised to share more details on what data is gathered on people who don’t sign up for Facebook, as well as a full breakdown of how many data points it has on those who do.

In a related request, Mr Zuckerberg will provide details on how users are tracked (on all their devices) when they are logged out of Facebook.

2) Moving to opt-in, not opt-out

Set by: Congressman Frank Pallone (Democrat, New Jersey)

“I think you should make that commitment.”

Creating new regulation will be an arduous, flawed process. But one thing Facebook could do right now? Move to an opt-in model, one which requires users to decide to make something public, as is the default (and most popular) option for posting content now.

In a similar vein, Mr Zuckerberg was asked to get back to Congressman Frank Pallone on how the company might consider collecting less information on its users.

3) Repercussions for censorship mistakes

Set by: Congressman Steve Scalise (Republican, Louisiana)

“Was there a directive to put a bias in [the algorithms]? And, first, are you aware of this bias that many people have looked at and analysed and seen?”

One surprising admission made by Mr Zuckerberg before these hearings was that despite acknowledging the company made big mistakes, nobody has been fired over the Cambridge Analytica affair.

Representative Steve Scalise wants to take questions on accountability a step further.

In cases where Facebook reverses a decision to remove content – i.e. admitting it over-moderated – what kind of repercussions did those responsible face? If someone created an algorithm that unfairly filtered certain political views, was there any kind of punishment?

4) Specific rules for minors

Set by: Senator Ed Markey (Democrat, Massachusetts)

“We’re leaving these children to the most rapacious commercial predators in the country who will exploit these children unless we absolutely have a law on the books.”

On Facebook the minimum age of users is 13, not counting the company’s Messenger for Kids app (which doesn’t collect the type of data Facebook’s main app does).

But for those aged 13-18, or maybe 21, what happens in those oh-so-delicate years should be protected by tighter rules, Senator Ed Markey suggested.

Mr Zuckerberg said the idea “deserved a lot of discussion”, but maybe not a new law. He promised to get his team to “flesh out the details”.

5) How many ‘like’ and ‘share’ buttons are out there?

Set by: Congresswoman Debbie Dingell (Democrat, Michigan)

“It doesn’t matter whether you have a Facebook account. Through those tools, Facebook is able to collect information from all of us.”

It seems like everywhere you look there is a button prompting you to “like” or share things on Facebook – indeed, there’s one on the page you’re reading right now.

A request to at least estimate how many of Facebook’s buttons are out there might at first seem like an abstract demand – but the response could be quite something.

The “like” buttons enable Facebook to track users on pages that are not part of Facebook itself, providing more data for advertisers.

If it’s even possible to tot up how many buttons are out there on the web, expect a number in the hundreds of millions – that’s hundreds of millions of pages with which Facebook is tracking your activity beyond its own borders.

View comments

Bug hunters: The hackers earning big bucks… ethically

The term hacker is often used pejoratively, but the ability to spot weaknesses in companies’ software and cyber-security systems is in high demand. Ethical hackers are now earning big bucks and the industry is growing.

James Kettle is a bug hunter – not of the insect kind, but of software.

He scans through pages of code looking for mistakes – weaknesses that criminals could exploit to break into a company’s network and steal data.

His computer science degree was a little slow-paced for his tastes so he looked around for something else to do and came across “bug bounty” programmes run by Google and browser maker Mozilla.

These are schemes that pay cash to hackers for spotting mistakes, or bugs, in companies’ software.

“They really made you work hard for each one and it took about 50 hours per valid bug I found,” he recalls.

The payoff, apart from the cash, was that he was struck by an insatiable desire to keep finding flaws in code. And this eventually turned into a lucrative career.

And he’s very good at his job.


What you need to find bugs

  • Insatiable curiosity
  • Solid technical expertise in web and networking technologies
  • Patience and dedication
  • Puzzle-solving abilities

    He’s now one of the top-earning bug finders on Hacker One, a service that matches hackers with companies and governments looking for experts to test their software.

    These elite ethical or “white hat” hackers can earn more than $350,000 (£250,000) a year. Bug bounty programmes award hackers an average of $50,000 a month, with some paying out $1,000,000 a year in total, say industry insiders.

    Finding a bug that has never been found before is very rare and can lead to significant payouts, perhaps in the hundreds of thousands.

    Mr Kettle works for software company PortSwigger, which makes the Burp Suite tool that many hackers use to probe websites to see if they are ripe for exploitation.

    “I find new ways of hacking into websites and automating that, and I use bug bounties to prove my new techniques work,” Mr Kettle tells the BBC.

    “It’s fun and challenging.”

    Most software contains mistakes because it’s been written by fallible humans, and criminals are constantly scanning code for these vulnerabilities, often using automated tools.

    So it’s a race to find these weaknesses before the bad guys, or “black hat” hackers, do.

    The problem is that until recently few firms have had enough eyes to throw at the problem. So they’ve been crowdsourcing expert help from firms such as Hacker One, Bug Crowd and Synack.

    These act like agents for vetted ethical hackers, managing the bug bounty programmes, verifying the work done, and ensuring confidentiality for their clients.

    Hacker One, the largest of the three best-known bug bounty firms, has more than 120,000 hackers on its books and has paid out more than $26m (£18.5m) so far, says Laurie Mercer, a senior engineer at the firm.

    “Bug bounty programmes offer a way for organisations to ‘outsource’ application security testing, but it comes at a cost,” says Bob Egner, vice-president at security firm Outpost24.

    “You have to pay a crowdsource bug bounty vendor to introduce your application to their independent researchers, manage the programme for you, and ultimately pay for any bounties required.”

    But the risk of not doing enough to find these vulnerabilities is a potential hack attack resulting in stolen data, financial loss and damaged reputation. According to a recent report by security firm Nuix, 71% of black hat hackers say they can breach the perimeter of a target within 10 hours.

    Swedish bug hunter Frans Rosen is using his bounty income to fund tech start-ups.

    “We use the bug bounty money as the seeding investment,” he says. “It’s a fun way to use the money.”

    The cash enables the start-ups get established and do some development of their products or apps, he says. As a former web developer, he knows what can go wrong when websites are being set up and run.

    “After that we help them get the scale investment to fund them properly,” he says.

    Not all hackers who find bugs work for an established security firm, however, so being represented by a company such as Hacker One or Bug Crowd gives them credibility when they want to alert companies to security vulnerabilities.

    Security tester Robbie Wiggins says telling a firm that its website or apps can be hacked is always tricky.

    More Technology of Business

    • ‘More than 600 apps had access to my iPhone data’
    • Meet the gargantuan air freighter that looks like a whale
    • Airbus builds a new super-transporter
    • Reaping the wind with the biggest turbines ever made
    • Making deliveries in a badly mapped world

      Often there is no formal reporting structure, he says, apart from a generic admin email address. Bug bounty firms help get the error reports in front of the right people.

      But the rapid growth in bug bounty programmes and the significant cash rewards has made it a crowded field, he says.

      “It’s constantly changing and finding bugs is getting harder.”

      So he specialises in finding firms that have made mistakes with their Amazon cloud storage accounts. So far, he’s found more than 5,000 that look like they are wrongly open to the public.

      “Bug bounty hunting is now a hobby and helps every now and again when I need some extra cash for the kids,” he says.

      Another advantage of such programmes is that they can keep hackers away from the dark side.

      “Bug bounty programmes provide a legal alternative for tech-savvy individuals who might otherwise be inclined to the nefarious activities of actually hacking a system and selling its data illegally,” says Terry Ray, chief technology officer for data security firm Imperva.

      Perhaps it’s time more hackers came in from the cold?

      • Click here for more Technology of Business features
      • Follow Technology of Business editor Matthew Wall on Twitter and Facebook

What Remains of Edith Finch wins Bafta’s top games award

What Remains of Edith Finch has scored an upset, winning the top prize at the Bafta Games Awards.

The first-person mystery adventure was developed by the US indie studio Giant Sparrow.

It had not won any of the other categories before taking Best Game.

Hellblade: Senua’s Sacrifice had been the favourite to win having attracted the most nominations. It took the British Game prize and four other awards at the London event.

Giant Sparrow previously won a Bafta in 2013 when The Unfinished Swan won Debut Game.

Skip Twitter post by @TorvirN

A big well done to what remains of Edith finch. Best game award was unexpected, but now I will definitely have to have a play. #BAFTAGames

— TorvirNwaka (@TorvirN) April 12, 2018

Report

End of Twitter post by @TorvirN

Skip Twitter post by @JodieAzhar

Amazing happy surprise to see What Remains of Edith Finch win Best Game at #BAFTAGames
♡ pic.twitter.com/oab0iNqX3f

— Jodie Azhar (@JodieAzhar) April 12, 2018

Report

End of Twitter post by @JodieAzhar

Skip Twitter post by @Chris_Dring

What Remains of Edith Finch deserves awards. It isn’t better than Zelda. But still. Deserving winner

— Christopher Dring (@Chris_Dring) April 12, 2018

Report

End of Twitter post by @Chris_Dring

Record demand

Sales of video games, consoles, PC gaming add-ons and other related products topped £5.1bn in the UK last year, according to trade body Ukie.

That marked a record high and a 12.4% improvement on the previous year.

The launches of the Nintendo Switch and Microsoft’s Xbox One X helped drive interest.

But challenges facing the sector include the price of graphics cards – which has been inflated by demand from the crypto-currency industry – as well as slower-than-forecast sales of virtual reality equipment.

Category Winner

Best GameWhat Remains of Edith FinchOriginal PropertyHorizon Zero DawnMusicCupheadGame DesignSuper Mario OdysseyEvolving GameOverwatchNarrativeNight In The WoodsGame Beyond EntertainmentHellblade: Senua’s SacrificeDebut GameGorogoaFamilySuper Mario OdysseyMobile GameGolf ClashArtistic AchievementHellblade: Senua’s SacrificeMultiplayerDivinity: Original Sin 2Audio AchievementHellblade: Senua’s SacrificeGame InnovationThe Legend of Zelda: Breath of the WildBritish GameHellblade: Senua’s SacrificePerformerMelina Juergens as Senua in Hellblade: Senua’s Sacrifice

Cambridge University saw ‘no issue’ with Facebook research

The academic at the centre of Facebook’s data scandal has hit back at Mark Zuckerberg’s suggestion that “something bad” might be going on at Cambridge University.

Dr Aleksandr Kogan, who collected data for Cambridge Analytica, told the BBC that Facebook should be investigating commercial uses of its data, not focusing on academic research.

He also denied that fellow academics had had any “ethical issues” with his work for Cambridge Analytica.

On Wednesday, Mark Zuckerberg said at a congressional hearing that there were a number of Cambridge academics building similar apps to Dr Kogan’s.

He said Facebook needed to know “whether there was something bad going on at Cambridge University”.

Commercial purposes

In an email to the BBC, Dr Kogan said it was true that the Cambridge Psychometrics Centre had developed a personality quiz to collect Facebook data, and that the dataset was shared with academics around the world.

However, he added: “It’s surprising that Facebook would choose to focus its investigation on academics working with other academics. There are tens of thousands of apps [which] had access to the data for commercial purposes.

“I would have thought it makes the most sense to start there.”

On Wednesday, Cambridge University said it was surprised that Mr Zuckerberg had only recently become aware of its research into social media, since it had appeared in peer-reviewed journals.

It said Facebook had not responded to its request for information about the allegations against Dr Kogan.

‘Still representing university’

Dr Kogan also defended himself against criticism by the university’s Psychometric Centre, which said that even though he had never been connected with it, his commercial activities had reflected on the university as a whole.

Vesselin Popov, the business development director of the Psychometrics Centre, said: “Our opinion is that even if an academic does something ‘in their spare time’ with their own company, they still ought to be held to professional standards as a psychologist because, like it or not, they are still representing that body and the university in doing it.”

Dr Kogan said he was surprised by Mr Popov’s comments as he had discussions with academics at the centre about their participation in the project.

“In truth, the Psychometrics Centre never had an ethical issue with the project, as far as I’m aware. To the contrary, my impression was that they very much wanted to be a part of it,” he told the BBC.

He said the relationship went sour only after a dispute over how much the Psychometric Centre would be paid for its involvement in the project, not over any ethical concerns.

The Psychometrics Centre, which is based at the university’s Judge Business School, rejected Dr Kogan’s version of events.

It said it had complained to the university authorities about his behaviour towards two of its academic staff, not about the monetary issue.

Cambridge University says it has received reassurances from Dr Kogan about his business interests but is now conducting a wide-ranging review of the case.

TED 2018: Alphabet firm’s tools to combat extremism

The online methods used by Jihadist groups to recruit members and the way Google is countering them, have been revealed at the TED (Technology, Entertainment and Design) conference.

The research director of Alphabet-owned Jigsaw gave a talk outlining the tools the firm is developing to ease extremist content and harassment.

Google faces scrutiny over how it deals with both.

It recently announced more human moderators to remove such content.

Yasmin Green revealed that an eight week trial which targeted those searching for Jihadist material on Google with adverts and videos offering alternative views had reached 300,000 potential recruits.

Jigsaw’s Redirect Method provided links to anti-extremist content including messages of peace from clerics, videos from Isis defectors and smartphone footage from those living in Isis-controlled areas.

Such links popped up when anyone typed a search enquiry about Jihadism into Google.

Ms Green described the sophisticated ways in which groups such as Isis recruit people online, including offering propaganda videos in many different languages.

“They had a video in sign language. They took the time to make sure their message reached the hard-of-hearing,” she said.

Iranian-born Green said she had spent time in Iraq meeting young people who had joined Isis and defected to better understand what motivated them.

“I talked to a 23-year-old who had trained as a suicide bomber before he defected and I asked him if he had known everything that he now knows whether he would still have joined and he said ‘yes'”.

“He was so brainwashed that he wasn’t taking in contradictory information.”

Hate speech

She also talked about the need to develop “empathetic technology” to counter online abuse, which she said worked in a similar way to Jihadist propaganda.

“Online harassment also wants to work out what resonates with another human being but not to recruit them, rather to cause them pain.”

“It is a perverse art of working out what makes people angry or afraid and then pushing those pressure points.”

Perspective, a tool Jigsaw developed in partnership with Wikipedia and the New York Times is an artificial intelligence system which is learning to better understand “the emotional impact of language” in order to root out abuse.

The tool has been criticised since it was launched with many pointing out that its ability to detect hate speech is limited.

According to Quartz, the online hate-speech detector rated “garbage truck” 78% toxic, while “race war now” was only found to be 24% toxic.

Despite the criticism, Ms Green remains convinced that such technology can provide a solution to “reinvigorate the spaces online that most of us have given up on”.

Tesla ‘removed’ from fatal car crash probe

Tesla has been removed from the investigation into the fatal crash of one of its semi-autonomous vehicles.

In March, a Tesla vehicle operating in Autopilot mode crashed in California, killing the driver. Tesla has suggested the driver was at fault.

On Thursday, the National Transportation Safety Board said Tesla had “violated the party agreement by releasing investigative information”.

Tesla says it decided to remove itself from the investigation.

Autopilot

Tesla’s Autopilot is a semi-autonomous mode in which the car controls its own steering and speed.

The company has always stressed that drivers must pay attention to the road and keep their hands on the steering wheel.

On 23 March, a Tesla vehicle crashed into a roadside barrier in California, killing 38-year-old Walter Huang.

Tesla said the vehicle was in Autopilot mode at the time of the crash.

“The driver had received several visual and one audible hands-on warning earlier in the drive,” it said in a statement.

“The crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr Huang was not paying attention to the road, despite the car providing multiple warnings to do so.”

Disservice

The NTSB has accused Tesla of releasing “incomplete information”.

It said such statements “often lead to speculation and incorrect assumptions about the probable cause of a crash, which does a disservice to the investigative process”.

Robert Sumwalt, NTSB chairman, said: “We decided to revoke Tesla’s party status and informed Mr [Elon] Musk in a phone call and via letter.

“While we understand the demand for information that parties face during an NTSB investigation, uncoordinated releases of incomplete information do not further transportation safety or serve the public interest.”

Tesla said it had decided to remove itself from the investigation.

The company has accused the NTSB of breaching its own rules by releasing statements about the crash, despite telling Tesla not to do so.

“We don’t believe this is right and we will be making an official complaint to Congress,” a spokesperson told the BBC.

Tesla added: “Last week, in a conversation with the NTSB, we were told that if we made additional statements before their 12-24 month investigative process is complete, we would no longer be a party to the investigation agreement.

“On Tuesday, we chose to withdraw from the agreement and issued a statement to correct misleading claims that had been made about Autopilot.”

TED 2018: The smart home that spied on its owner

For two months in early 2018, technology journalist Kashmir Hill let innocent household items spy on her.

She had turned her one-bedroom apartment into a “smart home” and was measuring how much data was being collected by the firms that made the devices.

Her smart toothbrush betrayed when she had not brushed her teeth, her television revealed when she had spent the day bingeing on programmes, and her smart speaker spoke to the world’s largest online retailer every day.

It was like living in a “commercial, surveillance state” with “not a single hour of digital silence”, she said.

Ms Hill, who reports for the technology news website Gizmodo, gave a TED talk describing her experience.

Her colleague Surya Mattu had built a special wi-fi router to monitor the devices listening to her life. They found that she was giving away a lot of information.

“The Amazon Echo [a smart speaker] talked to Amazon servers every three minutes and the TV was sending information about every show we watched on Hulu, which was in turn shared with data brokers.”

But perhaps more worrying than the data she could track, was the vast amount that she could not.

“With the other data I don’t know ultimately where it was shared,” she said.

The lack of transparency about what happens to the huge amount of consumer data that is sucked out of smart devices and social networks every day has been in sharp focus in the last few weeks.

Facebook remains under intense scrutiny after it was revealed that up to 87 million Facebook users may have had their profile information accessed by marketing firm Cambridge Analytica without their knowledge.

But while some consumers are prepared to part with their data for the convenience of access to free services such as Facebook and Google, Ms Hill did not feel this was true of her smart experiment.

“My smart home was not convenient. Things didn’t work, the smart coffee was horrible, Alexa didn’t understand us and my take-away was that the privacy trade-off was not worth it.”

Facebook may currently be in the spotlight, but it is by no means the first to be caught out over the mishandling of user data.

In 2017, smart TV manufacturer Vizio agreed to pay $2.2m to settle a lawsuit brought by the US Federal Trade Commission over charges that the company installed software on 11 million of its smart TVs to collect viewing data, without informing customers or seeking their consent.

In addition, it also gathered each household’s IP address, nearby wi-fi access points and postcode, and shared that information with other companies to target advertisements at Vizio TV owners.

And in August 2016, in a particularly intimate example of data misuse, hackers at the Def Con security conference revealed that Standard Innovation’s We-Vibe smart vibrators transmitted user data – including heat level and vibration intensity – to the company in real time.

“It is interesting that the issue has coalesced around Facebook but it is a much wider issue,” said Ms Hill.

“We use platforms on our smartphones and social networks that introduce us to third-party apps and we haven’t yet come to terms with what this means, and how much responsibility the companies have to vet these apps and keep us and our data safe.”

That is all about to change in Europe with the introduction of the General Data Protection Regulation (GDPR), which promises consumers far greater control over their data.

Currently the situation in the US is very different. Citizens do not have the right to access the information that companies have stored on them.

However, California, which is home to most of the biggest tech giants, is currently considering a law that would give users access to their data and let them ask firms not to sell it.

For Ms Hill, the changes in Europe cannot come soon enough.

“I absolutely hope that GDPR has a trickle-down effect on the US,” she said.

Meanwhile, she is not willing to totally abandon her smart home experiment.

“We will keep the Echo and the smart TV. I don’t love all this stuff but it is going to stay in our home.

“What I hope is that we can make better products in future – devices with privacy protections built-in.”