No one steers our opinion. This is important because it enables us to give a voice to the voiceless, challenge the powerful and hold them to account. Readers' support means we can continue bringing The Guardian's independent journalism to the world.and more »
And with that, Collins closes the committee, until 3:30, when it will take evidence from the information commissioner.
And finally, Charlie Angus, Canada’s representative, brings up Facebook’s inflated video metrics, overstated for two years. “I would consider that corporate fraud, on a massive scale,” he says, “and the best fix is anti-trust. The simplest form of regulation would be to break facebook up, or treat it as a utility, so that we can all be sure that we’re counting metrics that are accurate or true. To allow you to gobble up all the competition is not good.”
Allan says, “it depends on the problem we’re trying to solve,” and Angus counters that “the problem is Facebook, everything else is just a symptom.”
Allan: “Unless you’re going to turn off the internet, I’m not sure people would be better off in doing without Facebook offering the services it’s spent 15 years perfecting how to offer.”
France’s Morin-Desailly asks how Facebook can restore trust. Is Facebook prepared to re-found its economic model? Allan says that the company really will be sharing information with academics, in an attempt to help them prove that it’s trustworthy.
He adds that, on the business model, it would be nice if the company could just charge people the cost of doing business, but that in practice people like to use things for free, and advertising is the best way to do that.
Latvia’s Inese Lībiņa-Egnere asks how Facebook can help countries like Latvia, that face specific threats from Russia.
Allan says that Facebook is partnering with the Atlantic Council, which monitors that sort of threat; and says that the company is extending its own language capabilities, because currently Facebook’s AIs don’t understand Latvian, and so can’t spot problems as they happen. “We need to move from this being an election issue to this being an all-year-round issue.”
Brazil’s Alessandro Molon makes a brief statement, asking for internet companies and social media companies to work with governments to preserve democracy. He then asks what Facebook is doing to prevent improper manipulation of its algorithms to prevent illegal manipulation of elections.
Allan cites a previous post by Zuckerberg, which says that Facebook is trying to stop its algorithm rewarding sensationalist content; and highlight’s Facebook’s partnership with third-party fact checkers, which sees content suppressed if it’s marked as false.
Molon asks further about how anyone can be sure Facebook has deeper commitment to democracy than to profit, and again brings up WhatsApp, “which was widely used to spread manipulated content”. The service, he says, banned more than 100,000 accounts in Brazil during the election.
Allan says “we are now building WhatsApp into our thinking around election integrity.” The company acquired WhatsApp four years ago.
“There are some novel challenges to look at,” Allan says, but “we don’t think that sort of manipulative behaviour is in anyone’s interest.”
Argentina’s Leopoldo Moreau asks his question in Spanish (and Allan briefly responds fluently, before asking for the translator to continue for the benefit of the committee), and asks why Facebook’s Argentinian office didn’t engage with the country’s parliament.
Allan apologises (in English), and says that the company has a large presence in Argentina and that they should be engaging better.
Moreau asks about WhatsApp campaigning: the company, wholly owned by Facebook, allows for encrypted communications that facebook cannot oversee. Allan says that WhatsApp is “intended as a person to person messaging service; it should not be used for spamming people.”
The Argentinian delegation counters that WhatsApp does have business APIs, that do allow for bulk mailing. Allan says that “if shadowy companies are promising to circulate on WhatsApp information through lists of numbers, that should stop. We will be offering proper business communication, but that we can oversee.”
“Where we were made aware of it, we did take action. We’re building WhatsApp into those election task forces I mentioned.”
Singapore’s Pritam Singh asks if Facebook would be willing to remove a post that could be skewing an election if it were ordered to by the authorities, and Allan says it would be eager to, because it wants to work with authorities.
Lijnen asks about Facebook’s tracking of users.
“It may be that the EU decides to limit that, but it will have profound implications,” Allan says, on the ability of the publishing industry to advertise.
For the data of non-users, Allan again repeats the two categories of data the company stores: log files of non-users, and contact data uploaded by users. Lijnen says that she thinks that’s not GDPR compliant, but Allan disagrees.
Nele Lijnen from Belgium says that “sending your cat” is a Flemish expression meaning not showing up. Making Allan Mark Zuckerberg’s cat?
Lijnen clarifies that Allan is merely sitting next to Zuckerberg’s cat.
Julie Elliott asks how Facebook defines political advertising.
Allan: “This is one of the areas where we would really appreciate a discussion with policymakers. At the moment, in the UK, we say if you’re talking about a party or a candidate, or an issue in front of the legislature.”
Elliott asks how Facebook monitors that. Allan describes the current system, which requires people to register as political advertisers if they’re found running political adverts.
Elliott asks what percentage of Facebook’s budget is being spent on this effort; Allan says it’s a major effort, but that he can’t tell the committee the percentage.
Elliott asks “what other checks and balances” Facebook is applying to the money that is funding the advertising. Facebook gets the money from the person who is paying it, Alan says, but thinks that the best way to explore further up the chain is with regulators like the Electoral Commission.
Singapore’s Edwin Tong asks about Facebook’s policy on hate speech, and quotes from a Mark Zuckerberg statement saying that the company has always taken down such content.
Tong then brings up a post made in Sri Lanka, calling for the murder of Muslims. “It was put up at a time when there were significant tensions between Sri Lankan Muslims… that eventually resulted in a state of emergency.
“In that context, wouldn’t such a post inflame tensions?”
Allan agrees it would.
Tong asks why, then, that post is not down. Allan says it should be, and that there must have been a mistake; Tong quotes from Facebook’s response, which says that no policy has been broken, and Allan repeats that it’s a mistake.
“Would you agree that Facebook cannot be trusted to choose what goes on its platform,” Tong asks. Allan disagrees, and says “the best way to resolve this is a dictionary of hate speech terms in Sinhalese that gets surfaced to a Sinhalese reviewer.”
“We make mistakes; our job is to reduce the number of mistakes. We should be accountable for our mistake to you and your colleagues, to every parliament that’s sat round the table today.”
Sun asks if it’s possible that future elections will be interfered with through methods that will only be discovered after the fact. Allan says it’s possible, because “as long as we have an internet, it’s unreasonable to think that we’ll be able to stop all of this.”
Would more be achieved, Sun asks, if Facebook works with relevant authorities to take down false content? Allan says he thinks its important to work within a judicial process: if someone claims a politician is false, he says, the best person to check is the judiciary of the country.
Sun asks if he agrees that falsehoods can cause harm to society, and Allan says he does.
Sun Xueling from Singapore asks how Facebook is policing the setting up and shutting down of fake accounts and their networks.
“The shutting down of fake accounts is an ongoing battle that we have,” Allan says. “Most fake accounts are created with commercial intent … but they’re taken down within minutes.
“Then there are people who are careful, create one or two accounts, and act as though they are a normal Facebook user. The issue in the US, with the Internet Research Agency, was that.”
Allan says that “low-quality information” has reduced by over 50% on the site, according to a study from a French research institute. But, he says, those people who curate individual fake accounts are the hardest to catch.
Zimmer quotes again from the New York Times story two weeks ago: “Mr Zuckerberg and Ms Sandberg stumbled … and sought to conceal warning signs from public view.”
Allan says he doesn’t think that’s true. “Issues have come up, and been debated fully and thoroughly.”
Zimmer notes that Facebook’s quarterly profit is $13bn. “What do you say to the 400 million constituents we represent that shows you’re taking this seriously? There are other bigger issues involving election campaigns … but you’re still downplaying the role that Facebook has in this situation. That’s a huge player on the global scene, and you still don’t seem to get a grasp on how much influence you have on global election campaigns.”
Allan says: “We now have a world-leading security team, who are finding those people and taking them down. We tell you, and you ask how did they get on the site. There will be problems, but we will catch most of them, and our goal is that the Canadian elections should not be unduly influenced through online activity on our platform.”
Canada’s Bob Zimmer asks whether Allan thinks Canada’s democracy is at risk if the country doesn’t change its laws to deal with ‘surveillance capitalism’.
Allan says there are a number of vectors that are problematic: foreign interference, the ability for others to project their views into the country; but also domestic issues, allowing people inside the country to do dirty tricks campaigns.
After a brief interruption from Ireland’s Eamon Ryan, and a quip about missing his gavel from Zimmer, the Canadian asks about Zuckerberg’s dismissal of the idea that Facebook affected the US election as a “crazy idea”.
Allan concedes it was “not elegantly said”, but says that “in an election campaign there is a huge amount of legitimate activity carried out by all the parties … We did spot this activity that was wrong, shouldn’t have happened, but we think that if you look at what changed the outcome, it’s the main point.”
“They’re both problems, but if you ask my why that statement was made, I’m trying to describe to you the thinking behind it.”
The UK’s Brendan O’Hara reiterates the irritation with Facebook’s decision not to send Zuckerberg, and asks if Allan was sent to answer questions or defend the company.
“Were you sent because you, in the entire Facebook empire, are the best person to answer all these questions, or because you’re best placed to defend the company?”
Allan says he thinks it’s the former, and reminds O’Hara that Mike Schroepfer, the company’s chief technical officer, had previously come and not satisfied the committee. He says he volunteered to speak to the committee: “I said, ‘I believe that I have the knowledge that this group needs.’”
“To be precise, both for the issues that you want to raise as the UK commitee, and, I now work on election issues globally… this is the stuff I work on. Our working assumption was that’s what you want to discuss.”
O’Hara complains about how many times Allan is promising to write with answers after the commitee, and asks Allan what light he thinks he’s shone on the issue that has provided greater clarity than Zuckerberg could have.
“I think I’ve given you insights around the way we think about regulation–” he is cut off by Collins, who hands over to the next questioner.
Facebook,Media,Social networking,Technology,UK news,Mark Zuckerberg,World news,Privacy,Privacy &, the media