Facebook’s latest privacy debacle stirs up more regulatory interest from lawmakers

Facebook’s late Friday disclosure that a data analytics company with ties to the Trump campaign improperly obtained — and then failed to destroy — the private data of 50 million users is generating more unwanted attention from politicians, some of whom were already beating the drums of regulation in the company’s direction.

On Saturday morning, Facebook dove into the semantics of its disclosure, arguing against wording in the New York Times story the company was attempting to get out in front of that referred to the incident as a breach. Most of this happened on the Twitter account of Facebook chief security officer Alex Stamos before Stamos took down his tweets and the gist of the conversation made its way into an update to Facebook’s official post.

“People knowingly provided their information, no systems were infiltrated, and no passwords or sensitive pieces of information were stolen or hacked,” the added language argued.

I have deleted my Tweets on Cambridge Analytica, not because they were factually incorrect but because I should have done a better job weighing in.

— Alex Stamos (@alexstamos) March 17, 2018

While the language is up for debate, lawmakers don’t appear to be looking kindly on Facebook’s arguably legitimate effort to sidestep data breach notification laws that, were this a proper hack, could have required the company to disclose that it lost track of the data of 50 million users, only 270,000 of which consented to data sharing to the third party app involved. (In April of 2015, Facebook changed its policy, shutting down the API that shared friends data with third-party Facebook apps that they did not consent to sharing in the first place.)

While most lawmakers and politicians haven’t crafted formal statements yet (expect a landslide of those on Monday), a few are weighing in. Minnesota Senator Amy Klobuchar calling for Facebook’s chief executive — and not just its counsel — to appear before the Senate Judiciary committee.

Facebook breach: This is a major breach that must be investigated. It’s clear these platforms can’t police themselves. I've called for more transparency & accountability for online political ads. They say “trust us.” Mark Zuckerberg needs to testify before Senate Judiciary.

— Amy Klobuchar (@amyklobuchar) March 17, 2018

Senator Mark Warner, a prominent figure in tech’s role in enabling Russian interference in the 2016 U.S. election, used the incident to call attention to a piece of bipartisan legislation called the Honest Ads Act, designed to “prevent foreign interference in future elections and improve the transparency of online political advertisements.”

“This is more evidence that the online political advertising market is essentially the Wild West,” Warner said in a statement. “Whether it’s allowing Russians to purchase political ads, or extensive micro-targeting based on ill-gotten user data, it’s clear that, left unregulated, this market will continue to be prone to deception and lacking in transparency.”

That call for transparency was echoed Saturday by Massachusetts Attorney General Maura Healey who announced that her office would be launching an investigation into the situation. “Massachusetts residents deserve answers immediately from Facebook and Cambridge Analytica,” Healey tweeted. TechCrunch has reached out to Healey’s office for additional information.

On Cambridge Analytica’s side, it looks possible that the company may have violated Federal Election Commission laws forbidding foreign participation in domestic U.S. elections. The FEC enforces a “broad prohibition on foreign national activity in connection with elections in the United States.”

“Now is a time of reckoning for all tech and internet companies to truly consider their impact on democracies worldwide,” said Nuala O’Connor, President of the Center for Democracy & Technology. “Internet users in the U.S. are left incredibly vulnerable to this sort of abuse because of the lack of comprehensive data protection and privacy laws, which leaves this data unprotected.”

Just what lawmakers intend to do about big tech’s latest privacy debacle will be more clear come Monday, but the chorus calling for regulation is likely to grow louder from here on out.

YouTube is reportedly introducing your kids to conspiracy theories, too

In a recent appearance by YouTube CEO Susan Wojcicki at the South by Southwest Festival, she suggested that YouTube is countering the conspiracy-related videos that have been spreading like wildfire on the platform — including videos telling viewers that high school senior and Parkland, Fl. survivor David Hogg is an actor.

Specifically, Wojcicki outlined YouTube’s plans to add “information cues,” including links to Wikipedia pages that debunk garbage content for viewers if they choose to learn more. (Somewhat strangely, no one at YouTube had told Wikipedia about this plan.)

Either way, the platform is going to have do much better than that, suggests a new Business Insider report that says YouTube Kids has a huge problem with conspiracy videos, too. To wit, the three-year-old, ostensibly kid-friendly version of YouTube is showing its young viewers videos that preach the nonsensical, including “that the world is flat, that the moon landing was faked, and that the planet is ruled by reptile-human hybrids,” according to BI’s own first-hand findings.

In fact, when BI searched for “UFO” on YouTube Kids, one of the top videos to appear was a nearly five-hour-long lecture by professional conspiracy theorist David Icke, who covers everything in the clip from “reptile human bloodlines,” to the Freemasons, who he credits with building the Statue of Liberty, Las Vegas, Christianity, and Islam, among other things. (The Freemasons also killed President John Kennedy, he tells viewers.).

Business Insider says YouTube removed the videos from YouTube Kids after its editorial team contacted the company. YouTube also issued the following statement: “The YouTube Kids app is home to a wide variety of content that includes enriching and entertaining videos for families. This content is screened using human trained systems. That being said, no system is perfect and sometimes we miss the mark. When we do, we take immediate action to block the videos or, as necessary, channels from appearing in the app. We will continue to work to improve the YouTube Kids app experience.”

It’s further worth noting that parents are empowered with additional controls that allow them to block videos or channels they don’t like, at least in most of the world. (Parents in Europe, the Middle East, and Africa, are still waiting on this feature.) They can also turn search on or off, depending on how much access they want to give their kids.

The company says, too, that of the videos cited by BI, on average they had a little more than 100 total views.

That’s not going to be good enough for many parents, who want to be able to trust YouTube Kids wholeheartedly. Hunter Walk, a venture capitalist who previously led product at YouTube and has a young daughter, may have summed it up best in a tweet that he published earlier this afternoon, writing that “when you create and market an app to kids, the level of care and custodial responsibility you need to take is 100x usual. Clean it up or shut it down pls.”

YouTube has been reluctant to tinker with is recommendation algorithm because its “main objective is to keep you consuming YouTube videos for as long as possible” Wired noted this past week. (Crazy theories are apparently quite sticky). Wired also reported that despite a recent uproar about all the conspiracy theory content, YouTube still doesn’t have clear rules around when whether these videos violate its community guidelines, which cover bullying, hate speech, graphic violence, and sexually explicit content.

Wojcicki said during her festival appearance that “People can still watch the videos, but then they have access to additional information.”

Hopefully, as it evolves, YouTube will come up with a more sophisticated solution to the spread of misinformation, especially when it comes to its younger viewers. The scale of this particular issue may comparatively small. But as it is, this editor doesn’t allow her kids to watch YouTube Kids without strict supervision for fear of what they might see. At this point, we’d be surprised if parents at YouTube did otherwise.