‘Meta has editorial power to use indiscriminately if content threatens their bottom line’, says Center for Countering Digital Hate

Imran Ahmed, chief executive officer (CEO) of the Center for Countering Digital Hate (CCDH), told the Standing Committee on Canadian Heritage (CHPC) yesterday that Meta’s algorithms are designed to feed the spread of disinformation and, in turn, maximize user engagement.

Ahmed appeared by videoconference as a scheduled witness before the CHPC’s hearings into “tech giants’ use of intimidation tactics”.

The passage of Bill C-18, the Online News Act, which led to Meta blocking news links for Canadians, has created a news vacuum which, he said, is being filled with the hate speech and disinformation generated by the platform’s harmful algorithms.

“Platforms’ algorithms don’t give you what you want. That’s a myth. They give you what they want you to want,” said Ahmed.

Meta and other platforms, he added, do not want editorial responsibility for their content for liability and financial reasons, and because content moderation and editorial control require lots of resources. 

However, by blocking news posts in Canada, Meta, he said, “is proving that they’ve always had an editorial power and will use it indiscriminately if content threatens their all important bottom line.”

He continued, “These highly personal, highly invasive systems have resulted in the polarization of economic, democratic and social thought and are correlated to a rise in radical hate groups networks’ extremism.”

Ahmed explained that what we’re seeing on social media is not the “content of a billion people” or the “global discourse”. It is, he said, “discourse controlled by algorithms which are designed for commercial impact.”

In October, 41 states in the U.S. sued Meta for allegedly harming the health of young users. 

Now, an unredacted court document in that lawsuit discovered by the New York Times revealed that Meta has allegedly, actively “coveted and pursued” under 13 year old users for years on Instagram, targeting young users, manipulating them into spending unhealthy amounts of time on the apps, promoting body dysmorphia, and exposing them to potentially harmful content.

The bar is low for Meta and other similar platforms, Ahmed lamented, and we have to come to expect them to behave in the worst way possible, which is why regulations are of value.

But tech giants like Meta engage in a wide range of intimidation tactics to counter regulation, said Jason Kint, CEO of Digital Content Next.

He detailed five intimidation tactics that goliaths like Meta use:

  1. Threats to whistleblowers. A whistleblower shared documents with the Wall Street Journal that corroborated reports that Meta blocked news in Australia in March 2021, just as vaccines were being rolled out, and allegedly, purposefully during the most critical week of Parliament’s deliberations. He, however, got scared away from testifying and went underground.
  2. Threats to investments. Mark Zuckerberg had allegedly threatened to pull back investment in the U.K. after he was asked to testify before its Parliament.
  3. Threats to publishers and newsrooms. In 2018, Facebook allegedly threatened to sue the Guardian, a day before they reported on the Cambridge Analytica scandal.
  4. Record spending on lobbying, including through proxies. Google and Facebook registered in the top 10 lobbyists in the EU and the U.S.
  5. Threats to consumers. Repeated claims that regulations will stifle innovation and end the free and open internet are made to drive outrage among consumers. Facebook, Kint said, often takes it a step further by suggesting it will have to charge for services, which could kill thousands of small businesses and cost millions of jobs.

Michael Geist, University of Ottawa internet law professor, however, argued that in Canada, for instance, there is ample evidence of regulatory capture, with legacy cultural groups dominating meetings with officials and time with the committee, and voices of Canadian digital creators, including those from BIPOC (“black, indigenous, and other people of color”) communities dismissed and at times disrespected.

Meta’s reaction to block news, he said, was a consequence of regulation and not a tactic to influence it, adding that Meta and Google have been consistent in their discourse from the beginning.

“It’s essential that this committee and the department ensure that it avoids regulatory capture and provides a forum for all voices. Failure to do so makes for bad policy and, I have to say, raises the risk of intimidation in which, inadvertent or not, it may be the government or this committee that can do some of the intimidating.”

Would you recommend this article?

Share

Thanks for taking the time to let us know what you think of this article!
We'd love to hear your opinion about this or any other story you read in our publication.


Jim Love, Chief Content Officer, IT World Canada

Featured Download

Ashee Pamma
Ashee Pamma
Ashee is a writer for ITWC. She completed her degree in Communication and Media Studies at Carleton University in Ottawa. She hopes to become a columnist after further studies in Journalism. You can email her at [email protected]

Featured Story

How the CTO can Maintain Cloud Momentum Across the Enterprise

Embracing cloud is easy for some individuals. But embedding widespread cloud adoption at the enterprise level is...

Related Tech News

Get ITBusiness Delivered

Our experienced team of journalists brings you engaging content targeted to IT professionals and line-of-business executives delivered directly to your inbox.

Featured Tech Jobs