Ex -Ce from Facebook Stephen Scheeler about why the site should be banned for children

Ex -Ce from Facebook Stephen Scheeler about why the site should be banned for children


A former Facebook executive says social media should be banned for children – because their brains are not developed enough to deal with the “scary” algorithms of the site.

Stephen Scheeler, once the figure of Facebook in Australia and New Zealand, said in this week’s episode in 30 With Guyon Espiner The fact that he thought someone under 16 was too young for social media.

“This is mentally challenging. And it’s scary. A human brain does not evolve enough and develops enough even at least that age so that it can deal with some of the challenges that the algorithms present,” said Scheeler.

“As a mother, if you are a father, you will know the challenges of social media with your children. And leaving it to parents alone to find out, I just don’t think it’s fair.

Australia recently moved to ban children under 16 to use social media after their parliament approved the strictest laws in the world. The ban is yet to come into force, and how it will work it is not yet clear.

‘Facebook failed’

Scheeler’s role on Facebook was defending the platform in the Australlass market. He joined early 2010 when he was still relatively new. Since then, the company has been accused of polarizing modern society with its algorithms amid several other scandals.

“It didn’t work well as we planned,” said Scheeler.

“I feel like I should know more. I could have done something? I was just a gear on the machine, but I can now admit that I didn’t realize, I didn’t know and should have had more awareness of bad things.”

But Scheeler said no one focused on Facebook disadvantages.

“I almost cringe now, as I was talking about how social media would change the world for the better. We just covered anything that was potentially negative.”

Some of the most striking aspects of Facebook’s history in the user’s good – are related to adolescents – particularly the revelations that the company underestimated internal research by showing that Instagram was toxic to teenagers. Scheeler said the reasons why they are obvious.

“We are naive to think that Facebook, or any company, works against its own interest,” he said. In this sense, Scheeler added: “Facebook failed. I don’t think the moral line was firmly designed enough within the company, and it was because the reason for profit … annulled everything.”

An internal memorandum of 2017 that revealed that Facebook has actively offered advertisers the ability to target adolescent users showing signs of low self-esteem with beauty products.

According to Scheeler, who renounced his role in the same year, “unacceptable”. But he said again that it comes down to profit.

“Social media companies have no interest in spending less time on your platform. They are interested in spending more time, whatever the age you have.”

Ex -Ce from Facebook Australia and New Zealand, Stephen Scheeler sits with Guyon Espiner for an interview as part of 30 with Guyon Espiner '.

Ex -Ce from Facebook Australia and New Zealand, Stephen Scheeler sits with Guyon Espiner for an interview as part of 30 with Guyon Espiner ‘.
Photo: RNZ / COLE EASTHAM-FARRELL

‘I don’t think Zuckerberg has the moral fiber to do and say the right things’

Scheeler made an open assessment of Facebook CEO Mark Zuckerberg and on the company’s controversial decisions, recovering content moderation in the light of Donald Trump’s return to US presidency.

“My observation of Mark is that he is not a bad actor, he is not a petty person,” he said. “But I don’t think he has moral fiber … Doing and saying the right things. And I think doing and say the right thing at this moment was not to fall to Trump.”

“With the kind of power and influence that Mark has great responsibility. And I think at this moment I feel that he failed the test.”

‘There is a battle for your attention happening’

Scheeler said AI systems now dominate life on -line – systems he helped bring to the world.

“There is a battle for your attention,” he said. “Facebook is one of those who are fighting, and I think one of the problems we have at the moment is that we have the AI ​​… governing your attention in a way that you really don’t understand or control.”

This kind of design, he said, “is very addictive in his nature … The question is: Are you in control of your attention?”

The effects have the potential to be catastrophic – such as the United Nations discovery that Facebook played a “determining role” in violence against Muslims Rohingya in Myanmar.

“I couldn’t agree anymore,” he said. “We can’t even agree with the facts … And sometimes these facts can be used in different ways … to ignite people, think of certain things.”

Along with the view of how personalized, the curatorship of content led by the AI ​​can be a direct cause of damage to the real world, Scheeler has scarcely admitted that there are no real practical ways to escape the influence of the algorithm about what you read, watch and hear.

“All of these systems can be dripped or influenced in different ways. But I think it’s very difficult for individuals, to be honest with you, to find out and do it sustainably. If you are using the internet for what you really want to use it, your life is live, doing all these things … it just won’t be sustainable.”

‘Part of my penance’

Scheeler’s new venture, Omniscience, is his attempted redemption – a company using artificial intelligence to decode the brain and revolutionize the treatment of mental illness.

“As part of my penance for what I built for Facebook, what I want to do in my life is to build AI for the human good,” he said.

The omniscience is already in the market. “We already have regulatory authorizations in the US and Australia … We focus on using AI to decode your brain. And things like mental diseases … Essentially, only flaws in your circuits. We build tools that can find these flaws.”

But even here, he admitted, the same forces that shaped social media appear largely. AI risks are huge – and little understood.

Asked about the growing concern among experts that AI could eventually lead to human extinction it called “p-doom” [probability of doom] Measure – Scheeler did not rule out him, but remains cautiously optimistic.

“You can take two of the godparents [of AI]. One will say this is existentially threatening, and the other will say that there is nothing to worry about. “

But he sees a moral imperative in trying to do so better this time. “We have an ethical mission for our company,” he said. “We have ethical consulting of external parts … We are building for the medical system, which is already covered by ethics.”

SUBSCRIBE TO NGā Pytopito Kōrero, a daily newsletter Curated by our editors and deliver it directly to your inbox every day of the week.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *