categories

HOT TOPICS

Thought Leaders in Artificial Intelligence: Blackbird.ai CEO Wasim Khaled (Part 3)

Posted on Wednesday, Jan 13th 2021

Sramana Mitra: If you were to recommend how these large social media platforms should be regulated, what would you suggest?

Wasim Khaled: One of the most important things is that you don’t use a sledgehammer on the entire tech industry, especially with smaller up-and-coming companies that would get squashed if you applied the same broad stroke to the entire sector. That would be bad.

Focus on the companies that have the biggest audiences and most users. Everyone including the founders of those companies is now saying, “Hey, there does need to be some form of regulation.”

I won’t say that I have the solution to it because nobody has been able to come up with a set of things that are working. There are a lot of bills in play today that have variations on what a good solution might be. One of the things is simply the fact that the right amplification should not be given.

What I mean by that is this. You may be able to post something that isn’t harmful or potentially dangerous, but if you were prodded, that’s a problem. Everyone has the right to the same kind of amplification that the company determines.

Unfortunately, the incentives around how these systems were built were around increasing engagement and generating ad revenue. All these companies make all their money from advertising, so their whole model was built around high engagement.

In some ways, the algorithm that does that is dumb. It focuses on one thing. It doesn’t focus on societal harm or democratic process or any of that. It looks at how it can keep you glued to the screen for another minute. 

Sramana Mitra: The more polarizing the discussion topic, the more engaged people are, so they are incentivized. Algorithms are incentivized to reward polarizing content. 

Wasim Khaled: There is an old saying that goes, “A lie gets halfway around the world before the truth has a chance to put its pants on.” That is what works. That is what gets engagement. There are many studies that show that people like to read about disinformation because it is polarizing even though people know it’s wrong.

One of the things that we tell people is that if you read something, and it makes you angry or want to smash that share button, then you better pause and try to do some research on what you are looking at. Getting back to the question about regulation, there needs to be a real discussion on how things are amplified algorithmically.

The problem is that the entire platform and the entire system was built around this culture of how we can grow fast and how to get maximum engagement for maximum advertising dollars. In some ways, that culture is compatible with credibility.

I wrote a piece a while back about attention versus credibility. You have this attention economy that craves engagement to drive revenue, and you have credibility on the other side that essentially becomes an ignored metric in the pursuit of growth. How do you consolidate that? It’s really hard because the incentives are financially-driven. 

Sramana Mitra: There is another debatable issue. Everything is debatable on these emerging issues, but I personally feel that this algorithm focus on addictive engagements is itself a problem. Should that be regulated? 

Wasim Khaled: I think in the same way that online gambling was looked at, there should be at least some form of research and limitation on being able to mathematically build systems that are designed to addict you. This is especially true when you talk about younger children.

I’m a parent and I rarely put my child in front of a screen because I know how those things are designed. In the times that you ever do that, you see that it works. The whole thing is built to turn on all your functions and just zone you in. It works on the youngest of ages and all the way to the oldest. 

Sramana Mitra: It’s the most popular babysitting technique around – sticking a child in front of a device. 

Wasim Khaled: It’s ironic because people in Silicon Valley tech industries who work on these products do not give these devices to their children. 

Sramana Mitra: That’s true even with Steve Jobs’ children. 

Wasim Khaled: There are entire teams that work on understanding the psychology of gambling and addiction who work at these companies to increase its stickiness and increase its addictive properties. That’s why we find ourselves in the disinformation space that we are immersed in today.

We have self-subscribed ourselves to all of these systems. People self-subscribe to propagandist channels because they put out content that is sticky and engaging. You just have to click and look at one and it draws you in and starts upping the extreme nature of that content. That’s how the algorithms work.

It’s like that first taste that gets you drawn in. We have seen a lot of people radicalized through a YouTube algorithm. I think of the Christchurch shooter as an example of that. They were radicalized by algorithms. There was no one at the top making those decisions, but that is just how it ended up. 

This segment is part 3 in the series : Thought Leaders in Artificial Intelligence: Blackbird.ai CEO Wasim Khaled
1 2 3 4 5 6 7

Hacker News
() Comments

Featured Videos