Fake news and misinformation is playing havoc with modern society. This conversation delves into the depths of the issues.
Sramana Mitra: Let’s have you introduce yourself as well as Blackbird.ai.
Wasim Khaled: I’m the CEO and Founder of Blackbird.ai. We help brands and governments automate the rapid detection of harmful disinformation campaigns that cause financial and reputational harm. We got our start in national security and are mow expanding rapidly to enterprise markets.
Sramana Mitra: This conversation could not have been more timely.
Wasim Khaled: Yes, it’s been an interesting week for us for sure. In some ways, we have been building up to this week since we spun up in 2017.
Sramana Mitra: Let’s start the conversation there. We are in election week in America. The election has not been called yet at the time of recording this conversation and yet there has been plenty of disinformation.
Trump went and declared that he has won the election. Tell us about what kind of disinformation prevention has your company been involved in this week.
Wasim Khaled: Our primary focus is more on the enterprise side of disinformation where we work with government agencies. We don’t work on domestic political disinformation.
There are quite a few companies addressing that by doing a fact-checker approach. We stay away from that at present for the most part. In terms of the disinformation that has been flowing, the patterns that we see are universal in the disinformation space.
You will see a story that is designed to be polarized pop up in the media. Typically, there will be high influencer accounts whether they be known people or just accounts with millions of followers. They would promote that story and try to get it seen by as many people as possible.
A big success for whoever is operating such a campaign is if someone picks it up and spreads it further. The whole goal is to get seen and heard as much as possible and to cut the clutter with your narrative. This is a bit higher level than what you are asking.
With all the information today, it’s less about something being true or false and more about who is going to win this particular story or narrative. They then use techniques to get that seen and believed as much as possible. We see a lot of that happening today. We have been seeing it grow in size and impact over the last four years.
Sramana Mitra: We are starting this conversation at the 30,000 feet level. We will get to your company and what you are doing for your specific clients, but since we are already on the broader topic, I would like to continue that conversation a little bit.
I will ask you your perspective as an industry observer or expert and not necessarily what your company does. One of the things that happened this week in the context of the US election is that Trump has been trying to portray that this election has been stolen from him.
There is a large Facebook group that propagated that narrative. The group was taken down by Facebook. Could you talk about that a little bit? What happens? How did this group become so big? What is your observation about the stance that Facebook is taking on this and other misinformation stuff?
Wasim Khaled: I believe that you are talking about the group that was promoting “boots on the ground” or violence. I’ll start with the removal itself. Facebook’s got its own policy. Primarily, it has always taken a stance that if bodily harm or violence is being encouraged by any people within a group, it will be taken down. They are following that policy as they normally would.
One thing that was interesting about this particular group is that they used to be a group that was spreading COVID disinformation earlier in the year. It’s something that we covered in our disinformation reports, which we called the reopened narrative that was being pushed quite a bit. It’s a narrative being pushed about reopening regardless of scientific or medical information.
You saw a lot of protests in capitol buildings all over the country. The Facebook group used to be a big proponent of that and they switched over to a more militant viewpoint. For that reason, they took it down. I can’t comment on Facebook’s policy, but if there are people pushing violence in these groups, then it’s not a bad idea to do something about it.
Sramana Mitra: I hear two things in what you said. One is violence. It seems like Facebook has an explicit policy to take down groups that instigate violence.
Wasim Khaled: Correct.
This segment is part 1 in the series : Thought Leaders in Artificial Intelligence: Blackbird.ai CEO Wasim Khaled
1 2 3 4 5 6 7