
The Dejargonizer
A Deep-Tech Deep-Dive "HARDTalk for Startups" show. From Amir Mizroch, ex EMEA Tech Editor, The Wall Street Journal
The Dejargonizer
An Honest Conversation About Misinformation, Disinformation, and Narrative War
Why does the World Economic Forum think misinformation is scarier than nukes and climate change? Why are we more polarized than ever before? How do foreign powers mess with our minds? On this episode we are joined by Daniel Ravner, co-founder and CEO of Brinker, a misinformation threat management platform. Daniel pulls back the curtain on the wild world of digital influence operations, from Iranian sleeper accounts lurking in your WhatsApp groups to AI-powered narrative warfare. This episode reveals how modern propaganda spreads and seeps into polarized societies. A mind-bending journey under the hood of the information war you didn't even know you were fighting.
Listen
Apple Podcasts, Spotify, or anywhere you get podcasts.
Connect
LinkedIn
Twitter
Newsletter
Email: amir@orangegrovecomms.com
Hello, and welcome to the Dejargonizer. The World Economic Forum recently concluded that disinformation and misinformation are the most dangerous challenge facing humanity today. That's more than nuclear war, climate change, or a pandemic. That's because, as my guest on today's episode says if people can't even agree on some basic facts, they won't be able to fix anything else. That kind of makes sense if you think about it. It seems that anything that people have opposing views on, whether it's politics, gender, genocide, you name it, there is a persistent effort to manufacture distrust, discord, conspiracy, and polarization through lies and half truths. so what is disinformation? How is it different from misinformation? Who is behind its spread and what are they trying to achieve? Our guide down this narrative rabbit hole is Daniel Rivner He's the co- founder and CEO of Brinker, an Israeli misinformation defense platform that works with governments and corporations on so called influence campaigns launched against them. Brinker just exposed a sophisticated Iranian influence operation which exploited Israeli societal divisions to recruit spies and polarize israeli discourse we'll get into what a misinformation defense platform is with Daniel.
Amir:B2B services, new standards for quality technological leadership it had a very leveraged effect. It was an ownership stake in an underlying how big can you make be considered a disrupting marketplace. I'm gonna tell you about what really means system. Hi, I'm Amir Mizroch, a communications advisor and former tech editor at The Wall Street Journal. the tech industry is in an age of messaging instead of story, and it's getting worse. What is a data? Oracle? I speak to tech founders and put them through The Dejargonizer, a zero jargon zone podcast. Every time you say your security posture, a butterfly dies painfully. It's difficult but simple. Okay.
Amir Mizroch:Daniel thanks for joining me.
Daniel Ravner:Thank you for having me.
Amir Mizroch:The DeJargonizer really is, about you know, just to unpack for a broader audience, things that are deeply technical yet they matter to our day to day lives. And quite often, we'll hear about stuff like misinformation, disinformation, or defense tech or AI or computer vision or whatever it is. And we won't really understand the technical depths and maybe we don't need to, but I think that also at times forms a barrier to our understanding of the effects and implications and impact that these technologies have on our lives. There's a barrier that I'd like to help people through kind of a bridge to walk them over. You and I have been speaking about how we would frame. An episode about disinformation, the narrative wars, and I was struggling to find a place to start. You know, We could start with the usual, tell me about yourself, tell me about your company. But I think we'll get into that. And I'm wondering if we should start by talking about where we are as countrymen and Israelis inside a country that is in conflict with Iran. we know they're sending over a ballistic missiles and we can knock them down, they're also playing with our minds in ways that we don't always see And that's something that I haven't stopped to think about diseasing about like really worrying about so I thought maybe we'd start there, take us through that. And then we'll get into more about you, your company, what you guys are doing, what you see and where this is all going. Does that make sense?
Daniel Ravner:Completely. you know what's the difference between a misinformation and a disinformation? It's an important distinction. Disinformation is when I lie on purpose. I know I'm lying, I have an ill intent and I'm doing that. Misinformation is when I share something not knowing it's not true. and misinformation is where most of the damage is done. So so say that you are an Iranian intelligence officer, right? And missiles are one thing, but you want to sow discord. You want to crumble the country from within. the first thing you do is you try to understand the culture that you're trying to impact in a very deep level. So there is intelligence gathering involved because misinformation is not fake news. Fake news is not that much of an issue. Misinformation or disinformation is because it's usually 80 percent true. You have to start with something that people care about. People have a heated debate about, and then you'll need to figure out the, your angle into it. So you're an Iranian looking into Israel and you're trying to figure out first it. What are the debates that are divisive within the Israeli society? That could be the occupation, that could be Bibi or not Bibi in Israel, that's a huge one. It could be the religious versus secular. It could be the economic situation. Et cetera, et cetera, then you're trying to figure out, okay, so what are the places where those discussions happen Who are the main influencers? So you would go to forums of right wing and left wing activist you would want to create users that look like them. Because one of the things you want to do is to create users.
Amir Mizroch:A user meaning people on Facebook, on LinkedIn, on WhatsApp,
Daniel Ravner:LinkedIn, all
Amir Mizroch:even on WhatsApp, even like in group WhatsApp groups.
Daniel Ravner:is huge. WhatsApp Telegram is huge, of course. WhatsApp is just a phone. It's not your friend. You probably are a member of a group that has 100 members. You have no idea who they are. So who's to say that one of them is not an Iranian operative.
Amir Mizroch:Wow.
Daniel Ravner:So even if October 7th. You see the amount of misinformation that came out of October 7th. A lot of those users did not start at October 7th. They were on the web for months, if not years, not doing much. Just gaining credibility. Because if you look at a Facebook user, or a LinkedIn user, or a Twitter user, and you're seeing, alright, they've been there for the past three years. And every now and then they shared a post about a recipe. Or anything of that sort. Okay, this is not, you don't suspect it to be somebody who's a foreign influence. But, I would imagine that every country in the world has an army of users that are just gaining traction, doing minimal activity, waiting for activation day. And those users will not carry the most extreme messaging.
Amir Mizroch:Daniel do you think it's an army unit who's involved? Like, Let's just say that the Iranian army has a division for this kind of digital influence warfare. The Israeli army, the American, is it armies or is it more private contractors?
Daniel Ravner:It's armies, definitely
Amir Mizroch:actual people in uniforms?
Daniel Ravner:That's the misinformation front. The influence front, that's a legitimate front in any war have been since, since the time of what's his name? The Genghis Khan.
Amir Mizroch:Genghis Khan was doing misinformation operations.
Daniel Ravner:No, he did influence operations.
Amir Mizroch:Influence operations. How did he do that?
Daniel Ravner:So he had people running around telling stories about how he would maim, burn people alive, create a huge amount of damage so that when he and his army came to a new area, people would surrender. So it's been there since the beginning of time. It's just got that much worse with social media misinformation or influence campaigns up until 20 years ago might have been throwing leaflets from a helicopter. So the Iranian operative Wants tosow discord he wants to start, and remember, misinformation, that's storytelling, designed to sway public opinion, and the basic end of it that's a story, and it's a story that's meant to incite and to engage, not to inform.
Amir Mizroch:It's rhetoric, right? I mean, That's kind of rhetoric.
Daniel Ravner:Yeah, but rhetoric is the tool, at the heart of it, it's a psychological tool. If I'm telling you somebody's stealing your money, You wouldn't think about logically. The first thing you'll do is you'll get upset. You're going to get hot. You want to do something about it. So this is where we're aiming it. And social media is perfect for that because social media favors the inciting over the rationale every time the algorithms are designed to do that.
Amir Mizroch:We'll get into that. We started speaking about the Iranians and how they're trying to sow discord between the right and the left, between the religious and the secular. get back to that.
Daniel Ravner:Okay. So the first thing you do, they will map out all of the divisive discussions we already have, like the judicial system, et cetera. And then they would find out where the discussion is happening and then they would choose what they want to focus on. I can tell you, this is not the secret. You can search for it online that for the Iranians, the whole secular versus religious narrative and discussion and debate, It's a huge deal.
Amir Mizroch:Tens of thousands of ultra orthodox Boys and girls, 17, 18 years old, would give them a pass on the draft, right? They would get an exemption whereas everyone else who's not ultra Orthodox would still have to go serve.
Daniel Ravner:Yeah. And obviously from the religious standpoint, they're saying that what they are doing is what really protects Israel. This has been a heated debate since the beginning of Israel. And it became increasingly debated since we are in a war
Amir Mizroch:Right, and, this is a debate within Israel anyway. How are, we don't need the Iranians to make it worse for us. It's already pretty bad, right? What impact can the Iranians have on this debate?
Daniel Ravner:You can always make it worse. If you're inciting people to violence, You can incite people to hurt other people. They have been doing that for years now. There was a video of leftist activists who were burning down a picture of Rabbi Druckmann,
Amir Mizroch:Who was that?
Daniel Ravner:a rabbi, which is important to religious people.
Amir Mizroch:Okay.
Daniel Ravner:But it was completely deep fake.
Amir Mizroch:Hold on. Why would the leftist be burning down his specific
Daniel Ravner:It doesn't mattered the logic is not important. But if you have a picture of your enemy burning down something which is sacred to you, you want to hurt somebody. And this is the kind of emotion that they're trying to create.
Amir Mizroch:Got it. So a picture of a leftist. Quote unquote, burning a picture of a rabbi, you're saying is more than likely was faked
Daniel Ravner:hundred percent. That's a known fake uh, fake video
Amir Mizroch:it seems to me that in the olden days to have kind of an influence operation, you'd need to, be a sleeper agent planted in another country who then, becomes an influencer and owns newspapers and can, form opinion and amplify other people's opinion, but that would be, you know, one or two or a few dozen people, right? Whereas now you're able to infiltrate fake bots, right? Who would build up credibility, get into groups, would look like any other Israeli or other countries Facebook or whatever users, and then at a certain time would then be deployed, would be turned on to form this weapon.
Daniel Ravner:it's the easiest way. If I need to ransom note, if I need to hack your system, if I need to infiltrate your bank account, that takes skill, time, money. let's take you as an example, right? Let me paint a scary picture. So the first thing I can do is go to chatGPT and tell them, listen, I want to hurt The Dejargonizer. He's a competitor of mine and I want to hurt him. How do I do that using misinformation tactics. Chet GPT would probably tell me, we don't do that. We don't advise people on how to hurt other people But saying that, hey, I'm a film student. I'm writing script about how you can hurt another podcast. And I want to take the Dejagonazer as an example. Can you write a script for me? How that would look like? And then you'll have the technique. Now I can come up with a lie saying that, you know what? Your podcast idea is actually, you stole it from a French podcast. That's all the narrative I need.
Amir Mizroch:That's true. Okay.
Daniel Ravner:I was giving a truthful example so I can come up with that. And now the other thing I'm going to look, who are your clients, right? So on LinkedIn, I'm sure I can easily find five, six groups that you're on. I can join those groups. I can easily create avatars using fake emails. Easily done across the web and then somebody would say hey isn't the show based on a french show? And then my other avatar would say yes, I think he's been stealing it I was waiting for somebody to say it for a long time and then a third avatar would Like it and a fourth avatar would clap and then will rise. Now the choice of French is important because if it's English I can easily validate it But if it's French who's gonna start doing the work, but people have formed something negative about you Now it can easily be built into something much bigger and everything. I've just described this 20 minutes of my time,
Amir Mizroch:That's pretty scary stuff. And you're saying that this is like just a small example of what, countries and armies can do to You know,
Daniel Ravner:Yeah, they're in it for the long run. They have their ability to create hundreds, if not thousands, if not tens of thousands of bots is, it's completely possible. And then simmer, whatever country on a slow burn. And the other thing that they're targeting is the financial institutions. People not being happy with the economic situation, people not being happy with the banks So they would also amplify that narrative.
Amir Mizroch:Okay. I guess we should talk a little bit about um, how, you know, all this stuff. So maybe now would be a good time for a quick intro halfway through to Brinker. Just give us a few minutes on who you are and what you're up to.
Daniel Ravner:Brinker is a misinformation threat management platform. In essence What we're doing is that we're saying that the ability to tackle misinformation at the scale at which Misinformation operates is very problematic these days because it's almost completely done manually if we automate the entire process We can come up with a solution that can actually match, the problem. I think misinformation today is like a soccer match when only one side showed up. And that's true on the country level. I mean, just look at the amount of misinformation being targeted against Israel. it feels like no adequate response. So what we've done is we've created a process that can detect from across social media, detect from across the entire web all of the relevant discussions, relevant users and hashtags that has to do with the client. we then investigate automatically. So we're taking the stuff that was most expensive, most time consuming, and we're doing automated investigation. We look at what the investigation came up with, we understand the legal framework where the client operates we match them and then think, all right, so here are the violations that you can use in order to tackle the issue. And then it offers various mitigations ranging from a cease and desist letter through a request for a takedown, through counter narratives, if you want also impact the discussion. Now misinformation, that's a narrative war. So the ability to understand narrative is at the heart Of how we approach the issue
Amir Mizroch:there's a lot to unpack there. So we've had misinformation, disinformation and on digital networks and social media now for 15, 20 years, right? And then some time in 2022, Chad GPT came on the scene. I guess your company and your work and your technology, is it largely based on AI? Is it automated
Daniel Ravner:when people talk about, for example, antisemitism which is an area in which we are very active. they would say, listen, there are 16 million Jews and 1. 3 or 4 or 5 billion Muslims. Since all of social media is a popularity contest, we cannot win this war. I don't agree with this claim, but let's just take it because it has some logic to it. We use AI to make, Asymmetric situation is symmetric, because with AI, I can find a nuance of content. I can find the 20 percent manipulation of actual knowledge or truth or information. I can find the nuance and I can do it at scale. That's what the world of AI allows me and it carries other advantages that were not possible in the world of metadata
Amir Mizroch:All right. we gotta do jargonize a lot of that. We're basically having a conversation about misinformation and disinformation. Take me into how your company, Brinker, if you can, with an example, I know you have customers also in corporate, in government, but specifically let's say, the misinformation war against Israel. Take us into how You're using AI or you're seeing AI being used in that with whatever examples you can
Daniel Ravner:Because we automate the investigation, we turn something which is manual into an automated investigation.
Amir Mizroch:And so when you gather information, you're how are you taking that from, Facebook groups
Daniel Ravner:we look into social media, we look into the web there is a lot of misinformation in the news as well.
Amir Mizroch:in the news and then you are then able to take this data and then process it and analyze it and look for things that signpost that there might be something behind this.
Daniel Ravner:we have More than 50 percent of the people in our company are R and D people. And most of the time they work on the processing of data. this is where we develop the most deep technologies that we have. Data, became a commodity. and the world doesn't need another data company. Let's say that you're looking for misinformation about Gaza and Israel. Just imagine the amount of data that comes in. And then what? The ability to take all of that info and then come back to you and say, listen, out of the huge multitudes of people of stuff that people are saying about Israel and Gaza, here are the ones that you can act upon. Here are the ones that you should worry about. Here are the ones that you should monitor the processing of the data is where we have our secret sauce without going too much into depth. I would say it involves of the LLM, the other, and the processes that we have developed. So those would be the filters. Now you wanted an example let's say I'm looking at stuff that comes up about Israel and about Gaza. I would recognize a trend, right? I would recognize a trend and the system would be able to tell me that. There seems to be a lot of, uh, discussion about Israel attacking El Shifa Hospital without mentioning that it was used to harbor terrorists, right? So the system would be able to flush that out for me. So that is happening using LLM, the ability to understand not only the content, but to understand the context, right? It's the context that makes the difference.
Amir Mizroch:If we look at this persistent theme, the Gaza genocide there's definitely two camps here. The so called genocide. There's one who is convinced it's a genocide. And there's another side that's convinced there's no genocide, it's war. Each one has its data, has its truth and its facts. It does seem to me though that like on the Gaza genocide side of things, there is I guess a lot more posts about that. It seems to really have caught on in all parts of the world and it's based on, certain types of NGO reports or on the ground reports that can be in our, let's say discredited, but anything you can say about how that misinformation is being spread. Campaign looks like from under the hood,
Daniel Ravner:So first of all, let's talk about the campaigns. a few interesting facts. a lot of the users that have been used to spew a lot of the hate, anti Israeli hate they were registered and had some activity long before October 7th, right? They were waiting for the D Day. At the meantime, they just created the the semblance of looking like an actual user. So that's one. Now we talked about how. If somebody wants to enact a misinformation campaign, they have to look around the world to see what they can tap into. They don't have to build a new opinion. There is a lot of remorse in the West about their colonial past. The British have remorse about the colonial past, the Americans have remorse about the colonial past. Framing the whole Israeli discussion as a colonial equation that's tapping into something that already exists. And this is why they chose those specific narratives to amplify the war. The other thing, again, it has to do with the nature of the war, but an unbelievably horrific 7th of October. And the October 8th, we started fighting back. social media is a machine that needs to be fed on new images constantly, always stay recent, then our most meaningful moral claim saying that, see what happened, we have hostages there, Countering that, you have an ongoing parade of horrible images from Gaza, which we're not seeing, by the way, which we're not seeing. If you look at the feed of a student in New York, and if you look at the feed of a student in Israel they inhabit completely different worlds. So even our ability to understand their position in order to tackle it, not to justify it, but in order to say, I need to understand my problem if I want to tackle it, we don't. Because they're watching a different Israeli Gazan war. so there's some elements that go into the creation of that. And the important thing is that they tap into identity issues of that young Western generation and they inhabit that with the Israeli Gazan war. And allow that to become part of the identity. and that's smart. that's on their end. On our end, the problem is that there is just so much going on. There is so much data. There is so much misinformation that you can't tackle everything. You have to choose. You have to figure out who are your super spreaders out of the huge amount of data that comes in each day.
Amir Mizroch:do you see yourself as a kind of a defense tech company? Is it a kind of almost like a defense contractor?
Daniel Ravner:the short answer is yes. I see myself as a influence defense More specifically that that would be my category.
Amir Mizroch:Influence defense.
Daniel Ravner:Disinformationn is considered to be the biggest problem to the world, right? just start with the pain. there isn't any bigger pain because it's impossible to solve anything else when everything's polarized, when truth has no real value, everything is narrative.
Amir Mizroch:bigger than climate Change, bigger than North Korean nukes or a Chinese invasion of Taiwan. The world economic forum actually said that the world's greatest danger right now is misinformation and disinformation could, because it makes solving everything else impossible.
Daniel Ravner:making everything else much worse very possible. So it's the biggest problem in the world, not just because you can't solve anything else. Because everything becomes politicized, Polarized, Much faster, and people are not willing to listen to one another. So you can't solve everything, but you can definitely make everything worse. Like, you can make the demands of extremists on each side much stronger, you had a majority that was moderate for the most part, and then you had some some extreme groups, but now the extremes are taking much more substance, much more people belong to something that, in the past years you would consider extreme,
Amir Mizroch:Yeah.
Daniel Ravner:because keep on meeting themselves online. So if you keep on meeting yourself, that's a process that would make you more extreme, that the process that would make you, much less tolerant to other views because you're saying, all right, if everybody I'm talking to are saying that this is bad. Yeah. And if you're saying otherwise, then you're an idiot. and that's discourse today
Amir Mizroch:A lot of people call me an idiot, especially friends who let's say are in different content filter bubbles than I am They will send me a piece of content that they've taken from somewhere and they'll send it to me privately on WhatsApp and they'll say, look, and then I'll say, yes, but, clearly one sided, but for them it's everything.
Daniel Ravner:I think what I think is the biggest debate, in Israel, right? For Bibi, against Bibi. wherever you stand on that debate, that's not an opinion. That's your identity. If you're against Bibi And I'm telling you, listen, he's amazing, look at all of the great stuff he was able to do, and, and, et cetera. I'm not challenging your opinion. We're not having a discussion about some sort of, uh, interpretation between us. I'm challenging you when I'm telling if you're pro Bibi person, I'm telling you Bibi is horrible. I'm offending you and your ability to listen to whatever else I have to say is nonexistent.
Amir Mizroch:Wow. I've never actually thought of it like that. I know everyone talks about identity politics and everyone's entitled to their own opinion, but not their own facts. But what you're saying is in the world that we live in today, where there might be a lot of misinformation and disinformation that we can't tell apart from, more factual stuff, or we're just not that aware of. It's Not your opinion that I'm disagreeing with you're feeling it's like actually your identity.
Daniel Ravner:I think the problem with the times we're living in right now is that facts became that much less important. And narratives became that much more important. are everything currently. And if you're on one side of the fence, then it doesn't matter because facts don't matter. I can tell you that, listen, we are causing climate change, it's gonna be horrible, and the Earth's temperature is rising every year. If you're on the other side of that debate, You've heard what I had to say a million times by now, so you're completely immune to it. Also, To my ears, science is the strongest argument, but not for you. And we know for a fact that, you know, between liberal minds and conservative minds, although we all ask the same moral questions, we care about them in different ways. People on the conservative side, on the right side, would care more about uh, loyalty to the group and its values. People on the liberal slash leftist side would care about more about is something fair or not. So the best example there would be somebody burning a flag. I'm generalizing like hell, but I think the point is valid. So somebody on the left side would say, I don't like watching somebody burning a flag I think it's an idiot, but you know, it's cloth. Somebody on the right side might look at the same thing and be Mesmerized saying that people died for this flag. How dare you, you should be punished
Amir Mizroch:if this is the state of the world right now you know, you're in this influence defense vendor category and misinformation threat management platform. What, difference can companies like Brinker make?
Daniel Ravner:Currently are in the midst. Of an unbelievably asymmetric war to put it in simple terms, the bad people are winning in a very big way because there isn't a real objection to what they were doing. It's no contest.
Amir Mizroch:It's not a
Daniel Ravner:We're losing big time. what we're trying to do is through the power of technology, to create a response that is equal, if not greater to the size of the problem. Technologies makes a difference is the ability to act in real time because the way we treat misinformation right now is heavily manual. By the time you came up with any kind of mitigation or remediation to the problem or to the actor, damage has been done we want to turn that whole action into something that happens in real time. From a suspicious post. through to understanding what's suspicious about it. Is it foreign is it bots? Is it real people? Uh, is it a network activity to get you all of those answers and then get to the next point of maybe creating a cease and desist letter if it's a bully or maybe asking to remove the network. So today we're talking about something that takes over days in times of crisis, weeks, regular times, and it takes thousands, if not tens of thousands of dollars. Today I can take you from that post. to that action in minutes.
Amir Mizroch:Yeah, but I guess the, the question, over here is once you've detected it and you've presented the possible mitigations, what generally happens on the other side? I mean, I'm assuming that, you know, the platforms are, they'll do their best to take down or they won't do their best. It depends.
Daniel Ravner:For the absolute bulk of times. If you come up with good enough evidence, they will take down the user or content fairly fast.
Amir Mizroch:okay, that's good to know.
Daniel Ravner:cause we're not telling them, listen, they're problematic. We're saying, according to your terms and conditions, subheading 3. 2. And here is evidence that, that thing happened. when we come with that approach, for the most part, from our experience the networks are fairly fast, if you come up with a good violation and evidence. But I think networks are not doing enough to stop it in the first place. the ability to have an army of bots on social media is so ridiculously easy. As long as that problem is not solved, then the ability to act in real time before damage is done that's the differentiation of Brinker.
Amir Mizroch:Talk to me a little about this kind of automated misinformation, threat management platform, just to, to position Brinker, would you say you're a more in the kind of new defense tech, AI, big data side, software side, or more on the, let's say AI business intelligence monitoring side?
Daniel Ravner:I would say AI in defense, narrative risk, narrative defense. That would be it.
Amir Mizroch:Narrative risk, narrative defense,
Daniel Ravner:intelligence.
Amir Mizroch:as narrative intelligence?
Daniel Ravner:would be the bucket. think that that suits us the most.
Amir Mizroch:I love This narrative intelligence. Vector or sector or industry narrative intelligence is something that, really kind of folds into the whole national defense thinking
Daniel Ravner:Misinformation, regardless of if you're working with a bank or major multinational, Department of Defense anywhere, the soldiers fighting the war on misinformation are always analysts. So they would be threat intelligence analysts, they would be open source intelligence analysts. They are the ones trying to kind of find the needle in the haystack. making sense of the huge amount of data that they're seeing. So if you go to a very practical level, we want to make them super analysts
Amir Mizroch:I want to try and take the listener, through a case study. You have a client, you're monitoring their narrative reputation, their narrative defense, and then you start to get a couple of signals and you start to see that certain patterns are emerging, certain, themes or certain guess it's methods are being used. Take us into how you start to see something and then how you, how it might evolve. Before you say, okay, we should we should flag this.
Daniel Ravner:what I would start seeing are narratives that are suspiciously similar, right? So people will talk about the same general narrative, maybe discuss an aggression of one country against another country, and it could be a complete fake, but we would start seeing discussions about, uh, aggression. Each would tell a different story, but we would recognize the similarity between them because we vectorize all of the data that comes in, so we are able to understand
Amir Mizroch:What, what, what is vectorize?
Daniel Ravner:is when you, everything that we look at in the real world has two, three dimensions, right? it has depth, size, color. AI looks at things differently. AI is a machine. they have to break it down to many more information points to understand what something is. So every narrative that comes in is being vectorized by AI.
Amir Mizroch:It's sorted into something.
Daniel Ravner:Everything is vectorized. And then based on that vectorization, based on that very nuanced understanding and abstract understanding of the content, the machine would sort out similarities. They would say, this story is similar to that story by 89%. That story is similar to that story by 71 percent or by 95%. Now, everything that is beyond 90%, the machine would highlight.
Amir Mizroch:Yeah.
Daniel Ravner:because we look at narrative, as opposed to data points that are very clear, like, uh, when did the user join the platform? Who are his friends? like, Very binary information, we can see those narratives as they spread across All of social media. So I can say, listen, there was a story on Facebook and it's 95 percent similar to a story on Telegram. So everything that has a high similarity level, we can arrange to see them in one place. And the moment you'll see that, the similarities would emerge. Because suddenly you would see that, many of the users, use the same image. Maybe some of them use the same story. And then you can see how it travels across time and across space. And then if you'll see, for example, that on the 14th of December, There was a huge spike in that specific story, which suggests this is something that is coordinated. So already, getting to this point where there is a very clear understanding, you've seen that happening, here is something that looks like a coordinated attack at the very least. and when you investigate a few of the users to see Where did they all originate, Where did they come from? What have they talked about before? that investigation also happens automatically, and then you can start connecting the dots.
Amir Mizroch:It's a puzzle.
Daniel Ravner:It's a puzzle, but our idea is to make as much of the puzzle solving as possible automatic because again, you wouldn't necessarily be able to take down, right? If Iran is doing a foreign influence campaign against Israel, you wouldn't take down Iran. But if you have, uh, an Excel file with all of the people in the network with the clear evidence that they're all fake or promoting, terror speech, then your ability to take it down is much better.
Amir Mizroch:How often would you say that these kinds of influence operations are detected and then taken down? Like what percent?
Daniel Ravner:All day, every day.
Amir Mizroch:The good guys are fighting back and it's going okay.
Daniel Ravner:No, I think that the good guys have started fighting back, because of a few things. First of all, the advent of AI. It made it easier for malicious actors to attack us but it made it easier for us to find it. When the world Economic Forum says something like, disinformation is the biggest. risk factor, then automatically entrepreneurs in Israel and all across the world are all trying to find a solution. So we have one solution which keeps evolving. The understanding of governments that they have to tackle it again, if just from a military perspective, we're doing fairly good on all fronts. But We are bitterly losing on the influence front bitterly losing. That's a disgrace
Amir Mizroch:we got lynched. We got lynched online.
Daniel Ravner:still are it can come from lack of information. It can come from many different places. But the disinformation, that's like the 10 percent that causes all of the problem. But the misinformation is by people who have a very deep belief that the narrative they are pushing for, or pushing to correct, a moral one. They don't think they're doing anything wrong. And you have to understand that if you look at those people and say, They're idiots, they're anti semitic, blah, blah, blah. Right, some of them are anti semitic, but you have to understand the problem in all of its complexity. to understand, map out, quantify, Bye. The narrative that's step one and doing it fast makes a world of difference in your ability to actually tackle it
Amir Mizroch:but it's not getting to our phones. If I get sent a WhatsApp from a friend who says, Oh, look, look at the story, I mean, I can see that it's misinformation, but it's too late. That person was already, uh, formed that idea. Then I have to write to them that, no, I think there's another side to this. how do we do this quicker
Daniel Ravner:we have a mitigation called counter narrative. we tell our clients, you can't come to an angry mob with the fact that's useless. it's one thing to try and tackle the malicious actor through. Uh, cease and desist letter, for example, but the other thing to do is enter the conversation, but be smart about the way you're doing it by acknowledging the psychological forces at play. If somebody is currently very extreme, if somebody is convinced that Israel is Satan, I think it might be a waste of time to tackle them. but remember that social media really amplifies the people who are vocal to begin with. But there are many other people who look into those discussions and they form their opinion while looking at those discussions. So having a say within the discussion, that's a meaningful way to curve the impact. Not necessarily the malicious actor, but to curve the impact. if you can do data scale as well,
Amir Mizroch:So let's say the first part of your service is detection. Then you bring that file to the customer and you say, look, this is what's happening. These are in an order of priority. This one is really, really important. This one's less. And here are a few things you can do about it. a cease and desist letter. Here is something else you can do. And by the way, here's a counter narrative you can deploy.
Daniel Ravner:that's doable in the system right now.
Amir Mizroch:how does that work? What that's then considered the truth, not misinformation, counter narrative.
Daniel Ravner:From where I'm standing, it's all about taking away the exclamation mark.
Amir Mizroch:How do you mean
Daniel Ravner:Taking the Israeli example, right? Because it's easy and because it's actually something that we deal with every day
Amir Mizroch:committing genocide in
Daniel Ravner:or on the other end, Israel is a saint. the exclamation point, I want to take a sentence. That is declarative and turn it into a sentence that is just a sentence and maybe a sentence that ends with a question mark. we want to take away the extremism from the discussion. I think that's a worthy effort. I think that can make a change because again, when, when everybody's saying something and you don't mind shouting it, but if there is a general opinion or something that people are debating about your ability to shout about it or to go to a protest about it would be much lessened. what I'm planning to do is, is, kind of. putting into those debates, doubts, uh, let's say that I put a tag there saying that listen from everything We know this looks like a foreign influence campaign and that's it. Even that would change the entire conversation for you as somebody just reading Dialogue in order form an opinion
Amir Mizroch:I like that you know, as a former journalist, this idea that you can be sure of yourself. At all times, is something that, uh, I'm very, very uncomfortable with people who put exclamation points, exclamation marks, you know, after what they say to me seems, okay, you're not leaving any room for doubt.
Daniel Ravner:and so that's what again the counter narrative the way we understand it is to take down the exclamation mark. And I think if we're able to do that, that would be meaningful and huge. But technology allows us to do that in scale and fast. Again, misinformation, that's a narrative war. Whatever people are saying about Israel, that's a story. And if misinformation is successful, it's because the story was well constructed. And if you want to tackle that story, you have to tackle it a story that is constructed. as well.
Amir Mizroch:Blessed are the storytellers, for they shall inherit the earth.
Daniel Ravner:I hope so.
Speaker 3:Okay, I think that's a good place to wrap it up. Thank you, Daniel. I learned a tremendous amount.
Daniel Ravner:Thank you for having me.
the-dejargonizer:Thanks for listening to The Dejargonizer for more episodes and ways to connect with me, please visit dejargonizerpod.com. That's dejargonizerpod.com