Skip to content


Disinformation is the network

“Disinformation” may now be the most misused and poorly misunderstood concept in the English language.  Even the dictionaries have yet to catch up to the real world meaning of the term.   

We all understand that when someone says that they had been “misinformed” on a particular topic, they mean to say that they had been provided incorrect information.  What we do not know is why or how they were misinformed. Perhaps the information they received was simply out of date for example, or the result of a miscommunication, typo, or similar.   Or perhaps the information in question was what is often referred to as “fake news” – specific information that, for whatever the motivations, is largely or just completely made up.

However, no one says that they were “disinformed.”  This is because real disinformation is not any single fact or story. As practiced by serious actors, disinformation is not about any single piece of news or information, but rather a coordinated, sustained campaign made up of different pieces of information so as to influence a target audience over a period of time – such as during the course of a war, or an election cycle.

Let’s talk about Freedom of Speech

Freedom of Speech is a basic right for all.   Censorship has no place in a democracy. But of course free speech is often somewhat free from reality – and even more often free of any actual objective value. The Internet has not changed this. What it has done is make the free speech of many millions of people publicly and easily available.  Human nature means that some good chunk of this now very visible free speech can reasonably be considered “fake news.”

Individual pieces of “fake news” can arise in a number of different ways.  The majority of it is not the work of sinister forces. In most instances, it is fairly harmless to society. A delusional person posting that he seen 3 Martians strolling down his street is likely to have very limited impact. “Fake news” centered on a specific person is often the result of a grudge, personal rivalry or similar – if a celebrity, there is obvious financial motivation.  Furthermore, reasonable people may disagree in many cases as to whether or not a story is fake.  Many things are matters of guesswork or interpretation – for example, whether someone “stole” their friend’s husband, or whether a given policy has proved good or bad.  Apart from very clear-cut cases such as the Martians, arbitrating truth is an impossibility.  Reality is unfortunately very messy, ambiguous and subjective.  Attempts to do so amount to censorship.

Such censorship is a wholly unjustifiable evil.  Individual nuggets of information only start to become harmful in most cases when it is both amplified and supported in a meaningful way by a broad network. (Amplification is simply various forms of repetition, often across different social media platforms. Support involves different online identities verifying and/or adding further details to the original information nugget.  Support transforms an individual piece of information into a narrative.) At the point at which this occurs, the individual piece of information starts to become something different, and more dangerous.

At Chenope, we thus focus on detecting collusion, or the unnatural, inauthentic amplification or supporting of particular information.  This approach has the merit of being mathematically objective, evidence-based, and avoids the temptation of trying to assess ground truth.

Chenope Disinformation Technology

Our technology is focused on analyzing the mathematical characteristics of the transmission network of different stories so as to detect evidence of collusion.  These transmission networks have detectable properties for a simple common sense reason: they are the direct result of either workers following orders or of the rules embedded in a computer program (bot.)

However because we know well that disinformation is often achieved through suppression of specific pieces of information that don’t serve the desired narrative, our technology also looks for evidence of suppression of specific information nuggets in content being transmitted through a suspiciously behaving network. 

Our components similarly look for other types of evidence of inauthentic behavior, for example use of automated translation technology or content in a local forum that fails to correspond to regional linguistic patterns.

Our approach is derived from the study of sophisticated disinformation campaigns around the globe. 

Ukraine Project / Disinformation

Well prior to the Russian invasion in 2022, Ukraine has been ground zero for the testing of sophisticated Russian disinformation techniques.  This is why Chenope has been operating in Kyiv since 2019. Among these techniques are:

  • The repetitive execution of complex N-part narrative scripts that are executed by a number of different cooperating accounts.  These narratives often begin with pieces of information that are factually true, but each subsequent line of the scripted narrative diverges further from the truth and/or ignores key facts (e.g. the German occupation of some parts of Ukraine)
  • Creating real world events, such as vandalizing a Jewish cemetery in Ukraine, then both amplifying these events online and assigning blame to an innocent party
  • Deliberately polluting the Ukrainian national information space with so much mutually contradictory clickbait that soon is replaced with the next set that they hope most of the population will become demoralized and tune out the Internet.

For more information, please contact us at:

CVE Research in France

During the height of the Islamic State’s activity in France, thousands of “normal” French teenagers and young adults were successfully recruited by a network of highly disciplined and clever recruiters. The recruiters urged their targets to either come to Syria – or else commit terrorist attacks in France.  Often, for a time, they were successful.

Chenope collaborated with Dr. Dounia Bouzar to write a position paper on CVE for the benefit of the NATO. (Dr Bouzar’s company at time had the federal French contract to handle deradicalization efforts.) We were invited to present our work at a USASOC Futures Forum. Because Dr. Bouzar’s work was only available in French, we translated some of her work into English. Her innovative work based on substantial field experience was greatly appreciated in the English-speaking CT community. For access to these materials, please contact us at