The Feds Have Hired Moonshot CVE, Be Worried

By: Denise Simon | Founders Code

There was a time when CVE was first used by the Obama administration to describe terrorists such as the Haqqani Network, al Qaeda, or the Islamic State. Now, it is used to classify anyone the federal government actually just thinks is a terrorist and that could mean you just based on your various Internet searches. Your searches to find out patterns, various reports, names and dates, and other details are captured by Google (that is if you are still using Google and should not be) and then you are scored.

Sound crazy? This is a long read so hang on through it all as this post is an effort to give you full context, well as much as possible.

Ready?

Let’s begin here:

From February of 2021, a little more than a month since the J6 event in DC, The Hill reported the following:

When armed insurrectionists stormed the Capitol on Jan. 6, Vidhya Ramalingam wasn’t surprised.

A day earlier, her company Moonshot CVE, which monitors and combats online extremism, set up a crisis team in response to a flood of indications that the pro-Trump rally scheduled for Washington could turn violent.

Moonshot works to pull back from the brink people who have been inculcated into white supremacist movements, conspiracy theories and radical ideologies, and it offered crisis intervention to some 270,000 high-risk users around the time of the Capitol breach.

“For organizations like ours that have been working on domestic violent extremism for many years, and in the run up to the election and the months that followed, this was not a surprise, that this attack happened,” Ramalingam said.

But even the 33-year-old Ramalingam, who has spent her entire career focused on the issue both domestically and abroad, says the widespread nature of radicalization in the U.S. is alarming.

“It’s a very scary moment in America right now. I mean, the implications are so wide-reaching,” she told The Hill in a recent interview. “There’s just the potential for so much more violence right now.”

Ramalingam got her start embedding herself with white nationalist groups in Sweden for two years as part of her graduate studies.

“It was really tough. I spent a lot of time around people saying and spouting lies about people of color and about immigrants and people like me, so there were moments that were really horrible to sit and listen to,” she said.

But the experience helped open a window into their world and how people become radicalized.

“Some of them had life experiences that had led them here. And for me, it was really important to see that in order to then start to piece together, well, how could you get someone out?” she said.

When far-right extremist Anders Behring Breivik murdered 77 people in Norway in July 2011, the European Union tapped Ramalingam to lead its first intergovernmental initiative to respond to right-wing terrorism, a job she held for three years.

She worked on deradicalizing initiatives such as Exit Germany and Exit Sweden that included efforts setting up counseling interventions and training family members and loved ones.

She also sits on the board of Life After Hate, a U.S.-based group that provides similar interventions along with building a network of former extremists to push back on extremist content.

But Ramalingam says the problem needs larger-scale responses, something that became clear with the rise of ISIS and its use of social media to radicalize people.

“There was this sense of defeat, that the terrorists were winning and that they were just better than we were, they were able to use technology better,” she said.

The London-based Moonshot, which opened its D.C. office this week, seeks to scale up monitoring and intervention using the kinds of targeting that have become commonplace in business to build personalized responses.

The company counts a slew of governments, the United Nations and major tech companies such as Facebook and Google among its funders, and groups including the Anti-Defamation League among its partners.

“Technology can actually have the power to scale up really deeply personalized interactions the same way that every single advertisement we see is personalized towards me, my gender, my behavior online, my identity, where I live,” said Ramalingam.

“It really is literally the same thing that Coca-Cola is doing to sell us more Coke. We’re using those same tools to reach people and try and offer them safer alternatives, and either save their lives or save other people’s lives.”

Those efforts, which range from widely used platforms such as Google and Facebook to more niche ones such as Gab and Telegram, have led to some surprising results.

While countering facts and ideological debates seldom work to engage people online, a more empathic approach seems to yield gains. In a recent round of tests, Moonshot’s target audience was 17 percent more likely to engage with posts featuring the simple message that “anger and grief can be isolating” compared to other tested messages.

Other content focused on deescalating anger and even breathing exercises also found fertile ground.

But Ramalingam says the threat is also evolving.

“We’ve seen this kind of blending and metastasization of various once-distinct ideologies, groups and movements. You know, everything from white supremacist and neo-Nazis with armed groups and anti-vaxxers and election conspiracies,” she says.

“These groups weren’t always coordinating, and now we’re suddenly seeing this mess online come together.”

There are a slew of factors at play, including what Ramalingam says is a tepid response from technology companies that have “systematically overlooked and been unwilling to respond” to the threat, though the Capitol insurrection last month could be changing that. Tech platforms, she notes, were far more aggressive when dealing with ISIS and have proven tools on issues such as suicide prevention that show how much more they could be doing.

Another major contributor to the problem has been the willingness of people in positions of power to bolster conspiracy theories and misinformation, whether through full-throated endorsements or more subtle means, such as winking claims that questions remain in actual clear-cut cases or that certain facts are unknowable.

“Political leaders and people in that level of power should absolutely not be lending any credence to conspiracy theories and disinformation. Lending even the tiniest inkling of credence to those conspiracy theories is hugely dangerous because of the position of power that they’re in,” she said.

Ramalingam is no stranger to Washington, having grown up just a few hours away and later testifying before Congress on the threat of white nationalism.

She says she has been in touch with senior members of the Biden administration on how to take a whole-of-government approach to combatting right-wing extremism, which FBI Director Christopher Wray says is the top terrorism threat the country faces.

She worries that the country will assume that the events of Jan. 6 were the apex of a movement, rather than simply the latest in a series of deadly attacks ranging from Charlottesville, Va., to Pittsburgh to El Paso, Texas.

“For those of us that have been working on this form of extremism for 10 plus years now, it would be misleading to say that this is the — kind of the crescendo and now it’s going to dissipate,” she said.

“I think there’s a risk for the U.S. government, that the response following the Jan. 6 events focuses on public statements and on Band-Aids and not on the changes and the real shifts that need to take place in the entire system to deal with domestic violent extremism,” she said.

Got it? Hold on here comes the terrifying part….

From Fast Company:

How do you pull people out of the rabbit holes that lead to violent extremism, or keep them from falling in? If conspiracy-laced hate is another kind of pandemic pushed by online superspreaders, could we build something like a cure or a vaccine?

The deadly Capitol riot on January 6 has set off a fresh scramble to answer these questions, and prompted experts like Vidhya Ramalingam to look for new ways to reach extremists—like search ads for mindfulness.

“It’s so counterintuitive, you would just think that those audiences would be turned off by that messaging,” says Ramalingam, cofounder and CEO of Moonshot CVE, a digital counter-extremism company that counts governments like the U.S. and Canada and groups like the Anti-Defamation League (ADL) and Life After Hate among its clients. But Moonshot’s researchers recently found that Americans searching for information about armed groups online were more likely than typical audiences to click on messages that encourage calmness and mindful thinking.

“Our team tried it, and it seems to be working,” Ramalingam says. The finding echoes previous evidence suggesting that some violent extremists tend to be more receptive to messages offering mental health support. “And that’s an opening to a conversation with them.”

It’s a promising idea in a growing multimillion-dollar war—an effort that, even decades after 9/11 and especially after 1/6, is still hungry for tools to reach extremists. Old currents of violence and hate, amplified by a virtuous cycle of platforms and propagandists, are straining relationships and communities, draining wallets, and putting new pressure on the U.S. government to steer its anti-terror focus toward homegrown threats. Last month, the Department of Homeland Security said it was granting at least $77 million toward ideas for stopping what the agency says now represents the biggest danger to Americans’ safety: “small groups of individuals who commit acts of violence motivated by domestic extremist ideological beliefs.”

The risk of violence is buoyed by a rising tide of conspiracy theories and extremist interest, which Ramalingam says has reached levels comparable to other “high risk” countries like Brazil, Sri Lanka, India, and Myanmar. In terms of indicators of extremism in the U.S., “the numbers are skyrocketing.”

How to reach people—and redirect them

To get those numbers, Moonshot goes to where the toxicity tends to spread, and where violent far-right groups do much of their recruiting: Twitter, YouTube, Instagram, and Facebook, but also niche platforms like MyMilitia, Zello, and Gab. But core to its strategy is the place where many of us start seeking answers—the most trafficked website of all. “We all live our lives by search engines,” Ramalingam says.

From an analysis of U.S. social media and search data by Moonshot and the ADL [Image: courtesy of Moonshot]

Social media tends to get the bulk of the attention when it comes to radicalization, but Google is also integral to the extremism on-ramp. And unlike social media, with its posts and shares and filters, a search can feel like a more private, largely unmoderated, experience. “We tell Google our deepest, darkest thoughts,” Ramalingam says. “We turn to Google and ask the things that we won’t ask our family members or partners or our brothers or sisters.”

Search can also convey to users an illusory sense of objectivity and authority in a way that social media doesn’t. “It’s important that we keep our eye on search engines as much, if not more than we do social media,” Safiya Noble, associate professor at the University of California, Los Angeles, and cofounder and codirector of the UCLA Center for Critical Internet Inquiry, recently wrote on Twitter. “The subjective nature of social media is much more obvious. With search, people truly believe they are experiencing credible, vetted information. Google is an ad platform, the end.”

Moonshot began in 2015 with a simple, insurgent strategy: Use Google’s ad platform—and the personal data it collects—to redirect people away from extremist movements and toward more constructive content. The idea, called the Redirect Method, was developed in partnership with Google, and widely touted as a way to reach people searching for jihadist content, targeting young men who were just getting into ISIS propaganda, or more radicalized people who might be Googling for information on how to sneak across the border into Syria. The idea is to steer potential extremists away—known as counterradicalization—or to help people who are deep down a rabbit hole find their way out through deradicalization. That might mean connecting them with a mentor or counselor, possibly a former extremist.

[Image: courtesy of Moonshot]

Ramalingam has seen these methods work up close. A decade ago, as part of her graduate studies, she embedded herself among neo-Nazis in Scandinavia, where a system of counseling and exit programs was helping bring people back to sanity and family. In 2015, she and another counter-extremism researcher named Ross Frenett started Moonshot to drive that approach using search ads, with a name that described their far-reaching goal. “If we knew that that worked offline,” she says, “couldn’t we test whether this would work online?”

What began with a focus on jihadism and European white supremacy is now part of an effort to track a nexus of extremism, conspiracy theories, and disinformation—from QAnon to child exploitation content—from Canada to Sri Lanka. But for Moonshot, the U.S. is a new priority. Last month, Ramalingam, who grew up in the states, returned to open the company’s second office in D.C., where it can be closer to policy makers and practitioners. The company is also dropping the acronym from its birth name, Moonshot CVE: “Countering violent extremism” has become nearly synonymous with a misguided overemphasis on Muslim communities, Ramalingam points out, and in any case, old tactics aren’t sufficient. As extremist ideas have stretched into the mainstream, Moonshot’s once tiny target audiences now number in the millions.

“We can’t rely on what we knew worked when we were dealing with the dozens and the tens of people that were really on the fringes,” she says. “We need to be testing all sorts of new messaging.”

Understanding the data

If you were among the thousands of Americans who Googled for certain extremist-related keywords in the months around the election—phrases like “Join Oath Keepers Militia,” “I want to shoot Ron Wyden,” and “How to make C4″—you may have been targeted by the Redirect Method. It could have been a vague, nonjudgmental message at the top of your search results, like “Don’t do something you’ll regret.” Click, and you could end up at a playlist of YouTube videos with violence-prevention content, like a TED Talk by a would-be school shooter or testimonies from former neo-Nazis. Or you might encounter videos promoting calmness, or a page directing you to mental health resources. Around January 6 alone, Ramalingam says more than 270,000 Americans clicked on Moonshot’s crisis-counseling ads.

To do this, Google has given Moonshot special permission to target ads against extremist keywords that are typically banned. But while Moonshot launched the Redirect Method with Google’s help, these days it typically pays the ad giant to run its campaigns, just like any other advertiser. And now, given the sheer scale of the audiences Moonshot is reaching in the U.S., “the costs are off the charts,” Ramalingam says. Regarding its recent ADL-backed campaign, she says, “We’ve never paid this much for advertising in any one country on a monthly basis.”

This ad data comes with caveats. When looking at extremist search terms, for instance, Moonshot can’t be certain it’s measuring individual people or the same person searching multiple times. It also can’t know if it’s targeting an extremist or a journalist who’s simply writing about extremism.

Sample of U.S. Google search data during the three months around Election Day [Image: courtesy of Moonshot]

Still, the company is bringing more empirical evidence and scientific rigor to a field that sorely needs it, says Colin Clarke, the director of policy and research at the Soufan Center, an independent non-profit group that studies extremism. Moonshot’s data is even more concerning, Clarke says, because of another statistic that’s not exactly captured in Google analytics.
“At a time when people have been locked in their homes and consuming disinformation, with record levels of domestic violence, anxiety, depression, substance abuse, what’s the antidote? People have bought guns and ammunition in record numbers. So they’re anxious, they’re angry, isolated, and they’re well-armed,” he says. “It’s a perfect storm.”

In a recent analysis, done in partnership with the ADL and gathered in a report titled “From Shitposting to Sedition,” Moonshot tracked tens of thousands of extremist Google searches by Americans across all 50 states during the three months around Election Day. It saw searches spike around big political events, but also along geographic and political lines. In states where pandemic stay-at-home orders lasted 9 or fewer days, white-supremacist-related searches grew by only 1%; in states where stay-at-home orders were 10 days or longer, the increase was 21%.

The politics of the pandemic fomented domestic extremist interest, but also helped unite disparate fringe movements, from militias to climate denialists to anti-maskers and anti-vaxxers. “We started to see this worrying blending and metastasization of all these different ideologies in the U.S.—far-right groups blending and reaching across the aisle to work with anti-vax movements,” Ramalingam says. And it’s during times of crisis, she notes, “when we see these actors just grasping to turn fear and anxiety in society into an opportunity for them to grow.”

But Ramalingam isn’t just concerned about the most hard-core armed believers. After the election and the events of January 6, she worries now about splintered far-right groups and disaffected conspiracy theorists who are grappling for meaning. That puts them at risk of further radicalization, or worse.

“There are a lot of people who basically just feel misled, who feel like they’ve lost a lot because they followed these conspiracy theories,” she says. QAnon channels filled up with anxiety, self-harm, and talk of suicide, “like a father saying, ‘My son won’t speak to me,’ people who have lost their jobs, people who said, ‘I lost my family because of this,’ ” Ramalingam says. “And so there’s a real moment now where we need to be thinking about the mental health needs of people who, at scale, bought into these conspiracy theories and lies.”

What to say 

To reach violent radicals or conspiracy theorists to begin with, Ramalingam urges caution with ideological arguments. Shaming, ridiculing, and fact-based arguing can prove counterproductive. In some cases, it can be more effective to use nonjudgmental and nonideological messages that don’t directly threaten people’s beliefs or tastes but that try to meet them where they are. For instance, as Frenett suggests, if someone is searching for Nazi death metal, don’t show them a lecture; instead, show them a video with a death metal score, but without the racism.

Simple reminders to be mindful, and to think about how one’s actions impact others, may help. In its recent campaign, some of Moonshot’s most effective messaging asked people to “reflect and think on their neighbors, their loved ones, the people in their immediate community around them, and just to reflect on how their actions might be harmful to their loved ones,” Ramalingam says.

People interested in armed groups were most receptive to messages of “calm” offering mindfulness exercises. For all audiences, Moonshot found particularly high traction with an ad that said, “Anger and grief can be isolating.” When people clicked through, to meditation content or mental health resources, Ramalingam notes that “they seem to be watching it, or listening to it, or engaging with it for a long time.”

To reach QAnon supporters, Moonshot found the most success with messages that seek to empathize with their need for psychological and social support. “Are you feeling angry? Learn how to escape the anger and move forward,” said one Moonshot ad directed at QAnon keywords, which saw a click-through rate around 6%, twice that of other types of redirect messages. Clicking on that took some users to r/Qult_Headquarters, a subreddit that includes posts by former adherents.

Preventing the spread of violent extremist ideas involves a broader set of strategies. To bolster trust and a shared reality among the general public—people who haven’t yet gone down the rabbit hole—researchers are exploring other countermeasures like “attitudinal inoculation,” alerting people to emerging conspiracy theories and warning of extremists groups’ attempts at manipulation.

Experts also urge enhancing public media literacy through education and fact-checking systems. Governments may not be trusted messengers themselves, but they could help in other ways, through a public health model for countering violent extremism. That could mean funding for local community activities that can keep people engaged, and for mental health and social programs, an idea that then-Vice President Joe Biden endorsed at a 2015 White House summit on countering violent extremism.

Speaking of the White House, Ramalingam emphasizes that extremist ideologies warrant stern condemnation from public figures. Companies should deplatform the superspreaders of racism and disinformation, and political, cultural, and religious leaders should vehemently denounce them.

“Rhetoric that’s shaming of those ideologies can be really important and powerful from people in positions of power,” Ramalingam says. That’s for an already skeptical audience “that needs to hear it reinforced, but also the audience that is in the middle and doesn’t really know or doesn’t care that much. And that audience really needs to hear, ‘This is not okay. This is not acceptable. This is not a social norm.’ ”

But when addressing more extremist-minded individuals, Ramalingam suggests a gentler approach. “If someone is coming at you with an attack, you kind of pull yourself back into a corner and stand your ground and defend it,” she says. “And so if that’s our approach with the most extreme of society, that will actually worsen the problem.”

Does this work?

In the face of the domestic terror threat, mindfulness and compassion might sound like entering a space-laser fight with a water pistol or a hug.

But to Brad Galloway, who helps people exit right-wing extremist groups, Moonshot’s messaging makes sense. In a previous life, he used chat rooms and message boards to recruit people into a neo-Nazi group. After he joined—drawn in largely by camaraderie and music—what had been a U.S.-only organization eventually grew to 12 countries, thanks largely to the internet. Now Galloway is a coordinator at the Center on Hate, Bias and Extremism at Ontario Tech University, where he often urges his mentees to be more mindful, especially online.

“I ask people to think, Do I really need to watch this video of a school shooting?” Instead he encourages “positive content” to displace the stuff that can accelerate or even provoke radicalization.

Galloway, who has worked with Moonshot, Life After Hate, the Organization for the Prevention of Violence, and other groups, says the same principle of positive content applies to real life, too: Connecting with old friends and finding fun new activities can help people leave corrosive extremist communities. “What’s positive to that user, and how do we make that more prominent to them?”

Sample of U.S. Google search data around Election Day [Image: courtesy of Moonshot]

That’s not just a rhetorical question. What content works with which audience? Who is reachable? What counts as success? And how do strategies like the Redirect Method influence extremists? 

A 2018 Rand Corp. report on digital deradicalization tactics found that extremist audiences targeted with Redirect “clicked on these ads at a rate on par with industry standards.” Still, they couldn’t say what eventual impact it had on their behavior. As new funding flows in, and as experts throw up an arsenal of counter-radicalization ideas, there’s still scant evidence of what works.

For its part, Moonshot says its data suggests that some of its target audiences have viewed less extremist content, and points to the thousands of people it has connected to exit counseling and mental health resources. Still, Ramalingam says that the company sees “greater potential for us to assess whether our digital campaigns can lead to longer-term engagement with users, and longer-term change.”

There are other serious concerns as well. The missteps of previous digital wars on terror haunt Moonshot’s work: secret and extralegal surveillance systems, big data political warfare by military counter-radicalization contractors-turned-conspiracy mongers, untold violations of privacy and other civil rights. If Moonshot is tracking what messaging influences who, what data does it collect about “at risk” users, and where does that end up, and why? And who is at risk to begin with?

Ramalingam worries about the privacy concerns; she acknowledges that thanks to ad platforms and brokers, Moonshot can tap into “actually a heck of a lot of data.” But, she stresses, Moonshot isn’t accessing people’s private messages, and its work is bound by the stricter European personal data protections of the GDPR, as well as by an ethics panel that helps evaluate impacts. In any case, she argues, Moonshot is simply taking advantage of the multibillion-dollar digital platforms that drive most of the internet, not to mention the markets.

“As long as Nike and Coca-Cola are able to use personal data to understand how best to sell us Coke and sneakers, I’m quite comfortable using personal data to make sure that I can try and convince people not to do violent things,” Ramalingam says. Should that system of influence exist at all? “I’m totally up for that debate,” she says. “But while we’re in a context where that’s happening, I think it’s perfectly reasonable for us to use that sort of data for public safety purposes.”

What about the platforms?

The tech giants have run their own redirect and counter-speech programs as part of ongoing efforts to stem the toxicity that flourishes on their platforms. Google touts its work with Moonshot battling ISIS, its research on extremism, and its efforts to remove objectionable content and reduce recommendations to “borderline” content on YouTube. In December, its rights group Jigsaw published its findings on the digital spread of violent white supremacy.

Facebook tested the Redirect Method in a 2019 pilot in Australia aimed at nudging extremist users toward educational resources and off-platform support, a system that echoes its suicide-prevention efforts, which use pop-ups to redirect at-risk users to prevention hotlines. In an evaluation commissioned by Facebook last year, Moonshot called the program “broadly successful,” and recommended changes for future iterations. Facebook has also tested the program in Indonesia and Germany.

Ramalingam praises the tech platforms for their efforts, and supports their decisions to deplatform vast numbers of far-right and QAnon-related accounts, even if that’s made researching online extremism harder. Still, she says, Big Tech is doing “not nearly enough.”

Extremist content continues to slip through the platforms’ moderation filters, and then gets rewarded by the algorithms. Facebook’s own researchers have repeatedly shown how its growth-focused algorithms favor extremist content. Despite YouTube’s moderation efforts, political scientist Brendan Nyhan recently reported, the site’s design can still “reinforce exposure patterns” among extremist-curious people.

“The tech companies have an obligation to use their great privilege . . . of being a conduit of information, to get information and helpful resources to people that might be trapped in violent movements,” Ramalingam says.

As companies and lawmakers and law enforcement scramble for solutions in the wake of the events of January 6, Ramalingam also cautions against rash decisions and short-term thinking. “There’s an imperative to act now, and I have seen in the past mistakes get made by governments and by the tech companies just delivering on knee-jerk responses,” she says. “And then once the conversation dies down, they go back to essentially the way things were.”

Emotional reactions are understandable, given the shock of January 6, or of a family member who’s fallen down a rabbit hole, but they tend to be counterproductive. What works for battling violent extremism on a personal, one-on-one level, Ramalingam says, can also help fight it on a national scale: Avoid assumptions, be mindful, and consider the actual evidence.

“The way counselors and social workers do their work is they start by asking questions, by trying to understand,” she says. “It’s about asking questions so those people can reflect on themselves.”

Not finished yet… it gets worse.

From CSP:

The Defense Department, led by controversial diversity chief Bishop Garrison, has commissioned a study to investigate “extremism” in its ranks. But the chosen contractor may raise additional questions for a DOD that is already facing increasing Congressional scrutiny over accusations of politicization.

The U.S. Military Academy reportedly is working with a London, England based firm, Moonshot CVE [Countering Violent Extremism], whose CEO is Vidhya Ramalingam, a former Obama Foundation leader. Ramalingam is also the author of a 2013 paper on immigration in Europe funded by a grant from George Soros’ Open Society Foundations.

Ramalingam told Defense One she spoke with Garrison personally last month about how the Pentagon could use technology developed by her company to “find and eliminate extremism in the ranks.”

Why would the Pentagon hire a U.K.-based company to study allegations of extremism in the U.S. military? Why hire a politically connected group like Ramalingam’s?

It suggests that Garrison and Secretary of Defense Lloyd Austin may be looking for a predetermined answer. A deeper dive into Moonshot CVE might help unravel what they have in mind.

Moonshot CVE co-founder Ross Frenett expressed his support for Critical Race Theory (CRT) on Twitter last month, calling the opposition “Horrifying.” Joint Chiefs Chairman Mark Milley recently faced stiff criticism from congressional Republicans over the military’s recent moves to incorporate CRT elements into their training.

Moonshot CVE’s website dismisses Antifa’s and Black Lives Matter’s Marxist leanings and claims that those who assert its Marxism have engaged in a “white supremacist disinformation” campaign “as a means of delegitimizing it.”

“These sources echo far-right extremist disinformation narratives about BLM protesters trying to overthrow the republic and harm American citizens in a Marxist coup,” Moonshot CVE wrote in a paper jointly published with the Anti-Defamation League (ADL).

Of course, Antifa and BLM groups haven’t been shy about identifying themselves as Marxists. A popular graphic that circulated on pro-Antifa websites and Telegram accounts during the so-called “George Floyd Rebellion” of June 2020 claimed, “Militant networks will defend our revolutionary communities. Liberation begins where America dies” and the status of BLM founders as self-identified “trained Marxists” has been only discussed in the press.

Ramalingam and her organization claim that Antifa is unorganized, ignoring evidence of significant local, regional and international Antifa networks, and substantial material support from an extensive far-left network (including, as noted above, the Rosa Luxemburg Stiftung.) An extensive social media network including utilizing peer-to-peer encryption apps also exist, where BLM and Antifa activists share propaganda and techniques.

Why does Moonshot CVE fixate exclusively on “far-right” extremism, and work to minimize or deny the evidence of left-wing extremism?

One reason might be Moonshot’s apparent association with a German far-Left organization which is overtly pro-Marxist and pro-Antifa, and whose leaders have historical ties to Russian intelligence.

Ramalingam is a regular contributor to programs for an initiative at American University in Washington, D.C. called The Polarization and Extremism Research and Innovation Lab (PERIL). She participated in PERIL-sponsored seminars in October 2020, in April, and last month.

PERIL has partnered  with The Rosa Luxemburg Stiftung (RLS), the think tank of the German political party Die Linke (The Left). Die Linke is the successor of the former East German communist party. The think tank is named for Rosa Luxemburg, a German Communist revolutionary whose ideas pioneered the Marxist examination of race and gender, and was killed during the 1919 German communist uprising. A 2008 report by the German Federal Office for the Protection of the Constitution calls “the memory” of Luxemburg a “traditional element of Left-wing extremism.”

This alliance could be revealing about Ramalingam’s and PERIL’s ideological orientation.

PERIL’s description of the RSL is misinformation and raised questions about what else it glosses over.

PERIL unsurprisingly omits the fact the organization’s top leaders belonged to East Germany’s ruling party, the Socialist Unity Party (SED) and/or were either employees or informants of the Soviet KGB-run STASI. Many former STASI members shifted their allegiance to the KGB following its disbanding, a defector told “The Washington Post” in 1990. Die Linke is a pro-Russia stalwart. RLS’s representative in Moscow is a woman named Kerstin Kaiser, a former STASI employee who provided reports that were given to the KGB.

Kaiser belongs to the Petersburger Dialogue, along with Andre Brie another RLS leader and former STASI employee. Vladimir Putin and former German Chancellor Gerhard Schroeder, an important figure in Russia’s controversial Nordstream 2 pipeline, created the group in 2001 to foster closer Russian-German relations.

“It stands in the tradition of the workers’ and women’s movements, as well as anti-fascism and anti-racism,” PERIL says on its website.

Given that The Rosa Luxemburg Stiftung was founded in 1990 after the fall of the Berlin Wall, —known officially as the “Antifascist Protection Barrier”— one might have questions about what “traditions” of antifascism the group actually stands for.

PERIL’s head Cynthia Miller-Idriss wrote a blog for the RLS’s New York office on “radicalization” during COVID last year. She thanked RLS for the opportunity to write for it on Twitter.

Miller-Idriss and Ramalingam both participated in a conference in Jena, Germany called “Hate Not Found” sponsored by the Institute for Democracy and Civil Society last December where Miller-Idriss was the keynote speaker. Rosa Luxemburg Foundation member Maik Fleilitz was on a panel at the conference that discussed “deplatforming the far-Right.”

Ramalingam and Miller-Idriss both contributed articles to a journal on “radicalization” on the Far-Right in November of 2020.

RLS’s global head Dagmar Enkelmann belonged to the SED and the East German parliament before the wall fell. Gregor Gysi, who helped open the RLS’s New York office in 2012 and who visited last month, headed the SED when it rebranded itself as the “Party of Democratic Socialism” in December 1989. Gysi allegedly informed on his legal clients to the STASI. A bloc in the German Bundestag expelled him in 1992 for seeming to defend the STASI.

STASI informants played a key role in promoting the climate of fear that kept East German society under control. RLS hosted former East German spy chief Werner Grossmann in 2010 for a talk on his book.

East Germany’s last Premier Hans Modrow is an RLS member, and the RLS manages his foundation, The Hans Modrow Stiftung. Modrow had close KGB ties, including to KGB Chairman Vladimir Kryuchkov, who ran the Soviet spy agency     his tenure as Dresden Communist Party boss. Modrow supervised the dismantling of the STASI together with Grossmann. Today, Modrow received the Order of Friendship from the Vladimir Putin in 2017. He remains embittered toward Mikhail Gorbachev for allowing the collapse of the East German regime.

As a young KGB major, Putin supervised a local STASI office in Dresden, while Modrow was the local party boss.

The STASI trained the Red Army Faction (RAF), a predecessor of today’s Antifa.

RLS funded Antifa activities in Germany, and Die Linke openly supports Antifa. The Hamburg, Germany Antifa chapter even promoted a Rosa Luxemburg Stiftung panel on its Facebook page. Friedrich Burschel, editor of “Antifascitisiches Info Blatt,” advises the foundation on subjects related to right-wing extremism and fascism at the Rosa Luxemburg Foundation. “Antifascitisiches Info Blatt”  ̶  the oldest ANTIFA publication, having first entered publication in 1987 in East Berlin  ̶  publishes articles on the Rosa Luxemburg Stiftung-funded website Linksnet, a collaboration of far-Left magazines.

The RLS hosted two BLM founders, Alicia Garza and Opal Tometi in 2014 and 2015 respectively. Garza attended the RLS-sponsored “Mapping Socialist Strategies” seminar in August 2014. RLS leader and former “unofficial STASI employee” Michael Brie spoke at this event. His brother Andre Brie spoke at a 1994 “Committees of Correspondence for Liberation and Socialism” conference along with Angela Davis, who has become influential in BLM. Davis worked closely with the East German regime in the 1970s, and she was a guest of honor at an event sponsored by Die Linke a decade ago. RLS’s New York office hosted BLM propagandist Shaun King in 2017.

The Southern Poverty Law Center (SPLC) is another PERIL partner who Ramalingam has worked with. The SPLC also has received money from the Rosa Luxemburg Stiftung. The SPLC is an extremely controversial organization which has been accused by its own former employees of bias and deliberately overinflating supposed far right threats for fundraising.  SPLC has defended Antifa. Former SPLC Intelligence Project Director Heidi Beirich and SPLC Intelligence Project Senior Analyst Evelyn Schlatter participated in a June 2017 RLS-sponsored session in New York called “Strategies Against the Far Right.” Ramalingam and Beirich are both advisory group members of a pan-European “anti-radicalization” project called The DARE Consortium.  In October, Ramalingam, Beirich and Miller-Idriss collaborated on a podcast on countering extremism sponsored by the ADL.

Moonshot CVE’s alliance with RLS-backed PERIL reinforces the perception that the Biden Pentagon’s hunt for extremism actually is an excuse for classifying dissenting view as “extremist.” And the pro-Russian/ex-STASI controlled RLS’s endorsement of the same talking points as Moonshot CVE shows it comes from a far-Left extremist perspective. U.S. troops shouldn’t be subjected to ideological warfare.

The fact Moonshot CVE equates opposing Antifa with extremism reminds us that this company doesn’t deserve taxpayer money or the Pentagon’s cooperation.

You’re already guilty just by the research you do while so many other cases are not prosecuted at all. Take caution reader…

Even federal contracts have gone to universities…

George Washington University School of Law’s Program on Extremism has created an online resource for tracking the hundreds of criminal cases filed by the Biden Justice Department against United States citizens for their alleged actions on January 6th. The Administration has charged people from all 50 states, and as is reflected in the“Capitol Hill Siege” project archive, every case has been filed in the District of Columbia. Read more here.

Share:

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *