According to the Department of Justice, the "Good Old USA" project was directly overseen by Sergei Kiriyenko, the first deputy chief of staff of the Russian presidential executive office, with potential knowledge by Putin himself. Documents revealed the orchestrators specifically targeted existing divisions within American society, using racist stereotypes and far-right conspiracies to target supporters of former president Donald Trump.
As stated in their planning documents: "It makes sense for Russia to put a maximum effort to ensure that the Republican Party's point of view (first and foremost, the opinion of Trump supporters) wins over the US public opinion. This includes provisions on peace in Ukraine in exchange for territories, the need to focus on the problems of the US economy, returning troops home from all over the world, etc."
The Justice Department released documents showing how the Russian operation extensively used "cybersquatted" domains, registering names intended to mimic legitimate websites (e.g., washingtonpost.pm to mimic washingtonpost.com). These fake sites published Russian government messaging falsely presented as content from legitimate news organizations.
In addition, the operation deployed "influencers" worldwide, paid for social media advertisements (often created using AI), and created social media profiles posing as U.S. citizens to post comments on social media platforms with links to the cybersquatted domains.
Our strategy leverages strategic narrative development and deployment that operates at the intersection of media, psychology, and grassroots activism. Rather than simply reacting to disinformation after it spreads, we take a proactive approach by developing compelling, fact-based narratives that directly confront Russian manipulation techniques.
This asymmetric information response model combines the speed and reach of AI with human creativity and cultural understanding. By distributing factual content through the same channels and using similar engagement techniques as disinformation actors, we can effectively reach vulnerable audiences before false narratives take hold. The strategy effectively "fights fire with fire" - using the power of emotional storytelling and memetic content while maintaining rigorous factual standards.
The Human Task Force (HTF) represents a dual-purpose initiative: combating disinformation while providing economic opportunities to people in underserved regions worldwide. Our assistants aren't just fighting falsehoods—they're earning meaningful income through flexible, remote work.
Each assistant receives specialized training in identifying disinformation techniques, utilizing effective counter-narratives, and using our AI-enhanced tools to maximize their impact. They work as independent contractors, completing micro-tasks that collectively create a powerful network of truth advocates across social media platforms. They seek out popular misinfo posts to maximize engagement and views.
Grok AI serves as an especially powerful tool in our counter-disinformation toolkit because it has been dubbed "the Truth Machine" by its creator, Elon Musk. When Grok contradicts false narratives - even when those narratives come from its own creator - it creates a uniquely persuasive effect.
Our assistants leverage this capability by engaging with misleading content on X/Twitter, asking Grok to fact-check claims in real-time, and sharing those corrections widely. This approach is particularly effective because it uses the platform's own AI tools to correct misinformation within the same ecosystem.
"The U.S. is wasting billions on Ukraine while Americans suffer from inflation and crumbling infrastructure."
"U.S. investment in Ukraine represents less than 0.3% of our budget. The real causes of inflation are corporate price hikes (up 54%) while worker wages rose only 4%."
"White middle-class Americans are being discriminated against while 'people of color' receive preferential treatment."
"Economic challenges are hitting all Americans. The real wealth gap is between the top 1% and everyone else. Working families across all backgrounds are struggling with similar issues."
"Peace in Ukraine can only be achieved by accepting Russian territorial claims. The U.S. is prolonging the war by supporting Ukraine."
"History shows that appeasing territorial aggression leads to more conflict, not peace. Supporting Ukraine's sovereignty now prevents a wider, costlier war later. True peace requires justice."
"Prebunking" is a powerful approach where we educate audiences about manipulation techniques before they encounter them. This creates psychological immunity to disinformation, making people significantly less likely to believe or share false content later.
Research shows prebunking messages that follow a specific structure - warning people about the threat, explaining how the manipulation works, and showing a weakened example - can build lasting resistance to targeted manipulation techniques.
Our comprehensive measurement approach goes beyond traditional social media metrics to assess the true impact of our counter-narrative strategy. At scale, we will be able to evaluate not just reach and engagement, but actual attitude and belief change among target demographics.
Through partnerships with research institutions, we can conduct regular polling and sentiment analysis to measure shifts in public opinion on key issues targeted by disinformation campaigns. This data feeds back into our strategy, creating a continuously improving counter-narrative system.
Each subscription funds a full year of operations for the specified number of HTF assistants. Each assistant is expected to produce 40-80 pieces of content daily across targeted platforms, primarily focusing on X/Twitter, where disinformation is particularly prevalent.
As the number of assistants increases, we achieve significant economies of scale that allow us to pass on savings while dramatically increasing the reach and impact of the counter-narrative campaign. With 1,000 assistants, we can generate and distribute over 40,000 pieces of factual content daily.
Direct campaign focus on specific topics and narratives
Choose specific demographic and geographic targets
Quarterly detailed reports of campaign performance metrics
Access to emerging narrative trends and audience insights
Create economic opportunities in underserved communities
Support global development while promoting truth and justice
Russia's Project "Good Old USA" represents an unprecedented level of sophisticated influence operations targeting American democracy.
HTF assistants leveraging AI creates an agile, cost-effective response system that can rapidly deploy fact-based counter-narratives and memes at scale
We're facing an unprecedented wave of coordinated disinformation ahead of key elections. Early intervention is crucial.
Social media companies are retreating from content moderation. We need a proactive, community-based response.
Our model has demonstrated effectiveness in initial testing. With your support, we can scale rapidly to meet the challenge.
Supporting organizations receive exclusive quarterly briefings detailing emergent misinformation patterns, trends, and the effectiveness of various counter-narrative strategies. These intelligence reports provide valuable insight into the evolving nature of information warfare that extends beyond the specific campaign.
Additionally, supporters gain the ability to direct campaign focus toward specific issue areas of concern, ensuring that resources are allocated to combating the most harmful disinformation affecting their constituencies and priority areas.
The fight against disinformation requires immediate action. Truth is the best defense. Join us in protecting democracy.
Meta, which owns Facebook and Instagram, blocked news from its apps in Canada in 2023 after a new law required the social media giant to pay Canadian news publishers a tax for publishing their content. The ban applies to all news outlets irrespective of origin, including The New York Times.
Amid the news void, Canada Proud and dozens of other partisan pages are rising in popularity on Facebook and Instagram before the election. At the same time, cryptocurrency scams and ads that mimic legitimate news sources have proliferated on the platforms. Yet few voters are aware of this shift, with research showing that only one in five Canadians knows that news has been blocked on Facebook and Instagram feeds.
The result is a "continued spiral" for Canada's online ecosystem toward disinformation and division, said Aengus Bridgman, director of the Media Ecosystem Observatory, a Canadian project that has studied social media during the election.
Meta's decision has left Canadians "more vulnerable to generative A.I., fake news websites and less likely to encounter ideas and facts that challenge their worldviews," Dr. Bridgman added.
CORE INNOVATIONS © 2025
admin@coreinnovations.co