How Texty.org.ua uses AI tools for investigations
In late 2024/early 2025, Texty.org.ua published a major investigation revealing how Russian propagandists were using AI deepfakes of famous Ukrainian female journalists to spread panic on TikTok. They analyzed 595 AI-generated videos that garnered millions of views, proving these weren't random pranks but a coordinated campaign to erode trust in media.
The Challenge: The "Firehose" of Disinformation To produce reports like "Roller Coaster" or the deepfake investigation, Texty’s small team faces a massive data processing hurdle:
Volume: Monitoring thousands of Telegram channels and TikTok videos manually is impossible. For the deepfake story, they had to identify patterns in audio manipulation across hundreds of clips.
Pattern Recognition: Detecting a subtle shift in Kremlin narratives (e.g., shifting the blame from "US" to "UK" after the US elections) requires remembering months of propaganda context.
Technical Heavy Lifting: Visualizing the "Shahed" drone routes or economic data requires writing custom Python code for every new map.
The Solution: A Dual-AI Workflow Texty can use Gemini to handle the code and NotebookLM to handle the narrative analysis.
1. NotebookLM as the "Propaganda Archivist" Texty’s analysts produce weekly "Disinformation Monitor" reports. They can upload their entire archive of these reports (hundreds of PDFs) into NotebookLM.
The Workflow: When a new narrative emerges (e.g., "Peace Talks"), the analyst asks NotebookLM: "Trace the evolution of the 'Peace Talks' narrative in Russian media from January 2024 to today. When did the tone shift from aggressive to conciliatory?"
The Impact: NotebookLM instantly cites the exact weeks where the messaging changed, allowing Texty to correlate propaganda shifts with battlefield events without re-reading year-old reports.
2. Gemini as the "Forensic Coder" For investigations like the TikTok deepfakes, Texty needs to scrape and analyze data quickly.
The Workflow: Instead of manually logging video metadata, an analyst uses Gemini 2.5 Pro (which has a large context window) to process the raw transcripts of the deepfake videos.
The Prompt: "Analyze these 500 video transcripts. Identify the top 5 recurring phrases used to incite fear. Output the result as a JSON file matching our database schema."
The Visualization: They then ask Gemini to write the Python code (using libraries like Plotly) to visualize this network of fake accounts, speeding up the creation of their signature interactive maps.
The Outcome: Systemic Exposure This workflow allows Texty.org.ua to move from "reporting on a fake" to "exposing the system."
Scale: They can analyze 5,000 videos instead of 50.
Speed: They publish the debunking report while the disinformation campaign is still active.
Proof: By using code and AI to process the data, their findings are statistically robust, not just anecdotal.