by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Accelerate your research analysis with this Automated Research Intelligence System! This workflow uses AI and web scraping to analyze research papers and articles, extracting key insights, validating content quality, and generating comprehensive research documents. Perfect for research teams, academics, and AI enthusiasts staying current with the latest developments in artificial intelligence and machine learning. What This Template Does Triggers via form submission for on-demand research URL analysis. Validates URL accessibility and prepares for processing. Uses Decodo scraper to extract research content from target URLs. Analyzes research papers with AI for comprehensive understanding. Validates summaries for accuracy, completeness, and relevance. Generates key insights and actionable takeaways from research. Creates professional Google Docs with formatted research summaries. Evaluates research quality with AI-powered rating system. Saves all research to Google Sheets for historical tracking. Sends Slack alerts for high-quality research findings (9+ rating). Key Benefits Automated research analysis saves hours of manual reading time AI-powered insights extraction from complex research papers Quality validation ensures accurate and relevant summaries Centralized research database for team collaboration Real-time alerts for breakthrough research findings Professional documentation automatically generated Features Form-based trigger for easy research submission URL validation and accessibility checking AI-powered research analysis and summarization Decodo web scraping for reliable content extraction Multi-stage validation for accuracy and relevance Automated Google Docs report generation Quality assessment with structured rating system Google Sheets integration for research tracking Slack notifications for premium research findings Quality threshold filtering for optimal results Requirements Decodo API credentials for research scraping OpenAI API credentials for AI analysis Google Docs OAuth2 credentials for document creation Google Sheets OAuth2 credentials with edit access Slack Bot Token with chat:write permission Environment variables for configuration settings Target Audience AI research teams and data scientists Academic researchers and university labs Machine learning engineers and developers Technology innovation teams Research and development departments Content creators in AI/ML space Step-by-Step Setup Instructions Connect Decodo API credentials for research scraping functionality Set up OpenAI credentials for AI analysis and quality assessment Configure Google Docs for automated research document generation Add Google Sheets credentials for research tracking and history Set up Slack credentials for high-quality research alerts Customize quality thresholds for research rating (default: 6+ for processing, 9+ for alerts) Test with sample research URLs to verify analysis and formatting Deploy the form for team access to research analysis requests Monitor research database for trends and insights Pro Tip: Pro Tip: Use coupon code "YARON" to get 23K requests for testing This workflow transforms complex research into actionable intelligence with automated analysis, quality validation, and professional documentation!
by Pake.AI
Overview This workflow converts a single topic into a full blog article through a structured multi-step process. Instead of generating everything in one pass, it breaks the task into clear stages to produce cleaner structure, better SEO consistency, and more predictable output quality. How this workflow differs from asking ChatGPT directly It does not produce an article in one step. It separates the process into two focused stages: outline generation and paragraph expansion. This approach gives you more control over tone, SEO, structure, and keyword placement. How it works 1. Generate outline The workflow sends your topic to an AI Agent. It returns a structured outline based on the topic, desired depth, language, and keyword focus. 2. Expand each subtopic The workflow loops through each outline item. Every subtopic is expanded into a detailed, SEO-friendly paragraph. Output is consistent and optimized for readability. 3. Produce final outputs Combines all expanded sections into: A clean JSON object A Markdown version ready for blogs or CMS The JSON includes: Title HTML content Markdown content You can send this directly to REST APIs such as WordPress, Notion, or documentation platforms. Content is validated for readability and typically scores well in tools like Yoast SEO. Uses GPT-4o Mini by default, with average token usage between 2000 and 3000 depending on outline size. Use cases Auto-generate long-form articles for blogs or content marketing. Turn Instagram or short-form scripts into complete SEO articles. Create documentation or educational content using consistent templates. Setup steps 1. Prepare credentials Add your OpenAI API Key inside n8n’s credential manager. 2. Adjust input parameters Topic or main idea Number of outline items Language Primary keyword Tone or writing style (optional) 3. Customize the workflow Switch the model if you want higher quality or lower token usage. Modify the prompt for the outline or paragraph generator to match your writing style. Add additional nodes if you want to auto-upload the final article to WordPress, Notion, or any API. 4. Run the workflow Enter your topic Execute the workflow Retrieve both JSON and Markdown outputs for immediate publishing If you need help expanding this into a full content pipeline or want to integrate it with other automation systems, feel free to customize further.
by Rully Saputra
Sign up for Decodo — get better pricing here Overview This workflow automatically collects the latest AI research papers using Decodo, extracts and summarizes PDFs with AI, stores insights in Google Sheets, and notifies users via Telegram. It turns complex academic research into structured, readable knowledge with zero manual effort. Who’s this for This template is ideal for: AI researchers and ML engineers tracking new papers Founders and product teams monitoring AI trends Content writers and analysts creating research-based content Educators, students, and newsletter creators Anyone who wants automated, summarized research without reading full papers. How it works / What it does A schedule trigger starts the workflow automatically Decodo fetches the latest AI research listings from arXiv reliably and at scale Article titles and PDF links are extracted and structured Each paper PDF is downloaded and converted to text An AI summarization chain generates concise, human-readable summaries Results are saved to Google Sheets as a research database A Telegram message notifies users when new summaries are available How to set up Add your Decodo API credentials (required) Connect your OpenAI / ChatGPT-compatible model for summarization Connect Google Sheets and choose your target spreadsheet Add your Telegram bot credentials and chat ID Adjust the schedule trigger if needed, then activate the workflow Requirements n8n (self-hosted required due to community node usage) Decodo community node** (web extraction) OpenAI or compatible AI model credentials Google Sheets account Telegram bot access ⚠️ Disclaimer: This workflow uses a community node and is supported on self-hosted n8n only. How to customize the workflow Change the arXiv category to track different research domains Modify the AI prompt to adjust summary length or tone Replace Google Sheets with another database or knowledge base Disable Telegram notifications if not needed Extend the workflow for SEO blogs, newsletters, or RAG pipelines
by Gilbert Onyebuchi
This workflow leverages n8n to automate LinkedIn content creation from start to finish. Upload an image and quote through a web form, and get a professionally designed post with AI-generated captions, ready to publish in seconds. Features Randomly selects from 6 professional design templates for visual variety Converts HTML designs to high-quality images (90-95% JPEG quality) Generates engaging captions using OpenAI's GPT models Built-in caption editor for customization before posting Direct publishing to LinkedIn profiles or company pages Auto-compresses images for optimal LinkedIn upload Prerequisites N8N Instance: A running n8n instance (cloud or self-hosted) OpenAI API: Active account with API access for caption generation LinkedIn Account: Profile or company page with API access Image Conversion API: HTML CSS to Image account Web Hosting: Platform to host the web form (Netlify, Vercel, or custom server) Setup Instructions 1. Deploy Web Form Download the provided web form template Host on your preferred platform Copy both webhook URLs from your n8n workflow Update form's webhook endpoints with your n8n URLs 2. Configure Image Conversion Sign up at htmlcsstoimage.com Get your API credentials (User ID + API Key) Add to HTTP Request node as Basic Auth credentials 3. Connect OpenAI API Create API key at OpenAI Platform In the ChatGPT HTTP Request node, add Header parameter: Key: Authorization Value: Bearer YOUR_API_KEY Recommended model: gpt-4 or gpt-3.5-turbo 4. Authenticate LinkedIn Create LinkedIn OAuth2 credential in n8n Follow the authentication flow and grant required permissions Select the credential in the "Create a post" LinkedIn node Choose post destination (personal profile or company page) 5. Test the Workflow Submit test data through the web form Monitor n8n execution panel for successful completion Verify image generation, caption quality, and LinkedIn posting Adjust settings as needed based on results Notes Processing time averages 10-20 seconds from upload to preview All 6 design templates are fully responsive and LinkedIn-optimized Caption editor allows full customization before publishing to LinkedIn For questions or issues, please contact me for consulting and support : Linkedin. 🔗 Test with sample data first. Access Web Form Template
by oka hironobu
AI Meal Nutrition Tracker with LINE and Google Sheets Who's it for This workflow is designed for health-conscious individuals, fitness enthusiasts, and anyone who wants to track their daily food intake without manual calorie counting. It is best suited for users who want a simple, AI-powered meal logging system that analyzes food photos one at a time and provides instant nutritional feedback via LINE. What it does This workflow processes a single meal photo sent via LINE, analyzes it using Google Gemini AI to identify foods and estimate nutritional content, and stores the data in Google Sheets for tracking. The workflow focuses on simplicity and encouragement: it receives a meal image, performs AI-based food recognition, estimates calories and macronutrients, calculates a health score, provides personalized advice, and replies with a detailed nutritional breakdown on LINE. How it works A single meal photo is sent to the LINE bot. The workflow is triggered via a LINE webhook. The image file is downloaded and sent to Google Gemini AI for food analysis. The AI identifies foods and estimates nutritional values (calories, protein, carbs, fat, fiber). A health score (1-10) is calculated with personalized improvement tips. The data is appended to Google Sheets for meal history tracking. The image is uploaded to Google Drive for reference. A formatted nutritional report with advice is sent back as a LINE reply. This workflow is intentionally designed to handle one image per execution. Requirements To use this workflow, you will need: A LINE Messaging API account A Google Gemini API key A Google account with access to Google Sheets and Google Drive A Google Sheets document with the following column names: Date Time Meal Type Food Items Calories Protein (g) Carbs (g) Fat (g) Fiber (g) Health Score Advice Image URL Important limitations This workflow does not support multiple images sent in a single message. Sending images in quick succession may trigger multiple executions and lead to unexpected results. Only the first image in an event payload is processed. Nutritional values are AI estimates based on visual analysis and typical serving sizes. Accuracy depends on image quality, lighting, and food visibility. This tool should not replace professional dietary advice. These limitations are intentional to keep the workflow simple and easy to understand. How to set up Create a LINE Messaging API channel and obtain a Channel Access Token. Generate a Google Gemini API key. Update the Config node with your LINE token, Google Sheets ID, Google Drive folder ID, and daily calorie goal. Configure credentials for LINE, Google Gemini, Google Sheets, and Google Drive. Register the n8n webhook URL in your LINE channel settings. Activate the workflow in n8n and test it with a single meal photo. How to customize Modify the AI prompt in the "Analyze Meal with AI" node to support different languages or dietary frameworks (keto, vegan, etc.). Adjust the daily calorie goal in the Config node to match individual needs. Add additional nutritional fields such as sodium, sugar, or vitamins. Replace Google Sheets with a fitness app API or database. Integrate with other services to send daily/weekly nutrition summaries. Note: This workflow was tested using real meal photos sent individually via the LINE Messaging API. Nutritional estimates are approximations and may vary from actual values. For accurate dietary tracking, consult a registered dietitian.
by Khairul Muhtadin
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Who is this for? Automation enthusiasts, content creators, or social media managers who post article-based threads to Bluesky and want to automate the process end-to-end. What problem is this solving? Manual content repackaging and posting can be repetitive and time-consuming. This workflow automates the process from capturing article URLs (via Telegram or RSS) to scraping content, transforming it into a styled thread, and posting on Bluesky platform. What this workflow does Listens on Telegram or fetches from RSS feeds (AI Trends, Machine Learning Mastery, Technology Review). Extracts content from URLs using JinaAI. Converts the article into a neat, scroll-stopping thread via LangChain + Gemini / OpenAI ChatGPT. Splits the thread into multiple posts. The first post is published with “Create a Post”, while subsequent posts are replies. Adds short delays between posting to avoid rate limits. Setup Add credentials for Telegram Bot API, JinaAI, Google Gemini, and Bluesky App Password. Add or customize RSS feeds if needed Test with a sample URL to validate posting sequence. How to customize Swap out RSS feeds or trigger sources. Modify prompt templates or thread formatting rules in the LangChain/Gemini node. Adjust wait times or content parsing logic. Replace Bluesky with another posting target if desired. Made by: Khaisa Studio Need customs workflows? Contact Me!
by Cheng Siong Chin
Introduction Upload invoices via Telegram, receive structured data instantly. Perfect for accountants and finance teams. How It Works Telegram bot receives invoices, downloads files, extracts data using OpenAI, then returns analysis. Workflow Template Telegram Trigger → Document Check → Get File → HTTP Download → AI Extract → Format Response → Send to Telegram Workflow Steps Telegram Trigger: Listens for uploads. Document Check: Validates files; routes errors. Get File: Retrieves metadata. HTTP Download: Fetches content. AI Extract: OpenAI parses invoice fields. Format Response: Structures data. Send Analysis: Delivers to chat. Setup Instructions Telegram Bot: Create via BotFather, add credentials. OpenAI Agent: Add API key and extraction prompt. HTTP Node: Set authentication. Parser: Define invoice schema. Error Handling: Configure fallbacks. Prerequisites n8n instance Telegram Bot Token OpenAI API key Customization Database storage Accounting software integration Benefits Eliminates manual entry Reduces errors
by Manav Desai
This n8n template demonstrates how to build a weekly Hollywood film industry briefing using Tavily for real-time search and Google Gemini for summarization. It sends a concise, emoji‑styled email with movie releases, box office results, industry news, and must‑watch recommendations every week automatically. Use cases: Great for film journalists, entertainment bloggers, or movie enthusiasts who want automated weekly updates without manually checking multiple sources. Good to know Free to use: Tavily provides **1,000 API credits per month on their free plan (no credit card required), so this workflow can run at zero cost. Real-time data**: Tavily’s search API is optimized for up-to-date information — perfect for weekly movie releases and box office stats. Google Gemini is used for summarization, and you only need basic API access (no paid tier required). How it works Trigger**: Scheduled every Thursday morning (configurable). Search**: Four Tavily API calls gather: Movies releasing this week Last week’s box office results Hollywood industry news Must‑watch movies currently in theatres Summarization**: Google Gemini turns this into Gmail‑friendly HTML with emojis and bullet points. Email**: The formatted newsletter is sent via Gmail node. How to use Configure Tavily API and Gmail OAuth2 credentials in n8n’s credential manager. (Optional) Edit Tavily queries to focus on specific genres or add filters. Adjust the schedule trigger to any day/time you prefer. Requirements Tavily API account (free plan – 1,000 monthly requests) Google Gemini API key for summarization Gmail account (OAuth2 credentials for sending emails) Want insane output quality? You can swap Gemini for OpenAI’s ChatGPT models: GPT‑3.5 Turbo** – \~\$0.002/run (crazy cheap) GPT‑4o** – \~\$0.009/run (latte price) GPT‑4.5** – \~\$0.15/run (god‑mode quality) This upgrade gives you cleaner, richer, “did‑a‑human‑write‑this?” vibes — perfect for journalist‑grade Hollywood briefings. Just note: OpenAI API requires a \$5 minimum credit to activate usage. Example Output (ChatGPT version) Subject: Daily Hollywood Film Industry Briefing – August 3, 2025 Good morning, Here's your daily Hollywood film briefing for August 3, 2025: 🎬 Releases The Bad Guys 2 – Released Friday, August 1, 2025 The Naked Gun – Released Friday, August 1, 2025 These are the confirmed new wide theatrical Hollywood releases this week (Monday through Sunday of current week). No additional new Hollywood theatrical releases found for this week. 📊 Box Office Highest‑grossing Hollywood films of 2025 (worldwide): Ne Zha 2 – approx. \$1.90 billion (non‑Hollywood Chinese animated film leads) Lilo & Stitch – approx. \$1.02 billion A Minecraft Movie – approx. \$955 million Jurassic World Rebirth – approx. \$731 million How to Train Your Dragon – approx. \$610 million Last week’s box office performance (Monday–Sunday): The Fantastic Four: First Steps – domestic debut \\$118 M; global \\$218 M, Marvel’s biggest opening of 2025 Superman – added \~\$94 M worldwide last week, passing \$500 M global total Jurassic World Rebirth – up \\$70 M worldwide last week, despite \40 % drop week‑on‑week F1: The Movie – up \~\$48 M last week internationally/domestically growth visible Lilo & Stitch – added \~\$10 M worldwide last week, slower tail but still billion‑plus gross Highlights & trends: Fantastic Four’s strong debut reboots Marvel success, signaling resumed audience interest; Superman continues to hold strong; Jurassic World Rebirth remains durable after holiday surge; surge in box office recovery noted across key titles. Overall box office up \~12–15 % year‑on‑year. 📰 Industry Buzz Christopher Nolan has signed to direct a massive \$250 million adaptation of Homer’s The Odyssey, starring Matt Damon and Tom Holland, with Imax pre‑sales at 95 % capacity across major locations. Marvel has relaunched the Fantastic Four franchise successfully with First Steps; positive CinemaScore and strong visuals marking a fresh start. DC’s Superman continues strong with over \$500 M global, solidifying DC’s summer comeback. Universal’s Jurassic World Rebirth continues strong overseas, especially in China, contributing to \$318 M global in opening holiday weekend. Warner Bros.–Discovery stock surges (\~30 %) amid box office rebound, with Disney, IMAX and Cinemark also seeing robust growth in 2025. Ne Zha 2 becomes highest‑grossing animated and non‑Hollywood film ever, crossing \$2 billion globally—though not Hollywood, its impact on global trends is notable. Mission: Impossible – The Final Reckoning quietly solidifies strong global numbers (\~\$562 M) and continues reliable franchise performance. 🎥 Must‑Watch in Theatres (Surat, India) The Fantastic Four: First Steps** – Currently showing in English/Hindi/Tamil/Telugu in Surat cinemas; hyped globally, strong visuals, action‑heavy, best experienced in IMAX or premium formats if available in Surat multiplexes. Runs this week. F1: The Movie** – Available in Surat in multiple languages, strong reviews praising adrenaline‑fuelled direction and visuals and growing fan hype; ideal in standard or Dolby formats for immersive sound and speed feel. Jurassic World Rebirth** – Still playing in Surat, popular with family audiences; grand visuals and dinosaur action well‑suited to IMAX or large format screens. That’s all for today’s briefing. Have a great theatrical weekend ahead!
by Oneclick AI Squad
Automatically discovers trending topics in your niche and generates ready-to-use content ideas with AI. 🎯 How It Works 1. Multi-Source Trend Monitoring Twitter/X trending topics and hashtags Reddit hot posts from niche subreddits Google Trends daily search trends Runs every 2 hours for fresh opportunities 2. Smart Filtering & Scoring Filters by your niche keywords Removes duplicates across sources Calculates viral potential score (0-100) Ranks by engagement, recency, and relevance Prevents suggesting already-covered topics 3. AI Content Generation Uses Claude AI to analyze each trend Generates 5 unique content ideas per trend Provides hooks, key points, and platform recommendations Explains why each idea has viral potential 4. Comprehensive Delivery Beautiful HTML email digest with all opportunities Slack summary for quick review Database logging for tracking Research links for deeper investigation ⚙️ Configuration Guide Step 1: Configure Your Niche Edit the "Load Niche Config" node: niche: 'AI & Technology', // Your industry keywords: [ // Topics to track 'artificial intelligence', 'machine learning', 'AI tools', // Add your keywords ], subreddits: 'artificial+machinelearning', // Relevant subreddits thresholds: { minTwitterLikes: 1000, // Minimum engagement minRedditUpvotes: 500, minComments: 50 } Step 2: Connect Data Sources Twitter/X API: Sign up for Twitter Developer Account Get API credentials (OAuth 2.0) Add credentials to "Fetch Twitter/X Trends" node Reddit API: Create Reddit app: https://www.reddit.com/prefs/apps Get OAuth credentials Add credentials to "Fetch Reddit Hot Topics" node Google Trends: No authentication needed (public API) Already configured in workflow Step 3: Configure AI Integration Anthropic Claude API: Get API key from: https://console.anthropic.com/ Add credentials to "AI - Generate Content Ideas" node Alternative: Use OpenAI GPT-4 by modifying the node Step 4: Setup Notifications Email: Configure SMTP in "Send Email Digest" node Update recipient email address Customize HTML template if desired Slack: Create incoming webhook: https://api.slack.com/messaging/webhooks Add webhook URL to "Send Slack Summary" node Customize channel name Step 5: Database (Optional) Create PostgreSQL database with schema below Add credentials to "Log to Content Database" node Skip if you don't need database tracking Database Schema CREATE TABLE content.viral_opportunities ( id SERIAL PRIMARY KEY, opportunity_id VARCHAR(255) UNIQUE, detected_at TIMESTAMP, topic TEXT, source VARCHAR(50), source_url TEXT, engagement BIGINT, viral_score INTEGER, opportunity_level VARCHAR(20), niche VARCHAR(100), content_ideas JSONB, research_links JSONB, urgency TEXT, status VARCHAR(50), created_at TIMESTAMP DEFAULT NOW() ); 🎨 Customization Options Adjust Scan Frequency Edit "Every 2 Hours" trigger: More frequent: Every 1 hour Less frequent: Every 4-6 hours Consider API rate limits Tune Viral Score Algorithm Edit "Calculate Viral Potential Score" node: Adjust engagement weight (currently 40%) Change recency importance (currently 30%) Modify threshold in "Filter High Potential Only" (currently 40) Customize Content Ideas Modify the AI prompt in "AI - Generate Content Ideas": Change number of ideas (currently 5) Add specific format requirements Include brand voice guidelines Target specific platforms 📊 Expected Results Typical scan finds: 5-15 opportunities** per scan (2 hours) 3-5 HIGH priority** (score 75+) 25+ content ideas** generated Email sent** with full digest Slack alert** for quick review 💡 Pro Tips Timing Matters: Create content within 24-48 hours of detection High Priority First: Focus on opportunities scoring 75+ Platform Match: Choose platforms where your audience is active Add Your Voice: Use AI ideas as starting points, not final copy Track Performance: Note which opportunity types perform best Refine Keywords: Regularly update your niche keywords based on results Mix Formats: Try different content formats for same trend 🚨 Important Notes ⚠️ API Rate Limits: Twitter: Monitor rate limits closely Reddit: 60 requests per minute Claude AI: Tier-based limits Consider caching results 💰 Cost Considerations: Twitter API: May require paid tier Reddit API: Free for reasonable use Claude AI: ~$0.50-1.00 per scan Total: ~$15-30/month estimated 🎯 Best Practices: Start with 1-2 sources, add more later Test with broader keywords initially Review first few reports to tune scoring Don't create content for every opportunity Quality over quantity 🔄 What Happens Next? Workflow runs every 2 hours Scans Twitter, Reddit, Google Trends Filters by your keywords Scores viral potential Generates AI content ideas Sends digest to email + Slack Logs to database Marks topics as suggested Repeat!
by Țugui Dragoș
How It Works Story Generation – Your idea is transformed into a narrative split into scenes using DeepSeek LLM. Visuals – Each scene is illustrated with AI images via Replicate, then animated into cinematic video clips with RunwayML. Voice & Music – Narration is created using ElevenLabs (text-to-speech), while Replicate audio models generate background music. Final Assembly – All assets are merged into a professional video using Creatomate. Delivery – Everything is orchestrated by n8n, triggered from Slack with /render, and the final video link is delivered back instantly. Workflow in Action 1. Trigger from Slack Type your idea with /render in Slack - the workflow starts automatically. 2. Final Video Output Receive a polished cinematic video link in Slack. 3. Creatomate Template ⚠️ Important: You must create your own template in Creatomate. This is a one-time setup - the template defines where the voiceover, music, and video clips will be placed. The more detailed and refined your template is, the better the final cinematic result. Required APIs To run this workflow, you need accounts and API keys from the following services: DeepSeek – Story generation (LLM) Replicate – Images & AI music generation RunwayML – Image-to-video animations ElevenLabs – Text-to-speech voiceovers Creatomate – Video rendering and templates Dropbox – File storage and asset syncing Slack – Workflow trigger and video delivery Setup Steps Import the JSON workflow into your n8n instance. Add your API credentials for each service above. Create a Creatomate template (only once) – define layers for visuals, voice, and music. Trigger the workflow from Slack with /render Your Story Idea. Receive your final cinematic video link directly in Slack. Use Cases Automated YouTube Shorts / TikToks for faceless content creators. Scalable ad creatives and marketing videos for agencies. Educational explainers** and onboarding videos generated from text. Rapid prototyping** of cinematic ideas for developers & storytellers. With this workflow, you’re not just using AI tools – you’re running a full AI-powered studio in n8n.
by Hardikkumar
This workflow is the AI analysis and alerting engine for a complete social media monitoring system. It's designed to work with data scraped from X (formerly Twitter) using a tool like the Apify Tweet Scraper, which logs the data into a Google Sheet. The workflow then automatically analyzes new tweets with Google Gemini and sends tailored alerts to Slack. How it works This workflow automates the analysis and reporting part of your social media monitoring: tweet Hunting:** It finds tweets for the query entered in the set node and passes the data to the google sheets Fetches New Tweets:** It gets all new rows from your Google Sheet that haven't been processed yet (it looks for "Notmarked" in the 'action taken' column). Prepares for AI:** It combines the data from all new tweets into a single, clean prompt for the AI to analyze. AI Analysis with Gemini:* It sends the compiled data to Google Gemini, asking for a full summary report *and a separate, machine-readable JSON list of any urgent items. Splits the Response:** The workflow intelligently separates the AI's text summary from the JSON data for urgent alerts. Sends Notifications:** The high-level summary is sent to a general Slack channel (e.g., #brand-alerts). Each urgent item is sent as a separate, detailed alert to a high-priority Slack channel (e.g., #urgent). Set up steps It should take about 5-10 minutes to get this workflow running. Prerequisite - Data Source: Ensure you have a Google Sheet being populated with tweet data. For a complete automation, you can set up a new google sheet with the same structure for saving the tweets data and run the Tweet Scraper on a schedule. Configure Credentials: Make sure you have credentials set up in your n8n instance for Google Sheets, Google Gemini (PaLM) API, and Slack. Google Sheets Node ("Get row(s) in sheet"): Select your Google Sheet containing the tweet data. Choose the specific sheet name from the dropdown. Ensure your sheet has a column named action taken so the filter works correctly. Google Gemini Chat Model Node: Select your Google Gemini credential from the dropdown. Slack Nodes ("Send a message" & "Send a message1"): In the first Slack node, choose the channel for the summary report. In the second Slack node, choose the channel for urgent alerts. Save and Activate: Once configured, save your workflow and turn it on!
by Rajeet Nair
📖 Description 🔹 How it works This workflow uses AI (Mistral LLM + Pollinations.ai) to generate high-quality visual content for social media campaigns. It automates the process from brand/campaign input to final image upload, ensuring consistency and relevance. Input Brand & Campaign Data Retrieves brand profile and campaign goals from Google Drive. Cleans and merges the data into a structured JSON format. Campaign Goal Generation AI summarizes campaign goals, audience, success metrics, and keywords. Produces a clear campaign goal summary for content planning. Image Prompt Generation AI creates 5 detailed image prompts reflecting the campaign story. Includes 1 caption and 4–6 relevant hashtags. Image Creation Pollinations.ai generates images based on the AI prompts. Each image is renamed systematically (photo1 → photo5). Post-Processing & Upload All images are merged into a single item. Workflow uploads the final output to Google Drive for campaign use. ⚙️ Set up steps Connect Credentials Add Google Drive and Mistral API credentials in n8n. Configure Google Drive Input Nodes Set fileId for brand profile and campaign goals. Customize AI Prompts Sticky notes explain AI nodes for goal summary and image prompt generation. Optionally modify tone, keywords, or target audience for brand-specific campaigns. Check Image Output Nodes Ensure Pollinations.ai HTTP request nodes are active. Verify renaming code nodes for proper photo sequence. Activate Workflow Test workflow manually to ensure images are generated and uploaded correctly. 🔹 Data Handling & Output This workflow pulls brand profile and campaign goal data from Google Drive. Data is processed into structured JSON, including: Brand Profile: name, mission, vision, values, services, tone, keywords, contact info. Campaign Goal: primary goal, focus, success metrics, target audience, core message. Supports population of multiple campaigns or brands dynamically. JSON output can be used downstream for image prompt generation, reporting, or analytics. All processing is automated, with clear nodes for extraction, parsing, and merging. pollinations.ai is an open-source free text and image generation API available. No signups or API keys required. which prioritize your privacy with zero data storage and completely anonymous usage. ⚡ Result: A fully automated AI-to-image workflow that transforms campaign goals into ready-to-use social media visuals, saving time and maintaining brand consistency.