by tsushima ryuto
This n8n workflow is designed to centralize the management and tracking of customer inquiries received through multiple channels (email and web forms). Who's it for? Customer support teams Marketing teams Sales teams Small to medium-sized businesses Individuals looking to streamline customer inquiry processes How it works / What it does This workflow is designed to automatically collect, process, route, and track customer inquiries from different sources. Multi-Channel Input: The workflow listens for inquiries from both incoming emails and web form submissions. Email Trigger: Monitors a specific inbox for sent emails. Webhook - Web Form: Listens for web form data submitted to a designated endpoint. Data Extraction and Parsing: Extract Email Content: Extracts HTML content from incoming emails to get a clean text message. Parse Email Data: Extracts relevant information from the email, such as customer name, email address, subject, message, received timestamp, source ("email"), and inquiry type (e.g., "urgent", "billing", "general") based on the subject line. Parse Webhook Data: Extracts customer name, email, subject, message, received timestamp, source ("webform"), and inquiry type from the web form data based on the provided type or a default of "general". Merge Inquiries: The parsed email and web form inquiry data are combined into a single stream for continued processing. Route by Inquiry Type: The workflow then routes the inquiries based on the extracted inquiryType. Urgent Inquiries: Inquiries marked as "urgent" are routed to a specific Slack channel for immediate alerts. General Inquiries: Inquiries marked as "general" are notified in another Slack channel. Billing Inquiries: Inquiries marked as "billing" are routed to the general inquiries channel, or can be customized for a separate channel if needed. Save to Google Sheets: All inquiry data is logged into a Google Sheet, which serves as a central repository, including details like customer name, email, subject, message, source, received timestamp, and inquiry type. Send Auto-Reply Email: Customers receive an automated email reply confirming that their inquiry has been successfully received. How to set up Google Sheets: Create a new spreadsheet in your Google Drive. Name the first sheet "Inquiries" and create the following header row: customerName, customerEmail, subject, message, source, receivedAt, inquiryType. In the 'Save to Google Sheets' node, configure the Spreadsheet ID and Sheet Name. Link your Google Sheets credentials. Email Trigger (IMAP): Set up the 'Email Trigger' node to connect to your IMAP email account. Test it to ensure it correctly listens for incoming emails before activating the workflow. Webhook - Web Form: Copy the Webhook URL from the 'Webhook - Web Form' node and configure your web form to submit data to it. Ensure your web form sends fields like name, email, subject, message, and type in JSON format. Slack: Configure your Slack credentials to connect to your Slack workspace. Update the relevant Slack Channel IDs in both the 'Notify Urgent - Slack' and 'Notify General - Slack' nodes for sending notifications for urgent and general inquiries. Gmail: Set up your Gmail credentials to connect to your Gmail account. Ensure the 'Send Auto-Reply Email' node is correctly linked to your sending Gmail account. Requirements An n8n instance A Google Sheets account An IMAP-enabled email account A Slack workspace A Gmail account A basic web form (to integrate with the Webhook node) How to customize the workflow Add more Inquiry Types: You can add more specific inquiry types (e.g., "technical support", "returns") by adding more rules in the **'Route by Inquiry Type' node. Additional Notification Channels**: To integrate other notification systems (e.g., Microsoft Teams, Discord, SMS) beyond Slack, create new routing outputs and add new notification nodes for the desired service. CRM Integration**: Instead of or in addition to saving data to Google Sheets, you can add new nodes to connect to CRM systems like Salesforce, HubSpot, or others. Prioritization and Escalation**: Implement more complex logic to trigger escalation processes or prioritization rules based on inquiry type or keywords. AI Sentiment Analysis**: Integrate an AI node to analyze the sentiment of inquiry messages and route or prioritize them accordingly. `
by Madame AI
Create curated industry trend reports from Medium to Google Docs This workflow automates the process of market research by generating high-quality, curated digests of Medium articles for specific topics. It scrapes recent content, uses AI to filter out spam and duplicates, categorizes the stories into readable buckets, and compiles everything into a formatted Google Doc report. Target Audience Content marketers, market researchers, product managers, and investors looking to track industry trends without reading through noise. How it works Schedule: The workflow runs on a defined schedule (e.g., daily or weekly) via the Schedule Trigger. Define Source: A Set node defines the specific Medium tag URL to track (e.g., /tag/artificial-intelligence). Scrape Content: BrowserAct visits the target Medium page and scrapes the latest article titles, authors, and summaries. Analyze & Filter: An AI Agent (powered by Claude via OpenRouter) analyzes the raw feed. It removes duplicates, filters out spam/clickbait, and categorizes high-quality stories into buckets (e.g., "Must Reads," "Engineering," "Wealth"). Create Report: A Google Docs node creates a new document using the digest title generated by the AI. Build Document: The workflow loops through the analyzed items, appending headers and body text to the Google Doc section by section. Notify Team: A Slack node sends a message to your chosen channel confirming the report is ready. How to set up Configure Credentials: Connect your BrowserAct, Google Docs, Slack, and OpenRouter accounts in n8n. Prepare BrowserAct: Ensure the Automated Industry Trend Scraper & Outline Creator template is saved in your BrowserAct account. Set Target Topic: Open the Target Page node and replace the Target_Medium_Link with the Medium tag archive you wish to track (e.g., https://medium.com/tag/bitcoin/archive). Configure Notification: Open the Send a message node (Slack) and select the channel where you want to receive alerts. Activate: Turn the workflow on. Requirements BrowserAct* account with the *Automated Industry Trend Scraper & Outline Creator** template. Google Docs** account. Slack** account. OpenRouter** account (or any compatible LLM credentials). How to customize the workflow Adjust the AI Persona: Modify the system prompt in the Analyzer & Script writer node to change the categorization buckets (e.g., change "Engineering" to "Marketing Strategies"). Change the Output Destination: Replace the Google Docs nodes with Notion or Airtable nodes if you prefer a database format over a document. Add Email Delivery: Add a Gmail or Outlook node at the end to email the finished Google Doc link directly to stakeholders. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video Stop Writing Outlines! Use This AI Trend Scraper (BrowserAct + n8n + Gemini)
by Oneclick AI Squad
This enterprise-grade n8n workflow automates the Pharmaceutical Raw Material COA Verification & Vendor Quality Scoring System — from upload to final reporting — using AI-powered document extraction, specification matching, and dynamic vendor scoring. It processes Certificates of Analysis (COAs) to validate compliance, assign quality scores, generate approvals or CAPA requests, and notify stakeholders, ensuring regulatory adherence and vendor accountability with full audit trails and zero manual data entry. Key Features Webhook-triggered COA Upload** for seamless integration with file-sharing systems AI Document Extraction** to parse test results and data from uploaded COAs Automated Specification Analysis** matching against predefined quality standards Weighted Vendor Scoring** based on compliance metrics and historical performance Compliance Decision Engine** with approve/reject branching and CAPA flagging Dynamic Certificate Generation** for approved materials, including digital signatures Vendor Database Synchronization** to update scores and records in real-time Targeted Email Notifications** for QA, production, and executive teams Executive Reporting Dashboard** with summaries, scores, and verification logs Audit-Ready Logging** for all steps, deviations, and decisions Workflow Process | Step | Node | Description | | ---- | ----------------------------------- | -------------------------------------------------------- | | 1 | START: Upload COA | Webhook trigger receives uploaded COA file for verification process | | 2 | EXTRACT: Parse COA | Extracts test results and data from the COA document using AI parsing | | 3 | ANALYZE: Vendor Compliance | Compares extracted data against specifications and flags deviations | | 4 | SCORE: Vendor Quality Rating | Calculates weighted compliance score based on test results and history | | 5 | DECISION: Compliance Route | Evaluates score/status: Branches to approve (green) or reject (red) path | | 6 | APPROVED: Generate Approval Cert (Approved Path) | Creates digital approval certificate for compliant materials | | 7 | Update Vendor Database | Saves verification record, score, and status to vendor database | | 8 | NOTIFY: Email Alert | Sends detailed notifications to QA/production teams | | 9 | REPORT: Final Report | Generates executive summary with COA scores and verifications | | 10 | REJECT: Generate Rejection Report (Reject Path) | Produces rejection report with deviation details | | 11 | Request CAPA | Initiates Corrective and Preventive Action (CAPA) process | | 12 | PATH REJECTED | Terminates rejected branch with audit log entry | Setup Instructions 1. Import Workflow Open n8n → Workflows → Import from Clipboard Paste the JSON workflow 2. Configure Credentials | Integration | Details | | ----------------- | -------------------------------------------------- | | File Storage (e.g., Google Drive/AWS S3) | API key or OAuth for COA upload handling | | AI Extraction (e.g., Claude or OCR Tool) | API key for document parsing (e.g., claude-3-5-sonnet-20241022) | | Database (e.g., PostgreSQL/Airtable) | Connection string for vendor records and specs | | Email (SMTP/Gmail) | SMTP credentials or OAuth for notifications | 3. Update Database/Sheet IDs Ensure your database or Google Sheets include: VendorDatabase for scores and history Specifications for quality standards 4. Set Triggers Webhook:** /coa-verification (for real-time file uploads) Manual/Scheduled:** For batch processing if needed 5. Run a Test Use manual execution to confirm: COA extraction and analysis Score calculation and branching Email notifications and report generation (use sample COA file) Database/Sheets Structure VendorDatabase | vendorId | coaId | score | complianceStatus | lastVerified | deviations | capaRequested | |--------------|-------------|----------|--------------------|--------------------|--------|--------------------|---------------| | VEND-123456 | COA-789012 | 92.5 | Approved | 2025-11-04T14:30:00Z | None | No | Specifications | materialType | testParam | specMin | specMax | weight | |--------------|-------------|----------|--------------------|--------------------|--------|--------------------|---------------|----------| | API Excipient | Purity (%) | 98.0 | 102.0 | 0.4 | System Requirements | Requirement | Version/Access | | --------------------- | ---------------------------------------------- | | n8n | v1.50+ (AI and database integrations supported) | | AI Parsing API | claude-3-5-sonnet-20241022 or equivalent OCR | | Database API | SQL connection or Google Sheets API | | Email API | https://www.googleapis.com/auth/gmail or SMTP | | File Storage | AWS S3 or Google Drive API access | Optional Enhancements Integrate ERP Systems (e.g., SAP) for direct material release Add Regulatory Export to PDF/CSV for FDA audits Implement Historical Trend Analysis for vendor performance dashboards Use Multi-Language Support for global COA extraction Connect Slack/Teams for real-time alerts beyond email Enable Batch Processing for high-volume uploads Add AI Anomaly Detection for predictive non-compliance flagging Build Custom Scoring Models via integrated ML tools Result: A fully automated quality assurance pipeline that verifies COAs, scores vendors, and drives compliance decisions — ensuring pharmaceutical safety and efficiency with AI precision and complete traceability. Explore More AI Workflows: Get in touch with us for custom n8n automation!
by Amit Mehta
Streamline Your Zoom Meetings with Secure, Automated Stripe Payments This comprehensive workflow automates the entire process of setting up a paid online event, from scheduling a Zoom meeting and creating a Stripe payment link to tracking participants and sending confirmation emails. How it Works This workflow has two primary, distinct branches: Event Creation and Participant Registration. Event Creation Flow (Triggered via Form): An administrator submits details (title, price, date/time) via a form. The workflow creates a new Zoom meeting with a unique password. It creates a Stripe Product and a Payment Link. A dedicated Google Sheet tab is created for tracking participants. An email is sent to the event organizer with all the details, including the Zoom link, payment link, and participant list URL. Participant Registration Flow (Triggered via Stripe Webhook): A webhook is triggered when a Stripe payment is completed (checkout.session.completed). The participant's details are added to the dedicated Google Sheet tab. A confirmation email is sent to the participant with the Zoom link and password. A notification email is sent to the event organizer about the new registration. Use Cases Webinar Sales**: Automate setup and registration for paid webinars. Consulting/Coaching Sessions**: Streamline the booking and payment process for group coaching calls. Online Classes**: Handle registration, payment, and access distribution for online courses or classes. Setup Instructions Credentials: Add credentials for: Zoom: For creating the meeting. Google: You need both Gmail and Google Sheets credentials. Stripe: For creating products and handling payment webhooks. Google Sheet: Create a new, blank Google Sheet to hold meeting and participant information. Config Node: Fill the Config node with: currency (e.g., EUR). sheet_url (the URL of the Google Sheet you created). teacher_email (the organizer/host's email). Workflow Logic The workflow splits into two logical parts handled by an if node: Part A: Event Creation (Triggered by Creation Form) Trigger: Creation Form (Form Trigger). Check: if is creation flow (If) evaluates to true. Zoom: Create Zoom meeting creates the session. Stripe Product: Create Stripe Product creates a product and price in Stripe. Stripe Link: Create payment link generates the public payment link, embedding Zoom and sheet metadata. Google Sheet: Create participant list creates a new sheet tab for the event. Email Host: Send email to teacher notifies the host of the successful setup. Part B: Participant Registration (Triggered by On payment) Trigger: On payment (Stripe Trigger - checkout.session.completed). Format: Format participant extracts customer details. Google Sheet: Add participant to list appends the new participant's info to the event's sheet. Email Participant: Send confirmation to participant sends the Zoom access details. Email Host: Notify teacher sends a registration alert. Node Descriptions | Node Name | Description | |-----------|-------------| | Creation Form | A form trigger used to input the event's required details (title, price, start date/time). | | On payment | A Stripe trigger that listens for the checkout.session.completed event, indicating a successful payment. | | Create Zoom meeting | Creates a new Zoom meeting, calculating the start time based on the form inputs. | | Create Stripe Product | Posts to the Stripe API to create a new product and price based on the form data. | | Create payment link | Creates a Stripe Payment Link, embedding Zoom meeting and Google Sheet ID metadata. | | Create participant list | Creates a new tab (named dynamically) in the configured Google Sheet for event tracking. | | Add participant to list | Appends a new row to the event's Google Sheet tab upon payment completion. | | Send email to teacher / Notify teacher | Sends emails to the host/organizer for creation confirmation and new participant registration, respectively. | | Send confirmation to participant | Sends the welcome email to the paying customer with the Zoom access details retrieved from the Stripe metadata. | Customization Tips Email Content**: You are encouraged to adapt the email contents in the Gmail nodes to fit your branding and tone. Currency**: Change the currency in the Config node. Zoom Password**: The password is set to a random 4-character string; you can modify the logic in the Create Zoom meeting node. Stripe Price**: The price is sent to Stripe in the smallest currency unit (e.g., cents, * 100). Suggested Sticky Notes for Workflow Setup**: "Add Your credentials [Zoom, Google, Stripe]. Note: For Google, you need to add Gmail and Google Sheet. Create a new Google Sheet. Keep this sheet blank for now. And fill the config node." Creation Form**: "Your journey to easy event management starts here. Click this node, copy the production URL, and keep it handy. It's your personal admin tool for quickly creating new meetings." Customize**: "Feel free to adapt email contents to your needs." Config**: "Setup your flow". Required Files 2DT5BW5tOdy87AUl_Streamline_Your_Zoom_Meetings_with_Secure,_Automated_Stripe_Payments.json: The n8n workflow export file. A new, blank Google Sheet (URL configured in the Config node). Testing Tips Test Creation**: Run the Creation Form to trigger the Part A flow. Verify that a Zoom meeting and Stripe Payment Link are created, a new Google Sheet tab appears, and the host receives the setup email. Test Registration**: Simulate a successful payment to the generated Stripe link to trigger the Part B flow. Verify that the participant is added to the Google Sheet, receives the confirmation email with Zoom details, and the host receives the notification. Suggested Tags & Categories #Stripe #Zoom #Payment #E-commerce #GoogleSheets #Gmail #Automation #Webinar
by Dr. Firas
💥 AI Image → Professional Video Workflow (NanoBanana Ultra + Kling AI + Blotato) 📄 Documentation: Notion Guide 👤 Who is this for? This workflow is designed for content creators, marketers, fashion brands, AI creators, and automation builders who want to turn a single image into a cinematic, multi-shot professional video, then automatically publish it on platforms like YouTube—without manual editing. It’s especially useful if you already work with Google Sheets, n8n, and AI image/video generation APIs. 🎯 What problem does this workflow solve? (Use case) Creating high-quality video content from images usually requires: Manual image editing Multiple AI tools Video stitching Uploading and publishing steps This workflow fully automates that pipeline: From a single image URL To a cinematic multi-shot video To automatic publishing All driven from a spreadsheet. ⚙️ What this workflow does This workflow runs in two main phases: Step 1 – Contact Sheet Generation (NanoBanana Ultra) Reads an image URL from Google Sheets Downloads the image from Google Drive Uploads it to a public host (tmpfiles) Uses NanoBanana Ultra (edit-ultra) to generate a 2×3 cinematic contact sheet Ensures: Identity preservation Fashion & texture fidelity Consistent lighting and style Saves the generated contact sheet back to Google Drive Updates the spreadsheet status Step 2 – Video Creation & Publishing (Blotato) Splits the contact sheet into 6 keyframes Uploads each frame publicly Uses Kling AI (start–end frame i2v) to generate multiple cinematic video segments Merges the generated clips into a single final video (FFmpeg API) Uploads the final video to Google Drive Publishes automatically to YouTube via **Blotato** Updates the spreadsheet with the final video URL and status 🛠️ Setup To use this workflow, you need: n8n (Cloud or Self-Hosted) Google Sheets Copy the provided template Each row controls one image → one video API Keys AtlasCloud (NanoBanana + Kling) tmpfiles.org (public hosting) Blotato (publishing) Connected Accounts Google Drive Google Sheets Blotato 🎛️ How to customize this workflow You can easily adapt it to your needs: Change video style** Edit the Kling prompt (camera movement, mood, pacing) Adjust video duration** Modify the duration parameter in Kling nodes Use fewer or more shots** Add/remove crop + Kling generation branches Publish to other platforms** Blotato supports multiple social networks Trigger automatically** Use the Schedule Trigger instead of manual execution Different aspect ratios** Adjust NanoBanana aspect_ratio and crop math 🚀 Expected outcome From one image, you get: A high-end editorial contact sheet Multiple cinematic AI-generated video clips One merged professional video Automatically uploaded and published No manual editing. No timeline work. Fully automated. 👋 Need help or want to customize this? 📩 Contact: LinkedIn 📺 YouTube: @DRFIRASS 🚀 Workshops: Mes Ateliers n8n Need help customizing? Contact me for consulting and support : Linkedin / Youtube / 🚀 Mes Ateliers n8n
by Daniel Lianes
Auto-scrape Twitter accounts to WhatsApp groups This workflow provides automated access to real-time Twitter/X content through intelligent scraping and AI processing. It keeps you at the cutting edge of breaking news, emerging trends, and industry developments by eliminating the need to manually check multiple social media accounts and delivering curated updates directly to your communication channels. Overview This workflow automatically handles the complete Twitter monitoring process using advanced scraping techniques and AI analysis. It manages API authentication, multi-source data collection, intelligent content filtering, and message delivery with built-in error handling and rate limiting for reliable automation. Core Function: Real-time social media monitoring that transforms Twitter noise into actionable intelligence, ensuring you're always first to know about the latest trends, product launches, and industry shifts that shape your field. Key Capabilities Real-time trend detection** - Catch breaking news and emerging topics as they happen on X/Twitter Multi-source Twitter monitoring** - Track specific accounts AND trending keyword searches simultaneously AI-powered trend analysis** - Gemini 2.5 Pro filters noise and surfaces only the latest developments that matter Stay ahead of the curve** - Identify emerging technologies, viral discussions, and industry shifts before they go mainstream Flexible delivery options** - Pre-configured for WhatsApp, but easily adaptable for Telegram, Slack, Discord, or even blog content generation Rate limit protection** - Built-in delays and error handling using TwitterAPI.io's reliable, cost-effective infrastructure Tools Used n8n**: The automation platform orchestrating the entire workflow TwitterAPI.io**: Reliable access to Twitter/X data without API complexities OpenRouter**: Gateway to advanced AI models for content processing Gemini 2.5 Pro**: Google's latest AI for intelligent content analysis and formatting Evolution API**: WhatsApp Business API integration for message delivery Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install IMPORTANT: Before importing this workflow, you need to install the Evolution API community node: Install Community Node First: Go to Settings > Community Nodes in your n8n instance Add Evolution API: Install n8n-nodes-evolution-api package Restart n8n: Allow the new nodes to load properly Import the Workflow: Download the .json file and import it into your n8n instance Configure Twitter Access: Set up TwitterAPI.io credentials and add target accounts/keywords Set Up AI Processing: Add your OpenRouter API key for Gemini 2.5 Pro access Configure WhatsApp: Set up Evolution API and add your target group ID Test & Deploy: Run a test execution and schedule for daily operation Use Cases Stay Ahead of Breaking News**: Be the first to know about industry announcements, product launches, and major developments the moment they hit X/Twitter Spot Trends Before They Explode**: Identify emerging technologies, viral topics, and shifting conversations while they're still building momentum Competitive Intelligence**: Monitor what industry leaders, competitors, and influencers are discussing in real-time Brand Surveillance**: Track mentions, discussions, and sentiment around your brand as conversations develop Content Creation Pipeline**: Gather trending topics, viral discussions, and timely content ideas for blogs, newsletters, or social media strategy Market Research**: Collect real-time social sentiment and emerging market signals from X/Twitter conversations Multi-platform Distribution**: While configured for WhatsApp, the structured output can easily feed Telegram bots, Slack channels, Discord servers, or automated blog generation systems FIND YOUR WHATSAPP GROUPS The workflow includes a helper node to easily find your WhatsApp group IDs: Use the Fetch Groups node: The workflow includes a dedicated node that fetches all your available WhatsApp groups Run the helper: Execute just that node to see a list of all groups with their IDs Copy the group ID: Find your target group in the list and copy its ID Update the delivery node: Paste the group ID into the final WhatsApp sending node Group ID format: Always ends with @g.us (example: 120363419788967600@g.us) Pro tip: Test with a small private group first before deploying to your main team channels. Connect with Me LinkedIn**: https://www.linkedin.com/in/daniel-lianes/ Discovery Call**: https://cal.com/averis/asesoria Consulting Session**: https://cal.com/averis/consultoria-personalizada Was this helpful? Let me know! I truly hope this was helpful. Your feedback is very valuable and helps me create better resources. Want to take automation to the next level? If you're looking to optimize your business processes or need expert help with a project, here's how I can assist you: Advisory (Discovery Call): Do you have a process in your business that you'd like to automate but don't know where to start? In this initial call, we'll explore your needs and see if automation is the ideal solution for you. Schedule a Discovery Call Personalized Consulting (Paid Session): If you already have a specific problem, an integration challenge, or need hands-on help building a custom workflow, this session is for you. Together, we'll find a powerful solution for your case. Book Your Consulting Session Stay Up to Date For more tricks, ideas, and news about automation and AI, let's connect on LinkedIn! Follow me on LinkedIn #n8n #automation #twitter #whatsapp #ai #socialmedia #monitoring #intelligence #gemini #scraping #workflow #nocode #businessautomation #socialmonitoring #contentcuration #teamcommunication #brandmonitoring #trendanalysis #marketresearch #productivity
by Anirudh Aeran
This workflow is a complete, AI-powered content engine designed to help automation experts build their personal brand on LinkedIn. It transforms a technical n8n workflow (in JSON format) into a polished, engaging LinkedIn post, complete with a custom-generated AI image and a strategic call-to-action. This system acts as your personal content co-pilot, handling the creative heavy lifting so you can focus on building, not just writing. Who’s it for? This template is for n8n developers, automation consultants, and tech content creators who want to consistently showcase their work on LinkedIn but lack the time or desire to write marketing copy and design visuals from scratch. If you want to turn your projects into high-quality content with minimal effort, this is your solution. How it works This workflow is divided into two main parts that work together through Telegram: Content Generation & Image Creation: You send an n8n workflow's JSON file to your first Telegram bot. The workflow sends the JSON to Google Gemini with a sophisticated prompt, instructing it to analyze the workflow and write a compelling LinkedIn post in one of two high-engagement styles ("Builder" or "Strategist"). Gemini also generates a detailed prompt for an AI image model, including a specific headline to be embedded in the visual. This image prompt is then sent to the Cloudflare Workers AI model to generate a unique, high-quality image for your post. The final image and the AI-generated text prompt are sent back to you via Telegram for review. Posting to LinkedIn: You use a second Telegram bot for publishing. Simply reply to the image you received from the first bot with the final, polished post text. The workflow triggers on your reply, grabs the image and the text, and automatically publishes it as a new post on your LinkedIn profile. Why Two Different Workflows? The first workflow sends you the image and the post content. You can make changes in the content or the image and send the image to BOT-2. Then copy the post content send it to BOT-2 as a reply to the image. Then both the image and Content will be posted on LinkedIn as a single post. How to set up Create Two Telegram Bots: You need two separate bots. Use BotFather on Telegram to create them and get their API tokens. Bot 1 (Generator): For submitting JSON and receiving the generated content/image. Bot 2 (Publisher): For replying to the image to post on LinkedIn. (After Human Verification) Set Up Accounts & Credentials: Add credentials for Google Gemini, Cloudflare (with an API Token), Google Sheets, and LinkedIn. For Cloudflare, you will also need your Account ID. Google Sheet for Tracking: Create a Google Sheet with the columns: Keyword, Image Prompt, Style Used to keep a log of your generated content. Configure Nodes: In all Telegram nodes, select the correct credential for each bot. In the Google Gemini node, ensure your API credential is selected. In the Cloudflare nodes ("Get accounts" and "Get Flux Schnell image"), select your Cloudflare credential and replace the placeholder with your Account ID in the URL. In the LinkedIn node, select your credential and choose the author (your profile). In the Google Sheets node, enter your Sheet ID. Activate: Activate both Telegram Triggers in the workflow. Requirements An n8n instance. Credentials for: Google Gemini, Cloudflare, LinkedIn, Google Sheets. Two Telegram bots with their API tokens. A Cloudflare Account ID.
by Rahul Joshi
📊 Description Automate your content repurposing workflow by transforming long-form articles, blogs, and newsletters into short, high-signal, AI-ready social media snippets. ✍️🤖 This workflow fetches pending content from Airtable, generates 30-word snippets, data points, and quote-style insights using GPT-4o-mini, and updates the original record with all generated fields. If Facebook is selected as a target platform, the workflow automatically posts the best snippet via the Meta Graph API and logs the result. Perfect for content, marketing, and social media teams scaling daily publishing without manual rewriting. 🚀📣 🔁 What This Template Does 1️⃣ Fetches “pending” long-form content from Airtable. 📥 2️⃣ Processes all records in batches to avoid rate limits. 🔁 3️⃣ Sends full content + metadata to GPT-4o-mini to generate structured snippets. 🤖 4️⃣ Ensures valid JSON output via the structured parser. 📐 5️⃣ Updates Airtable with: — 30-word snippets — data points — quote insights — a recommended primary snippet — timestamps & status 6️⃣ Checks if Facebook is selected as a posting platform. ⚙️ 7️⃣ Automatically publishes the recommended snippet using the Meta Graph API. 📤 8️⃣ Updates Airtable again with post status + response. 📝 9️⃣ Sends a success notification to Slack with full details. 💬 ⭐ Key Benefits ✅ Automates creation of platform-ready social media snippets ✅ Produces AI-friendly, high-signal content that works for LLM discovery ✅ Eliminates manual rewriting for LinkedIn, Facebook, Twitter, Instagram ✅ Automatically posts to Meta if selected — hands-free publishing ✅ Maintains clean, structured content in Airtable for future reuse ✅ Saves time for marketing, growth, and social teams 🧩 Features Airtable integration for content fetch + update GPT-4o-mini AI snippet generation Structured JSON parser for clean, reliable AI output Auto-detection of selected social platforms Facebook Graph API publishing Slack notifications for success Scheduled automation for hands-free daily processing Full audit trail with timestamps 🔐 Requirements Airtable Personal Access Token OpenAI API key (GPT-4o-mini) Facebook Graph API credentials (for auto-posting) Slack API credentials n8n with LangChain nodes enabled 🎯 Target Audience Content marketing teams repurposing long-form content Social media managers publishing daily posts Growth teams optimizing content for AI search engines Agencies producing content at scale for multiple clients
by Artem Boiko
How it works This template helps project managers collect task updates and photo reports from field workers via Telegram and stores everything in a Google Sheet. It enables daily project reporting without paper or back-office overhead. High-level flow: Workers receive daily tasks via Telegram They respond with photo reports Bot auto-saves replies (photos + status) to a Google Sheet The system tracks task completion, adds timestamps, and maintains report history Set up steps 🕒 Estimated setup time: 15–30 min You’ll need: A Telegram bot (via BotFather) A connected Google Sheet (with specific column headers) A set of preconfigured tasks 👉 Detailed setup instructions and required table structure are documented in sticky notes inside the workflow. Consulting and Training We work with leading construction, engineering, consulting agencies and technology firms around the world to help them implement open data principles, automate CAD/BIM processing and build robust ETL pipelines. If you would like to test this solution with your own data, or are interested in adapting the workflow to real project tasks, feel free to contact us. Docs & Issues: Full Readme on GitHub
by AppStoneLab Technologies LLP
Reddit Thread → AI-Powered X & LinkedIn Posts with Human Approval Gate Turn any Reddit thread into polished, platform-optimized social media posts for X (Twitter) and LinkedIn - in minutes, not hours. This workflow reads a Reddit thread, extracts the full discussion (including nested comment threads sorted by score), feeds everything to Google Gemini for summarization and post generation, then pauses for your review before publishing anything live. No accidental posts. No context lost. Just high-quality content, on your terms. ✨ Key Features 🔗 Any Reddit URL supported** → Standard, mobile (m.reddit.com), and short redd.it links all work 💬 Full thread extraction** → Recursively pulls all comments and replies, sorted by score at every depth level 🧠 Two-stage AI pipeline** → Gemini first summarizes the thread, then generates platform-specific posts from that summary 🐦 X-optimized post** → Max 280 characters, punchy, curiosity-driven with relevant hashtags; auto-truncated if over the limit 💼 LinkedIn-optimized post** → 150–300 words, professional tone, structured paragraphs, engagement question, and hashtags 👤 Human-in-the-loop approval** → A review form shows both posts before anything is published; supports manual overrides per platform 🚫 Graceful rejection path** → If rejected, the workflow terminates cleanly with no content published 📝 What This Workflow Does This workflow solves a real content creation bottleneck: Reddit threads are goldmines of community insight, niche expertise, and trending discussions - but turning that raw discussion into polished, platform-appropriate social posts takes significant manual effort. This automation handles the entire pipeline: from a raw URL to live posts on two platforms, with you staying in full control via an approval gate. It's ideal for content marketers, community managers, indie hackers, developers, and newsletter writers who want to repurpose Reddit content without losing quality or spending hours manually summarizing threads. ⚙️ How It Works (Step-by-Step) Submit a Reddit URL - A web form (Form Trigger) accepts any Reddit thread URL as input. Parse the URL - A Code node validates and deconstructs the URL using regex, extracting the subreddit and post ID to build the Reddit JSON API endpoint. Fetch the thread - An HTTP Request node calls Reddit's public JSON API (reddit.com/r/.../comments/id.json) with limit=100 and depth=3 to retrieve the full thread. Extract & structure content - A Code node recursively traverses the entire comment tree, sorts comments by score at every depth level, and builds a clean flat text representation of the full thread - including post metadata (title, score, upvote ratio, flair, awards) - ready for AI injection. Summarize with Gemini - The assembled thread content is passed to Google Gemini (3.1 Flash Lite), which returns a comprehensive markdown summary covering: Thread Overview, Key Topics, Notable Insights, Community Sentiment, and Actionable Takeaways. Generate social posts - A second Gemini call uses the summary to craft a platform-optimized X post (≤280 chars) and a LinkedIn post (150–300 words), returning strict JSON output. Parse & validate - A Code node safely extracts the JSON, strips any markdown fences, falls back to regex parsing if needed, and enforces the 280-character hard limit on the X post. Human Approval form - The workflow pauses and presents both posts in a review form. You can approve as-is, paste a manual override for either platform, or reject the entire run. Resolve final content - A Code node merges your overrides (if any) with the AI versions; overrides always win, AI version is the fallback. Route by decision - An IF node checks your approval decision: ✅ Approve & Publish → Posts simultaneously to X and LinkedIn ❌ Reject → Workflow ends cleanly; nothing is published 🚀 How to Use This Workflow Step 1 - Set up credentials Click Use template, then configure the following credentials in n8n: | Service | Credential Type | How to Get It | |---|---|---| | 🤖 Google Gemini | Google PaLM API | Get API Key → Google AI Studio | | 🐦 X (Twitter) | Twitter OAuth2 | X Developer Portal → Create App → OAuth2 | | 💼 LinkedIn | LinkedIn OAuth2 | LinkedIn Developer Portal → Create App | > Note on Reddit: No API key required. This workflow uses Reddit's public JSON API (append .json to any thread URL) which is freely accessible without authentication. Step 2 - Configure the LinkedIn node Open the Post to LinkedIn node and replace the person field value (=ID) with your LinkedIn Person URN. You can retrieve it by calling the LinkedIn API: GET https://api.linkedin.com/v2/userinfo after authenticating. Step 3 - Activate the workflow Toggle the workflow to Active in your n8n instance. This enables the Form Trigger and the Wait node's webhook to function correctly. Step 4 - Run it Open the Form Trigger URL (found in the Reddit URL Input node) Paste any Reddit thread URL Wait for the approval form to arrive (check the execution log for the form URL) Review, optionally edit, and approve or reject Done! your posts are live 🚀 🛠️ How to Customize 🤖 Swap the AI model** - Both Gemini nodes use gemini-3.1-flash-lite-preview. You can switch to gemini-3.1-pro-preview or claude-sonnet-4-6 for higher quality output by updating the modelId in both Gemini nodes or by adding Anthropic nodes. 📝 Change the post format* - Edit the prompt in the *Generate Social Posts** node to adjust tone, length, hashtag count, or add support for other platforms (Instagram, Threads, Facebook). 📊 Add more platforms* - After the *Approved?** node's true branch, connect additional posting nodes (e.g., Facebook Graph API, Buffer, Telegram) in parallel. 📋 Log to Google Sheets** - Add a Google Sheets node after the publish nodes to track published posts, Reddit thread URLs, dates, and engagement metrics. ⏱️ Make it scheduled** - Replace the Form Trigger with a Schedule Trigger + a list of pre-configured Reddit URLs in Google Sheets for fully automated daily publishing. ⚠️ Important Notes The Reddit public JSON API does not require authentication but is rate-limited. For high-volume use, consider adding a Reddit OAuth2 credential. The Wait node requires your n8n instance to be publicly accessible (or use n8n Cloud) so the approval form's webhook URL can be reached by your browser. LinkedIn's API requires your app to have the w_member_social permission scope to post on behalf of a user. X (Twitter) API v2 requires an approved developer account. Free tier allows limited monthly tweets.
by Joseph
Overview This n8n workflow creates an intelligent AI agent that automates browser interactions through Airtop's browser automation platform. The agent can control real browser sessions, navigate websites, interact with web elements, and maintain detailed session records - all while providing live viewing capabilities for real-time monitoring. Youtube Tutorial: https://www.youtube.com/watch?v=XoZqFY7QFps What This Workflow Does The AI agent acts as your virtual assistant in the browser, capable of: Session Management**: Creates, monitors, and terminates browser sessions with proper tracking Web Navigation**: Visits websites, clicks elements, fills forms, and performs complex interactions Multi-Window Support**: Manages multiple browser windows within sessions Live Monitoring**: Provides real-time viewing URLs so you can watch the automation Data Tracking**: Maintains comprehensive records of all browser activities Profile Integration**: Uses Airtop profiles for authenticated sessions Email Notifications**: Sends live URLs and status updates via Gmail Demo Use Case: Automated Reddit Posting The tutorial demonstrates the agent's capabilities by: Logging into Reddit using pre-configured Airtop profile credentials Navigating to a specific subreddit based on user input Creating and publishing a new post with title and content Tracking the entire process with detailed session records Providing live viewing access throughout the automation Core Workflow Components 1. Chat Interface Trigger Node Type**: Chat Trigger Purpose**: Accepts user commands for browser automation tasks Input**: Natural language instructions (e.g., "Create a Reddit post in r/automation") 2. AI Agent Processing Node Type**: OpenAI GPT-4 Purpose**: Interprets user requests and determines appropriate browser actions System Message**: Contains the comprehensive agent instructions from your documentation Capabilities**: Understands complex web interaction requests Plans multi-step browser workflows Manages session states intelligently Handles error scenarios gracefully 3. Google Sheets Data Management Multiple Google Sheets nodes manage different aspects of session tracking: Browser Sessions Sheet Fields**: session_id, description, status, created_date Purpose**: Tracks active browser sessions Operations**: Create, read, update session records Window Sessions Sheet Fields**: session_id, window_id, description, airtop_live_view_url, status Purpose**: Tracks individual browser windows within sessions Operations**: Create, read, update window records Airtop Profiles Sheet Fields**: platform_name, platform_url, profile_name Purpose**: Stores available authenticated profiles Operations**: Read available profiles for session creation 4. Airtop Browser Automation Nodes Multiple specialized nodes for browser control: Session Management create_session**: Creates new browser sessions with optional profile authentication terminate_session**: Closes browser sessions and updates records read_airtop_profiles**: Retrieves available authentication profiles Window Management create_window**: Opens new browser windows with specified URLs query_page**: Analyzes page content and identifies interactive elements Web Interaction click_element**: Clicks specific page elements based on AI descriptions type_text**: Inputs text into form fields and input elements 5. Gmail Integration Node Type**: Gmail Send Purpose**: Sends live viewing URLs and status updates Recipients**: User email for real-time monitoring Content**: Complete Airtop live view URLs for browser session observation 6. Error Handling & Validation Input Validation**: Ensures required parameters are present Session State Checks**: Verifies browser session status before operations Error Recovery**: Handles failed operations gracefully Data Consistency**: Maintains accurate session records even during failures Technical Requirements API Credentials Needed Airtop.ai API Key Sign up at airtop.ai Generate API key from dashboard Required for all browser automation functions OpenAI API Key OpenAI account with GPT-4 access Required for AI agent intelligence and decision-making Google Sheets Access Google account with Google Sheets API access Copy the provided template and get your sheet URL Required for session and profile data management Gmail OAuth Google account with Gmail API access Required for sending live viewing URLs and notifications Airtable Base Structure Create three tables in your Airtable base: 1. Browser Details (Sessions) session_id (Single line text) description (Single line text) status (Single select: Open, Closed) created_date (Date) 2. Window Details (Windows) session_id (Single line text) window_id (Single line text) description (Single line text) airtop_live_view_url (URL) status (Single select: Open, Closed) 3. Airtop Profiles platform_name (Single line text) platform_url (URL) profile_name (Single line text) Workflow Logic Flow User Request Processing User Input: Natural language command via chat interface AI Analysis: OpenAI processes request and determines required actions Session Check: Agent reads current browser session status Action Planning: AI creates step-by-step execution plan Browser Session Lifecycle Session Creation: Check for existing open sessions Ask user about profile usage if needed Create new Airtop session Record session details in Airtable Window Management: Create browser window with target URL Capture live viewing URL Record window details in Airtable Send live URL via Gmail Web Interactions: Query page content for element identification Execute clicks, form fills, navigation Monitor page state changes Handle dynamic content loading Session Cleanup: Terminate browser session when complete Update all related records to "Closed" status Send completion notification Data Flow Architecture User Input → AI Processing → Session Management → Browser Actions → Data Recording → User Notifications Key Features & Benefits Intelligent Automation Natural Language Control**: Users can describe tasks in plain English Context Awareness**: AI understands complex multi-step workflows Adaptive Responses**: Handles unexpected page changes and errors Profile Integration**: Seamlessly uses stored authentication credentials Real-Time Monitoring Live View URLs**: Watch browser automation as it happens Status Updates**: Real-time notifications of task progress Session Tracking**: Complete audit trail of all browser activities Multi-Window Support**: Handle complex workflows across multiple tabs Enterprise-Ready Features Error Recovery**: Robust handling of network issues and page failures Session Persistence**: Maintains state across workflow interruptions Data Integrity**: Consistent record-keeping even during failures Scalable Architecture**: Can handle multiple concurrent automation tasks Use Cases Beyond Reddit This workflow architecture supports automation for any website: Social Media Management Multi-platform posting**: Facebook, Twitter, LinkedIn, Instagram Community engagement**: Responding to comments, messages Content scheduling**: Publishing posts at optimal times Analytics gathering**: Collecting engagement metrics Business Process Automation CRM data entry**: Updating customer records across platforms Support ticket management**: Creating, updating, routing tickets E-commerce operations**: Product listings, inventory updates Report generation**: Gathering data from multiple web sources Personal Productivity Travel booking**: Comparing prices, making reservations Bill management**: Paying utilities, checking statements Job applications**: Submitting applications, tracking status Research tasks**: Gathering information from multiple sources Advanced Configuration Options Custom Profiles Create Airtop profiles for different websites Store authentication credentials securely Switch between different user accounts Handle multi-factor authentication flows Workflow Customization Modify AI system prompts for specific use cases Add custom validation rules Implement retry logic for failed operations Create domain-specific interaction patterns Integration Extensions Connect to additional data sources Add webhook notifications Implement approval workflows Create audit logs and reporting Getting Started 📊 Copy the Google Sheets Template - Just click and make a copy! Set up credentials for Airtop, OpenAI, and Gmail Import workflow into your n8n instance Configure node credentials with your API keys and Google Sheets URL Test with simple commands like "Visit google.com" Expand to complex workflows as you become comfortable Best Practices Session Management Always check for existing sessions before creating new ones Properly terminate sessions to avoid resource waste Use descriptive names for sessions and windows Regularly clean up old session records Error Handling Implement timeout handling for slow-loading pages Add retry logic for network failures Validate element existence before interactions Log detailed error information for debugging Security Considerations Store sensitive credentials in Airtop profiles, not workflow Use webhook authentication for production deployments Implement rate limiting to avoid being blocked by websites Regular audit of browser session activities This workflow transforms n8n into a powerful browser automation platform, enabling you to automate virtually any web-based task while maintaining full visibility and control over the automation process.
by Richard Nijsten
Create sprint goals using Pega Agile Studio with AI Based on the Google Sheet data, the AI will retrieve the userstories ID's, retrieves the userstory data and the corresponding attachments and creates sprint goals according to the defined system prompt. Who's it for Product owners/Scrum masters that use Pega Agile Studio How it works It retrieves the data from each userstory using the Agile Studio api, and the Google API to gather the attachments in the corresponding documents/sheets/slides. Then the AI processes it according to the system prompt and generates a mail. How to set up Create a Google Sheet and add the userstories you want in the Google Sheet in column named "Userstory". Set the GoogleSheet ID in the node "Retrieve_Data_From_Sheet" Setup other credentials. Publish the subworkflow for attachment handling. Execute the workflow. Requirements Access to Pega Agile Studio OAuth2 Api. AI API. Access to Google Cloud for the Google API's