by PhilanthropEAK Automation
Who's it for Marketing teams, social media managers, content creators, and small businesses looking to maintain consistent social media presence across multiple platforms. Perfect for organizations that want to automate content creation while maintaining quality and brand consistency. How it works This workflow creates a complete social media automation system that generates platform-specific content using AI and schedules posts across Twitter, LinkedIn, and Instagram based on your content calendar in Google Sheets. The system runs daily at 9 AM, reading your content calendar to identify scheduled topics for the day. It uses OpenAI's GPT-4 to generate platform-optimized content that follows each platform's best practices - concise engaging posts for Twitter, professional thought leadership for LinkedIn, and visual storytelling for Instagram. DALL-E creates accompanying images that match your brand style and topic themes. Each piece of content is automatically formatted for optimal engagement, including appropriate hashtags, character limits, and platform-specific calls-to-action. The workflow then schedules posts through Buffer's API at optimal times and updates your spreadsheet with posting status, content previews, and generated image URLs for tracking and approval workflows. How to set up Prerequisites: Google account with Sheets access OpenAI API key with GPT-4 and DALL-E access Buffer account with connected social media profiles Slack workspace (optional for notifications) Setup steps: Create your content calendar: Copy the provided Google Sheets template Set up columns: Date, Topic, Platforms, Content Type, Keywords, Status, Generated Content, Image URL Fill in your content schedule with topics and target platforms Configure credentials in n8n: Add OpenAI API credential with your API key Set up Google Sheets OAuth2 for spreadsheet access Add Buffer API token from your Buffer dashboard Add Slack API credential for success notifications (optional) Update Configuration Variables: Set your Google Sheet ID from the spreadsheet URL Define your brand voice and company messaging Specify target audience for content personalization Set image style preferences for consistent visuals Configure Buffer integration: Connect your social media accounts to Buffer Get profile IDs for Twitter, LinkedIn, and Instagram Update the Schedule Post node with your specific profile IDs Set optimal posting times in Buffer settings Test the workflow: Add test content to tomorrow's date in your calendar Run the workflow manually to verify content generation Check that posts appear in Buffer's queue correctly Verify spreadsheet updates and Slack notifications work Requirements Google Sheets with template structure and editing permissions OpenAI API key with GPT-4 and DALL-E access (estimated cost: $0.10-0.30 per day for content generation) Buffer account (free plan supports up to 3 social accounts, paid plans for more) Social media accounts connected through Buffer (Twitter, LinkedIn, Instagram) n8n instance (cloud subscription or self-hosted) How to customize the workflow Adjust content generation: Modify AI prompts in the OpenAI node to match your industry tone and style Add custom content types (promotional, educational, behind-the-scenes, user-generated) Include seasonal or event-based content variations in your prompts Customize hashtag strategies per platform and content type Enhance scheduling logic: Add time zone considerations for global audiences Implement different posting schedules for weekdays vs weekends Create urgency-based posting for time-sensitive content Add approval workflows before scheduling sensitive content Expand platform support: Add Facebook, TikTok, or YouTube Shorts using their respective APIs Integrate with Hootsuite or Later as alternative scheduling platforms Include Pinterest for visual content with optimized descriptions Add LinkedIn Company Page posting alongside personal profiles Improve content intelligence: Integrate trending hashtag research using social media APIs Add competitor content analysis for inspiration and differentiation Include sentiment analysis to adjust tone based on current events Implement A/B testing for different content variations Advanced automation features: Add engagement monitoring and response workflows Create monthly performance reports sent via email Implement content recycling for evergreen topics Build user-generated content curation from brand mentions Add crisis communication protocols for sensitive topics Integration enhancements: Connect with your CRM to include customer success stories Link to email marketing for cross-channel content consistency Integrate with project management tools for campaign coordination Add analytics dashboards for content performance tracking
by Omer Fayyaz
Automatically discover and extract article URLs from any website using AI to identify valid content links while filtering out navigation, category pages, and irrelevant content—perfect for building content pipelines, news aggregators, and research databases. What Makes This Different: AI-Powered Intelligence** - Uses GPT-5-mini to understand webpage context and identify actual articles vs navigation pages, eliminating false positives Browser Spoofing** - Includes realistic User-Agent headers and request patterns to avoid bot detection on publisher sites Smart URL Normalization* - Automatically strips tracking parameters (utm_, fbclid, etc.), removes duplicates, and standardizes URLs Source Categorization** - AI assigns logical source names based on domain and content type for easy filtering Rate Limiting Built-In** - Configurable delays between requests prevent IP blocking and respect website resources Deduplication on Save** - Google Sheets append-or-update pattern ensures no duplicate URLs in your database Key Benefits of AI-Powered Content Discovery: Save 10+ Hours Weekly** - Automate manual link hunting across dozens of publisher sites Higher Quality Results** - AI filters out 95%+ of junk links (nav pages, categories, footers) that rule-based scrapers miss Scale Effortlessly** - Add new seed URLs to your sheet and the same workflow handles any website structure Industry Agnostic** - Works for news, blogs, research papers, product pages—any content type Always Up-to-Date** - Schedule daily runs to catch new content as it's published Full Audit Trail** - Track discovered URLs with timestamps and sources in Google Sheets Who's it for This template is designed for content marketers, SEO professionals, researchers, media monitors, and anyone who needs to aggregate content from multiple sources. It's perfect for organizations that need to track competitor blogs, curate industry news, build research databases, monitor brand mentions, or aggregate content for newsletters without manually checking dozens of websites daily or writing complex scraping rules for each source. How it works / What it does This workflow creates an intelligent content discovery pipeline that automatically finds and extracts article URLs from any webpage. The system: Reads Seed URLs - Pulls a list of webpages to crawl from your Google Sheets (blog indexes, news feeds, publication homepages) Fetches with Stealth - Downloads each webpage's HTML using browser-like headers to avoid bot detection Converts for AI - Transforms messy HTML into clean Markdown that the AI can easily process AI Extraction - GPT-5-mini analyzes the content and identifies valid article URLs while filtering out navigation, categories, and junk links Normalizes & Saves - Cleans URLs (removes tracking params), deduplicates, and saves to Google Sheets with source tracking Key Innovation: Context-Aware Link Filtering - Unlike traditional scrapers that rely on CSS selectors or URL patterns (which break when sites update), the AI understands the semantic difference between an article link and a navigation link. It reads the page like a human would, identifying content worth following regardless of the website's structure. How to set up 1. Create Your Google Sheets Database Create a new Google Spreadsheet with two sheets: "Seed URLs" - Add column URL with webpages to crawl (blog homepages, news feeds, etc.) "Discovered URLs" - Add columns: URL, Source, Status, Discovered At Add 3-5 seed URLs to start (e.g., https://abc.com/, https://news.xyz.com/) 2. Connect Your Credentials Google Sheets**: Click the "Read Seed URLs" and "Save Discovered URLs" nodes → Select your Google Sheets account OpenAI**: Click the "OpenAI GPT-5-mini" node → Add your OpenAI API key Select your spreadsheet and sheet names in both Google Sheets nodes 3. Customize the AI Prompt (Optional) Open the "AI URL Extractor" node Modify the system message to add industry-specific rules: // Example: Add to system message for tech blogs For tech sites, also extract: Tutorial and guide URLs Product announcement pages Changelog and release notes Adjust source naming conventions to match your taxonomy 4. Test Your Configuration Click "Test Workflow" or use the Manual Trigger Check the execution to verify: Seed URLs are being read correctly HTML is fetched successfully (check for 200 status) AI returns valid JSON array of URLs URLs are saved to your output sheet Review the "Discovered URLs" sheet for results 5. Schedule and Monitor Adjust the Schedule Trigger (default: daily at 6 AM) Enable the workflow to run automatically Monitor execution logs for errors: Rate limiting: Increase wait time if sites block you Empty results: Check if seed URLs have changed structure AI errors: Review AI output in execution data Set up error notifications via email or Slack (add nodes after Completion Summary) Requirements Google Sheets Account** - OAuth2 connection for reading seed URLs and saving results OpenAI API Key** - For GPT-5-mini (or swap for any LangChain-compatible LLM)
by Dylan Watkins
⚠️ Heads up: this is satire. The "Hell Yeah!" workflow is a parody of "automate your whole life with AI agents" grindset content. The API endpoints are fictional and the function nodes are illustrative stubs. It's published as a template because the patterns underneath the joke — multi-channel social fan-out, marketplace scraping with follow-up sequences, CRM sync with chat alerts, cron-triggered daily chains — are genuinely useful starting points for n8n agencies building real lead-gen flows. Swap the jokes for your actual integrations and you've got a working skeleton. How it works A Rise and Grind cron kicks off a parallel "daily lifestyle" routine every morning A gambling → flex chain runs arbitrage math against a fake sportsbook and fans the "wins" out to Instagram, TikTok, Facebook, Threads, LinkedIn, and Twitter in one blast (this is the reusable multi-channel social posting pattern) A Facebook Marketplace lowball engine scrapes listings, filters them, fires templated lowball offers, and runs a relentless follow-up sequence (reusable pattern: scraper → filter → outreach → nurture) An eBay arbitrage loop finds profitable items, relists them, and logs every transaction to Google Sheets A Salesforce + Telegram branch handles "deal alerts" — the standard CRM-sync-with-chat-notification pattern every agency ships Extra satirical branches (Tinder auto-booking, Uber Black to the casino, steak delivery, playing Alice in Chains) are there for the bit — delete or replace them Set up steps Takes about 15–20 minutes if you're just importing to read through the patterns Takes 1–2 hours if you're stripping out the satirical branches and wiring the reusable patterns (social fan-out, marketplace outreach, CRM+Telegram) to your real credentials Replace every fictional HTTP endpoint with your actual API before activating Add credentials for whichever real nodes you keep: Salesforce, Telegram, Google Sheets, WhatsApp, Facebook Graph Detailed notes for each section live in the sticky notes inside the workflow itself
by WeblineIndia
Automated Failed Login Detection with Jira Security Tasks, Slack Notifications Webhook: Failed Login Attempts → Jira Security Case → Slack Warnings This n8n workflow monitors failed login attempts from any application, normalizes incoming data, detects repeated attempts within a configurable time window and automatically: Sends detailed alerts to Slack, Creates Jira security tasks (single or grouped based on repetition), Logs all failed login attempts into a Notion database. It ensures fast, structured and automated responses to potential account compromise or brute-force attempts while maintaining persistent records. Quick Implementation Steps Import this JSON workflow into n8n. Connect your application to the failed-login webhook endpoint. Add Jira Cloud API credentials. Add Slack API credentials. Add Notion API credentials and configure the database for storing login attempts. Enable the workflow — done! What It Does Receives Failed Login Data Accepts POST requests containing failed login information. Normalizes the data, ensuring consistent fields: username, ip, timestamp and error. Validates Input Checks for missing username or IP. Sends a Slack alert if any required field is missing. Detects Multiple Attempts Uses a sliding time window (default: 5 minutes) to detect multiple failed login attempts from the same username + IP. Single attempts → standard Jira task + Slack notification. Multiple attempts → grouped Jira task + detailed Slack notification. Logs Attempts in Notion Records all failed login events into a Notion database with fields: Username, IP, Total Attempts, Attempt List, Attempt Type. Formats Slack Alerts Single attempt → lightweight notification. Multiple attempts → summary including timestamps, errors, total attempts, and Jira ticket link. Who’s It For This workflow is ideal for: Security teams monitoring authentication logs. DevOps/SRE teams maintaining infrastructure access logs. SaaS platform teams with high login traffic. Organizations aiming to automate breach detection. Teams using Jira + Slack + Notion + n8n for incident workflows. Requirements n8n (Self-Hosted or Cloud). Your application must POST failed login data to the webhook. Jira Software Cloud credentials (Email, API Token, Domain). Slack Bot Token with message-posting permissions. Notion API credentials with access to a database. Basic understanding of your login event sources. How It Works Webhook Trigger: Workflow starts when a failed-login event is sent to the failed-login webhook. Normalization: Converts single objects or arrays into a uniform format. Ensures username, IP, timestamp and error are present. Prepares a logMessage for Slack and Jira nodes. Validation: IF node checks whether username and IP exist. If missing → Slack alert for missing information. Multiple Attempt Detection: Function node detects repeated login attempts within a 5-minute sliding window. Flags attempts as multiple: true or false. Branching: Multiple attempts → build summary, create Jira ticket, format Slack message, store in Notion. Single attempts → create Jira ticket, format Slack message, store in Notion. Slack Alerts: Single attempt → concise message Multiple attempts → detailed summary with timestamps and Jira ticket link Notion Logging: Stores username, IP, total attempts, attempt list, attempt type in a dedicated database for recordkeeping. How To Set Up Import Workflow → Workflows → Import from File in n8n. Webhook Setup → copy the URL from Faield Login Trigger node and integrate with your application. Jira Credentials → connect your Jira account to both Jira nodes and configure project/issue type. Slack Credentials → connect your Slack Bot and select the alert channel. Notion Credentials → connect your Notion account and select the database for storing login attempts. Test the Workflow → send sample events: missing fields, single attempts, multiple attempts. Enable Workflow → turn on workflow once testing passes. Logic Overview | Step Node | Description | |---------------------------------|-----------------------------------------------| | Normalize input | Normalize Login Event — Ensures each event has required fields and prepares a logMessage. | | Validate fields | Check Username & IP present — IF node → alerts Slack if data is incomplete. | | Detect repeats | Detect Multiple Attempts — Finds multiple attempts within a 5-minute window; sets multiple flag. | | Multiple attempts | IF - Multiple Attempts + Build Multi-Attempt Summary — Prepares grouped summary for Slack & Jira. | | Single attempt | Create Ticket - Single Attempt — Creates Jira task & Slack alert for one-off events. | | Multiple attempt ticket | Create Ticket - Multiple Attempts — Creates detailed Jira task. | | Slack alert formatting | Format Fields For Single/Multiple Attempt — Prepares structured message for Slack. | | Slack alert delivery | Slack Alert - Single/Multiple Attempts — Posts alert in selected Slack channel. | | Notion logging | Login Attempts Data Store in DB — Stores structured attempt data in Notion database. | Customization Options Webhook Node** → adjust endpoint path for your application. Normalization Function** → add fields such as device, OS, location or user-agent. Multiple Attempt Logic** → change the sliding window duration or repetition threshold. Jira Nodes** → modify issue type, labels or project. Slack Nodes** → adjust markdown formatting, channel routing or severity-based channels. Notion Node** → add or modify database fields to store additional context. Optional Enhancements: Geo-IP lookup for country/city info. Automatic IP blocking via firewall or WAF. User notification for suspicious login attempts. Database logging in MySQL/Postgres/MongoDB. Threat intelligence enrichment (e.g., AbuseIPDB). Use Case Examples Detect brute-force attacks targeting user accounts. Identify credential stuffing across multiple users. Monitor admin portal access failures with Jira task creation. Alert security teams instantly when login attempts originate from unusual locations. Centralize failed login monitoring across multiple applications with Notion logging. Troubleshooting Guide | Issue | Possible Cause | Solution | |-------------------------------|---------------------------------------------------|-------------------------------------------------------------| | Workflow not receiving data | Webhook misconfigured | Verify webhook URL & POST payload format | | Jira ticket creation fails | Invalid credentials or insufficient permissions | Update Jira API token and project access | | Slack alert not sent | Incorrect channel ID or missing bot scopes | Fix Slack credentials and permissions | | Multiple attempts not detected| Sliding window logic misaligned | Adjust Detect Multiple Attempts node code | | Notion logging fails | Incorrect database ID or missing credentials | Update Notion node credentials and database configuration | | Errors in normalization | Payload format mismatch | Update Normalize Login Event function code | Need Help? If you need help setting up, customizing or extending this workflow, WeblineIndia can assist with full n8n development, workflow automation, security event processing and custom integrations.
by Khairul Muhtadin
Automate your B2B lead generation by scraping Google Maps, generating AI business summaries, and extracting hidden contact emails from websites, all triggered via Telegram. Why Use This Workflow? Time Savings: Reduces lead research time from 4 hours of manual searching to 5 minutes of automated execution per 50 leads. Cost Reduction: Replaces expensive monthly lead database subscriptions with a pay-as-you-go model using Apify and OpenAI. Error Prevention: Uses AI to deduplicate results and ensure company summaries are professional and consistent for your CRM. Scalability: Allows you to trigger massive scraping tasks from your mobile phone via Telegram while the backend handles the heavy lifting. Ideal For Sales Development Reps (SDRs):** Rapidly building targeted lists of local businesses for cold outreach or door-knocking campaigns. Marketing Agencies:** Identifying new businesses in specific sectors (e.g., "Dentists in Paris") to offer SEO or advertising services. Real Estate Investors:** Finding specific commercial properties or business types in a geographic area to identify investment opportunities. How It Works Trigger: The workflow starts when you send a message to your Telegram bot in the format: Sector; Limit; MapsURL. Data Collection: n8n parses these parameters and triggers an Apify Actor to scrape Google Maps for business details. Processing: The workflow retrieves the results, removes duplicate entries, and begins a loop to process each business. Intelligence Layer (Summary): OpenAI analyzes the business data to create a natural, human-like summary of the company. Enrichment (Email Hunting): If the business has a website, Jina AI fetches the page content, and OpenAI extracts the most authoritative contact email. Output & Delivery: Core data is upserted into Google Sheets, and the email is updated once found. A "DONE" notification is sent to Telegram upon completion. Setup Guide Prerequisites | Requirement | Type | Purpose | |-------------|------|---------| | n8n instance | Essential | Workflow execution and logic | | Apify Account | Essential | Scrapes Google Maps data | | OpenAI API Key | Essential | Generates summaries and extracts emails | | Google Sheets | Essential | Data storage and lead management | | Telegram Bot | Essential | User interface for triggering searches | | Jina AI Key | Essential | Converts websites to LLM-friendly Markdown | Installation Steps Import the JSON file into your n8n instance. Configure credentials: Telegram: Create a bot via @BotFather and add your token to the Telegram nodes. Apify: Provide your API token to the "Run Maps Scraper" and "Fetch Dataset" nodes. OpenAI: Add your API key to the enrichment nodes. Google Sheets: Connect your Google account and select your target Spreadsheet ID. Set up the Sheet: Ensure your Google Sheet has headers matching the "Upsert" node (Title, Email, Category, Website, etc.). Test execution: Send a message like Coffee Shops; 5; https://www.google.com/maps/search/coffee+shops+london to your bot. Technical Details Core Nodes | Node | Purpose | Key Configuration | |------|---------|-------------------| | Telegram Trigger | Captures user input | Listens for /message updates | | Apify Node | Executes Maps Scraper | Uses nwua9Gu5YrADL7ZDj actor ID | | OpenAI Node | AI Analysis | Configured for JSON output (Summary) and extraction (Email) | | Jina AI | Web Scraping | Converts HTML to Markdown for AI readability | | Google Sheets | Database | Uses Append or Update based on the business Title | Workflow Logic The workflow utilizes a Split In Batches loop to ensure stability. It first performs a "shallow" save of business details (name, phone, address) and then attempts a "deep" enrichment only if a website URL is detected. This two-stage approach ensures you don't lose data if a website crawl fails. Customization Options Basic Adjustments: Rate Limit:** Adjust the "Wait" node duration (default 2s) to comply with Google Sheets or API limits. Language:** Modify the OpenAI prompt to generate summaries in your preferred language (e.g., English, French, Spanish). Advanced Enhancements: CRM Integration:** Replace Google Sheets with HubSpot or Pipedrive for direct lead injection. Slack Notifications:** Send the final lead list or business summary directly to a sales channel. Auto-Emailer:** Add a Gmail/Outlook node to automatically send an introductory email once a lead is found. Troubleshooting Common Issues: | Problem | Cause | Solution | |---------|-------|----------| | Empty Email column | Website protected by bot-blockers | Try a different scraper or use a proxy in Jina AI | | Apify Timeout | Scraping limit set too high | Lower the "limit" parameter in your Telegram message | | 429 Errors | Google Sheets rate limits | Increase the duration in the "Wait Rate Limit" node | Use Case Examples Scenario 1: Local SEO Agency Challenge: Finding local contractors with poor reviews but high revenue potential. Solution: Use the workflow to scrape "Plumbers" in a city, use AI to summarize their online presence, and collect emails. Result: A curated list of 100 leads with contact info and a "Pitch" summary generated in minutes. Scenario 2: SaaS Cold Outreach Challenge: Getting the direct email of a business owner for a new booking software. Solution: Trigger the scraper via Telegram while in the field. The workflow extracts the "authoritative" email (manager/owner) from their site. Result: Accurate, high-intent lead data delivered directly to a master spreadsheet for the sales team. Created by: Khaisa Studio Category: Marketing | Tags: Lead Gen, AI, Google Maps, Telegram Need custom workflows? Contact us Connect with the creator: Portfolio • Workflows • LinkedIn • Medium • Threads
by Jitesh Dugar
Schedule social media posts from local files using UploadToURL, OpenAI, and Buffer Marketing teams often have design files sitting locally — campaign images, product videos, event graphics — that need to be published on social media. The usual process means downloading files, switching apps, uploading to each platform separately, and writing captions by hand. This workflow removes those steps. Send a file link or binary upload to the webhook. UploadToURL hosts it instantly and returns a clean public URL. OpenAI GPT-4.1 mini reads the filename and context to generate a platform-specific caption, hashtags, alt text, and a scroll-stopping hook. A Switch node routes to the correct Buffer profile — Twitter/X, Instagram, or LinkedIn — and the post is scheduled at the AI-suggested best time. What this workflow does Receives a file URL or binary upload via webhook along with platform, tone, and brand preferences Validates the payload — checks the platform, detects content type from the file extension, cleans the filename into readable words for the AI prompt Uploads the file to UploadToURL and retrieves a permanent public link Sends the link and context to OpenAI, which returns a structured JSON caption with hashtags, alt text, a hook line, and a UTC posting time Routes to the correct Buffer profile based on the platform field Schedules the post and returns a confirmation with the schedule ID, caption, hashtags, and estimated engagement Who this is for Marketing agencies** managing multiple brand accounts who need to go from a finished design file to a scheduled post without switching tools Solo creators** who want to publish immediately after finishing a piece of content without manually uploading to each platform E-commerce teams** who want to trigger social posts whenever new product photos are ready Setup Install the UploadToURL community node: n8n-nodes-uploadtourl Add credentials for UploadToURL API, OpenAI API, and Buffer (as HTTP Header Auth with your Buffer access token) Set three workflow variables: BUFFER_PROFILE_TWITTER, BUFFER_PROFILE_INSTAGRAM, BUFFER_PROFILE_LINKEDIN — find these IDs in your Buffer account under each profile's settings Activate and copy the webhook URL Webhook payload { "fileUrl": "https://cdn.example.com/summer-campaign.jpg", "filename": "summer-campaign.jpg", "platform": "instagram", "tone": "casual", "brand": "Acme Studio", "hashtags": true } To upload a binary file instead, send as multipart/form-data with field name file and omit fileUrl. Pass scheduleTime as an ISO 8601 string to override the AI scheduling suggestion. Notes The OpenAI node uses gpt-4.1-mini with response_format: json_object to guarantee structured output — no post-processing of freetext required Caption length is validated against per-platform limits before scheduling (Twitter: 280, Instagram: 2200, LinkedIn: 3000) To add Facebook or TikTok, add a new output on the Switch node and duplicate one of the Buffer HTTP request nodes The error handler returns a structured JSON 400 response so calling apps receive actionable feedback without needing to check n8n logs
by Dr. Firas
Auto-Generate Social Media Videos with GPT-5 and Publish via Blotato > ⚠️ Disclaimer: This workflow uses Community Nodes (Blotato) and requires a self-hosted n8n instance with "Verified Community Nodes" enabled. 👥 Who is this for? This workflow is perfect for: Content creators and influencers who post regularly on social media Marketing teams that want to scale branded video production Solo entrepreneurs looking to automate their video marketing Agencies managing multi-client social media publishing 💡 What problem is this workflow solving? Creating high-quality video content and publishing consistently on multiple platforms is time-consuming. You often need to: Write compelling captions and titles Adapt content to fit each platform’s requirements Publish manually or across disconnected tools This workflow automates the entire process — from idea to publishing — so you can focus on growth and creativity, not logistics. ⚙️ What this workflow does Receives a video idea via Telegram Saves metadata to Google Sheets Transcribes the video using OpenAI Whisper Generates a catchy title and caption using GPT-5 Uploads the final media to Blotato Publishes the video automatically to: TikTok Instagram YouTube Shorts Facebook X (Twitter) Threads LinkedIn Pinterest Bluesky Updates the post status in Google Sheets Sends confirmation via Telegram 🧰 Setup Before launching the workflow, make sure to: Create a Blotato Pro account and generate your API Key Enable Verified Community Nodes in the n8n Admin Panel Install the Blotato community node in n8n Create your Blotato credential using the API key Make a copy of this Google Sheet template Ensure your Google Drive folder with videos is shared publicly (viewable by anyone with the link) Link your Telegram Bot and configure the trigger node Follow the sticky note instructions inside the workflow 🛠️ How to customize this workflow Modify the GPT-5 prompt to reflect your brand voice or campaign tone Add/remove social platforms depending on your strategy Include additional AI modules (e.g., for voiceover or thumbnails) Insert review/approval steps (via Slack, email, or Telegram) Connect Airtable, Notion, or your CRM to track results This is your all-in-one AI video publishing engine, built for automation, scale, and growth across the social web. 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Davide
This workflow automates the full pipeline for extending short Viral UGC-style videos using AI, merging them, and finally publishing the output to cloud storage or social media platforms (TikTok, Instagram, Facebook, Linkedin, X, and YouTube). It integrates multiple external APIs (Fal.ai, Runpod/Kling 2.1, Postiz, Upload-Post, Google Sheets, Google Drive) to create a smooth end-to-end video-generation system. Key Advantages 1. ✅ Full End-to-End Automation The workflow covers the entire process: Read inputs Generate extended clips Merge them Save outputs Publish on social platforms No manual intervention required after starting the workflow. 2. ✅ AI-Powered Video Extension (Kling 2.1 or other models like Veo 3.1 or Sora 2) The system uses Kling 2.1 (Kling 2.1 or other models like Veo 3.1 or Sora 2) to extend short videos realistically, enabling: Longer UGC clips Consistent cinematic style Smooth transitions based on extracted frames Ideal for viral social media content. 3. ✅ Smart Integration with Google Sheets The spreadsheet becomes a control panel: Add new videos to extend Control merging Automatically store URLs and results This makes the system user-friendly even for non-technical operators. 4. ✅ Robust Asynchronous Job Handling Every external API includes: Status checks Waiting loops Error prevention steps This ensures reliability when working with long-running AI processes. 5. ✅ Automatic Merging and Publishing Once videos are generated, the workflow: Merges them in the correct order Uploads them to Google Drive Posts them automatically to selected social platforms This drastically reduces time required for content production and distribution. 6. ✅ Highly Scalable and Customizable Because it is built in n8n: You can add more APIs You can add editing steps You can connect custom triggers (e.g., Airtable, webhooks, Shopify, etc.) You can fully automate your video-production pipeline How It Works This workflow automates the process of extending and merging videos using AI-generated content, then publishing the final result to social media platforms. The process consists of five main stages: Data Input & Frame Extraction** The workflow starts by reading video and prompt data from a Google Sheet. It extracts the last frame from the input video using Fal.ai’s FFmpeg API. AI Video Generation** The extracted frame is sent to RunPod’s Kling 2.1 AI model to generate a new video clip based on the provided prompt and desired duration. Video Merging** Once the AI-generated clip is ready, it is merged with the original video using Fal.ai’s FFmpeg merge functionality to create a seamless extended video. Storage & Publishing** The final merged video is uploaded to Google Drive and simultaneously distributed to social media platforms via: YouTube (via Upload-Post) TikTok, Instagram, Facebook, X, and YouTube (via Postiz) Progress Tracking** Throughout the process, the Google Sheet is updated with the status, video URLs, and completion markers to keep track of each step. Set Up Steps To configure this workflow, follow these steps: Prepare the Google Sheet Use the provided template or clone this sheet. Fill in the START (video URL), PROMPT (AI prompt), and DURATION (in seconds) columns. Configure Fal.ai API for Frame Extraction & Merging Create an account at fal.ai. Obtain your API key. In the nodes “Extract last frame”, “Merge Videos”, and related status nodes, set up HTTP Header Authentication with: Name: Authorization Value: Key YOUR_API_KEY Set Up RunPod API for AI Video Generation Sign up at RunPod and get your API key. In the “Generate clip” node, configure HTTP Bearer Authentication with: Value: Bearer YOUR_RUNPOD_API_KEY Configure Social Media Publishing For YouTube: Create a free account at Upload-Post and set your YOUR_USERNAME and TITLE in the “Upload to Youtube” node. For Multi-Platform Posting: Sign up at Postiz and configure your Channel_ID and TITLE in the “Upload to Social” node. Connect Google Services Set up Google Sheets and Google Drive OAuth2 credentials in their respective nodes to allow reading from and writing to the sheet and uploading videos to Drive. Execute the Workflow Once all credentials are set, trigger the workflow manually via the “When clicking ‘Execute workflow’” node. The process will run autonomously, updating the sheet and publishing the final video upon completion. 👉 Subscribe to my new YouTube channel. Here I’ll share videos and Shorts with practical tutorials and FREE templates for n8n. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by ConnectSafely
Send AI-personalized LinkedIn connection requests from Google Sheets using ConnectSafely.AI API Who's it for This workflow is built for sales professionals, recruiters, founders, and growth marketers who want to scale their LinkedIn outreach without sacrificing personalization. Perfect for anyone tired of sending generic connection requests that get ignored, or manually crafting individual messages for hundreds of prospects. If you're running ABM campaigns, building a sales pipeline, recruiting talent, or expanding your professional network, this automation handles the heavy lifting while keeping your outreach authentic and human. How it works The workflow automates personalized LinkedIn connection requests by combining Google Sheets prospect tracking with AI-powered message generation through ConnectSafely.ai's API. The process flow: Reads pending prospects from your Google Sheet Immediately marks them "IN PROGRESS" to prevent duplicate sends Fetches complete LinkedIn profile data via ConnectSafely.ai API Generates a personalized, authentic message using Google Gemini AI Sends the connection request with your custom message Updates your sheet with "DONE" status and stores the message sent Random delays between requests mimic human behavior and maintain LinkedIn compliance. Watch the complete step-by-step implementation guide: Setup steps Step 1: Prepare Your Google Sheet Structure your Google Sheet with the following columns: | Column Name | Description | Required | |------------|-------------|----------| | First Name | Contact's first name | Optional | | LinkedIn Url | LinkedIn profile URL or username | Yes | | Tagline | Contact's headline/title | Optional | | Status | Processing status (PENDING/IN PROGRESS/DONE) | Yes | | Message | Stores the AI-generated message sent | Yes | Sample Data Format: First Name: John LinkedIn Url: https://www.linkedin.com/in/johndoe Tagline: VP of Sales at TechCorp Status: PENDING Message: (left empty - will be filled by workflow) Pro Tip: Use LinkedIn Sales Navigator export or a prospecting tool to populate your sheet with qualified leads. Step 2: Configure ConnectSafely.ai API Credentials Obtain API Key Log into ConnectSafely.ai Dashboard Navigate to Settings → API Keys Generate a new API key Add Bearer Auth Credential in n8n Go to Credentials in n8n Click Add Credential → Header Auth or Bearer Auth Paste your ConnectSafely.ai API key Save the credential This credential is used by both the "Fetch LinkedIn Profile" and "Send Connection Request" HTTP nodes. Step 3: Configure Google Sheets Integration 3.1 Connect Google Sheets Account Go to Credentials → Add Credential → Google Sheets OAuth2 Follow the OAuth flow to connect your Google account Grant access to Google Sheets 3.2 Configure "Get Pending Prospect" Node Open the Get Pending Prospect node Select your Google Sheets credential Enter your Document ID (from the sheet URL) Select the Sheet Name Add a filter: Lookup Column: Status Lookup Value: PENDING Enable Return First Match Only under Options 3.3 Configure "Mark as In Progress" Node Open the Mark as In Progress node Select the same document and sheet Configure column mapping: Matching Column: row_number Status: IN PROGRESS 3.4 Configure "Mark as Complete" Node Open the Mark as Complete node Select the same document and sheet Configure column mapping: Matching Column: row_number Status: DONE Message: {{ $('Generate Personalized Message').item.json.message }} Step 4: Configure Google Gemini AI Get Gemini API Key Go to Google AI Studio Create or select a project Generate an API key Add Gemini Credential in n8n Go to Credentials → Add Credential → Google Gemini (PaLM) API Paste your API key Save the credential Connect to Google Gemini Node Open the Google Gemini node Select your Gemini credential Step 5: Customize the AI Prompt The Generate Personalized Message node contains the system prompt that controls how messages are written. Customize it for your personal brand: Open the Generate Personalized Message node Find the System Message in Options Replace the placeholder text: MY CONTEXT: [CUSTOMIZE THIS: Add your name, role, and what you're looking for in connections] With your actual information, for example: MY CONTEXT: I'm Sarah, founder of a B2B SaaS startup. I'm interested in connecting with other founders, VCs, and sales leaders to exchange ideas and explore potential partnerships. Update the sign-off instruction from "- [YOUR NAME]" to your actual name Step 6: Test the Workflow Add a test prospect to your Google Sheet with Status: PENDING Click the Manual Trigger (for testing) node Click Test Workflow Verify: Profile data is fetched correctly AI generates an appropriate message Connection request is sent Sheet updates to DONE with the message stored Customization Message Personalization Edit the system prompt in the Generate Personalized Message node to adjust: Tone**: Formal, casual, or industry-specific language Length**: Modify character limits (LinkedIn allows up to 300 characters) Focus**: Emphasize mutual connections, shared interests, or achievements Sign-off**: Change the signature format to match your brand Timing Adjustments Schedule Trigger: Currently set to run every minute. Adjust the interval in the **Run Every Minute node Random Delay: The **Random Delay (1-5 min) node adds 1-5 minutes of random wait time. Modify the formula {{ Math.floor(Math.random() * 4) + 1 }} to change the range Rate Limiting Best Practices Start with 10-20 connection requests per day Gradually increase over 2-3 weeks Never exceed 100 requests per day Consider pausing on weekends Use Cases Sales Prospecting**: Connect with decision-makers at target accounts with personalized outreach Recruiting**: Reach out to passive candidates with messages that reference their specific experience Founder Networking**: Build relationships with fellow entrepreneurs, investors, and advisors Event Follow-up**: Send personalized connection requests to conference attendees and speakers Partnership Development**: Connect with potential partners by referencing their company achievements Troubleshooting Common Issues & Solutions Issue: AI generating messages over 300 characters Solution**: Add explicit character count requirement in the system prompt; the current prompt specifies 200-250 characters Issue: "Profile not found" errors from ConnectSafely.ai Solution**: Ensure LinkedIn URLs are complete (include https://www.linkedin.com/in/) Issue: Generic-sounding AI messages Solution**: Enhance the system prompt with more specific context about your background and goals Issue: Duplicate connection requests sent Solution**: Verify "Mark as In Progress" node runs before "Fetch LinkedIn Profile"; check that row_number column exists in your sheet Issue: Google Sheets not updating Solution**: Confirm row_number column exists and the matching column is correctly configured Issue: Bearer Auth errors Solution**: Verify your ConnectSafely.ai API key is valid and has proper permissions Documentation & Resources Official Documentation ConnectSafely.ai Docs**: https://connectsafely.ai/docs API Reference**: Available in ConnectSafely.ai dashboard Google Gemini API**: https://ai.google.dev/docs Support Channels Email Support**: support@connectsafely.ai Documentation**: https://connectsafely.ai/docs Custom Workflows**: Contact us for custom automation Connect With Us Stay updated with the latest automation tips, LinkedIn strategies, and platform updates: LinkedIn**: linkedin.com/company/connectsafelyai YouTube**: youtube.com/@ConnectSafelyAI-v2x Instagram**: instagram.com/connectsafely.ai Facebook**: facebook.com/connectsafelyai X (Twitter)**: x.com/AiConnectsafely Bluesky**: connectsafelyai.bsky.social Mastodon**: mastodon.social/@connectsafely Need Custom Workflows? Looking to build sophisticated LinkedIn automation workflows tailored to your business needs? Contact our team for custom automation development, strategy consulting, and enterprise solutions. We specialize in: Multi-channel engagement workflows AI-powered personalization at scale Lead scoring and qualification automation CRM integration and data synchronization Custom reporting and analytics pipelines
by Vincent Nguyen
This n8n template shows how to fully automate your social media workflow from Google Sheets to X, Threads, LinkedIn, Facebook, and Instagram — with AI-generated visuals. Instead of manually writing, designing, and posting content, this workflow turns a single Google Sheet row into multi-platform posts plus a custom AI image that matches your message. You write once. n8n posts everywhere. How it works A scheduled trigger checks your Google Sheet for content marked “Ready To Post.” The post text is pulled from Sheets and sent to all text-only platforms (X and Threads). At the same time, an AI agent analyzes your post and generates a high-quality image prompt. GPT-Image creates a custom visual based on that prompt. The image is uploaded to ImgBB to create a shareable URL. That image + your caption are automatically posted to LinkedIn, Facebook, and Instagram. Finally, your Google Sheet is updated so you never repost the same content twice. How to use Add your posts to Google Sheets in the Content column. Set Status = Ready To Post when you want it published. Make sure all platform credentials are connected in n8n. Turn the workflow on and let it run automatically. What you get Hands-free posting to 5 platforms AI-generated visuals for every post Zero manual design work Built-in error handling No duplicate posts Requirements Google Sheets access X (Twitter) account Threads account LinkedIn account Facebook & Instagram Page access ImgBB account OpenAI credentials for image generation Need help? Ask in the n8n Forum or shoot me a DM on LinkedIn Happy automating 🚀
by Tony Adijah
Who is this for This workflow is built for sales teams, agencies, and small businesses that receive inbound leads via WhatsApp and want to automate their first response, lead qualification, and CRM logging — without missing a single message. What this workflow does It listens for incoming WhatsApp messages, uses an AI agent to classify each message by intent (hot lead, warm lead, support, or needs qualification), sends a tailored auto-reply, logs every interaction to Google Sheets, and automatically books Google Calendar meetings with Meet links for qualified leads. How it works WhatsApp Trigger receives incoming messages and filters out bot/status messages to prevent loops. AI Agent (powered by Ollama or any connected LLM) classifies the message into one of four intent categories with confidence scoring. Smart Router directs each intent down a dedicated path. Hot & Warm Leads receive an instant reply, get logged to Google Sheets, have a Google Calendar meeting auto-booked, and receive the Meet link via WhatsApp. Support requests are logged and receive a ticket confirmation. Vague or incomplete messages trigger a smart follow-up question. Conversation memory ensures the AI re-classifies correctly when the user replies with more context. Setup steps Connect your WhatsApp Business API credentials (Meta Cloud API). Connect Google Sheets OAuth and set your spreadsheet ID in all three logging nodes. Connect Google Calendar OAuth and select your calendar in both booking nodes. Configure your LLM (Ollama endpoint, OpenAI, or any supported model). Update the BOT_NUMBERS array in the "Parse WhatsApp Message" node to match your WhatsApp Business phone number ID. Update the phoneNumberId in all WhatsApp Send nodes to your number. Send a test message and verify the full flow. Requirements WhatsApp Business API (Meta Cloud API) access Google Sheets and Google Calendar accounts with OAuth credentials An LLM endpoint (Ollama, OpenAI, or any n8n-supported model) n8n instance (cloud or self-hosted) How to customize Swap the AI model in the Ollama Chat Model node for OpenAI, Anthropic, or any supported LLM. Edit the auto-reply templates in each Reply code node to match your brand voice. Adjust meeting booking times (default: Hot = 2 hours out, Warm = 4 hours out). Add Slack or email notifications by branching from the Google Sheets logging nodes. Modify the AI classification prompt to add custom intent categories for your business.
by Oneclick AI Squad
This workflow ingests student profiles from a form submission or CRM, loads the active scholarship catalogue, uses Claude AI to score each student's eligibility against every available scholarship, filters strong matches, and automatically notifies eligible candidates with personalised application guidance. How it works Trigger — Form submission webhook or nightly scheduled batch run Load Student Profile — Fetches or normalises the student's academic and personal data Load Scholarship Catalogue — Pulls active scholarships from Airtable / Google Sheets Pair Students × Scholarships — Builds evaluation pairs for AI scoring AI Eligibility Scoring — Claude AI scores fit, flags eligibility, ranks scholarships Parse & Rank Results — Extracts structured scores, sorts by match strength Filter Qualified Matches — Keeps scholarships above configurable match threshold Check Deadline Urgency — Flags scholarships closing within 14 days Personalise Notification — Builds tailored email per student with top matches Send Student Email — Dispatches personalised scholarship digest Notify Advisor on Slack — Alerts academic advisor for high-value matches Update CRM Record — Writes matched scholarships back to Airtable student record Log to Audit Sheet — Appends full match report to Google Sheets Return API Response — Returns structured match results to caller Setup Steps Import workflow into n8n Configure credentials: Anthropic API — Claude AI for eligibility scoring Airtable — Student profiles and scholarship catalogue Google Sheets — Audit and match history log Slack OAuth — Academic advisor notifications SendGrid / SMTP — Student notification emails Set your Airtable base ID and table names for students and scholarships Configure match threshold (default: 70) in filter node Set urgency window (default: 14 days) in deadline check node Add your Slack advisor channel ID Activate the workflow Sample Webhook Payload { "studentId": "STU-2025-4821", "firstName": "Priya", "lastName": "Sharma", "email": "priya.sharma@university.edu", "gpa": 3.8, "major": "Computer Science", "yearOfStudy": 2, "nationality": "Indian", "residencyStatus": "International", "financialNeed": true, "extracurriculars": ["Robotics Club", "Volunteer Tutor"], "achievements": ["Dean's List 2024", "Hackathon Winner"], "intendedCareer": "AI Research", "disabilities": false, "firstGenStudent": true } Scholarship Criteria Evaluated by Claude AI Academic Merit** — GPA, honours, academic awards Field of Study** — Major/discipline alignment Financial Need** — Demonstrated need indicators Demographic Eligibility** — Nationality, residency, gender, Indigenous status Year of Study** — Undergraduate, postgraduate, PhD level Extracurricular Profile** — Leadership, community service, sports Career Alignment** — Intended career path vs scholarship mission Special Circumstances** — First-gen, disability support, regional background Features Batch processing of entire student cohort nightly AI-powered multi-criteria eligibility scoring (0–100) Deadline urgency detection and priority flagging Personalised email with ranked scholarship list and tips Academic advisor Slack digest for high-value matches Full audit trail in Google Sheets Airtable CRM auto-update with matched scholarships Explore More Automation: Contact us to design AI-powered content engagement, and multi-platform reply workflows tailored to your growth strategy.