by Gilbert Onyebuchi
Automatically turn your Google Calendar into a fully-automated notification system with email alerts, SMS reminders, and a live performance dashboard - all powered by n8n. This automation helps you never miss an event, while giving you clear visibility into what notifications were sent, when, and how reliably they ran. What This Automation Does This solution is built as 4 connected workflows that run on a schedule and work together: 1. Daily Email Summary (Morning) Every morning, the workflow: Reads today’s events from Google Calendar Formats them into a clean email Sends a daily schedule summary via Mailchimp or SendGrid 2. Daily SMS Summary Shortly after, it: Sends a concise SMS overview of today’s meetings using Twilio 3. 15-Minute Event Reminders Before each event: Sends an individual SMS reminder Skips all-day events automatically 4. Weekly Schedule Preview Every Sunday: Sends a week-ahead summary so you can plan in advance Live Reporting & Dashboard All workflow activity is logged automatically into Google Sheets, which powers a real-time analytics dashboard showing: Number of notifications sent Success vs failure rates Daily and weekly execution stats Visual charts powered by Chart.js No manual tracking needed, everything updates automatically. How the Workflow Is Structured The automation is grouped into 3 clear sections: Section 1: Calendar Data Collection Pulls events from Google Calendar Filters relevant meetings Prepares clean event data Section 2: Notifications & Messaging Formats emails and SMS messages Sends reminders and summaries Handles scheduling logic Section 3: Logging & Reporting Saves every execution to Google Sheets Updates daily stats automatically Feeds the live dashboard SUPPORT & FEEDBACK Questions or issues? Connect with me on LinkedIn Want to see it in action? Try the live report demo: Click here
by Andrey
Overview This n8n workflow automates brand monitoring across social media platforms (Reddit, LinkedIn, X, and Instagram) using the AnySite API. It searches posts mentioning your defined keywords, stores results in n8n Data Tables, analyzes engagement and sentiment, and generates a detailed AI-powered social media report automatically sent to your email. Key Features Multi-Platform Monitoring:** Reddit, LinkedIn, X (Twitter), and Instagram Automated Post Collection:** Searches for new posts containing tracked keywords Data Persistence:** Saves all posts and comments in structured Data Tables AI-Powered Reporting:** Uses GPT (OpenAI API) to summarize and analyze trends, engagement, and risks Automated Email Delivery:** Sends comprehensive daily/weekly reports via Gmail Comment Extraction:** Collects and formats post comments for deeper sentiment analysis Scheduling Support:** Can be executed manually or automatically (e.g., every night) How It Works Triggers The workflow runs: Automatically (via Schedule Trigger) — e.g., once daily Manually (via Manual Trigger) — for testing or on-demand analysis Data Collection Process Keyword Loading: Reads all keywords from the Data Table “Brand Monitoring Words” Social Media Search: For each keyword, the workflow calls the AnySite API endpoints: api/reddit/search/posts api/linkedin/search/posts api/twitter/search/posts (X) api/instagram/search/posts Deduplication: Before saving, checks if a post already exists in the “Brand Monitoring Posts” table. Data Storage: Inserts new posts into the Data Table with fields like type, title, url, vote_count, comment_count, etc. Comments Enrichment: For Reddit and LinkedIn, retrieves and formats comments into JSON strings, then updates the record. AI Analysis & Report Generation: The AI Agent (OpenAI GPT model) aggregates posts, analyzes sentiment, engagement, risks, and generates a structured HTML email report. Email Sending: Sends the final report via Gmail using your connected account. Setup Instructions Requirements Self-hosted or cloud n8n instance AnySite API key** – https://AnySite.io OpenAI API key** (GPT-4o or later) Connected Gmail account (for report delivery) Installation Steps Import the workflow Import the provided file: Social Media Monitoring.json Configure credentials AnySite API: Add access-token header with your API key OpenAI: Add your OpenAI API key in the “OpenAI Chat Model” node Gmail: Connect your Gmail account (OAuth2) in the “Send a message in Gmail” node Create required Data Tables 1️⃣ Brand Monitoring Words | Field | Type | Description | |-------|------|-------------| | word | string | Keyword or brand name to monitor | > Each row represents a single keyword to be tracked. 2️⃣ Brand Monitoring Posts | Field | Type | Description | |-------|------|-------------| | type | string | Platform type (e.g., reddit, linkedin, x, instagram) | | title | string | Post title or headline | | url | string | Direct link to post | | created_at | string | Post creation date/time | | subreddit_id | string | (Reddit only) subreddit ID | | subreddit_alias | string | (Reddit only) subreddit alias | | subreddit_url | string | (Reddit only) subreddit URL | | subreddit_description | string | (Reddit only) subreddit description | | comment_count | number | Number of comments | | vote_count | number | Votes, likes, or reactions count | | subreddit_member_count | number | (Reddit only) member count | | post_id | string | Unique post identifier | | text | string | Post body text | | comments | string | Serialized comments (JSON string) | | word | string | Matched keyword that triggered capture | AI Reporting Logic Collects all posts gathered during the run Aggregates by keyword and platform Evaluates sentiment, engagement, and risk signals Summarizes findings with an executive summary and key metrics Sends the Social Media Intelligence Report to your configured email Customization Options Schedule:** Adjust the trigger frequency (daily, hourly, etc.) Keywords:* Add or remove keywords in the *Brand Monitoring Words** table Report Depth:** Modify system prompts in the “AI Agent” node to customize tone and analysis focus Email Recipient:** Change the target email address in the “Send a message in Gmail” node Troubleshooting | Issue | Solution | |-------|-----------| | No posts found | Check AnySite API key and keyword relevance | | Duplicate posts | Verify Data Table deduplication setup | | Report not sent | Confirm Gmail OAuth2 connection | | AI Agent error | Ensure OpenAI API key and model selection are correct | Best Practices Use specific brand or product names in keywords for better precision Run the workflow daily to maintain fresh insights Periodically review and clean Data Tables Adjust AI prompt parameters to refine analytical tone Review AI-generated reports to ensure data quality Author Notes Created for automated cross-platform brand reputation monitoring, enabling real-time insights into how your brand is discussed online.
by Takumi Oku
Who’s it for This template is designed for Print-on-Demand (POD) business owners, independent artists, and e-commerce managers who want to automate the process of turning raw design files into listed products without manual data entry. How it works This workflow acts as an automated merchandise factory that handles everything from image processing to marketing. Trigger: The workflow starts when a new design file is uploaded to a specific Google Drive folder. Analyze: OpenAI Vision analyzes the image to determine the subject, mood, and color palette, and assesses copyright risk. Process: The image background is removed using Remove.bg, and the clean asset is uploaded to Cloudinary. Mockup: The workflow generates realistic product mockups (e.g., T-shirts, Tote bags) by overlaying the design onto base product images using Cloudinary transformations. Copywriting: OpenAI writes an SEO-friendly product title, description, and tags based on the visual analysis. Draft: A draft product is created in Shopify with the generated details and mockup image. Approval: A message is sent to Slack with the product details and mockup. The workflow pauses and waits for a human to click "Approve" or "Reject". Publish & Promote: If approved, the product is published to Shopify and automatically posted to Instagram and Pinterest. If rejected, a notification is sent to Slack. How to set up Base Images: Upload your blank product images (e.g., a white t-shirt, a tote bag) to your Cloudinary account and note their Public IDs. Configuration: Open the Workflow Configuration node and fill in all the required fields, including your API keys and the Cloudinary Public IDs for your base products. Credentials: Configure the credentials for Google Drive, OpenAI, Shopify, Slack, Instagram, and Pinterest in their respective nodes. Folder ID: Update the Google Drive Trigger node with the ID of the folder you want to watch. Requirements n8n (Self-hosted or Cloud) Google Drive account OpenAI API key (Access to GPT-4o model recommended for Vision capabilities) Remove.bg API key Cloudinary account Shopify store Slack workspace Instagram Business account Pinterest account How to customize Mockups: You can modify the Code - Generate Mockup URLs node to add more product types (e.g., Hoodies, Mugs) by adding their Cloudinary Public IDs. Prompt Engineering: Adjust the system prompt in the OpenAI - SEO Copywriting node to match your brand voice or language style. Social Channels: Add or remove nodes to support other platforms like Twitter (X) or Facebook Pages.
by Michael Gullo
Workflow Purpose The workflow is designed to scan submitted URLs using urlscan.io and VirusTotal, combine the results into a single structured summary, and send the report via Telegram. I built this workflow for people who primarily work from their phones and receive a constant stream of emails throughout the day. If a user gets an email asking them to sign a document, review a report, or take any action where the link looks suspicious, they can simply open the Telegram bot and quickly check whether the URL is safe before clicking it. Key Components 1. Input / Trigger Accepts URLs that need to be checked. Initiates requests to VirusTotal and urlscan.io. 2. VirusTotal Scan Always returns results if the URL is reachable. Provides reputation, malicious/clean flags, and scan metadata. 3. urlscan.io Scan Returns details on how the URL behaves when loaded (domains, requests, resources, etc.). Sometimes fails due to blocks or restrictions. 4. Error Handling with Code Node Checks whether urlscan.io responded successfully. Ensures the workflow always produces a summary, even if urlscan.io fails. 5. Summary Generation If both scans succeed → summarize combined findings from VirusTotal + urlscan.io. If urlscan.io fails → state clearly in the summary “urlscan.io scan was blocked/failed. Relying on VirusTotal results.” Ensures user still gets a complete security report. 6. Telegram Output Final formatted summary is delivered to a Telegram chat via the bot. Chat ID issue was fixed after the Code Node restructuring. Outcome The workflow now guarantees a consistent, user-friendly summary regardless of urlscan.io failures. It leverages VirusTotal as the fallback source of truth. The Telegram bot provides real-time alerts with clear indications of scan success/failure. Prequisites Telegram In Telegram, start a chat with @BotFather. Send /newbot, pick a name and a unique username. Copy the HTTP API token BotFather returns (store securely) Start a DM with your bot and send any message. Call getUpdates and read the chat.id urlscan.io Create/log into your urlscan.io account. Go to Settings & API → New API key and generate a key. (Recommended) In Settings & API, set Default Scan Visibility to Unlisted to avoid exposing PII in public scans. Save the key securely (env var or n8n Credentials). Rate limits note: urlscan.io enforces per-minute/hour/day quotas; exceeding them returns HTTP 429. You can view your personal quotas on their dashboard/quotas endpoint Virustotal Sign up / sign in to VirusTotal Community. Open My API key (Profile menu) and copy your Public API key. Store it securely (env var or n8n Credentials). For a more reliable connection with VirusTotal and improved scanning results, enable the Header section in the node settings. Add a header parameter with a clear name (e.g., x-apikey), and then paste your API key into the Value field. Rate limits (Public API): 4 requests/minute, 500/day; not for commercial workflows. Consider Premium if you’ll exceed this. How to Customize the Workflow This workflow is designed to be highly customizable, allowing users to adapt it to their specific needs and use cases. For example, additional malicious website scanners can be integrated through HTTP Request nodes. To make this work, the user simply needs to update the Merge node so that all information flows correctly through the workflow. In addition, users can connect either Gmail or Outlook nodes to automatically test URLs, binary attachments, and other types of information received via email—helping them evaluate data before opening it. Users can also customize how they receive reports. For instance, results can be sent through Telegram (as in the default setup), Slack, Microsoft Teams, or even saved to Google Drive or a Google Sheet for recordkeeping and audit purposes. For consulting and support, or if you have questions, please feel free to connect with me on Linkedin or via email.
by David Olusola
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. WordPress to Blotato Social Publisher Overview: This automation monitors your WordPress site for new posts and automatically creates platform-specific social media content using AI, then posts to Twitter, LinkedIn, and Facebook via Blotato. What it does: Monitors WordPress site for new posts every 30 minutes Filters posts published in the last hour to avoid duplicates Processes each new post individually AI generates optimized content for each social platform (Twitter, LinkedIn, Facebook) Extracts platform-specific content from AI response Publishes to all three social media platforms via Blotato API Setup Required: WordPress Connection Configure WordPress credentials in the "Check New Posts" node Enter your WordPress site URL, username, and password/app password Blotato Social Media API Setup Get your Blotato API key from your Blotato account Configure API credentials in the Blotato connection node Map each platform (Twitter, LinkedIn, Facebook) to the correct Blotato channel AI Configuration Set up Google Gemini API credentials Connect the Gemini model to the "AI Social Content Creator" node Customization Options Posting Frequency: Modify schedule trigger (default: every 30 minutes) Content Tone: Adjust AI system message for different writing styles Post Filtering: Change time window in WordPress node (default: last hour) Platform Selection: Remove any social media platforms you don’t want to use Testing Run workflow manually to test connections Verify posts appear correctly on all platforms Monitor for API rate limit issues Features: Platform-optimized content (hashtags, character limits, professional tone) Duplicate prevention system Batch processing for multiple posts Featured image support Customizable posting frequency Customization: Change monitoring frequency Adjust AI prompts for different tones Add/remove social platforms Modify hashtag strategies Need Help? For n8n coaching or one-on-one consultation
by BluePro
This workflow monitors targeted subreddits for potential sales leads using Reddit’s API, AI content analysis, Supabase, and Google Sheets. It is built specifically to discover posts from Reddit users who may benefit from a particular product or service. It can be easily customized for any market. 🔍 Features Targeted Subreddit Monitoring:** Searches multiple niche subreddits like smallbusiness, startup, sweatystartup, etc., using relevant keywords. AI-Powered Relevance Scoring:** Uses OpenAI GPT to analyze each post and determine if it’s written by someone who might benefit from your product, returning a simple “yes” or “no.” Duplicate Lead Filtering with Supabase:** Ensures you don’t email the same lead more than once by storing already-processed Reddit post IDs in a Supabase table. Content Filtering:** Filters out posts with no body text or no upvotes to ensure only high-quality content is processed. Lead Storage in Google Sheets:** Saves qualified leads into a connected Google Sheet with key data (URL, post content, subreddit, and timestamp). Email Digest Alerts:** Compiles relevant leads and sends a daily digest of matched posts to your team’s inbox for review or outreach. Manual or Scheduled Trigger:** Can be manually triggered or automatically scheduled (via the built-in Schedule Trigger node). ⚙️ Tech Stack Reddit API** – For post discovery OpenAI Chat Model** – For AI-based relevance filtering Supabase** – For lead de-duplication Google Sheets** – For storing lead details Gmail API** – For sending email alerts 🔧 Customization Tips Adjust Audience**: Modify the subreddits and keywords in the initial Code node to match your market. Change the AI Prompt**: Customize the prompt in the “Analysis Content by AI” node to describe your product or service. Search Comments Instead**: To monitor comments instead of posts, change type=link to type=comment in the Reddit Search node. Change Email Recipients**: Edit the Gmail node to direct leads to a different email address or format.
by Dr. Firas
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Create and Auto-Post Viral AI Videos with VEO3 and Blotato to 9 Platforms Who is this for? This template is ideal for content creators, growth marketers, e-commerce entrepreneurs, and video-first brands who want to automate the creation and multi-platform distribution of viral short-form ads using AI. If you're looking to scale video production without editing tools or posting manually, this is for you. What problem is this workflow solving? Creating high-converting video content is time-consuming. You need to: Come up with ideas Write compelling scripts Generate visuals Adapt content for each platform Manually publish and track results This workflow automates that entire process and turns a single idea into a ready-to-publish video campaign across 9 platforms. What this workflow does Triggers via Telegram when a new video idea is submitted Fetches parameters (style, tone, duration) from Google Sheets Generates the video script using GPT-4 and a master AI prompt Creates the video using Google’s VEO3 video generation model Downloads the final video once rendering is complete Rewrites the caption with GPT-4o for platform-optimized posting Logs the result in Google Sheets for tracking Sends preview links to Telegram for review Auto-posts the video to 9 platforms using Blotato (TikTok, YouTube, Instagram, Threads, Facebook, X, LinkedIn, Pinterest, Bluesky) Setup Install n8n (self-hosted) with Community Nodes enabled Connect your Telegram Bot Token to the trigger node Add your OpenAI API Key for GPT-4 and GPT-4o models Configure your VEO3 API access (Google AI Studio) Set up Blotato with your platform tokens & IDs Link your Google Sheets and set the expected column structure (idea, style, caption, etc.) Adjust the Telegram trigger format (e.g., idea: ...) to your team’s input style How to customize this workflow to your needs Edit the master prompt to match your brand voice or industry Replace the caption prompt to generate more marketing-style hooks Modify the platform list if you only publish to a few specific channels Add approval steps (Slack, email, Telegram) before publishing Integrate tracking by pushing published URLs to your analytics or CRM 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Davide
This workflow automates the process of creating short video clips from a YouTube video based on specific content requested by the user. Tis is a complete AI-powered video clipping and distribution system, turning any YouTube video into ready-to-publish short-form content automatically IMPORTANT: This workflow is quite complex, it requires integrating multiple APIs including YouTube Transcript, RapidAPI for video download, OpenAI GPT, Fal.run for video processing, Google Drive, FTP, and various social media platforms (TikTok, YouTube, Instagram via Postiz and Upload-Post). But despite the complexity, the core idea is brilliantly simple and elegant: you provide a YouTube video and a prompt describing what you're looking for, and the workflow automatically finds the exact moment in the video, extracts that segment, and publishes it as a short clip across all your social channels. Key Benefits 1. ✅ Fully Automated Content Repurposing No manual editing is required. The workflow automatically extracts and creates clips from long-form content. 2. ✅ AI-Powered Precision The use of an LLM ensures that the exact relevant moment in the video is identified based on meaning, not just keywords. 3. ✅ Multi-Platform Distribution The workflow doesn’t just create clips — it also publishes them across multiple platforms (TikTok, YouTube, Instagram), saving time and effort. 4. ✅ Scalability You can process multiple videos and prompts, making it ideal for: Content creators Agencies Social media automation 5. ✅ Time Efficiency What normally takes 30–60 minutes of manual editing is reduced to a fully automated flow that runs in minutes. 6. ✅ Modular & Extendable Each step (transcript, AI analysis, trimming, publishing) is modular, so you can: Replace APIs Add subtitles Add captions or branding Integrate with other tools 7. ✅ Centralized Media Management Files are automatically stored in: Google Drive CDN (FTP) This ensures easy access and distribution. 8. ✅ Error Handling & Validation The workflow checks if the content is found in the transcript and avoids unnecessary processing if not. How it works This workflow automates the process of creating a short video clip from a YouTube video based on a specific user prompt. It follows these steps: Input & Transcript Retrieval: The user provides a YouTube Video ID and a search prompt. The workflow first calls the youtube-transcript.io API to fetch the video's transcript. AI-Powered Time Extraction: The transcript and the user's prompt are sent to an OpenAI GPT model (gpt-4.1-mini) with a structured prompt. The LLM analyzes the transcript to find the exact segment matching the prompt and outputs a JSON object containing the start_time, end_time, and duration of the matching clip. Video Download & Hosting: Simultaneously, the workflow downloads the full video file using a RapidAPI YouTube downloader. It then uploads this file to an FTP server (BunnyCDN) to generate a public, accessible URL. Clip Generation (Fal.run): Using the video_url (from the FTP server) and the timecodes from the AI, the workflow sends a request to a Fal.run workflow utility (trim-video) to process and create the short video clip. It then polls the status endpoint until the processing is completed. Clip Distribution: Once the clip is ready, the workflow downloads the final video file and uploads it to multiple destinations: Storage: Google Drive and an FTP server (BunnyCDN). Social Media: Directly to TikTok, YouTube, and Instagram (via Postiz API) using the upload-post.com and Postiz APIs. Set up steps To get this workflow running, you need to configure several external API keys and credentials: API Keys & Credentials: RapidAPI: Obtain an API key for the "Youtube Video Fast Downloader 24-7" API. Insert this key in the "Download Video" node's header parameters. Youtube Transcript API: Get an API key from youtube-transcript.io and configure it in the "Generate transcript" node's HTTP Header Auth. OpenAI: Add your OpenAI API key in the "OpenAI Chat Model" node credentials. Fal.run: Add your Fal.run API key in the "Video Dubbing" and "Get final video url" nodes under HTTP Header Auth. Storage & Hosting: Google Drive: Set up OAuth2 credentials for Google Drive. The node is configured to upload to a specific folder ID (ensure this folder exists or update the ID). FTP BunnyCDN: Configure FTP credentials (host, username, password) in the "Upload to FTP" and "Upload to FTP server" nodes. The paths are set to /n3wstorage/test/. Update these paths and credentials to match your server. Social Media Integration (Optional): Postiz (Instagram): Set up credentials for Postiz in the "Upload to Instagram" node and update the integrationId and content fields with your specific Instagram account details. Upload-Post.com (TikTok & Youtube): Obtain an API key for upload-post.com and configure it in the "Upload to TikTok" and "Upload to Youtube" nodes. You must also update the title, user, and platform[] parameters (currently placeholders like SET_TITLE and YOUR_USERNAME) with your actual account data. Workflow Variables: Ensure the "Edit Fields" node contains the correct VIDEO ID and PROMPT for testing. 👉 Subscribe to my new YouTube channel. Here I’ll share videos and Shorts with practical tutorials and FREE templates for n8n. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Samyotech
Social Media Posting Automation with Image and Caption How it works This AI-powered workflow streamlines your social media posting process, transforming hours of manual caption writing, image uploading, and scheduling into a fully automated system. You define the topic and image once, and the workflow handles caption generation, review, approval, and posting to your selected platforms. Automated Flow Generate Caption Trigger the workflow manually and set your post topic and image URL in the Set node. The AI (GPT-4.1-mini) generates a high-quality, engaging social media caption tailored to your audience, platform, and content goals. Store in Google Sheet The generated caption, along with your image URL and post metadata, is automatically appended to your Google Sheet. This creates a central location to review and manage all your social media content. Review and Approve You review the generated caption in the sheet, make any edits if needed, and update the status to Approved. You can also select the platform(s) where you want to post. Automatic Posting Once the status is updated to Approved, the next workflow is triggered automatically. It posts your caption and image to the selected social media platform(s) without any further manual effort. The result? A seamless, end-to-end social media posting process where captions are AI-generated, stored, reviewed, and posted automatically. Focus on strategy and engagement instead of repetitive manual posting. Setup Steps Setup time: ~10–15 minutes What you’ll need: OpenAI API key, Google account, access to your social media platform(s) Connect Your Google Account Click on the Google Sheets node in your workflow. Select the Credential dropdown and choose + Create New Credential. Authenticate your Google account and grant the necessary permissions. Initialize Your Spreadsheet Run the workflow once by clicking the play button on the start node. This will automatically create a Google Sheet with all the required columns for caption tracking and approval. Add Your OpenAI API Key Navigate to the AI Agent node. Click the Credential dropdown and select + Create New Credential. Paste your OpenAI API key and save. Get your API key from platform.openai.com/api-keys. Set Post Topic and Image Update the title in the Set node with the topic you want to post. Add the image URL associated with your post. Review Captions and Approve Open your Google Sheet, review the generated captions, and update the status to Approved. Select the platform(s) where the post should go live. Go Live Once the status is updated, the workflow will automatically post your content to the selected social media platform(s). Sit back and watch your AI-generated captions and images go live automatically. Ready to automate your social media? Activate your workflow and start posting smarter today! 🚀💡✅
by inderjeet Bhambra
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works? The Content Strategy AI Pipeline is an intelligent, multi-stage content creation system that transforms simple user prompts into polished, ready-to-publish content. The system intelligently extracts platform requirements, audience insights, and brand tone from user requests, then develops strategic reasoning and emotional connection strategies before crafting compelling content outlines and final publication-ready posts or articles. Supporting both social media platforms (Instagram, LinkedIn, X, Facebook, TikTok) and blog content. Key Differentiators: Strategic thinking approach, emotional intelligence integration, platform-native optimization, zero-editing-required output, and professional content strategist-level quality through multi-model AI orchestration. Technical Points Multi-model AI orchestration for specialized tasks Emotional psychology integration for audience connection Platform algorithm optimization built-in Industry-standard content strategy methodology automated Enterprise-grade reliability with session management and memory API-ready architecture for integration into existing workflows Test Inputs Sample Request: "Create an Instagram post for a fitness coach targeting busy moms, tone should be motivational and relatable" Expected Flow: Platform: Instagram → Niche: Fitness → Audience: Busy Moms → Tone: Motivational → Output: 125-150 word post with hashtags `
by Meak
LinkedIn Job-Based Cold Email System Most outreach tools rely on generic lead lists and recycled contact data. This workflow builds a live, personalized lead engine that scrapes new LinkedIn job posts, finds company decision-maker emails, and generates custom cold emails using GPT — all fully automated through n8n. Benefits Automated daily scraping of “Marketing Manager” jobs in Belgium Real-time leads from companies currently hiring for marketing roles Filters out HR and staffing agencies to keep only real businesses Enriches each company with verified CEO, Sales, and Marketing emails Generates unique, human-like cold emails and subject lines with GPT-4o Saves clean data to Google Sheets and drafts personalized Gmail messages How It Works Schedule Trigger runs every morning at 08:00. Apify LinkedIn Scraper collects new “Marketing Manager” jobs in Belgium. Remove Duplicates ensures each company appears only once. Filter Staffing excludes recruiters, HR agencies, and interim firms. Save Useful Infos extracts core company data — name, domain, size, description. Filter Domain & Size keeps valid websites and companies under 100 employees. Anymailfinder API looks up CEO, Sales, and Marketing decision-maker emails. Merge + If Node validates email results and removes invalid entries. Split Out + Deduplicate ensures unique, verified contacts. Extract Lead Name (Code Node) separates first and last names. Google Sheets Node appends all enriched lead data to your master sheet. GPT-4o (LangChain) writes a 100–120 word personalized cold email. GPT-4o (LangChain) creates a short, casual subject line. Gmail Draft Node builds a ready-to-send email using both outputs. Wait Node loops until all leads are processed. Who Is This For B2B agencies targeting Belgian SMEs Outbound marketers using job postings as purchase intent signals Freelancers or founders running lean, automated outreach systems Growth teams building scalable cold email engines Setup Apify**: use curious_coder~linkedin-jobs-scraper actor + API token Anymailfinder**: header auth with decision-maker categories (ceo, sales, marketing) Google Sheets**: connect a sheet named “LinkedIn Job Scraper” and map columns OpenAI (GPT-4o)**: insert your API key into both LangChain nodes Gmail**: OAuth2 connection; resource set to draft n8n**: store all credentials securely; set HTTP nodes to continue on error ROI & Results Save 1–3 hours per day on manual research and outreach prep Contact active hiring companies when they need marketing help most Scale to multiple industries or regions by changing search URLs Outperform paid lead databases with fresh, verified data Strategy Insights Add funding or tech-stack data for better lead scoring A/B test GPT subject lines and log open rates in Sheets Schedule GPT follow-ups 3 and 7 days later for full automation Push all enriched data to your CRM for advanced segmentation Use hiring signals to trigger ad audiences or retargeting campaigns Check Out My Channel For more advanced automation workflows that generate real client results, check out my YouTube channel — where I share the exact systems I use to automate outreach, scale agency pipelines, and close deals faster.
by Li CHEN
AWS News Analysis and LinkedIn Automation Pipeline Transform AWS industry news into engaging LinkedIn content with AI-powered analysis and automated approval workflows. Who's it for This template is perfect for: Cloud architects and DevOps engineers** who want to stay current with AWS developments Content creators** looking to automate their AWS news coverage Marketing teams** needing consistent, professional AWS content Technical leaders** who want to share industry insights on LinkedIn AWS consultants** building thought leadership through automated content How it works This workflow creates a comprehensive AWS news analysis and content generation pipeline with two main flows: Flow 1: News Collection and Analysis Scheduled RSS Monitoring: Automatically fetches latest AWS news from the official AWS RSS feed daily at 8 PM AI-Powered Analysis: Uses AWS Bedrock (Claude 3 Sonnet) to analyze each news item, extracting: Professional summary Key themes and keywords Importance rating (Low/Medium/High) Business impact assessment Structured Data Storage: Saves analyzed news to Feishu Bitable with approval status tracking Flow 2: LinkedIn Content Generation Manual Approval Trigger: Feishu automation sends approved news items to the webhook AI Content Creation: AWS Bedrock generates professional LinkedIn posts with: Attention-grabbing headlines Technical insights from a Solutions Architect perspective Business impact analysis Call-to-action engagement Automated Publishing: Posts directly to LinkedIn with relevant hashtags How to set up Prerequisites AWS Bedrock access** with Claude 3 Sonnet model enabled Feishu account** with Bitable access LinkedIn company account** with posting permissions n8n instance** (self-hosted or cloud) Detailed Configuration Steps 1. AWS Bedrock Setup Step 1: Enable Claude 3 Sonnet Model Log into your AWS Console Navigate to AWS Bedrock Go to Model access in the left sidebar Find Anthropic Claude 3 Sonnet and click Request model access Fill out the access request form (usually approved within minutes) Once approved, verify the model appears in your Model access list Step 2: Create IAM User and Credentials Go to IAM Console Click Users → Create user Name: n8n-bedrock-user Attach policy: AmazonBedrockFullAccess (or create custom policy with minimal permissions) Go to Security credentials tab → Create access key Choose Application running outside AWS Download the credentials CSV file Step 3: Configure in n8n In n8n, go to Credentials → Add credential Select AWS credential type Enter your Access Key ID and Secret Access Key Set Region to your preferred AWS region (e.g., us-east-1) Test the connection Useful Links: AWS Bedrock Documentation Claude 3 Sonnet Model Access AWS Bedrock Pricing 2. Feishu Bitable Configuration Step 1: Create Feishu Account and App Sign up at Feishu International Create a new Bitable (multi-dimensional table) Go to Developer Console → Create App Enable Bitable permissions in your app Generate App Token and App Secret Step 2: Create Bitable Structure Create a new Bitable with these columns: title (Text) pubDate (Date) summary (Long Text) keywords (Multi-select) rating (Single Select: Low, Medium, High) link (URL) approval_status (Single Select: Pending, Approved, Rejected) Get your App Token and Table ID: App Token: Found in app settings Table ID: Found in the Bitable URL (tbl...) Step 3: Set Up Automation In your Bitable, go to Automation → Create automation Trigger: When field value changes → Select approval_status field Condition: approval_status equals "Approved" Action: Send HTTP request Method: POST URL: Your n8n webhook URL (from Flow 2) Headers: Content-Type: application/json Body: {{record}} Step 4: Configure Feishu Credentials in n8n Install Feishu Lite community node (self-hosted only) Add Feishu credential with your App Token and App Secret Test the connection Useful Links: Feishu Developer Documentation Bitable API Reference Feishu Automation Guide 3. LinkedIn Company Account Setup Step 1: Create LinkedIn App Go to LinkedIn Developer Portal Click Create App Fill in app details: App name: AWS News Automation LinkedIn Page: Select your company page App logo: Upload your logo Legal agreement: Accept terms Step 2: Configure OAuth2 Settings In your app, go to Auth tab Add redirect URL: https://your-n8n-instance.com/rest/oauth2-credential/callback Request these scopes: w_member_social (Post on behalf of members) r_liteprofile (Read basic profile) r_emailaddress (Read email address) Step 3: Get Company Page Access Go to your LinkedIn Company Page Navigate to Admin tools → Manage admins Ensure you have Content admin or Super admin role Note your Company Page ID (found in page URL) Step 4: Configure LinkedIn Credentials in n8n Add LinkedIn OAuth2 credential Enter your Client ID and Client Secret Complete OAuth2 flow by clicking Connect my account Select your company page for posting Useful Links: LinkedIn Developer Portal LinkedIn API Documentation LinkedIn OAuth2 Guide 4. Workflow Activation Final Setup Steps: Import the workflow JSON into n8n Configure all credential connections: AWS Bedrock credentials Feishu credentials LinkedIn OAuth2 credentials Update webhook URL in Feishu automation to match your n8n instance Activate the scheduled trigger (daily at 8 PM) Test with manual webhook trigger using sample data Verify Feishu Bitable receives data Test approval workflow and LinkedIn posting Requirements Service Requirements AWS Bedrock** with Claude 3 Sonnet model access AWS account with Bedrock service enabled IAM user with Bedrock permissions Model access approval for Claude 3 Sonnet Feishu Bitable** for news storage and approval workflow Feishu account (International or Lark) Developer app with Bitable permissions Automation capabilities for webhook triggers LinkedIn Company Account** for automated posting LinkedIn company page with admin access LinkedIn Developer app with posting permissions OAuth2 authentication setup n8n community nodes**: Feishu Lite node (self-hosted only) Technical Requirements n8n instance** (self-hosted recommended for community nodes) Webhook endpoint** accessible from Feishu automation Internet connectivity** for API calls and RSS feeds Storage space** for workflow execution logs Cost Considerations AWS Bedrock**: ~$0.01-0.05 per news analysis Feishu**: Free tier available, paid plans for advanced features LinkedIn**: Free API access with rate limits n8n**: Self-hosted (free) or cloud subscription How to customize the workflow Content Customization Modify AI prompts** in the AI Agent nodes to change tone, focus, or target audience Adjust hashtags** in the LinkedIn posting node for different industries Change scheduling** frequency by modifying the Schedule Trigger settings Integration Options Replace LinkedIn** with Twitter/X, Facebook, or other social platforms Add Slack notifications** for approved content before posting Integrate with CRM** systems to track content performance Add content calendar** integration for better planning Advanced Features Multi-language support** by modifying AI prompts for different regions Content categorization** by adding tags for different AWS services Performance tracking** by integrating analytics platforms Team collaboration** by adding approval workflows with multiple reviewers Technical Modifications Change RSS sources** to monitor other AWS blogs or competitor news Adjust AI models** to use different Bedrock models or external APIs Add data validation** nodes for better error handling Implement retry logic** for failed API calls Important Notes Service Limitations This template uses community nodes (Feishu Lite) and requires self-hosted n8n Geo-restrictions** may apply to AWS Bedrock models in certain regions Rate limits** may affect high-frequency posting - adjust scheduling accordingly Content moderation** is recommended before automated posting Cost considerations**: Each AI analysis costs approximately $0.01-0.05 USD per news item Troubleshooting Common Issues AWS Bedrock Issues: Model not found**: Ensure Claude 3 Sonnet access is approved in your region Access denied**: Verify IAM permissions include Bedrock service access Rate limiting**: Implement retry logic or reduce analysis frequency Feishu Integration Issues: Authentication failed**: Check App Token and App Secret are correct Table not found**: Verify Table ID matches your Bitable URL Automation not triggering**: Ensure webhook URL is accessible and returns 200 status LinkedIn Posting Issues: OAuth2 errors**: Re-authenticate LinkedIn credentials Posting failed**: Verify company page admin permissions Rate limits**: LinkedIn has daily posting limits for company pages Security Best Practices Never hardcode credentials** in workflow nodes Use environment variables** for sensitive configuration Regularly rotate API keys** and access tokens Monitor API usage** to prevent unexpected charges Implement error handling** for failed API calls