by Peyrichou Maxime
🎯 Automated B2B Cold Email Sequence with AI Personalization Stop manually sending follow-ups. This workflow automates your entire cold email outreach with AI-powered personalization, smart scheduling, and automatic reply detection. ✨ What It Does This workflow sends a perfectly-timed 3-email sequence to B2B prospects with: AI-generated personalized emails** adapted to each prospect's sector, company size, and role Smart scheduling**: Only sends Tuesday/Wednesday/Thursday at optimal times (~9AM with randomization) Bilingual support**: Auto-detects French or English prospects Reply detection**: Automatically stops the sequence when someone replies Spam prevention**: Randomized sending times, professional intervals Full tracking**: Updates Google Sheets with send status and reply detection Slack notifications**: Get alerted immediately when a hot lead responds 📊 Expected Results Based on real usage: 10-15% reply rate** (vs 3-8% for generic cold emails) 2-5 hot leads** per 30 prospects contacted 90% time saved** vs manual outreach Professional delivery** that preserves sender reputation 🎬 How It Works Add prospect to Google Sheet (Email, Name, Company, etc.) Workflow triggers automatically on new row Time planning calculates optimal send dates (Tue/Wed/Thu only) Email 1 (J0): AI generates introduction with value prop (80-120 words) Wait 3 days Email 2 (J+3): Follow-up with new angle or case study (70-90 words) Wait 4 days Email 3 (J+7): Final soft close, respectful exit (60-100 words) Reply detection: Monitors Gmail → marks prospect "hot" → Slack alert → stops sequence 🚀 Perfect For Sales teams doing B2B cold outreach Agencies prospecting new clients SaaS founders building their pipeline Consultants/freelancers doing business development Anyone tired of manual follow-ups and low response rates ⚙️ Requirements Required: Google Sheets (free) Gmail account (free) OpenAI API key (~$0.50 per 100 emails) Optional: Slack (for notifications) n8n version: Compatible with n8n 1.0+
by browseract
🕵️♂️ Reddit Intelligence Monitor: AI-Powered Scraping with BrowserAct Automate your market research and competitor analysis with this powerful "Set and Forget" workflow. It monitors Reddit for specific keywords and competitor subreddits, uses BrowserAct for stealth scraping, analyzes the sentiment with AI, and delivers a daily intelligence digest to your Google Sheets. 💡 Key Features Powered by BrowserAct**: Leverages cloud browser automation to stealthily scrape Reddit data without getting blocked. Dual-Track Monitoring**: Simultaneously tracks "Brand Competitors" (Subreddits) and "Topic Keywords" (Search Results). AI Analysis**: Summarizes the top 3 trending posts into a single concise daily report, filtering out noise. Structured Archive**: Automatically cleans, formats, and archives intelligence with source links into Google Sheets. 🛠️ How it Works Config Read: Reads a list of monitoring targets from a Google Sheet. Route: Splits the task into two paths (Competitor vs. Keyword) based on input type. Scrape: BrowserAct navigates to the target Reddit pages and extracts the latest posts. Process: Custom Code nodes clean the data and merge top 3 posts into a single prompt. Analyze: AI Agent generates an executive summary for each topic. Archive: Final reports are appended to your "Report" Google Sheet. 📋 Setup Guide Google Sheets: Create a sheet with two tabs: Config: Columns keywords (for search terms) and competitor (for subreddit names). Report: Columns Date, Competitor/Keyword, Summary, Link. BrowserAct: Connect your BrowserAct credentials and ensure you have the Reddit scraping task template ready. AI Model: Configure the Google Gemini Chat Model (or swap for OpenAI). Schedule: Enable the Schedule Trigger for daily automated runs.
by Gilbert Onyebuchi
Complete YouTube video automation workflow that creates ready-to-upload videos from start to finish. No manual editing required. How it works: This n8n automation fetches stock videos from Pixabay, generates AI-powered voiceover scripts with OpenAI, creates professional narration using ElevenLabs text-to-speech, merges all clips with beautiful transitions using Shotstack rendering, and automatically uploads your finished video to Google Drive. What you'll achieve: Create 5-10 minute videos automatically Generate unlimited faceless YouTube content Save hours of manual video editing Build a consistent content pipeline Scale your YouTube channel effortlessly Requirements: Pixabay API (free tier available) ElevenLabs API (text-to-speech) Shotstack API (video rendering) OpenAI API (script generation) Google Drive API credentials Perfect for content creators, YouTube automation, educational channels, social media marketers, and faceless channel owners. 📧 Questions? Need customization? Connect with me on LinkedIn: Click here 👀 Check out my other automation workflows on my n8n creator profile for more productivity tools!
by Pixcels Themes
Who’s it for This template is designed for recruiters, lead-generation teams, agency owners, and sales professionals who collect LinkedIn profile data and need to automate the process of finding verified company domains and email addresses. It is ideal for teams looking to eliminate manual research and streamline prospect enrichment. What it does / How it works This workflow reads contact records from a Google Sheet, including name, position, and description. An AI agent analyzes each profile to determine the company domain. If the domain is already identifiable from the description, it is used directly. If no domain is found, the workflow generates an intelligent search term and performs a Google Custom Search to extract the most accurate domain from real web results using another AI agent. Once the domain is confirmed, the workflow queries Hunter.io to find the best-matching email address for the contact. Finally, the enriched data—email and company domain—is appended back into the Google Sheet, updating each row automatically. Requirements Google Sheets OAuth2 credentials Google Gemini (PaLM) API credentials Hunter.io API key Google Custom Search API key and CSE ID A Google Sheet with columns for name, position, description, and domain How to set up Connect your Google Sheets, Gemini, Hunter.io, and Google Search credentials. Replace the Google Sheet ID and sheet name with your own. Add your API keys to the designated nodes. Ensure column names match your sheet structure. Execute the workflow to begin enrichment. How to customize the workflow Modify AI prompts for better domain inference Add additional enrichment steps (social profiles, industry tags) Add fallback email providers (Snov, Apollo, etc.) Change update logic to support multiple sheets or batch processing
by deAPI Team
Who is this for? Teams who upload meeting recordings to YouTube (unlisted or private) and want automated notes Project managers who need to track action items across recurring meetings Remote teams who want searchable, structured meeting notes in Notion Content teams repurposing recorded calls into documentation What problem does this solve? Meeting notes are either rushed, incomplete, or never written at all. This workflow removes that bottleneck — upload a recording to YouTube and get a structured Notion page with summary, action items, decisions, and key topics, plus a Slack notification, all within minutes. What this workflow does Monitors a YouTube channel via RSS for new video uploads Transcribes the video using deAPI (Whisper Large V3) directly from the YouTube URL — no file download or size limits AI Agent analyzes the transcript and extracts a title, summary, action items, decisions, and key topics Creates a structured meeting notes page in a Notion database Posts the summary and action items to a Slack channel Setup Requirements n8n instance** (self-hosted or n8n Cloud) deAPI account for video transcription Anthropic account for the AI Agent Notion workspace with a meeting notes database Slack workspace Installing the deAPI Node n8n Cloud: Go to **Settings → Community Nodes and toggle the "Verified Community Nodes" option Self-hosted: Go to **Settings → Community Nodes and install n8n-nodes-deapi Configuration Add your deAPI credentials (API key + webhook secret) Add your Anthropic credentials (API key) Set the Feed URL in the RSS trigger to your YouTube channel's RSS feed: https://www.youtube.com/feeds/videos.xml?channel_id=YOUR_CHANNEL_ID Add your Notion credentials and set the Database ID in the Notion node Add your Slack credentials and set the Channel in the Slack node Ensure your n8n instance is on HTTPS How to customize this workflow Change the AI model**: Swap Anthropic for OpenAI, Google Gemini, or any other LLM provider Adjust the note structure**: Modify the AI Agent system message to extract different fields (attendees, follow-up date, sentiment, etc.) Change the trigger**: Replace the RSS trigger with a Google Drive trigger or form upload for non-YouTube recordings Change the output destination**: Replace Notion with Google Docs, Confluence, or Airtable Change the notification**: Replace Slack with Microsoft Teams, email, or Discord Monitor multiple channels**: Duplicate the RSS trigger or use multiple feed URLs to track several YouTube channels
by takuma
Who's it for This template is for home cooks, small restaurant owners, or anyone who wants to streamline their meal planning, ingredient cost tracking, leftover management, nutritional analysis, and social media promotion. It's ideal for those looking to optimize their kitchen operations, reduce food waste, maintain a healthy diet, and efficiently share their culinary creations. How it works / What it does This advanced workflow acts as a comprehensive culinary assistant. Triggered by a new menu item, it performs several key functions: Cost and Ingredient Tracking:** A "Menu Agent" uses AI to analyze your input (e.g., a recipe or dish) and extract a detailed list of ingredients, their associated costs, unit prices, and total cost, then logs this into a Google Sheet as a "Recipe List." Leftover Management:** A "Leftovers Agent" identifies any unused ingredients from your planned dish and suggests three new recipes to utilize them, helping to minimize food waste. This information is also recorded in a Google Sheet. Nutritional Diary:** A "Nutritionist Agent" generates a diary-style entry with dietary advice based on the meal, highlighting key nutrients and offering personalized suggestions. This entry is appended to a "Diary" Google Sheet. Social Media Promotion:** A "Post Agent" takes the nutritional diary entry and transforms it into an engaging social media post (specifically for X/Twitter in this template), which is then sent as a direct message, ready for you to share with your followers. How to set up Webhook Trigger: The workflow starts with a Webhook. Copy the webhook URL from the "Webhook" node. You will send your menu item input to this URL. Google Sheets Integration: You need to set up a Google Sheets credential for your n8n instance. Create a Google Sheet document (e.g., "Recipe List"). Within this document, create three sheets: "Recipe: This sheet will store your menu items, ingredients, costs, etc. Ensure it has columns for Date, Item, Ingredients, Ingredient Cost, Unit Price, Quantity, Total, Cost, and Leftover Ingredients. "leftovers" (Leftovers): This sheet will store suggested recipes for leftover ingredients. Ensure it has columns for Date and Ingredients. "diary" (Diary): This sheet will store your nutritional diary entries. Ensure it has a column for Diary. In the "Append row in sheet", "Append row in sheet1", and "Append row in sheet2" nodes, replace the Document ID with the ID of your Google Sheet. For "Sheet Name," ensure you select the correct sheet (e.g., "レシピ", "diary", "leftovers") from the dropdown. OpenRouter Chat Model: Set up your OpenRouter credentials in the "OpenRouter Chat Model" nodes. You will need your OpenRouter API key. Twitter Integration: Set up your Twitter credentials for the "Create Direct Message" node. In the "Create Direct Message" node, specify the User (username) to whom the direct message should be sent. This is typically your own Twitter handle or a test account. Requirements An n8n instance. A Google account with Google Sheets enabled. An OpenRouter API key. A Twitter (X) account with developer access to send Direct Messages. How to customize the workflow Input Data:** The initial input to the "Webhook" node is expected to be the name of a dish or recipe. You can modify the "Menu Agent" to accept more detailed inputs if needed. Google Sheets Structure:** Adjust the column mappings in the Google Sheets nodes if your spreadsheet column headers differ. AI Agent Prompts:** Customize the System Message in each AI Agent node (Menu Agent, Leftovers Agent, Nutritionist Agent, Post Agent) to refine their behavior and the kind of output they generate. For example, you could ask the Nutritionist Agent to focus on specific dietary needs. Social Media Platform:** The "Create Direct Message" node is configured for Twitter. You can swap this with another social media node (e.g., Mastodon, Discord) if you prefer to post elsewhere, remembering to adjust the "Post Agent" system message accordingly. Output Parser:** The "Structured Output Parser" is configured for a specific JSON structure. If you change the "Menu Agent" to output a different structure, you'll need to update this parser.
by Liveblocks
Analyzing uploaded Liveblocks comments attachments with AI This example uses Liveblocks Comments, collaborative commenting components for React. When an AI assistant is mentioned in a thread (e.g. "@AI Assistant"), it will automatically leave a response. Additionally, it will analyze any PDf or image attachments in the comments, and use them to help it respond. Using webhooks, this workflow is triggered when a comment is created in a thread. If the agent's ID ("__AI_AGENT") it will create a response. If a PDF or image file is uploaded, these will be analyzed by Anthropic and used as context. This response is then added, and users will see it appear in their apps in real time. Set up This workflow requires a Comments app installed and webhooks set up in the Liveblocks dashboard. You can try it with a demo application: Download the Next.js comments example, and run it with a secret key. Find database.ts inside the example and uncomment the AI assistant user. Insert the secret key from the project into n8n nodes: "Get a comment", "Get a thread", "Create a comment". Go to the Liveblocks dashboard, open your project and go to "Webhooks". Create a new webhook in your project using a placeholder URL, and selecting "commentCreated" events. Copy your webhook secret from this page and paste it into the "Liveblocks Trigger" node. Expose the webhook URL from the trigger, for example with localtunnel or ngrok. Copy the production URL from the "Liveblocks Trigger" and replace localhost:5678 with the new URL. Your workflow is now set up! Tag @AI Assistant in the application and add attachments to trigger it. Localtunnel The easiest way to expose your webhook URL: npx localtunnel --port 5678 --subdomain your-name-here This creates a URL like: https://honest-months-fix.loca.lt The URL you need for the dashboard looks like this: https://honest-months-fix.loca.lt/webhook/9cc66974-aaaf-4720-b557-1267105ca78b/webhook `
by Afigo Sam
This n8n workflow automatically scrapes X (Twitter) trending topics for any country, generates a pidgin-style tweet and a matching AI image for each trend, then schedules and posts them to X via Buffer — all on a custom schedule. Perfect for social media managers, content creators, and personal brands who want to stay relevant on X without manually tracking trends or writing posts every day. Good to know The workflow runs 3 times daily by default (8:30am, 12:30pm, 4:30pm). You can adjust the schedule in the Schedule Trigger node to suit your timezone. Trends already used in the last 24 hours are automatically skipped to avoid repetition. If the primary Apify scraper fails, the workflow falls back to Gemini AI search to fetch trends — so it rarely runs dry. FLUX.1-schnell on HuggingFace is a free-tier model but has rate limits. For higher volume, consider upgrading your HuggingFace plan. How it works A Schedule Trigger fires 3 times a day and kicks off the Apify Twitter Trending Topics Scraper for your chosen country (default: Nigeria). If Apify fails or returns no data, Gemini AI searches for current trending topics as a fallback. Already-used trends are fetched from an n8n Data Table and filtered out so the same trend is never posted twice within 24 hours. 2 unused trends are randomly picked and passed into a loop. For each trend, Gemini 2.5 Flash generates a short, engaging pidgin-English tweet (≤275 characters) grounded in live context via Google Search. FLUX.1-schnell on HuggingFace generates a photorealistic image matching the trend topic. The image is uploaded to Dropbox and a temporary public URL is extracted. The tweet and image are posted to X via Buffer MCP on automatic scheduling. The used trend is saved back to the Data Table with a timestamp to prevent reuse. How to use Change the country field in both Apify nodes to scrape trends for your own country. Edit the tweet prompt in the Generate Tweet with Gemini node to match your personal tone, language, or brand voice. Edit the FLUX image prompt in the Generate Image with FLUX node to change the visual style of generated images. Run the Create a data table node once manually to set up the trend history table, then disable it. Replace YOUR_BUFFER_CHANNEL_ID in the Post to X via Buffer node with your actual Buffer channel ID. Requirements Apify account with the Twitter Trending Topics Scraper actor Google Gemini API credential HuggingFace account with an API token (HTTP Bearer Token credential) Dropbox account (OAuth2) Buffer account connected to your X profile Customising this workflow Swap Nigeria for any country supported by the Apify actor to target a different audience. Replace the pidgin-English tweet prompt with any language or tone — formal English, Yoruba slang, Spanish, etc. Replace Dropbox with Google Drive or S3 if you prefer a different image storage provider. Increase the number of picked trends from 2 to more in the Pick Trend node to post more tweets per run.
by AI Sales Agent HQ
Generate professional sales proposals from a simple form—AI writes the content, you deliver the document. Fill out client details, pain points, and pricing, and this workflow creates a polished proposal with calculated ROI metrics, executive summary, solution strategy, and team bios. How It Works Sales rep submits a form with client name, industry, pain points, and pricing Code node calculates ROI, net savings, and break-even period Gemini AI generates proposal content: executive summary, key challenges, solution strategy, team bios, and call to action Copies your Google Doc template and replaces all placeholders with generated content Final proposal is ready in Google Drive Setup Import the workflow JSON Create a Google Doc template with placeholders: {{client_name}}, {{executive_summary}}, {{key_challenges}} {{solution_strategy}}, {{team_bios}}, {{next_steps}} {{formatted_roi}}, {{formatted_net_savings}}, {{formatted_break_even}} {{formatted_solution_cost}}, {{date}} Add credentials: Google Drive → OAuth2 Google Docs → OAuth2 Google Gemini → API key from aistudio.google.com Configure "Copy proposal template" node → Point to your template document Customize the AI → Edit system message in "Generate proposal content" to match your tone Test → Submit the form and check the generated proposal Activate
by Nijan
This workflow turns Slack into your content control hub and automates the full blog creation pipeline — from sourcing trending headlines, validating topics, drafting posts, and preparing content for your CMS. With one command in Slack, you can source news from RSS feeds, refine them with Gemini AI, generate high-quality blog posts, and get publish-ready output — all inside a single n8n workflow. ⸻ ⚙️ How It Works 1.Trigger in Slack Type start in a Slack channel to fetch trending headlines. Headlines are pulled from your configured RSS feeds. 2.Topic Generation (Gemini AI) Gemini rewrites RSS headlines into unique, non-duplicate topics. Slack displays these topics in a numbered list (e.g., reply with 2 to pick topic 2). 3.Content Validation When you reply with a number, Gemini validates and slightly rewrites the topic to ensure originality. Slack confirms the selected topic back to you. 4.Content Creation Gemini generates a LinkedIn/blog-style draft: Strong hook introduction 3–5 bullet insights A closing takeaway and CTA Optionally suggests asset ideas (e.g., image, infographic). 5.CMS-Ready Output Final draft is structured for publishing (markdown or plain text). You can expand this workflow to automatically send the output to your CMS (WordPress, Ghost, Notion, etc.). ⸻ 🛠 Setup Instructions Connect your Slack Bot to n8n. Configure your RSS Read nodes with feeds relevant to your niche. Add your Gemini API credentials in the AI node. Run the workflow: Type start in Slack → see trending topics. Reply with a number (e.g., gen 3) → get a generated blog draft in the same Slack thread. ⸻ 🎛 Customization Options • Change RSS sources to match your industry. • Adjust Gemini prompts for tone (educational, casual, professional). • Add moderation filters (skip sensitive or irrelevant topics). • Connect the final output step to your CMS, Notion, or Google Docs for publishing. ⸻ ✅ Why Use This Workflow? • One-stop flow: Sourcing → Validation → Writing → Publishing. • Hands-free control: Everything happens from Slack. • Flexible: Easily switch feeds, tone, or target CMS. • Scalable: Extend to newsletters, social posts, or knowledge bases.
by Summer
LinkedIn Job Search Automation Creator: Summer Chang Setup Instructions This n8n workflow automatically searches for senior designer jobs on LinkedIn every day at 5am and saves them to a Notion database. Prerequisites n8n instance (cloud or self-hosted) Notion account with API access A Notion database set up to receive job listings Setup Steps ✅ 1. Create Your Notion Database Or duplicate my template ✅ 2. Connect Notion to n8n In the "Save to Notion" node, click on the Notion credentials Follow the authentication flow to connect your Notion account Select your job search database from the dropdown ✅ 3. Customize Your Search Criteria In the "Set Search Criteria" node, modify these parameters to match your job preferences: search_keywords: Job titles to search for (comma-separated) Default: senior product designer, product design lead, senior UX designer, AI designer excluded_keywords: Terms to filter out (comma-separated) Default: contract, freelance location: Where you want to work (comma-separated) Default: remote, san francisco f_TPR: Time filter for job postings r86400 = Last 24 hours r604800 = Last week r2592000 = Last month sortBy: How to sort results DD = Most recent first R = Most relevant first ✅ 4. Adjust the Schedule In the "Everyday @5am" node: Click on the node Modify the schedule to your preferred time You can set it to run daily, weekly, or at custom intervals ✅ 5. Set Result Limits In the "Limit1" node: Default: Processes 10 jobs per run Adjust the maxItems value to get more or fewer results ✅ 6. Configure Wait Time (Optional) The "Wait2" node adds a 10-second delay between requests to avoid rate limiting: Default: 10 seconds Increase if you're getting blocked by LinkedIn Decrease for faster processing (not recommended) How It Works Trigger: Runs automatically every day at 5am Search: Queries LinkedIn with your specified criteria Parse: Extracts job title, company, location, and URL from search results Filter: Removes any jobs with missing critical information Wait: Delays between requests to avoid rate limiting Fetch Details: Retrieves full job descriptions and poster information Save: Adds each job to your Notion database
by Budi SJ
Automated Brand DNA Generator Using JotForm, Google Search, AI Extraction & Notion The Brand DNA Generator workflow automatically scans and analyzes online content to build a company’s Brand DNA profile. It starts with input from a form, then crawls the company’s website and Google search results to gather relevant information. Using AI-powered extraction, the system identifies insights such as value propositions, ideal customer profiles (ICP), pain points, proof points, brand tone, and more. All results are neatly formatted and automatically saved to a Notion database as a structured Brand DNA report, eliminating the need for manual research. 🛠️ Key Features Automated data capture, collects company data directly from form submissions and Google search results. Uses AI-powered insight extraction with LLMs to extract and summarize brand-related information from website content. Fetches clean text from multiple web pages using HTTP requests and a content extractor. Merges extracted data from multiple sources into a single Brand DNA JSON structure. Automatically creates a new page in Notion with formatted sections (headings, paragraphs, and bullet points). Handles parsing failures and processes multiple pages efficiently in batches. 🔧 Requirements JotForm API Key, to capture company data from form submissions. SerpAPI Key, to perform automated Google searches. OpenRouter / LLM API, for AI-based language understanding and information extraction. Notion Integration Token & Database ID, to save the final Brand DNA report to Notion. 🧩 Setup Instructions Connect your JotForm account and select the form containing the fields Company Name and Company Website. Add your SerpAPI Key. Configure the AI model using OpenRouter or LLM. Enter your Notion credentials and specify the databaseId in the Create a Database Page node. Customize the prompt in the Information Extractor node to modify the tone or structure of AI analysis (Optional). Activate the workflow, then submit data through the JotForm to test automatic generation and Notion integration. 💡 Final Output A complete Brand DNA Report containing: Company Description Ideal Customer Profile Pain Points Value Proposition Proof Points Brand Tone Suggested Keywords All generated automatically from the company’s online presence and stored in Notion with no manual input required.