by Abdul Mir
Overview Impress your leads with ultra-personalized “thank you” emails that look hand-written — sent automatically seconds after they submit your intake form. This workflow instantly scrapes the prospect's website, extracts meaningful copy, and uses AI to write a custom thank-you message referencing something specific from their site. It gives the impression you immediately reviewed their business and crafted a thoughtful reply — without lifting a finger. Who’s it for Agencies and consultants using intake forms Freelancers booking discovery calls B2B businesses that want high-touch first impressions Sales teams automating initial follow-ups How it works Triggered when a form (e.g. Tally, Typeform) is submitted Scrapes the website URL provided in the form Converts HTML to Markdown and extracts plain copy Uses AI to write a personalized thank-you message referencing the site Waits briefly to simulate real typing delay Sends the message via Gmail (or any email provider) Example use case > Prospect submits a form with their website: coolstartup.ai > > 30 seconds later, they receive: > > “Thanks for reaching out! I just checked out Cool Startup’s homepage — love the clean UX and mission around AI for teams. Looking forward to diving into how we might collaborate!” How to set up Connect your form tool (e.g. Tally or Typeform) Connect Gmail or another email provider Customize the AI prompt to match your tone Set the wait time (e.g. 30 seconds) for a realistic delay Update your website scraping logic if needed Requirements Form tool with webhook support OpenAI (or other LLM) credentials Email sending integration (Gmail, Mailgun, Postmark, etc.) How to customize Edit the email tone (casual, formal, funny, etc.) Add CRM integration to log form submission and response Trigger additional workflows like lead scoring or Slack alerts Add fallback logic if the website doesn’t scrape cleanly
by kote2
This workflow allows a LINE user to send either text or an image of food to a connected LINE bot. If text is sent, the AI agent responds directly via LINE. If an image is sent, the workflow downloads it from LINE’s API, analyzes it using OpenAI’s Vision model, estimates calories (only if the image contains food), and formats the result into JSON. Detected dishes and calories are appended to a Google Sheet, and a confirmation message is sent back to the user via LINE. Key Features: Integrates LINE Messaging API webhook with n8n Uses OpenAI Vision to detect food and estimate calories Automatically logs results into Google Sheets Sends real-time feedback to the LINE user How to use: Set up a LINE Messaging API channel and get your channel access token. Add your OpenAI API credentials in n8n. Replace placeholders for {channel access token}, {your id}, and Google Sheet IDs with your own. Activate the workflow and send a food image or text message to your LINE bot.
by Lakindu Siriwardana
This visual workflow represents an AI-powered automated CV filtering system created using tools like n8n, Google Drive, Google Sheets, and Ollama (LLM) ⚙️ Key Features 📂 Google Drive Integration – Automatically searches and downloads CVs (PDF/DOCX/PPTX) from a shared folder. 📋 Criteria Matching – Reads and applies filtering rules from a Google Sheet. 🧠 LLM-Based Analysis – Uses a Large Language Model (Ollama) to assess and interpret CV content. 🧪 Smart Parsing – Includes structured and auto-fixing output parsers to ensure data accuracy. 📊 Automated Results Output – Writes matching candidates and analysis to a Google Sheet. 🔁 Loop and Aggregate Logic – Handles multiple CVs with iterative processing and aggregation. 🚀 No-Code Automation with n8n – Fully visual, modifiable without programming. 🛠️ How It Works Trigger: Workflow is initiated via a Webhook (from a UI “Start Workflow” button). CV Search: Searches for CV files in a designated Google Drive folder. Loop Over Files: Each file is downloaded. Text is extracted (from PDFs or other formats). Criteria Input: Matching rules are fetched from a predefined Google Sheet. Merge & Aggregate: Combines file text and criteria for unified processing. LLM Processing: Text + criteria are sent to the Basic LLM Chain. Utilizes Ollama Model for advanced language understanding. Structured or auto-fixing output parsers enhance reliability. Custom Code Execution: Optionally enriches or reformats the data. Output: Results are appended to a shared Google Sheet (the output sheet).
by Parth Pansuriya
Fetch Property Listings from 99Acres & MagicBricks with Apify and Google Sheets Who’s it for Users who want to automatically fetch and organize property listings from 99Acres and MagicBricks into Google Sheets without manual copying. How it works / What it does Users submit search URLs via a form. The workflow uses Apify scrapers to fetch listings from 99Acres & MagicBricks. Data is cleaned, standardized (ID, Title, Price, Price per Sqft, URL), and deduplicated. Listings are automatically appended to their respective Google Sheets tabs. How to set up Connect your Google Sheets account in all Google Sheets nodes. Open the form trigger and submit valid search URLs. Run the workflow or submit the form live. A new spreadsheet is created and populated automatically. Requirements Google Sheets account Apify API key for 99Acres & MagicBricks scrapers Valid property search URLs How to customize the workflow Change sheet names or spreadsheet title in the “Create Master Spreadsheet” node. Adjust API parameters in the HTTP Request nodes (like max retries or proxy settings). Modify the Code nodes to include additional fields or filters.
by Robert Breen
This workflow pulls all tasks from your Monday.com board each day and logs them into a Google Sheet. It creates a daily snapshot of your project’s progress and statuses for reporting, tracking, or analysis. ⚙️ Setup Instructions 1️⃣ Connect Monday.com API In Monday.com → go to Admin → API Copy your Personal API Token Docs: Generate Monday API Token In n8n → Credentials → New → Monday.com API → paste your token and save 2️⃣ Prepare Your Google Sheet Copy this template to your own Google Drive: Google Sheet Template Add your data in rows 2–100. Make sure each new task row starts with Added = No. Connect Google Sheets in n8n Go to n8n → Credentials → New → Google Sheets (OAuth2) Log in with your Google account and grant access In the workflow, select your Spreadsheet ID and the correct Sheet Name 🧠 How it works Trigger**: Runs on click or via schedule (e.g., daily at 9 AM). Get many items (Monday.com)**: Fetches all tasks and their current status. Today's Date Node**: Adds the current date for snapshot logging. Map Fields**: Normalizes task name and status. Google Sheets (Append)**: Saves all tasks with status + date into your sheet for historical tracking. 📬 Contact Need help customizing this (e.g., filtering by status, emailing daily reports, or adding charts)? 📧 robert@ynteractive.com 🔗 Robert Breen 🌐 ynteractive.com
by automedia
Transcribe New YouTube Videos and Save to Supabase Who's It For? This workflow is for content creators, marketers, researchers, and anyone who needs to quickly get text transcripts from YouTube videos. If you analyze video content, repurpose it for blogs or social media, or want to make videos searchable, this template will save you hours of manual work. What It Does This template automatically monitors multiple YouTube channels for new videos. When a new video is published, it extracts the video ID, retrieves the full transcript using the youtube-transcript.io API, and saves the structured data—including the title, author, URL, and full transcript—into a Supabase table. It intelligently filters out YouTube Shorts by default and includes error handling to ensure that only successful transcriptions are processed. Requirements A Supabase account with a table ready to receive the video data. An API key from youtube-transcript.io (offers a free tier). The Channel ID for each YouTube channel you want to track. You can find this using a free online tool like TunePocket's Channel ID Finder. How to Set Up Add Channel IDs: In the "Channels To Track" node, replace the example YouTube Channel IDs with your own. The workflow uses these IDs to create RSS links and find new videos. Configure API Credentials: Find the "youtube-captions" HTTP Request node. In the credentials tab, create a new "Header Auth" credential. Name it youtube-transcript-io and paste your API key into the "Value" field. The "Name" field should be x-api-key. Connect Your Supabase Account: Navigate to the "Add to Content Queue Table" node. Create new credentials for your Supabase account using your Project URL and API key. Once connected, select your target table and map the incoming fields (title, source_url, content_snippet, etc.) to the correct columns in your table. Set Your Schedule (Optional): The workflow starts with a manual trigger. To run it automatically, replace the "When clicking ‘Execute workflow’" node with a Schedule node and set your desired interval (e.g., once a day). Activate the Workflow: Save your changes and toggle the workflow to Active in the top right corner. How to Customize Transcribe YouTube Shorts:* To include Shorts in your workflow, select the *"Does url exist?"** If node and delete the second condition that checks for youtube.com/shorts. Change Your Database:* Don't use Supabase? Simply replace the *"Add to Content Queue Table"* node with another database or spreadsheet node, such as *Google Sheets, **Airtable, or n8n's own Table.
by furuidoreandoro
Automated TikTok Real Estate Research for Couples This workflow automates the process of finding real estate (rental) videos on TikTok, filtering them for a specific target audience (couples in their 20s), generating an explanation of why they are recommended, and saving the results to Google Sheets and Slack. Who’s it for Real Estate Agents & Marketers:** To research trending rental properties and video styles popular on social media. Content Curators:** To automatically gather and summarize niche content from TikTok. House Hunters:** To automate the search for "rental" videos tailored to couples. How it works / What it does Trigger: The workflow starts manually (on click). Scrape TikTok: It connects to Apify to run a "TikTok Scraper". It searches for videos with the hashtag 賃貸 (Rental) and retrieves metadata. Filter & Extract (AI Agent 1): An AI Agent (using OpenRouter) analyzes the retrieved video data to select properties suitable for "couples in their 20s" and outputs the video URL. Generate Insights (AI Agent 2): A second AI Agent reviews the URL/content and generates a specific reason why this property is recommended for the target audience, formatting the output with the URL and explanation. Save to Database: The final text (URL + Reason) is appended to a Google Sheet. Notify Team: The same recommendation text is sent to a specific Slack channel to alert the user. Requirements n8n:** Version 1.0 or later. Apify Account:** You need an API token and access to the clockworks/tiktok-scraper actor. OpenRouter Account:** An API Key to use Large Language Models (LLMs) for the AI Agents. Google Cloud Platform:** A project with the Google Sheets API enabled and OAuth credentials. Slack Workspace:** Permission to add apps/bots to a channel. How to set up Import the Workflow: Copy the JSON code and paste it into your n8n editor. Configure Credentials: Apify: Create a new credential in n8n using your Apify API Token. OpenRouter: Create a new credential using your OpenRouter API Key. Google Sheets: Connect your Google account via OAuth2. Slack: Connect your Slack account via OAuth2. Configure Nodes: Google Sheets Node: Select your specific Spreadsheet and Sheet from the dropdown lists (replace the placeholders YOUR_SPREADSHEET_ID etc. if they don't update automatically). Slack Node: Select the Channel where you want to receive notifications (replace YOUR_CHANNEL_ID). Test: Click "Execute Workflow" to run a test. How to customize the workflow Change the Search Topic:* Open the *Apify** node and change the hashtags value in the "Custom Body" JSON (e.g., change "賃貸" to "DIY" or "Travel"). Adjust the Persona:* Open the *AI Agent** nodes and modify the text prompt. You can change the target audience from "20s couples" to "students" or "families." Increase Volume:* In the *Apify** node, increase the resultsPerPage or maxProfilesPerQuery to process more videos at once (note: this will consume more API credits). Change Output Format:* Modify the *Google Sheets** node to map specific fields (like Video Title, Author, Likes) into separate columns instead of just one raw output string.
by higashiyama
Advanced Code Review Automation (AI + Lint + Slack) Who’s it for For software engineers, QA teams, and tech leads who want to automate intelligent code reviews with both AI-driven suggestions and rule-based linting — all managed in Google Sheets with instant Slack summaries. How it works This workflow performs a two-layer review system: Lint Check: Runs a lightweight static analysis to find common issues (e.g., use of var, console.log, unbalanced braces). AI Review: Sends valid code to Gemini AI, which provides human-like review feedback with severity classification (Critical, Major, Minor) and visual highlights (red/orange tags). Formatter: Combines lint and AI results, calculating an overall score (0–10). Aggregator: Summarizes results for quick comparison. Google Sheets Writer: Appends results to your review log. Slack Notification: Posts a concise summary (e.g., number of issues and average score) to your team’s channel. How to set up Connect Google Sheets and Slack credentials in n8n. Replace placeholders (<YOUR_SPREADSHEET_ID>, <YOUR_SHEET_GID_OR_NAME>, <YOUR_SLACK_CHANNEL_ID>). Adjust the AI review prompt or lint rules as needed. Activate the workflow — reviews will start automatically whenever new code is added to the sheet. Requirements Google Sheets and Slack integrations enabled A configured AI node (Gemini, OpenAI, or compatible) Proper permissions to write to your target Google Sheet How to customize Add more linting rules (naming conventions, spacing, forbidden APIs) Extend the AI prompt for project-specific guidelines Customize the Slack message formatting Export analytics to a dashboard (e.g., Notion or Data Studio) Why it’s valuable This workflow brings realistic, team-oriented AI-assisted code review to n8n — combining the speed of automated linting with the nuance of human-style feedback. It saves time, improves code quality, and keeps your team’s review history transparent and centralized.
by Matthew
Instagram Hashtag Lead Generation Automate the process of finding and qualifying Instagram leads based on hashtags. This workflow reads hashtags from Google Sheets, scrapes Instagram for posts using Apify, analyzes caption content and language, compiles unique usernames, gathers detailed user info, and filters leads based on follower count. How It Works Fetch Hashtags The workflow starts and pulls a list of hashtags from a Google Sheet. Scrape Instagram Posts For each hashtag, it builds Instagram explore URLs and scrapes posts using Apify. Analyze Captions Each caption is cleaned, hashtags and links are removed, and language/content is analyzed (English/French/Spanish). Extract & Filter Usernames Usernames are combined and deduplicated, their Instagram profiles scraped for follower counts and other details. Qualified Leads Only users with followers in your target range are kept as qualified leads for outreach or analysis. Requirements An n8n instance. Apify API key. Google account with a Sheet containing hashtags. Apify Instagram Scraper actor access. The Google Sheet should have a column with hashtags. Setup Instructions Add Credentials In n8n, add your Apify API key and Google Sheets credentials. Configure Google Sheets Nodes Choose your credentials, spreadsheet, and sheet name in the “Get list of Hashtags” node. Configure Apify Request Nodes Enter your Apify API key and select the Instagram scraper actors. Adjust Filtering Edit the min/max follower count in the relevant filter node to match your needs. Test & Run Manually execute the workflow or add a trigger to run on a schedule. Customization Options 💡 Trigger:** Add a schedule or webhook to automate execution. Language Filtering:** Modify keyword lists or add language detection logic. Lead Output:** Extend the workflow to save leads to a CRM or send notifications. Details Nodes used in workflow: Manual Trigger Google Sheets Code HTTP Request (Apify) IF (Conditional) Aggregate Remove Duplicates Sticky Note
by RamK
Phishing Lookout (Typosquatting) and Brand Domain Monitor This workflow monitors SSL certificate logs to find and scan new domains that might be impersonating your brand. Background In modern cybersecurity, Brand Impersonation (or "Typosquatting") is quite common in phishing attacks. Attackers register domains that look nearly identical to a trusted brand—such as .input-n8n.io, n8n.i0, etc. instead of the legitimate— to deceive users into revealing sensitive credentials or downloading malware. How it works Monitor: Checks crt.sh every hour for new SSL certificates matching your brand keywords. Process: Uses a Split Out node to handle multi-domain certificates and a Filter node to ignore your own legitimate domains bringing only most recent certificates. Scan: Automatically sends suspicious domains to Urlscan.io for a headless browser scan and screenshot. Loop & Triage: Implements a 30-second Wait to allow the scan in loop to finish before fetching results. Alert: Sends a Slack message with the domain name, report link, and an image of the supposedly suspicious site trying to mimic your site login page, etc. alerting potentially a phishing case. Setup Steps Credentials: Connect your Urlscan.io API key and Slack bot token. Configuration: Update the "Poll crt.sh" node. In URL https://crt.sh/?q=%.testdomain.com&output=json, use your specific brand name (e.g., %.yourbrand.com or .yourdomain.com instead of .testdomain.com). Whitelist: Add your real domains to the myDomains list in the Filter & Deduplicate code node to prevent false alerts. Alternatively, you may also NOT opt to include your own domain for testing purposes to check how the Workflow behaves and outputs. In such case, obviously, your domain and sub-domains also are highlighted as Suspicious (as received in Slack Alerts) Looping: Ensure the Alert Slack node output is connected back to the Split In Batches input to process all found domains.
by Robert Breen
This workflow pulls deals from Pipedrive, categorizes them by stage, and logs them into a Google Sheet for reporting and tracking. ⚙️ Setup Instructions 1️⃣ Connect Pipedrive In Pipedrive → Personal preferences → API → copy your API token URL shortcut: https://{your-company}.pipedrive.com/settings/personal/api In n8n → Credentials → New → Pipedrive API Company domain: {your-company} (the subdomain in your Pipedrive URL) API Token: paste the token from step 1 → Save In the Pipedrive Tool node, select your Pipedrive credential and (optionally) set filters (e.g., owner, label, created time). 2️⃣ Prepare Your Google Sheet Connect your Data in Google Sheets Use this format: Sample Sheet Row 1 = column names In n8n, create credentials: Google Sheets (OAuth2) Log in with your Google account and select your Spreadsheet + Worksheet 🧠 How it works Get many deals (Pipedrive)**: Fetches all deals with stage IDs. Categorize Stages**: Maps stage IDs → friendly stage names (Prospecting, Qualified, Proposal, Negotiation, Closed Won). Today's Date**: Adds a date stamp to each run. Set Fields**: Combines stage, deal name, and date into clean columns. Google Sheets (Append)**: Writes all rows to your reporting sheet. 📬 Contact Need help customizing this (e.g., pulling only active deals, calculating win-rates, or sending dashboards)? 📧 robert@ynteractive.com 🔗 Robert Breen 🌐 ynteractive.com
by SOLOVIEVA ANNA
Who this workflow is for This template is for teams who want a lightweight “daily icebreaker” in Slack and creators who’d like to build a reusable trivia database over time. It works well for remote teams, communities, and any workspace that enjoys a quick brain teaser each day. What this workflow does The workflow fetches a random multiple-choice question from the Open Trivia Database (OpenTDB), posts a nicely formatted trivia message to a Slack channel, and logs the full question and answers into a Google Sheets spreadsheet. Over time, this creates a searchable “trivia archive” you can reuse for quizzes, content, or community events. How it works A Schedule Trigger runs once per day at a time you define. A Set node randomly chooses a difficulty level (easy, medium, or hard). A Switch node routes to the matching OpenTDB HTTP request. Each branch normalizes the API response into common fields (timestamp, date, difficulty, category, question, correct, incorrect, messageTitle, messageBody). A Merge node combines the three branches into a single stream. Slack posts the trivia message. Google Sheets appends the same data as a new row. How to set up Connect your Slack OAuth2 credentials and choose a target channel. Connect your Google Sheets credentials and select the spreadsheet and sheet. Adjust the schedule (time and frequency) to match your use case. How to customize Change the Slack message format (for example, add emojis or hints). Filter categories or difficulty levels instead of picking them fully at random. Add additional logging (e.g., user reactions, answer stats) in Sheets or another datastore.