by Frankie Wong
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This n8n workflow template helps you automatically convert unstructured contact information—such as customer details copied from emails, web forms, or chat messages—into clean, structured JSON using an AI agent. What It Does: Accepts unstructured contact data via a Webhook (as form-data under the key prompt) Uses AI to intelligently extract key fields such as: Company Name First Name Last Name Address City Country Phone Fax Email Parses and formats the extracted data into a valid JSON object Prepares the output for seamless integration into systems like: Dolibarr Other ERP/CRM platforms Any service that consumes JSON via API or webhook Use Cases: Automate manual data entry from emails into your ERP system Clean and normalize contact data from various input sources Reduce human error in your customer onboarding workflows This template saves you time and ensures consistency across your business systems. Simply connect your systems and let the automation handle the rest.
by Isight
Dental Clinic Automation: Scheduling, Availability & Patient Lookup This workflow automates dental appointment management through a phone-based assistant. It listens for requests like booking, rescheduling, canceling, checking insurance, looking up appointments, and finding available time slots. Each request is processed through a Switch node and then routed to your Supabase database for action. How it works Once a request is received, the workflow uses the patient’s phone number to identify them. Then, it: Booking: Checks for available time, creates or retrieves the patient record, and stores the appointment. Rescheduling: Confirms the new date, avoids double-booking, and updates the record. Canceling: Removes the appointment and sends a confirmation. Insurance: Looks up the member ID and provides a status (accepted or not). Availability: Finds the doctor’s existing appointments and generates available 60-minute slots. Appointment & doctor lists: Retrieves and presents clean, structured information for the assistant. Each action ends with a webhook response that the phone system reads back to the patient. Setup steps Add your Supabase credentials to the Supabase nodes. Connect your phone/voice system to the webhook URL. Ensure Supabase table and column names match the workflow. Test all actions (booking, rescheduling, canceling, etc.) before going live. Customization tips (optional) You can update working hours, appointment durations, or add new services by modifying the availability logic or Switch node routing.
by Gilbert Onyebuchi
This workflow automatically discovers developers on GitHub, enriches their data with email addresses, removes duplicates, and saves everything into a structured Google Sheets CRM. No manual searching, copying, or data cleaning required. It’s perfect for recruiter teams, SaaS founders, agencies, and outbound marketers who need fresh developer leads every day without spending hours on GitHub. How It Works This automation is divided into 3 clear stages: Find Developers on GitHub. The workflow runs on a schedule (daily/hourly). Enrich Developer Data with Emails. The workflow checks if a developer already has an email. If not, it automatically uses Hunter.io to find a professional email address. Remove Duplicates & Save to Google Sheets What You Get Automatic developer sourcing Email enrichment using Hunter.io Built-in duplicate detection Clean, enriched data you can use instantly for outreach What You Need GitHub API Hunter.io API key Google Sheets connection n8n (self-hosted or cloud)
by Gabriel Santos
Who’s it for Teams and project managers who want to turn meeting transcripts into actionable Trello tasks automatically, without worrying about duplicate cards. What it does This workflow receives a transcript file in .txt format and processes it with AI to extract clear, concise tasks. Each task includes a short title, a description, an assignee (if mentioned), and a deadline (if available). The workflow then checks Trello for duplicates across all lists, comparing both card titles (name) and descriptions (desc). If a matching card already exists, the workflow returns the existing Trello card ID. If not, it creates a new card in the predefined default list. Finally, the workflow generates a user-friendly summary: how many tasks were found, how many already existed, how many new cards were created, and how many tasks had no assignee or deadline. Requirements A Trello account with API credentials configured in n8n (no hardcoded keys). An OpenAI (or compatible) LLM account connected in n8n. How to customize Adjust similarity thresholds for title/description matching in the Trello Sub-Agent. Modify the summary text to always return in your preferred language. Extend the Trello card creation step with labels, members, or due dates.
by Nima Salimi
Overview This n8n workflow automatically retrieves the monthly CrUX (Chrome User Experience) Report from Google BigQuery and updates the data in NocoDB. It removes the previous month’s data before inserting the new dataset, ensuring your database always contains the latest CrUX rankings for website origins. The flow is fully automated, using schedule triggers to handle both data cleanup and data insertion each month. ✅ Tasks ⏰ Runs automatically on a monthly schedule 🔢 Converts the month name to a numeric value for table selection 🧹 Deletes last month’s CrUX data from NocoDB 🌐 Queries Google BigQuery for the latest monthly dataset 💾 Inserts the new CrUX rankings into NocoDB ⚙️ Keeps your database up to date with zero manual effort 🛠 How to Use 1️⃣ Set Up BigQuery Access Connect your Google BigQuery credentials. Make sure your project includes access to the chrome-ux-report public dataset. 2️⃣ Adjust the Query In the Google BigQuery node, change the LIMIT value to control how many top-ranked sites are retrieved. Ensure the {{ $json.table }} field correctly references the dataset for the desired month (e.g., 202509). 3️⃣ Prepare NocoDB Table Create a table in NocoDB with fields: origin, crux_rank, and any additional metadata you wish to track. 4️⃣ Schedule Automation The workflow includes two Schedule Trigger nodes: One runs the data cleanup process (deletes last month). One runs the data insertion for the new month. 5️⃣ Run or Activate the Workflow Activate it to run automatically each month. You can also run it manually to refresh data on demand. 📋 Prerequisites Before running this workflow, make sure you complete the following setup steps: 🧱 Enable BigQuery API Go to Google Cloud Console → APIs & Services Enable the BigQuery API for your project. 📊 Access the Chrome UX Report Dataset In BigQuery, search for “Chrome UX Report” in the Marketplace or go directly to: https://console.cloud.google.com/marketplace/product/chrome-ux-report/chrome-ux-report Click “View Dataset” and add it to your BigQuery project. 🔑 Connect BigQuery to n8n In n8n, create credentials for your Google BigQuery account using Service Account Authentication. Ensure the account has permission to query the chrome-ux-report dataset. 🗄️ Create a NocoDB Table In NocoDB, create a new table to store your CrUX data with the following fields: origin → Short text crux_rank → Number ⚙️ Connect NocoDB to n8n Use your NocoDB API Token to connect and allow the workflow to read/write data. What is CrUX Rank? CrUX Rank (Chrome User Experience Rank) is a metric from Google’s Chrome UX Report (CrUX) dataset that indicates a website’s popularity based on real user visits. It reflects how frequently an origin (website) is loaded by Chrome users around the world. A lower rank number means the site is more popular (e.g., rank 1 = top site). The data is collected from anonymized Chrome usage statistics, aggregated monthly. This rank helps you track site popularity trends and compare your domain’s visibility over time.
by Sk developer
Automated DA PA Checker Workflow for SEO Analysis Description This n8n workflow collects a website URL via form submission, retrieves SEO metrics like Domain Authority (DA) and Page Authority (PA) using the Moz DA PA Checker API, and stores the results in Google Sheets for easy tracking and analysis. Node-by-Node Explanation On form submission – Captures the website input from the user to pass to the Moz DA PA Checker API. DA PA API Request – Sends the website to the Moz DA PA Checker API via RapidAPI to fetch DA, PA, spam score, DR, and organic traffic. If – Checks if the API request to the Moz DA PA Checker API returned a successful response. Clean Output – Extracts only the useful data from the Moz DA PA Checker API response for saving. Google Sheets – Appends the cleaned SEO metrics to a Google Sheet for record-keeping. Use Cases SEO Analysis** – Quickly evaluate a website’s DA/PA metrics for optimization strategies. Competitor Research** – Compare domain authority and organic traffic with competitors. Link Building** – Identify high-authority domains for guest posting and backlinks. Domain Purchase Decisions** – Check metrics before buying expired or auctioned domains. Benefits Automated Workflow** – From input to Google Sheets without manual intervention. Accurate Metrics* – Uses the trusted *Moz DA PA Checker API** for DA, PA, spam score, DR, and traffic. Instant Insights** – Get SEO scores in seconds for faster decision-making. Easy Integration** – Seamless connection between RapidAPI and Google Sheets for data storage.
by Shelly-Ann Davy
Build authentic Reddit presence and generate qualified leads through AI-powered community engagement that provides genuine value without spam or promotion. 🎯 What This Workflow Does: This intelligent n8n workflow monitors 9 targeted subreddits every 4 hours, uses AI to analyze posts for relevance and lead potential, generates authentic helpful responses that add value to discussions, posts comments automatically, and captures high-quality leads (70%+ potential score) directly into your CRM—all while maintaining full Reddit compliance and looking completely human. ✨ Key Features: 6 Daily Checks: Monitors subreddits every 4 hours for fresh content 9 Subreddit Coverage: Customizable list of target communities AI Post Analysis: Determines relevance, intent, and lead potential Intelligent Engagement: Only comments when you can add genuine value Authentic Responses: AI-generated comments that sound human, not promotional Lead Scoring: 0-1.0 scale identifies high-potential prospects (0.7+ captured) Automatic CRM Integration: High-quality leads flow directly to Supabase Rate Limit Protection: 60-second delays ensure Reddit API compliance Native Reddit Integration: Official n8n Reddit node with OAuth2 Beginner-Friendly: 14+ detailed sticky notes explaining every component 🎯 Target Subreddits (Customizable): Insurance & Claims: r/Insurance - General insurance questions r/ClaimAdvice - Claim filing help r/AutoInsurance - Auto coverage discussions r/FloodInsurance - Flood damage queries r/PropertyInsurance - Property coverage Property & Home: r/homeowners - Property issues and claims r/RoofingContractors - Roof damage discussions Financial & Legal: r/PersonalFinance - Insurance decisions r/legaladvice - Legal aspects of claims 🤖 AI Analysis Components: Post Evaluation: Relevance score (0-100%) User intent detection Damage type identification (hail, water, fire, wind) Urgency level (low/medium/high) Lead potential score (0-1.0) Recommended services Engagement opportunity assessment Decision Criteria: Should engage? (boolean) Can we add value? (quality check) Is this promotional? (avoid spam) Lead worth capturing? (70%+ threshold) Typical Engagement Rate: 5-10% of analyzed posts (67-135 comments/day) 🔧 Technical Stack: Trigger: Schedule (every 4 hours, 6x daily) Reddit API: Native n8n node with OAuth2 AI Analysis: Supabase Edge Functions Response Generation: AI-powered contextual replies Lead Capture: Supabase CRM integration Rate Limiting: Wait node (60-second delays)
by SIENNA
Automated FTP/SFTP to MinIO Object Backup with Scheduling $\mapsto$ Can work with FTP/SFTP Servers like your Wordpress Website ! What this workflow does ? This workflow performs automated, periodic backups of files from a FTP or SFTP server directory to a MinIO S3 bucket running locally or on a dedicated container/VM/server. It can also work if the MinIO bucket is running on a remote cloud provider's infrastructure; you just need to change the URL and keys. Who's this intended for ? Storage administrators, cloud architects, or DevOps who need a simple and scalable solution for retrieving data from a remote FTP or SFTP Server. This can also be practical for Wordpress Devs that need to backup data from a server hosting a Wordpress Website. In that case, you'll just need to specify the folder that you want to backup (could be one from wordpress/uploads or even the root one) How it works This workflow uses commands to list and download files from a specific directory on a FTP-SFTP Server, then send them to MinIO using their version of the S3 API. The source directory can be a specific one or the entire server (the root directory) Requirements None, just a source folder/directory on a FTP/SFTP Server and a destination bucket on MinIO. You'll also need to get MinIO running. You're using Proxmox VE ? Create a MinIO LXC Container : https://community-scripts.github.io/ProxmoxVE/scripts?id=minio Need a Backup from another Cloud Storage Provider ? Need automated backup from another Cloud Storage Provider ? $\mapsto$ Check out our templates, we've done it with AWS, Azure, and GCP, and we even have a version for FTP/SFTP servers! $\odot$ These workflow can be integrated to bigger ones and modified to best suit your needs ! You can, for example, replace the MinIO node to another S3 Bucket from another Cloud Storage Provider (Backblaze, Wasabi, Scaleway, OVH, ...)
by SerpApi
Sync Google Maps Reviews to Google Sheets for Any Google Maps Query How it works This workflow accepts any query you might run on actual Google Maps to search for places. The search happens through SerpApi's Google Maps API. Once the workflow receives place results from Google Maps, it loops through each place fetching reviews using SerpApi's Google Maps Reviews API. By default, the workflow will be limited to fetch up to 50 reviews per place. This can be customized in the 'Set Review Limit' node}. The first page of reviews for a place will only return 8 reviews. All subsequent pages will return up to 20 reviews. The fetched reviews are sent to a connected Google Sheet. How to use Create a free SerpApi account here: https://serpapi.com/ Add SerpApi credentials to n8n. Your SerpApi API key is here: https://serpapi.com/manage-api-key Connect your Google Sheets accounts to n8n. Help available here: https://n8n.io/integrations/google-sheets/ Create a Google Sheet with these column headers: name, iso_date, rating, snippet Connect your Google Sheet in the 'Append Reviews' Google Sheet node Update the Search Query in the 'Search Google Maps' node to set your own query (Optional) Update the review limit from the default 50 in the 'Set Review Limit' node. Set it to a very high number (e.g. 50000) to get all possible reviews. Hit 'Test Workflow' to manually trigger the workflow. Limitations Can only retrieve the top 20 results from Google Maps. It won't paginate to get more results. The workflow could be extended to support Google Maps Pagination. Warning Each request to SerpApi consumes 1 search credit. Be mindful of how many search credits your account has before requesting more reviews than your account supports. As an example, if a Google Maps query returns 20 results and you fetch the default limit of 50 reviews per page, this will use up to 61 SerpApi search credits. Documentation Google Maps API Google Maps Reviews API SerpApi n8n Node Intro Guide
by KlickTipp
Community Node Disclaimer: This workflow uses KlickTipp community nodes. Introduction This workflow automates Stripe checkout confirmations by capturing transaction data and syncing it into KlickTipp. Upon successful checkout, the contact's data is enriched with purchase details and tagged to trigger a personalized confirmation campaign in KlickTipp. Perfect for digital product sellers, course creators, and service providers seeking an end-to-end automated sales confirmation process. Benefits Instant confirmation emails**: Automatically notify customers upon successful checkout—no manual processing needed. Structured contact data**: Order data (invoice link, amount, transaction ID, products) is stored in KlickTipp custom fields. Smart campaign triggering**: Assign dynamic tags to start automated confirmation or fulfillment sequences. Seamless digital delivery**: Ideal for pairing with tools like Memberspot or Mentortools to unlock digital products post-checkout. Key Features Stripe Webhook Trigger**: Triggers on Triggers on Checkout Session.completed events events. Captures checkout data including product names, order number, and total amount. KlickTipp Contact Sync**: Adds or updates contacts in KlickTipp. Maps Stripe data into custom fields Assigns a tag such as Stripe Checkout to initiate a confirmation campaign. Router Logic (optional)**: Branches logic based on product ID or Stripe payment link. Enables product-specific campaigns or follow-ups. Setup Instructions KlickTipp Preparation Create the following custom fields in your KlickTipp account: | Field Name | Field Type | |--------------------------|------------------| | Stripe \| Products | Text | | Stripe \| Total | Decimal Number | | Stripe \| Payment ID | Text | | Stripe \| Receipt URL | URL | Define a tag for each product or confirmation flow, e.g., Order: Course XYZ. Credential Configuration Connect your Stripe account using an API key from the Stripe Dashboard. Authenticate your KlickTipp connection with username/password credentials (API access required). Field Mapping and Workflow Alignment Map Stripe output fields to the KlickTipp custom fields. Assign the tag to trigger your post-purchase campaign. Ensure that required data like email and opt-in info are present for the contact to be valid. Testing and Deployment Click on Inactive to activate the scenario. Perform a test payment using a Stripe product link. Verify in KlickTipp: The contact appears with email and opt-in status. Custom fields for Stripe are filled. The campaign tag is correctly applied and confirmation email is sent. ⚠️ Note: Use real or test-mode API keys in Stripe depending on your testing environment. Stripe events may take a few seconds to propagate. Campaign Expansion Ideas Launch targeted upsell flows based on the product tag. Use confirmation placeholders like: [[Stripe | Products]] [[Stripe | Total]] [[Stripe | Payment ID]] [[Stripe | Products]] Route customers to different product access portals (e.g., Memberspot, Mentortools). Send follow-up content over multiple days using KlickTipp sequences. Customization You can extend the scenario using a switch node to: Assign different tags per used payment link Branch into upsell or membership activation flows Chain additional automations like CRM entry, Slack notification, or invoice creation. Resources: Use KlickTipp Community Node in n8n Automate Workflows: KlickTipp Integration in n8n
by Yaron Been
Generate Custom Text Content with IBM Granite 3.3 8B Instruct AI This workflow connects to Replicate’s API and uses the ibm-granite/granite-3.3-8b-instruct model to generate text. ✅ 🔵 SECTION 1: Trigger & Setup ⚙️ Nodes 1️⃣ On clicking 'execute' What it does:* Starts the workflow manually when you hit *Execute. Why it’s useful:** Perfect for testing text generation on-demand. 2️⃣ Set API Key What it does:* Stores your *Replicate API key** securely. Why it’s useful:** You don’t hardcode credentials into HTTP nodes — just set them once here. Beginner tip:** Replace YOUR_REPLICATE_API_KEY with your actual API key. 💡 Beginner Benefit ✅ No coding needed to handle authentication. ✅ You can reuse the same setup for other Replicate models. ✅ 🤖 SECTION 2: Model Request & Polling ⚙️ Nodes 3️⃣ Create Prediction (HTTP Request) What it does:* Sends a *POST request** to Replicate’s API to start a text generation job. Parameters include:** temperature, max_tokens, top_k, top_p. Why it’s useful:** Controls how creative or focused the AI text output will be. 4️⃣ Extract Prediction ID (Code) What it does:* Pulls the *prediction ID** and builds a URL for checking status. Why it’s useful:** Replicate jobs run asynchronously, so you need the ID to track progress. 5️⃣ Wait What it does:* Pauses for *2 seconds** before checking the prediction again. Why it’s useful:** Prevents spamming the API with too many requests. 6️⃣ Check Prediction Status (HTTP Request) What it does:* Polls the Replicate API for the *current status** (e.g., starting, processing, succeeded). Why it’s useful:** Lets you loop until the AI finishes generating text. 7️⃣ Check If Complete (IF Condition) What it does:* If the status is *succeeded, it goes to “Process Result.” Otherwise, it loops back to **Wait and retries. Why it’s useful:** Creates an automated polling loop without writing complex code. 💡 Beginner Benefit ✅ No need to manually refresh or check job status. ✅ Workflow keeps retrying until text is ready. ✅ Smart looping built-in with Wait + If Condition. ✅ 🟢 SECTION 3: Process & Output ⚙️ Nodes 8️⃣ Process Result (Code) What it does:* Collects the final *AI output**, status, metrics, and timestamps. Adds info like:** ✅ output → Generated text ✅ model → ibm-granite/granite-3.3-8b-instruct ✅ metrics → Performance data Why it’s useful:** Gives you a neat, structured JSON result that’s easy to send to Sheets, Notion, or any app. 💡 Beginner Benefit ✅ Ready-to-use text output. ✅ Easy integration with any database or CRM. ✅ Transparent metrics (when it started, when it finished, etc.). ✅✅✅ ✨ FULL FLOW OVERVIEW | Section | What happens | | ------------------------------ | ---------------------------------------------------------------------------- | | ⚡ Trigger & Setup | Start workflow + set Replicate API key. | | 🤖 Model Request & Polling | Send request → get Prediction ID → loop until job completes. | | 🟢 Process & Output | Extract clean AI-generated text + metadata for storage or further workflows. | 📌 How You Benefit Overall ✅ No coding needed — just configure your API key. ✅ Reliable polling — the workflow waits until results are ready. ✅ Flexible — you can extend output to Google Sheets, Slack, Notion, or email. ✅ Beginner-friendly — clean separation of input, process, and output. ✨ With this workflow, you’ve turned Replicate’s IBM Granite LLM into a no-code text generator — running entirely inside n8n! ✨
by SerpApi
Google Play Store App Rank and Rating Monitoring What and who this is for This workflow will be useful for anyone looking to do SEO tracking on the Google Play Store. It automates checking Google Play Store rank positions and average ratings for a list of app titles. The SerpApi component can also be modified to use other APIs for anyone looking for SEO tracking on any other search engine supported by SerpApi. How it works This workflow takes in a list of keywords and app titles to identify the apps' rank in Google Play Store search results. It also grabs the average rating of the app. The search uses SerpApi's Google Play Store API. The results are then synced to two different sheets in a Google Sheet. The first is a log of all past run. The latest results are appended to the bottom of the log. The second updates a kind of "dashboard" to show the results from the latest run. The workflow includes a Wait node that delays 4 seconds between each app title and keyword pair to prevent hitting the default Google Sheets' API per minute rate limit. You can delete this if you have a high enough custom rate limit on the Google Sheets API. The Schedule Trigger is configured to run at 10 AM UTC every day. How to use Create a free SerpApi account here: https://serpapi.com/ Add SerpApi credentials to n8n. Your SerpApi API key is here: https://serpapi.com/manage-api-key Connect your Google Sheets accounts to n8n. Help available here: https://n8n.io/integrations/google-sheets/ Copy this Google Sheet to your own Google account: https://docs.google.com/spreadsheets/d/1DiP6Zhe17tEblzKevtbPqIygH3dpPCW-NAprxup0VqA/edit?gid=1750873622#gid=1750873622 Set your own list of keywords and app titles to match in the 'Latest Run' sheet. This is the source list used to run the searches and must be set. Connect your Google Sheet in the 'Get Keywords and Titles to Match' Google Sheet node Connect your Google Sheet in the 'Update Rank & Rating Log' Google Sheet node Connect your Google Sheet again in the 'Update Latest Run' Google Sheet node (Optional) Update the schedule or disable the schedule to only run manually Documentation SerpApi Google Play Store API SerpApi n8n Node Intro Guide