by Nima Salimi
Overview This n8n workflow automatically retrieves the monthly CrUX (Chrome User Experience) Report from Google BigQuery and updates the data in NocoDB. It removes the previous month’s data before inserting the new dataset, ensuring your database always contains the latest CrUX rankings for website origins. The flow is fully automated, using schedule triggers to handle both data cleanup and data insertion each month. ✅ Tasks ⏰ Runs automatically on a monthly schedule 🔢 Converts the month name to a numeric value for table selection 🧹 Deletes last month’s CrUX data from NocoDB 🌐 Queries Google BigQuery for the latest monthly dataset 💾 Inserts the new CrUX rankings into NocoDB ⚙️ Keeps your database up to date with zero manual effort 🛠 How to Use 1️⃣ Set Up BigQuery Access Connect your Google BigQuery credentials. Make sure your project includes access to the chrome-ux-report public dataset. 2️⃣ Adjust the Query In the Google BigQuery node, change the LIMIT value to control how many top-ranked sites are retrieved. Ensure the {{ $json.table }} field correctly references the dataset for the desired month (e.g., 202509). 3️⃣ Prepare NocoDB Table Create a table in NocoDB with fields: origin, crux_rank, and any additional metadata you wish to track. 4️⃣ Schedule Automation The workflow includes two Schedule Trigger nodes: One runs the data cleanup process (deletes last month). One runs the data insertion for the new month. 5️⃣ Run or Activate the Workflow Activate it to run automatically each month. You can also run it manually to refresh data on demand. 📋 Prerequisites Before running this workflow, make sure you complete the following setup steps: 🧱 Enable BigQuery API Go to Google Cloud Console → APIs & Services Enable the BigQuery API for your project. 📊 Access the Chrome UX Report Dataset In BigQuery, search for “Chrome UX Report” in the Marketplace or go directly to: https://console.cloud.google.com/marketplace/product/chrome-ux-report/chrome-ux-report Click “View Dataset” and add it to your BigQuery project. 🔑 Connect BigQuery to n8n In n8n, create credentials for your Google BigQuery account using Service Account Authentication. Ensure the account has permission to query the chrome-ux-report dataset. 🗄️ Create a NocoDB Table In NocoDB, create a new table to store your CrUX data with the following fields: origin → Short text crux_rank → Number ⚙️ Connect NocoDB to n8n Use your NocoDB API Token to connect and allow the workflow to read/write data. What is CrUX Rank? CrUX Rank (Chrome User Experience Rank) is a metric from Google’s Chrome UX Report (CrUX) dataset that indicates a website’s popularity based on real user visits. It reflects how frequently an origin (website) is loaded by Chrome users around the world. A lower rank number means the site is more popular (e.g., rank 1 = top site). The data is collected from anonymized Chrome usage statistics, aggregated monthly. This rank helps you track site popularity trends and compare your domain’s visibility over time.
by Sk developer
Automated DA PA Checker Workflow for SEO Analysis Description This n8n workflow collects a website URL via form submission, retrieves SEO metrics like Domain Authority (DA) and Page Authority (PA) using the Moz DA PA Checker API, and stores the results in Google Sheets for easy tracking and analysis. Node-by-Node Explanation On form submission – Captures the website input from the user to pass to the Moz DA PA Checker API. DA PA API Request – Sends the website to the Moz DA PA Checker API via RapidAPI to fetch DA, PA, spam score, DR, and organic traffic. If – Checks if the API request to the Moz DA PA Checker API returned a successful response. Clean Output – Extracts only the useful data from the Moz DA PA Checker API response for saving. Google Sheets – Appends the cleaned SEO metrics to a Google Sheet for record-keeping. Use Cases SEO Analysis** – Quickly evaluate a website’s DA/PA metrics for optimization strategies. Competitor Research** – Compare domain authority and organic traffic with competitors. Link Building** – Identify high-authority domains for guest posting and backlinks. Domain Purchase Decisions** – Check metrics before buying expired or auctioned domains. Benefits Automated Workflow** – From input to Google Sheets without manual intervention. Accurate Metrics* – Uses the trusted *Moz DA PA Checker API** for DA, PA, spam score, DR, and traffic. Instant Insights** – Get SEO scores in seconds for faster decision-making. Easy Integration** – Seamless connection between RapidAPI and Google Sheets for data storage.
by SIENNA
Automated FTP/SFTP to MinIO Object Backup with Scheduling $\mapsto$ Can work with FTP/SFTP Servers like your Wordpress Website ! What this workflow does ? This workflow performs automated, periodic backups of files from a FTP or SFTP server directory to a MinIO S3 bucket running locally or on a dedicated container/VM/server. It can also work if the MinIO bucket is running on a remote cloud provider's infrastructure; you just need to change the URL and keys. Who's this intended for ? Storage administrators, cloud architects, or DevOps who need a simple and scalable solution for retrieving data from a remote FTP or SFTP Server. This can also be practical for Wordpress Devs that need to backup data from a server hosting a Wordpress Website. In that case, you'll just need to specify the folder that you want to backup (could be one from wordpress/uploads or even the root one) How it works This workflow uses commands to list and download files from a specific directory on a FTP-SFTP Server, then send them to MinIO using their version of the S3 API. The source directory can be a specific one or the entire server (the root directory) Requirements None, just a source folder/directory on a FTP/SFTP Server and a destination bucket on MinIO. You'll also need to get MinIO running. You're using Proxmox VE ? Create a MinIO LXC Container : https://community-scripts.github.io/ProxmoxVE/scripts?id=minio Need a Backup from another Cloud Storage Provider ? Need automated backup from another Cloud Storage Provider ? $\mapsto$ Check out our templates, we've done it with AWS, Azure, and GCP, and we even have a version for FTP/SFTP servers! $\odot$ These workflow can be integrated to bigger ones and modified to best suit your needs ! You can, for example, replace the MinIO node to another S3 Bucket from another Cloud Storage Provider (Backblaze, Wasabi, Scaleway, OVH, ...)
by SerpApi
Sync Google Maps Reviews to Google Sheets for Any Google Maps Query How it works This workflow accepts any query you might run on actual Google Maps to search for places. The search happens through SerpApi's Google Maps API. Once the workflow receives place results from Google Maps, it loops through each place fetching reviews using SerpApi's Google Maps Reviews API. By default, the workflow will be limited to fetch up to 50 reviews per place. This can be customized in the 'Set Review Limit' node}. The first page of reviews for a place will only return 8 reviews. All subsequent pages will return up to 20 reviews. The fetched reviews are sent to a connected Google Sheet. How to use Create a free SerpApi account here: https://serpapi.com/ Add SerpApi credentials to n8n. Your SerpApi API key is here: https://serpapi.com/manage-api-key Connect your Google Sheets accounts to n8n. Help available here: https://n8n.io/integrations/google-sheets/ Create a Google Sheet with these column headers: name, iso_date, rating, snippet Connect your Google Sheet in the 'Append Reviews' Google Sheet node Update the Search Query in the 'Search Google Maps' node to set your own query (Optional) Update the review limit from the default 50 in the 'Set Review Limit' node. Set it to a very high number (e.g. 50000) to get all possible reviews. Hit 'Test Workflow' to manually trigger the workflow. Limitations Can only retrieve the top 20 results from Google Maps. It won't paginate to get more results. The workflow could be extended to support Google Maps Pagination. Warning Each request to SerpApi consumes 1 search credit. Be mindful of how many search credits your account has before requesting more reviews than your account supports. As an example, if a Google Maps query returns 20 results and you fetch the default limit of 50 reviews per page, this will use up to 61 SerpApi search credits. Documentation Google Maps API Google Maps Reviews API SerpApi n8n Node Intro Guide
by KlickTipp
Community Node Disclaimer: This workflow uses KlickTipp community nodes. Introduction This workflow automates Stripe checkout confirmations by capturing transaction data and syncing it into KlickTipp. Upon successful checkout, the contact's data is enriched with purchase details and tagged to trigger a personalized confirmation campaign in KlickTipp. Perfect for digital product sellers, course creators, and service providers seeking an end-to-end automated sales confirmation process. Benefits Instant confirmation emails**: Automatically notify customers upon successful checkout—no manual processing needed. Structured contact data**: Order data (invoice link, amount, transaction ID, products) is stored in KlickTipp custom fields. Smart campaign triggering**: Assign dynamic tags to start automated confirmation or fulfillment sequences. Seamless digital delivery**: Ideal for pairing with tools like Memberspot or Mentortools to unlock digital products post-checkout. Key Features Stripe Webhook Trigger**: Triggers on Triggers on Checkout Session.completed events events. Captures checkout data including product names, order number, and total amount. KlickTipp Contact Sync**: Adds or updates contacts in KlickTipp. Maps Stripe data into custom fields Assigns a tag such as Stripe Checkout to initiate a confirmation campaign. Router Logic (optional)**: Branches logic based on product ID or Stripe payment link. Enables product-specific campaigns or follow-ups. Setup Instructions KlickTipp Preparation Create the following custom fields in your KlickTipp account: | Field Name | Field Type | |--------------------------|------------------| | Stripe \| Products | Text | | Stripe \| Total | Decimal Number | | Stripe \| Payment ID | Text | | Stripe \| Receipt URL | URL | Define a tag for each product or confirmation flow, e.g., Order: Course XYZ. Credential Configuration Connect your Stripe account using an API key from the Stripe Dashboard. Authenticate your KlickTipp connection with username/password credentials (API access required). Field Mapping and Workflow Alignment Map Stripe output fields to the KlickTipp custom fields. Assign the tag to trigger your post-purchase campaign. Ensure that required data like email and opt-in info are present for the contact to be valid. Testing and Deployment Click on Inactive to activate the scenario. Perform a test payment using a Stripe product link. Verify in KlickTipp: The contact appears with email and opt-in status. Custom fields for Stripe are filled. The campaign tag is correctly applied and confirmation email is sent. ⚠️ Note: Use real or test-mode API keys in Stripe depending on your testing environment. Stripe events may take a few seconds to propagate. Campaign Expansion Ideas Launch targeted upsell flows based on the product tag. Use confirmation placeholders like: [[Stripe | Products]] [[Stripe | Total]] [[Stripe | Payment ID]] [[Stripe | Products]] Route customers to different product access portals (e.g., Memberspot, Mentortools). Send follow-up content over multiple days using KlickTipp sequences. Customization You can extend the scenario using a switch node to: Assign different tags per used payment link Branch into upsell or membership activation flows Chain additional automations like CRM entry, Slack notification, or invoice creation. Resources: Use KlickTipp Community Node in n8n Automate Workflows: KlickTipp Integration in n8n
by Yaron Been
Generate Custom Text Content with IBM Granite 3.3 8B Instruct AI This workflow connects to Replicate’s API and uses the ibm-granite/granite-3.3-8b-instruct model to generate text. ✅ 🔵 SECTION 1: Trigger & Setup ⚙️ Nodes 1️⃣ On clicking 'execute' What it does:* Starts the workflow manually when you hit *Execute. Why it’s useful:** Perfect for testing text generation on-demand. 2️⃣ Set API Key What it does:* Stores your *Replicate API key** securely. Why it’s useful:** You don’t hardcode credentials into HTTP nodes — just set them once here. Beginner tip:** Replace YOUR_REPLICATE_API_KEY with your actual API key. 💡 Beginner Benefit ✅ No coding needed to handle authentication. ✅ You can reuse the same setup for other Replicate models. ✅ 🤖 SECTION 2: Model Request & Polling ⚙️ Nodes 3️⃣ Create Prediction (HTTP Request) What it does:* Sends a *POST request** to Replicate’s API to start a text generation job. Parameters include:** temperature, max_tokens, top_k, top_p. Why it’s useful:** Controls how creative or focused the AI text output will be. 4️⃣ Extract Prediction ID (Code) What it does:* Pulls the *prediction ID** and builds a URL for checking status. Why it’s useful:** Replicate jobs run asynchronously, so you need the ID to track progress. 5️⃣ Wait What it does:* Pauses for *2 seconds** before checking the prediction again. Why it’s useful:** Prevents spamming the API with too many requests. 6️⃣ Check Prediction Status (HTTP Request) What it does:* Polls the Replicate API for the *current status** (e.g., starting, processing, succeeded). Why it’s useful:** Lets you loop until the AI finishes generating text. 7️⃣ Check If Complete (IF Condition) What it does:* If the status is *succeeded, it goes to “Process Result.” Otherwise, it loops back to **Wait and retries. Why it’s useful:** Creates an automated polling loop without writing complex code. 💡 Beginner Benefit ✅ No need to manually refresh or check job status. ✅ Workflow keeps retrying until text is ready. ✅ Smart looping built-in with Wait + If Condition. ✅ 🟢 SECTION 3: Process & Output ⚙️ Nodes 8️⃣ Process Result (Code) What it does:* Collects the final *AI output**, status, metrics, and timestamps. Adds info like:** ✅ output → Generated text ✅ model → ibm-granite/granite-3.3-8b-instruct ✅ metrics → Performance data Why it’s useful:** Gives you a neat, structured JSON result that’s easy to send to Sheets, Notion, or any app. 💡 Beginner Benefit ✅ Ready-to-use text output. ✅ Easy integration with any database or CRM. ✅ Transparent metrics (when it started, when it finished, etc.). ✅✅✅ ✨ FULL FLOW OVERVIEW | Section | What happens | | ------------------------------ | ---------------------------------------------------------------------------- | | ⚡ Trigger & Setup | Start workflow + set Replicate API key. | | 🤖 Model Request & Polling | Send request → get Prediction ID → loop until job completes. | | 🟢 Process & Output | Extract clean AI-generated text + metadata for storage or further workflows. | 📌 How You Benefit Overall ✅ No coding needed — just configure your API key. ✅ Reliable polling — the workflow waits until results are ready. ✅ Flexible — you can extend output to Google Sheets, Slack, Notion, or email. ✅ Beginner-friendly — clean separation of input, process, and output. ✨ With this workflow, you’ve turned Replicate’s IBM Granite LLM into a no-code text generator — running entirely inside n8n! ✨
by Kev
Automatically create and publish ready-to-post social media news updates — all powered by AI. This workflow turns any RSS feed into professional, branded posts, complete with visuals and captions. Use cases include automating news updates, sharing industry insights, or maintaining an active social presence without manual work. Good to know Fully automated end-to-end publishing — from RSS feed to social post Uses JsonCut for dynamic image composition (backgrounds, text overlays, logos) Publishes directly to Instagram (or other channels) via Blotato Utilizes OpenAI GPT-5 for post text and image prompt generation Polling mechanism checks job status every 3 seconds Setup time: under 10 minutes once credentials are in place How it works The RSS Trigger monitors any RSS feed for new content. OpenAI GPT-5 rewrites the headline and creates a short, social-friendly post caption. An AI image prompt is generated to match the article’s topic and mood. JsonCut combines the background, logo, and headline text into a branded image. Once the image is ready, Blotato uploads and publishes the post directly to Instagram (or other connected platforms). The process runs completely automatically — no human input required after setup. How to use Import the workflow into your n8n instance. Configure your RSS feed URL(s). Add your JsonCut, Blotato, and OpenAI credentials. Activate the workflow — it will automatically generate and post new content whenever your RSS source updates. Requirements Free account at jsoncut.com Account at blotato.com (paid service — can be replaced with any social media API or publishing platform) API keys for both services: JsonCut API Key via app.jsoncut.com Blotato API Key via www.blotato.com OpenAI credential** (GPT-5 or compatible model) RSS Feed URL** (e.g. from a news site, blog, or press page) Setup steps Sign up for a free account at app.jsoncut.com. If you use Blotato, create an account at blotato.com and generate an API key. In n8n, add: JsonCut API Key (HTTP Header Auth, header: x-api-key) Blotato API credential (optional — can be replaced) OpenAI credential for GPT-5 Replace the example RSS URL in the RSS Feed Trigger node with your own. Activate the workflow — it will start monitoring, generating, and posting automatically. Customising this workflow You can easily adjust: The image layout and branding (in the “Create JsonCut Job” node) The tone or length of social captions (in the “Create Instagram Text” node prompt) The publishing platform — replace Blotato with another integration (e.g. Buffer, Hootsuite, or native social API) Posting frequency via the RSS trigger interval For advanced customization, check out: JsonCut Documentation JsonCut Image Generation Examples Blotato Website n8n Documentation
by SerpApi
Google Play Store App Rank and Rating Monitoring What and who this is for This workflow will be useful for anyone looking to do SEO tracking on the Google Play Store. It automates checking Google Play Store rank positions and average ratings for a list of app titles. The SerpApi component can also be modified to use other APIs for anyone looking for SEO tracking on any other search engine supported by SerpApi. How it works This workflow takes in a list of keywords and app titles to identify the apps' rank in Google Play Store search results. It also grabs the average rating of the app. The search uses SerpApi's Google Play Store API. The results are then synced to two different sheets in a Google Sheet. The first is a log of all past run. The latest results are appended to the bottom of the log. The second updates a kind of "dashboard" to show the results from the latest run. The workflow includes a Wait node that delays 4 seconds between each app title and keyword pair to prevent hitting the default Google Sheets' API per minute rate limit. You can delete this if you have a high enough custom rate limit on the Google Sheets API. The Schedule Trigger is configured to run at 10 AM UTC every day. How to use Create a free SerpApi account here: https://serpapi.com/ Add SerpApi credentials to n8n. Your SerpApi API key is here: https://serpapi.com/manage-api-key Connect your Google Sheets accounts to n8n. Help available here: https://n8n.io/integrations/google-sheets/ Copy this Google Sheet to your own Google account: https://docs.google.com/spreadsheets/d/1DiP6Zhe17tEblzKevtbPqIygH3dpPCW-NAprxup0VqA/edit?gid=1750873622#gid=1750873622 Set your own list of keywords and app titles to match in the 'Latest Run' sheet. This is the source list used to run the searches and must be set. Connect your Google Sheet in the 'Get Keywords and Titles to Match' Google Sheet node Connect your Google Sheet in the 'Update Rank & Rating Log' Google Sheet node Connect your Google Sheet again in the 'Update Latest Run' Google Sheet node (Optional) Update the schedule or disable the schedule to only run manually Documentation SerpApi Google Play Store API SerpApi n8n Node Intro Guide
by Francisco Rivera
What this template does Connect a Vapi AI voice agent to Google Calendar to capture contact details and auto-book appointments. The agent asks for name, address, service type, and a preferred time. The workflow checks availability and either proposes times or books the slot—no code needed. How it works (node map) Webhook: Production URL = VAPI Server URL — receives tool calls from Vapi and returns results. CONFIGURATION (EDIT ME)** — your timezone, work hours, meeting length, buffers, and cadence. Route by Tool Name** — routes Vapi tool calls: checkAvailability → calendar lookup path bookAppointment → create event path Get Calendar Events (EDIT ME)** — reads events for the requested day. Calculate Potential Slots / Filter for Available Slots** — builds conflict-free options with buffers. Respond with Available Times — returns formatted slots to Vapi. Book Appointment in Calendar (EDIT ME)** — creates the calendar event with details. Booking Confirmation** — returns success back to Vapi. > Sticky notes in the canvas show exactly what to edit (required by n8n). No API keys are hardcoded; Google uses OAuth credentials. Requirements n8n (Cloud or self-hosted) Google account with Calendar (OAuth credential in n8n) Vapi account + one Assistant Setup (5 minutes) A) Vapi → n8n connection Open the Webhook node and copy the Production URL. In Vapi → Assistant → Messaging, set Server URL = that Production URL. In Server Messages, enable only toolCalls. B) Vapi tools (names must match exactly) Create two Custom Tools in Vapi and attach them to the assistant: Tool 1: checkAvailability Arguments** initialSearchDateTime (string, ISO-8601 with timezone offset, e.g. 2025-09-09T09:00:00-05:00) Tool 2: Arguments** startDateTime (string, ISO-8601 with tz) endDateTime (string, ISO-8601 with tz) clientName (string) propertyAddress (string) serviceType (string) > The Switch node routes based on C) Configure availability Open 1. CONFIGURATION (EDIT ME) and set: D) Connect Google Calendar Open 2. Get Calendar Events (EDIT ME) → Credentials: select/create Google Calendar OAuth. Then choose the calendar to check availability. Open 3. Book Appointment in Calendar (EDIT ME) → use the same credential and same calendar to book. E) Activate & test Toggle the workflow Active. Call your Vapi number (or start a session) and book a test slot. Verify the event appears with description fields (client, address, service type, call id). Customising Change summary/description format in 3. Book Appointment. Add SMS/Email confirmations, CRM sync, rescheduling, or analytics as follow-ups (see sticky note “I’m a note”). Troubleshooting No response back to Vapi** → confirm Vapi is set to send toolCalls only and the Server URL matches the Production URL. Switch doesn’t route** → tool names must be exactly checkAvailability and bookAppointment. No times returned** → ensure timezone + work hours + cadence generate at least one future slot; confirm Google credential and calendar selection. Event not created** → use the same Google credential & calendar in both nodes; check OAuth scopes/consent. Security & privacy Google uses OAuth; credentials live in n8n. No API keys hardcoded. Webhook receives only the fields needed to check times or book.
by Harshil Agrawal
This workflow automatically creates an event in PostHog when a request is made to a webhook URL. Prerequisites A PostHog account and credentials Nodes Webhook node triggers the workflow when a URL is accessed. PostHog node creates a new event in PostHog.
by Ossian Madisson
This n8n template allows you to, on a schedule, list all files that have been modified since the last execution in a Google Drive folder and in all its subfolders While Google Drive is accessible and easy to use, file listings via API are limited to either all files in the entire Drive or all files in a specific folder. This also means that the n8n triggers for Google Drive are limited to changes to a specific file or folder. This template is built to replace the built-in trigger nodes in the situations when you need to trigger on new or changed files in a folder or any of its subfolders. Use cases Trigger a RAG pipeline to update with new or updated documents Push newly uploaded or updated documents into CMS, project management tool or other external platform Log changes to build an audit trail Trigger a backup job or sync process only for files that have changed since the last run, saving bandwidth and processing time. Notify team or client about new documents Can also be run without the scheduling part to perform a one-time iteration of all files Good to know Works well if you attach a loop node to the "output node" to run additional actions on the files The workflow is designed to use a minimal amount of custom code, preferring built-in nodes in n8n Does not identify file removals How it works Recursively executes a subworkflow for each folder in the main folder Each subworkflow execution sends a list of all files in the folder to an "output node" that checks if the files was created or modified since the last execution When all subworkflows have been executed, the files in the main folder are sent to the "output node" A persistent variable (time of trigger node activation) is set for timestamp comparison on the next execution (this is only set on non-manually triggered active workflow executions) How to use Set schedule interval in the trigger node (default every 60min) Add Google Drive credentials to the four Google Drive nodes Define your main/root folder in the two nodes inside the red box Connect your workflow to process the files after the node in the yellow box, please note that there will be "one output" per folder
by David Olusola
🚀 Automated Lead Management: Google Sheets → Instantly + n8n Data Tables 📋 Overview This workflow automates lead management by syncing data from Google Sheets to Instantly email campaigns while maintaining tracking through n8n Data Tables. It processes leads in batches to avoid rate limits and ensures no duplicates are sent. ⚙️ Complete Setup Guide 1️⃣ Create Your Google Sheet Option A: Use Our Template (Recommended) Copy this template with test data: Google Sheets Template Click File → Make a copy to create your own version Populate with your lead data Option B: Create Your Own Create a Google Sheet with these required columns: Firstname - Contact's first name Email - Contact's email address Website - Company website URL Company - Company name Title - Job title/position 💡 Pro Tip: Add as many leads as you want - the workflow handles batching automatically! 2️⃣ Set Up n8n Data Table The workflow uses one Data Table to track leads and their sync status. Create the "Leads" Data Table: In your n8n workflow editor, add a Data Table node Click "Create New Data Table" Name it: Leads Add the following columns: | Column Name | Type | Purpose | |------------|------|---------| | Firstname | string | Contact's first name | | Lastname | string | Contact's last name | | email | string | Contact's email (unique identifier) | | website | string | Company website | | company | string | Company name | | title | string | Job title | | campaign | string | Sync status (e.g., "start", "added to instantly") | | focusarea | string | Enriched data from Title field | Click Save 📌 Important: The campaign field is crucial - it tracks which leads have been synced to prevent duplicates! 3️⃣ Connect Your Google Sheets Account In the "Get row(s) in sheet" node, click "Create New Credential" Select Google Sheets OAuth2 API Follow the OAuth flow: Sign in with your Google account Grant n8n permission to access your sheets Select your spreadsheet from the dropdown Choose the correct sheet name (e.g., "instantly leads") Test the connection to verify it works 4️⃣ Connect Your Instantly Account Go to Instantly.ai and log in Navigate to Settings → API Copy your API Key Back in n8n, open the "Create a lead" node Click "Create New Credential" Select Instantly API Paste your API key Important: Update the campaign ID: Current ID: 100fa5a2-3ed0-4f12-967c-b2cc4a07c3e8 (example) Replace with your actual campaign ID from Instantly Find this in Instantly under Campaigns → Your Campaign → Settings 5️⃣ Configure the Data Table Nodes You'll need to update three Data Table nodes to point to your newly created "Leads" table: Node 1: "Get row(s)" Operation: Get Data Table: Select Leads Filter: campaign = "start" This fetches only new, unsynced leads Node 2: "Update row(s)1" (Top Flow) Operation: Update Data Table: Select Leads Filter: Match by email field Update: Set focusarea to Title value This enriches lead data Node 3: "Update row(s)" (Bottom Flow) Operation: Update Data Table: Select Leads Filter: Match by Email field Update: Set campaign = "added to instantly" This prevents duplicate sends 6️⃣ Configure the Schedule (Optional) The workflow includes a Schedule Trigger for automation: Default: Runs every hour To customize: Click the "Schedule Trigger" node Choose your interval: Every 30 minutes Every 2 hours Daily at specific time Custom cron expression 💡 For testing: Use the "When clicking 'Execute workflow'" manual trigger instead! 🔄 How It Works Flow 1: Data Transfer (Top Path) This flow moves leads from Google Sheets → n8n Data Table Manual Trigger → Get Google Sheets → Batch Split (30) → Update Data Table → Loop Step-by-step: Manual Trigger - Click to start the workflow manually Get row(s) in sheet - Fetches ALL leads from your Google Sheet Loop Over Items - Splits into batches of 30 leads Update row(s)1 - For each lead: Searches Data Table by email Updates or creates the lead record Stores Title → focusarea for enrichment Loop continues - Processes next batch until all leads transferred ⚙️ Why 30 at a time? Prevents API timeouts Respects rate limits Allows monitoring of progress Can be adjusted in the node settings Flow 2: Instantly Sync (Bottom Path) This flow syncs qualified leads from Data Table → Instantly Schedule Trigger → Get Data Table (filtered) → Individual Loop → Create in Instantly → Update Status Step-by-step: Schedule Trigger - Runs automatically (every hour by default) Get row(s) - Queries Data Table for leads where campaign = "start" Only fetches NEW, unsynced leads Ignores leads already processed Loop Over Items1 - Processes ONE lead at a time Create a lead - Sends lead to Instantly: Campaign: "Launchday 1" Maps: Email, Firstname, Company, Website Adds to email sequence Update row(s) - Updates Data Table: Sets campaign = "added to instantly" Prevents duplicate sends on next run Loop continues - Next lead until all processed 🔍 Why one at a time? Instantly API works best with individual requests Ensures accurate status tracking Prevents partial failures Better error handling per lead ✅ Key Features Explained Batch Processing Processes 30 Google Sheet leads at once Configurable in Loop Over Items node Prevents timeouts on large datasets Duplicate Prevention Uses campaign field as status tracker Only syncs leads where campaign = "start" Updates to "added to instantly" after sync Re-running workflow won't create duplicates Data Enrichment Stores job title in focusarea field Can be used for personalization later Extensible for additional enrichment Two-Trigger System Manual Trigger**: For testing and one-time runs Schedule Trigger**: For automated hourly syncs Both triggers use the same logic Error Tolerance Individual lead processing prevents cascade failures One failed lead won't stop the entire batch Easy to identify and fix problematic records 🧪 Testing Your Workflow Step 1: Test Data Transfer (Flow 1) Add 5 test leads to your Google Sheet Click the Manual Trigger node Click "Execute Node" Check your Leads Data Table - should see 5 new rows Verify focusarea field has data from Title column Step 2: Test Instantly Sync (Flow 2) In Data Table, ensure at least one lead has campaign = "start" Click the Schedule Trigger node Click "Execute Node" (bypasses schedule for testing) Check Instantly dashboard - should see new lead(s) Check Data Table - campaign should update to "added to instantly" Step 3: Test Duplicate Prevention Re-run the Schedule Trigger No new leads should be created in Instantly Data Table shows no changes (already marked as synced) 🚨 Troubleshooting Issue: Google Sheets not fetching data ✅ Check OAuth credentials are valid ✅ Verify spreadsheet ID in node settings ✅ Ensure sheet name matches exactly ✅ Check Google Sheet has data Issue: Data Table not updating ✅ Verify Data Table exists and is named "Leads" ✅ Check column names match exactly (case-sensitive) ✅ Ensure email field is populated (used for matching) Issue: Instantly not receiving leads ✅ Verify Instantly API key is correct ✅ Update campaign ID to your actual campaign ✅ Check campaign = "start" in Data Table ✅ Verify email format is valid Issue: Workflow runs but nothing happens ✅ Check if Data Table has leads with campaign = "start" ✅ Verify loop nodes aren't stuck (check execution logs) ✅ Ensure batch size isn't set to 0 💡 Pro Tips & Best Practices For Beginners: Start small - Test with 5-10 leads first Use manual trigger - Don't enable schedule until tested Check each node - Execute nodes individually to debug Monitor Data Table - Use it as your source of truth Keep backups - Export Data Table regularly For Optimization: Adjust batch size - Increase to 50-100 for large datasets Add delays - Insert "Wait" nodes if hitting rate limits Filter in Google Sheets - Only fetch new rows (use formulas) Archive old leads - Move synced leads to separate table Add error notifications - Connect Slack/email for failures For Scaling: Use multiple campaigns - Add campaign selection logic Implement retry logic - Add "IF" nodes to retry failed syncs Add data validation - Check email format before syncing Log everything - Add "Set" nodes to track execution details Monitor API usage - Track Instantly API quota 📊 Expected Results After Setup: ✅ Google Sheets connected and fetching data ✅ Data Table populated with lead information ✅ Instantly receiving leads automatically ✅ No duplicate sends occurring ✅ Campaign status updating correctly Performance Metrics: 100 leads** - Processes in ~5-10 seconds 1000 leads** - Processes in ~15-20 seconds Instantly API** - 1 lead per second typical speed Schedule runs** - Every hour by default 📬 Need Help? Customization Services: Advanced filtering and segmentation Multi-campaign management Custom field mapping and enrichment Webhook integrations for real-time sync Error handling and monitoring setup Scale to 10K+ leads per day Contact: 📧 david@daexai.com 🎥 Watch Full Tutorial 🎓 What You'll Learn By setting up this workflow, you'll master: ✅ n8n Data Tables - Creating, querying, and updating data ✅ Batch Processing - Handling large datasets efficiently ✅ API Integrations - Connecting Google Sheets and Instantly ✅ Workflow Logic - Building complex multi-path automations ✅ Error Prevention - Implementing duplicate checking ✅ Scheduling - Automating workflows with triggers Happy Flogramming! 🎉