by Ayis Saliaris Fasseas
How It Works Starts with a Manual Trigger Reads lead list from Google Sheet Filter rows where email wasn’t sent Generate personalized email body (AI) Generate email subject line (AI) Merge AI outputs with original row data Create Gmail draft Update Google Sheet with email content and date Wait 3 seconds between updates to avoid API limits Setup Steps Connect Google Sheets, Gmail, and OpenAI credentials Check sheet column names (business_name, email, contact_name, city, business_type, email_sent) Run Manual Trigger to test one row Adjust AI prompts if needed Customization Add unique ID column to match rows if needed Change AI prompts to adjust email style Increase wait time to avoid rate limits Use Cases Draft cold emails for review before sending Automate lead outreach while keeping human oversight Generate personalized emails and subject lines quickly Troubleshooting Tips Draft not created → check Gmail credentials and scopes Sheet not updating → check matching column exists AI outputs empty → increase tokens or check response path
by isaWOW
Automatically track domain expiry dates from Google Sheets, fetch real-time DNS expiry data via WHOIS API, and update expiry details back to your sheet with zero manual effort. Automated Domain Expiry Date Tracker with Google Sheets & WHOIS API Automate the entire process of monitoring domain expiry dates for all your websites directly from Google Sheets. This workflow reads domain names, fetches DNS SOA expiry information using the WHOIS API, converts timestamps into readable dates, and updates expiry details back into your tracking sheet—fully automated and rate-limit safe. Perfect for SEO teams, agencies, hosting managers, and businesses managing large domain portfolios. What this workflow does This automation handles four key tasks: Reads domain data Pulls all website domains directly from a Google Sheet Fetches expiry details Uses WHOIS API to retrieve DNS SOA records for each domain Processes expiry dates Converts expiry timestamps into human-readable DD-MM-YYYY format Extracts expiry month and year automatically Updates tracking sheet Writes expiry date, month name, and year back to Google Sheets Processes domains one by one with a controlled delay to avoid API limits How it works The workflow starts manually and loads configuration values such as Google Sheet ID and sheet name. It reads all domains listed in the Websites column and processes them in a loop. For each domain, the workflow calls the WHOIS API to fetch DNS SOA records. The expiry timestamp is extracted, converted into a readable date format, and enriched with expiry month and year values. Once processed, the workflow updates the same Google Sheet row with the new expiry information. A 30-second pause is applied before moving to the next domain to ensure API safety and stability. Setup requirements Accounts needed: n8n instance (self-hosted or cloud) Google account with Google Sheets access RapidAPI account with WHOIS API access Estimated setup time: 10 minutes Setup steps 1. Import workflow Copy the workflow JSON Open n8n → Workflows → Import from JSON Paste and import Verify all nodes are connected correctly 2. Configure Google Sheets Create a Google Sheet with a Websites column Add Google Sheets OAuth2 credential in n8n Paste your Sheet ID and sheet name inside Set Sheet Configuration node 3. Configure WHOIS API Get your RapidAPI WHOIS API key Add it to the Fetch DNS Records via WHOIS API HTTP Request node Test the API request 4. Verify data mapping Ensure expiry values map correctly to: Domain Expiry Expiry Month Expiry Year 5. Run and monitor Run the workflow manually Check execution logs Verify expiry data updates correctly in Google Sheets What data gets updated Domain data: Domain name Expiry date (DD-MM-YYYY) Expiry month (January–December) Expiry year Sheet updates: Existing rows are matched using the Websites column No duplicate rows are created Use cases SEO management:** Prevent domain expiries that can hurt rankings Agency operations:** Track client domains in one central sheet Hosting monitoring:** Stay ahead of renewal deadlines Portfolio management:** Manage hundreds of domains automatically Important notes Replace the WHOIS API key before activating Google Sheets column names must match exactly Workflow runs sequentially to avoid rate limits One domain is processed at a time Expiry accuracy depends on DNS SOA availability Support Need help or custom development? 📧 Email: info@isawow.com 🌐 Website: https://isawow.com/
by Sabrina Ramonov 🍄
Description This fully automated AI Twin Viral News system researches the latest trending news in any niche or industry, then generates talking-head AI clone videos WITHOUT having to film or edit yourself. This can easily be tailored to any particular niche, simply by updating the Perplexity research prompty. It combines ChatGPT, Perplexity, HeyGen, and Blotato to research, create, and auto-post talking-head AI avatar videos to every social media platform, every single day. Who Is This For? This is perfect for digital marketing agencies, small business owners, entrepreneurs, content creators, influencers, and social media agencies. How It Works 1. Trigger Schedule: configured to run once daily at 10am 2. AI Researcher Call Perplexity to research the top 10 latest news in your industry Select news story most likely to go viral Compile detailed factual report on selected news story 3. AI Writer AI writes monologue script, video caption, and short title 4. Create Avatar Video Call HeyGen API (requires paid API plan), specifying your avatar ID and voice ID Create avatar video; optionally pass an image/video background if you have a green screen avatar 5. Get Video Wait awhile, then fetch completed avatar video Upload video to Blotato 6. Publish to Social Media via Blotato Connect your Blotato account Choose your social accounts Either post immediately or schedule for later Setup & Required Accounts Sign up for Perplexity.ai and set up your API Billing Generate your Perplexity API Key: https://www.perplexity.ai/account/api/keys Sign up for Blotato.com Generate Blotato API Key via Settings > API > Generate API Key (paid feature only) Ensure "Verified Community Nodes" enabled in n8n Admin Panel Install "Blotato" verified community node Create Blotato credential Sign up for HeyGen web plan and API plan Paste your HeyGen API key, HeyGen avatar ID, and HeyGen voice ID Complete the 2 setup steps in BROWN sticky notes in this template Optional: Background Image/Video Behind Avatar Ensure you have an avatar with background removed (requires higher tier Heygen plan) Open SETUP HEYGEN node and set parameter 'has_background_video' to true Open SETUP HEYGEN node and replace video URL in parameter 'background_video_url' Recommendation: Only enable after the basic workflow is operational, i.e. you can make avatar videos without a background Tips & Tricks Perplexity API account must have billing funded HeyGen API requires paid plan Make sure you copied your avatar ID correctly (not the group avatar ID) If your script is long, it takes more time for your video to finish While testing: Enable only 1 social platform and deactivate the rest Update AI writer prompt to return a 5-second script instead of 30 seconds to reduce processing time Go to HeyGen and check that your avatar video is being processed After the workflow finishes, check your social media account for the final post If successful, enable another social media node and continue testing 📄 Documentation Full Tutorial Troubleshooting Check your Blotato API Dashboard to see every request, response, and error. Click on a request to see the details. Need Help? In the Blotato web app, click the orange button on the bottom right corner. This opens the Support messenger where I help answer technical questions. Connect with me: Linkedin | Youtube
by Moe Ahad
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works Using chat node, ask a question pertaining to information stored in your MySQL database AI Agent converts your question to a SQL query AI Agent executes the SQL query and returns a result AI Agent can remember the previous 5 questions How to set up: Add your OpenAI API Key in "OpenAI Chat Model" node Add your MySQL credentials in the "SQL DB - List Tables and Schema" and "Execute a SQL Query in MySQL nodes" Update the database name in "SQL DB - List Tables and Schema" node. Replace "your_query_name" under the Query field with your actual database name After the above steps are completed, use the "When chat message received" node to ask a question about your data using plain English
by Łukasz
What Is This? This workflow monitors your Elastic Email subaccounts daily and sends a Slack alert whenever an account's email credit balance drops below a configurable threshold. It's a simple but essential guard against unexpected sending failures caused by depleted credits. Who Is It For? Any team or agency managing multiple Elastic Email subaccounts — marketing departments, email service providers, or developers running automated email campaigns — who want proactive warnings before credits run out. How Does It Work? The workflow runs once a day on a schedule. It calls the Elastic Email REST API to retrieve all subaccount data, then filters out any accounts with a credit balance below the minimum you define in the Config node. If any low-credit accounts are found, a Slack message with all accounts their email address and current credit balance will be sent. API errors are caught separately and also reported to Slack. How To Set It Up? Prerequisites: An Elastic Email account with API access A Slack workspace with bot permissions Step 1: Set the credit threshold In the Config node, set Minimum amount of Email Credits to the value below which you want to be notified (default: 100). Step 2: Configure Elastic Email credentials In the Load EE Subaccounts HTTP Request node, create a new Custom Auth credential with the following JSON: { "headers": { "X-ElasticEmail-ApiKey": "<Your EE Auth Token>" } } You can generate an API token directly in your Elastic Email account settings. Step 3: Configure Slack credentials Connect your Slack workspace using OAuth2 in both Slack nodes, and update the channel ID to your desired notification channel. Need help? Reach out at developers@sailingbyte.com or visit sailingbyte.com. Happy hacking!
by Oneclick AI Squad
This n8n workflow automates the collection and analysis of real-time attendee feedback and engagement data during sessions or live polls. It generates actionable insights for organizers, streamlining the process of gathering, processing, and delivering feedback to enhance event management and attendee experience. Key Features Collects session feedback and live poll responses in real-time. Analyzes sentiment and extracts key trends for actionable insights. Delivers summarized reports and recommendations to organizers via multiple channels. Supports seamless integration with external tools for data logging and communication. Workflow Process The Webhook Trigger node captures incoming feedback or poll data from attendees, initiating the workflow. The Extract Feedback Data node processes the raw input to organize and prepare it for analysis. The Analyze Sentiment node uses AI to evaluate feedback sentiment and identify key themes or trends. The Aggregate Feedback node compiles the analyzed data into a cohesive summary. The Calculate Insights node generates actionable insights and recommendations based on the aggregated data. The Check Urgency node assesses the priority of the feedback for timely responses or actions. The Log to Google Sheets node records the feedback and insights for future reference. The Webhook Response node sends real-time updates or acknowledgments back to the source. The Post to Slack node delivers summary messages to organizers via Slack. The Email Report to Organizers node sends detailed reports to organizers via email. Setup Instructions Import the workflow into n8n and configure the Webhook Trigger with your event platform's API credentials. Set up the AI service for sentiment analysis and insight generation with a suitable language model. Configure Google Sheets integration for logging feedback data. Set up Slack and email credentials for notifications and reports. Test the workflow by sending sample feedback or poll responses to ensure proper data flow and analysis. Monitor the output and adjust AI parameters or node settings as needed for optimal performance. Prerequisites Webhook integration with the event platform or polling system. AI/LLM service for sentiment analysis and insight generation. Google Sheets account for data logging. Slack workspace and email service for notifications and reports. Access to real-time attendee data from the event platform. Modification Options Modify the Extract Feedback Data node to include specific data fields or custom parsing rules. Adjust the Analyze Sentiment node to focus on particular sentiment metrics or keywords. Customize the Calculate Insights node to prioritize certain types of recommendations. Add additional notification channels (e.g., Microsoft Teams) to the Post to Slack or Email Report nodes. Configure the Check Urgency node to include custom urgency criteria based on event needs.
by Matthew
AI-Powered Personalized Email Nurturing This workflow automates sending tailored onboarding and follow-up emails based on lead commitment levels. It's designed for coaches, consultants, and service-based businesses who want to personalize their outreach and nurture prospects more effectively. How It Works Triggered by New Data: The workflow automatically starts whenever a new row is added to your specified Google Sheet. Assess Commitment: An If node checks the "Commitment" score from your Google Sheet data. If the commitment score is 8 or higher, the lead is considered "warm" and receives a detailed onboarding email. If the commitment score is below 8, the lead is considered "colder" and receives a gentle nudge email. Craft Warm Email (for high commitment): For "warm" leads, an OpenAI node generates an aspirational and confident first-touch email. This email highlights their goals, introduces your flagship offers, and invites them to a 30-minute discovery call. Craft Colder Email (for lower commitment): For "colder" leads, a different OpenAI node generates an encouraging, non-pushy follow-up email. This email thanks them for sharing their goals, reinspires them with the ROI of personal branding, asks a reflective question, and offers a no-pressure Q&A session. Send Emails: The appropriate Gmail node sends the personalized email to the lead. Record Activity: Finally, the Google Sheets node appends the lead's information and details of the sent email to a separate "Contacts" sheet, keeping a clear record of all interactions. Google Sheet Structure Your primary Google Sheet (the one triggering the workflow) must have the following exact column headers: Email** First name** Last name** Phone number** Industry** Title** Three goals you wish to achieve this year** 1-10 how commited are you to your goals** Your secondary Google Sheet (for appending contacts) must have these exact column headers: name** email** phone** industry** title** three goals** commitment** Setup Instructions Add Credentials: In n8n, add your OpenAI API key via the Credentials menu. Connect your Google account via the Credentials menu for Google Sheets and Gmail access. Configure Google Sheets Trigger: Select the Google Sheets Trigger node. Choose your Google Sheets credential, select your spreadsheet, and the specific sheet name that contains your incoming lead data. Configure Commitment Logic: Select the Commitment ≥ 8? If node. The condition is set to ={{ $json['1-10 how commited are you to your goals'] }} >= 8. Ensure your Google Sheet column name for commitment matches exactly, including spacing and capitalization. Configure OpenAI Nodes: Select both crafting warmer email and crafting colder email nodes. Choose your OpenAI credential from the dropdown. Crucially: In both nodes' Messages section, replace placeholder information with your actual company name, founder name, and any specific details about your flagship offers and booking links (https://calendly.com/brandied/discovery). Configure Gmail Nodes: Select both Send warmer email and Send colder email nodes. Choose your Gmail credential. Configure Append Row in Sheet: Select the Append row in sheet node. Choose your Google Sheets credential, select the spreadsheet and sheet name where you want to log sent emails (your "Contacts" sheet). Verify that the column mappings correctly align with the "Google Sheet Structure" for your secondary sheet. Customization Options 💡 Commitment Threshold: Adjust the number in the **Commitment ≥ 8? If node to change what constitutes a "warm" lead (e.g., gte 7 for a commitment score of 7 or higher). Email Content & Tone: Modify the prompts in the **crafting warmer email and crafting colder email OpenAI nodes to fine-tune the email tone, length, specific calls to action, or any other elements to better suit your brand voice and strategy. Follow-Up Cadence: Instead of immediate sending, you could introduce a **Wait node after the email crafting to space out follow-ups or integrate with an email sequence tool. CRM Integration: Replace the final **Google Sheets node with an integration to your CRM (e.g., HubSpot, Salesforce) to automatically log these interactions and enrich lead profiles directly within your CRM.
by Baptiste Fort
Automatically Reply to Customer Emails with Airtable, Gmail, and OpenAI Introduction This guide walks you step by step through setting up an automated agent that: Receives emails sent by your customers. Analyzes the content of the email. Generates an appropriate response using an AI model (OpenAI GPT). Stores all information (received email, AI response, date, customer email) in Airtable. Automatically replies to the customer in the same Gmail thread. Prerequisites Before you start, you’ll need: A Gmail account connected to n8n. An Airtable account. An n8n instance (cloud or self-hosted). An OpenAI API key. Prepare the Airtable Base No need to build everything from scratch — here’s a ready-to-use base you can copy: 👉 Open the Airtable base It already contains the following structure: Subject** (text) → email subject. Date** (date/time) → date of reception. Customer Email** (text) → customer’s email address. Message** (long text) → body of the received email. AI Response** (long text) → AI-generated reply. You can reuse it as it is or duplicate it into your Airtable account. 1. Set Up Gmail Trigger in n8n Alright, now that we have our Airtable base ready, we need to capture customer emails. That’s the job of the Gmail Trigger. Basically, this node lies in wait inside your inbox, and as soon as a new message arrives… bam, your workflow fires up. Connect Your Gmail Account In n8n, add a Gmail Trigger node. Click Credential to connect with and select your Gmail account. If you haven’t done it yet, click Add new, connect your Google account, and allow access. Pro tip: don’t worry, it won’t read your personal emails to gossip — everything stays inside your workflow. Basic Settings Poll Times**: select Every Minute. → This way, n8n checks your inbox every minute. Mode**: Message Received. → You want the flow to trigger whenever a customer writes to you. Event**: Message Received. → Same logic, keep it simple. Simplify**: turn it off (OFF). → Why? Because if you enable "Simplify," you only get a stripped-down version of the email. And you want it all: subject, sender, raw message… the full package. Expected Output When you execute the node, you should see: id**: unique identifier of the email. threadId**: conversation identifier (super useful to reply in the same thread). snippet**: a short preview of the email (first lines). From**: your customer’s email address. To**: your email address. Subject**: the subject of the email. payload**: the full body of the email (yep, in base64 — but we’ll handle that later). And that’s it — your Gmail Trigger is set up. In short, the moment a customer writes “Hey, I have an issue with my account,” your workflow kicks in instantly (well, almost — it checks every minute). 2. Set Up the AI Agent in n8n After configuring your Gmail Trigger (which captures incoming customer emails), you now need a brain to take over, analyze the email, and draft a reply. That’s where the AI Agent node comes in. Its Role The AI Agent node is used to: Read the email content (via the Gmail Trigger). Understand the context and tone of the customer. Generate a clear, concise, and human-like response. Prepare a personalized reply that will later be sent back via Gmail and stored in Airtable. In short, it’s your 24/7 support colleague, but coded as a bot. How to Configure It Source for Prompt (User Message)** → choose Define below. Prompt (User Message)** → describe your business and role as if you were training an intern. Example: “You are an AI support agent for a company that sells solar panels. You respond to technical requests, quotes, and customer questions. Your replies must be short, clear, friendly, and precise.” Chat Model** → connect your AI model (e.g. OpenAI GPT-4.1 Mini). Memory (optional but recommended)* → connect a *Conversation Memory** node. → This allows the AI to retain conversation history and better understand follow-ups. Expected Output When you run this node, you should see in the output: A field output containing the automatically generated AI reply. The text should be short, natural, and adapted to the customer’s tone (casual or formal). 👉 With the Gmail Trigger you capture emails, and with the AI Agent you get a reply ready to send — as if you had written it yourself. 3. Save Emails and Responses in Airtable Now that your AI Agent generates replies, you need to store them somewhere to keep a clear record of all interactions. That’s where Airtable comes in. Quick Reminder You’ve already copied my ready-to-use Airtable base: 👉 Access the base This base contains a table Email Support Logs with the following columns: Subject** Date** Customer Email** Message** AI Response** How to Connect Airtable in n8n Add an Airtable node right after your AI Agent. Under Operation, select Create. In Base → choose BASE AGENT IA EMAIL. In Table → select Email Support Logs. Map the Correct Values Then, link the fields as follows: Subject** → {{ $('Email Received').item.json.Subject }} Customer Email** → {{ $('Email Received').item.json.From }} Message** → {{ $('Email Received').item.json.snippet }} AI Response** → {{ $('AI Agent').item.json.output }} Date** → {{ $now }} Expected Output For each new email received: Gmail captures the email. Your AI drafts the reply. All details (email, sender, subject, reply) are automatically stored in your Airtable base. 👉 You now have a fully automated customer support log. 4. Automatically Reply to the Customer in Gmail Now that you’re storing each interaction in Airtable, it’s time to send your AI’s reply directly back to the customer. This closes the loop: customer writes → AI replies → everything gets logged in Airtable. Add the Gmail Reply Node Add a Gmail node right after your AI Agent (or after Airtable if you prefer logging before replying). Under Operation, select Reply. Connect your Gmail account (same credential as your Gmail Trigger). Configure the Reply Thread ID** → {{ $('Email Received').item.json.threadId }} → Ensures the reply is sent in the same conversation thread. To** → {{ $('Email Received').item.json.From }} → Customer’s email address. Subject** → Re: {{ $('Email Received').item.json.Subject }} → The "Re:" keeps the continuity of the conversation. Message Body** → {{ $('AI Agent').item.json.output }} → This is the text automatically generated by your AI. Expected Output When a customer sends an email: Gmail Trigger captures the message. The AI Agent generates a tailored reply. Airtable logs the full interaction. Gmail automatically sends the response in the same conversation thread. Your customer receives a quick, personalized, and natural reply without you typing a single word. 👉 You now have a complete support agent: listen, analyze, log, reply. Want to save hours each week? Visit Agence automatisation 0vni.
by Snehasish Konger
How it works: This template takes approved Notion pages and syncs them to a Webflow CMS collection as draft items. It reads pages marked Status = Ready for publish in a specific Notion database/project, merges JSON content stored across page blocks into a single object, then either creates a new CMS item or updates the existing one by name. On success it sets the Notion page to 5. Done; on failure it switches the page to On Hold for review.  Step-by-step: Manual Trigger You start the run with When clicking ‘Execute workflow’. Get Notion Pages (Notion → Database: Tech Content Tasks) Pull all pages with Status = Ready for publish scoped to the target Project. Loop Over Items (Split In Batches) Process one Notion page at a time. Code (Pass-through) Expose page fields (e.g., name, id, url, sector) for downstream nodes. Get Notion Block (children) Fetch all blocks under the page id. Merge Content (Code) Concatenate code-block fragments, parse them into one mergedContent JSON, and attach the page metadata. Get Webflow Items (HTTP GET) List items in the target Webflow collection to see if an item with the same name already exists. Update or Create (Switch) No match: Create Webflow Item (POST) with isDraft: true, mapping all fieldData (e.g., category titles, meta title, excerpt, hero copy/image, benefits, problem pointers, FAQ, ROI). Match: Update Webflow Item (Draft) (PATCH) for that id. Keep the existing slug, write latest fieldData, leave isDraft: true. Write Back Status (Notion) Success path → set Status = 5. Done. Error path → set Status = On Hold. Log Submission (Code) Log a compact object with status, notionPageId, webflowItemId, timestamp, and action. Wait → Loop Short pause, then continue with the next page. Tools integration: Notion** — source database and page blocks for approved content. Webflow CMS API* — destination collection; items created/updated as *drafts**. n8n Code** — JSON merge and lightweight logging. Split In Batches + Wait** — controlled, item-wise processing. Want hands-free publishing? Add a Cron trigger before step 2 to run on a schedule.
by Alex Berman
Who is this for This template is for investigators, real estate professionals, recruiters, and sales teams who need to skip trace individuals -- finding current addresses, phone numbers, and emails from a name, phone, or email address. It is ideal for anyone who needs to enrich a list of contacts with verified location and contact data at scale. How it works You configure a list of names (or phones/emails) in the setup node. The workflow submits a skip trace job to the ScraperCity People Finder API and captures the run ID. An async polling loop checks the job status every 60 seconds until it returns SUCCEEDED. Once complete, the results are downloaded, parsed from CSV, and written row by row to Google Sheets. How to set up Add your ScraperCity API key as an HTTP Header Auth credential named "ScraperCity API Key". Open the "Configure Search Inputs" node and replace the placeholder names with your target list. Open the "Save Results to Google Sheets" node and set your Google Sheet document ID and sheet name. Click Execute to run. Requirements ScraperCity account with People Finder access (app.scrapercity.com) Google Sheets OAuth2 credential connected to n8n How to customize the workflow Switch the search input from names to phone numbers or emails by editing the JSON body in "Start People Finder Scrape". Increase or decrease max_results in the request body to control how many matches are returned per person. Add a Filter node after CSV parsing to keep only results with a confirmed phone number or address.
by Asfandyar Malik
Short Description: Automatically collect and analyze your competitor’s YouTube performance. This workflow extracts video titles, views, likes, and descriptions from any YouTube channel and saves the data to Google Sheets — helping creators spot viral trends and plan content that performs. Who’s it for For content creators, YouTubers, and marketing teams who want to track what’s working for their competitors — without manually checking their channels every day. How it works This workflow automatically collects data from any YouTube channel you enter. You just write the channel name in the form — n8n fetches the channel ID, gets all recent video IDs, and extracts each video’s title, views, likes, and description. Finally, all the information is saved neatly into a connected Google Sheet for analysis. How to set up Create a Google Sheet with columns for Title, Views, Likes, Description, and URL. Connect your Google account to n8n. Add your YouTube Data API key inside the HTTP Request nodes (use n8n credentials, not hardcoded keys). Update your form submission or trigger node to match your input method. Execute the workflow once to test and verify that data is flowing into your sheet. Requirements YouTube Data API key Google Sheets account n8n cloud or self-hosted instance How to customize You can modify the JavaScript code node to include more metrics (like comments or publish date), filter by keywords, or change the output destination (e.g., Airtable or Notion).
by Vitorio Magalhães
Overview This workflow automatically enriches Brazilian company data by fetching comprehensive CNPJ information from the MinhaReceita.org API and updating your Google Sheets spreadsheet. Perfect for data analysts, sales teams, and anyone working with Brazilian business databases who needs to enrich company information at scale. What it does Reads CNPJ numbers** from your Google Sheets spreadsheet Fetches complete company data** from Brazil's Federal Revenue database via MinhaReceita.org API Updates your spreadsheet** with comprehensive business information including address, tax status, partners, and more Sends Telegram notifications** when the process is complete Processes data in batches** to handle large datasets efficiently Key Features ✅ Free API - No authentication required, completely free to use ✅ Comprehensive Data - 47+ fields of official Brazilian company information ✅ Batch Processing - Handles up to 100 CNPJs per batch automatically ✅ Smart Filtering - Only processes CNPJs that don't already have data ✅ Real-time Updates - Updates your spreadsheet as data is retrieved ✅ Progress Notifications - Get notified via Telegram when complete Setup Requirements Google Sheets Structure Your spreadsheet must contain at minimum: cnpj** column with Brazilian CNPJ numbers (numbers only, no formatting) razao_social** column (used to identify records without data) Credentials Needed Google Sheets OAuth2** credentials configured in n8n Telegram Bot** credentials for notifications (optional but recommended) Available Data Fields The workflow can populate your spreadsheet with any or all of these official fields from Brazil's Federal Revenue: 🏢 Company Information cnpj, razao_social, nome_fantasia, capital_social porte, codigo_porte, natureza_juridica, codigo_natureza_juridica 📍 Address & Location uf, municipio, codigo_municipio, codigo_municipio_ibge bairro, logradouro, descricao_tipo_de_logradouro numero, complemento, cep pais, codigo_pais, nome_cidade_no_exterior 📞 Contact Information email, ddd_telefone_1, ddd_telefone_2, ddd_fax 💼 Business Classification cnae_fiscal, cnae_fiscal_descricao, cnaes_secundarios regime_tributario, qualificacao_do_responsavel 📋 Registration Status situacao_cadastral, descricao_situacao_cadastral motivo_situacao_cadastral, descricao_motivo_situacao_cadastral situacao_especial, identificador_matriz_filial, descricao_identificador_matriz_filial ente_federativo_responsavel 📅 Important Dates data_inicio_atividade, data_situacao_cadastral, data_situacao_especial 🏛️ Tax Regime Information opcao_pelo_mei, data_opcao_pelo_mei, data_exclusao_do_mei opcao_pelo_simples, data_opcao_pelo_simples, data_exclusao_do_simples 👥 Partners & Shareholders qsa (Complete structured data of company partners and shareholders) How to Use Prepare your spreadsheet with CNPJ numbers in the cnpj column Configure your Telegram ID in the Settings node for notifications Set up Google Sheets credentials in n8n Add the desired data columns to your spreadsheet (any combination from the list above) Run the workflow - it will automatically process all CNPJs without existing data Performance & Limitations API Limits**: None - completely free API from MinhaReceita.org Data Accuracy**: Official data from Brazil's Federal Revenue Service Batch Size**: Configurable (default: 100 records per batch) Use Cases Lead Enrichment**: Enhance prospect databases with complete company information Market Research**: Gather comprehensive data about Brazilian companies Compliance & Due Diligence**: Verify company registration status and details Sales Intelligence**: Access contact information and company classification data Data Cleaning**: Standardize and complete existing company databases Prerequisites N8N instance with Google Sheets integration Google Sheets document with CNPJ data Basic understanding of Brazilian CNPJ format (14-digit company registration number) Telegram bot token (optional, for notifications) Important Notes CNPJs must be active and registered with Brazil's Federal Revenue Service The workflow only processes CNPJs where razao_social is empty (avoiding duplicates) All data comes from official government sources via MinhaReceita.org No rate limiting needed - the API is designed for bulk requests Supports both matriz (headquarters) and filial (branch) identification