by Robert Breen
This n8n workflow template creates an efficient data analysis system that uses Google Gemini AI to interpret user questions about spreadsheet data and processes them through a specialized sub-workflow for optimized token usage and faster responses. What This Workflow Does Smart Query Parsing**: Uses Gemini AI to understand natural language questions about your data Efficient Processing**: Routes calculations through a dedicated sub-workflow to minimize token consumption Structured Output**: Automatically identifies the column, aggregation type, and grouping levels from user queries Multiple Aggregation Types**: Supports sum, average, count, count distinct, min, and max operations Flexible Grouping**: Can aggregate data by single or multiple dimensions Token Optimization**: Processes large datasets without overwhelming AI context limits Tools Used Google Gemini Chat Model** - Natural language query understanding and response formatting Google Sheets Tool** - Data access and column metadata extraction Execute Workflow** - Sub-workflow processing for data calculations Structured Output Parser** - Converts AI responses to actionable parameters Memory Buffer Window** - Basic conversation context management Switch Node** - Routes to appropriate aggregation method Summarize Nodes** - Performs various data aggregations 📋 MAIN WORKFLOW - Query Parser What This Workflow Does The main workflow receives natural language questions from users and converts them into structured parameters that the sub-workflow can process. It uses Google Gemini AI to understand the intent and extract the necessary information. Prerequisites for Main Workflow Google Cloud Platform account with Gemini API access Google account with access to Google Sheets n8n instance (cloud or self-hosted) Main Workflow Setup Instructions 1. Import the Main Workflow Copy the main workflow JSON provided In your n8n instance, go to Workflows → Import from JSON Paste the JSON and click Import Save with name: "Gemini Data Query Parser" 2. Set Up Google Gemini Connection Go to Google AI Studio Sign in with your Google account Go to Get API Key section Create a new API key or use an existing one Copy the API key Configure in n8n: Click on Google Gemini Chat Model node Click Create New Credential Select Google PaLM API Paste your API key Save the credential 3. Set Up Google Sheets Connection for Main Workflow Go to Google Cloud Console Create a new project or select existing one Enable the Google Sheets API Create OAuth 2.0 Client ID credentials In n8n, click on Get Column Info node Create Google Sheets OAuth2 API credential Complete OAuth flow 4. Configure Your Data Source Option A: Use Sample Data The workflow is pre-configured for: Sample Marketing Data Make a copy to your Google Drive Option B: Use Your Own Sheet Update Get Column Info node with your Sheet ID Ensure you have a "Columns" sheet for metadata Update sheet references as needed 5. Set Up Workflow Trigger Configure how you want to trigger this workflow (webhook, manual, etc.) The workflow will output structured JSON for the sub-workflow ⚙️ SUB-WORKFLOW - Data Processor What This Workflow Does The sub-workflow receives structured parameters from the main workflow and performs the actual data calculations. It handles fetching data, routing to appropriate aggregation methods, and formatting results. Sub-Workflow Setup Instructions 1. Import the Sub-Workflow Create a new workflow in n8n Copy the sub-workflow JSON (embedded in the Execute Workflow node) Import as a separate workflow Save with name: "Data Processing Sub-Workflow" 2. Configure Google Sheets Connection for Sub-Workflow Apply the same Google Sheets OAuth2 credential you created for the main workflow Update the Get Data node with your Sheet ID Ensure it points to your data sheet (e.g., "Data" sheet) 3. Configure Google Gemini for Output Formatting Apply the same Gemini API credential to the Google Gemini Chat Model1 node This handles final result formatting 4. Link Workflows Together In the main workflow, find the Execute Workflow - Summarize Data node Update the workflow reference to point to your sub-workflow Ensure the sub-workflow is set to accept execution from other workflows Sub-Workflow Components When Executed by Another Workflow**: Trigger that receives parameters Get Data**: Fetches all data from Google Sheets Type of Aggregation**: Switch node that routes based on aggregation type Multiple Summarize Nodes**: Handle different aggregation types (sum, avg, count, etc.) Bring All Data Together**: Combines results from different aggregation paths Write into Table Output**: Formats final results using Gemini AI Example Usage Once both workflows are set up, you can ask questions like: Overall Metrics: "Show total Spend ($)" "Show total Clicks" "Show average Conversions" Single Dimension: "Show total Spend ($) by Channel" "Show total Clicks by Campaign" Two Dimensions: "Show total Spend ($) by Channel and Campaign" "Show average Clicks by Channel and Campaign" Data Flow Between Workflows Main Workflow: User question → Gemini AI → Structured JSON output Sub-Workflow: Receives JSON → Fetches data → Performs calculations → Returns formatted table Contact Information For support, customization, or questions about this template: Email**: robert@ynteractive.com LinkedIn**: Robert Breen Need help implementing these workflows, want to remove limitations, or require custom modifications? Reach out for professional n8n automation services and AI integration support.
by SpaGreen Creative
Bulk WhatsApp Campaign Automation with Rapiwa API (Unofficial Integration) Who’s it for This n8n workflow lets you send bulk WhatsApp messages using your own number through Rapiwa API, avoiding the high cost and limitations of the official WhatsApp API. It integrates seamlessly with Google Sheets, where you can manage your contacts and messages with ease. Ideal for easy-to-maintain bulk messaging solution using their own personal or business WhatsApp number. This solution is perfect for small businesses, marketers, or teams looking for a cost-effective way to manage WhatsApp communication at scale. How it Works / What It Does Reads data from a Google Sheet where the Status column is marked as "pending". Cleans each phone number (removes special characters, spaces, etc.). Verifies if the number is a valid WhatsApp user using the Rapiwa API. If valid: Sends the message via Rapiwa. Updates Status = sent and Verification = verified. If invalid: Skips message sending. Updates Status = not sent and Verification = unverified. Waits for a few seconds (rate-limiting). Loops through the next item. The entire process is triggered automatically every 5 minutes. How to Set Up Duplicate the Sample Sheet: Use this format. Fill Contacts: Add columns like WhatsApp No, Name, Message, Image URL, and set Status = pending. Connect Google Sheets: Authenticate and link Google Sheets node inside n8n. Subscribe to Rapiwa: Go to Rapiwa.com and get your API key. Paste API Key: Use the HTTP Bearer token credential in n8n. Activate the Workflow: Let n8n take care of the automation. Requirements Google Sheets API credentials Configured Google Sheet (template linked above) WhatsApp (Personal or Business) n8n instance with credentials setup How to Customize the Workflow Add delay between messages**: Use the Wait node to introduce pauses (e.g., 5–10 seconds). Change message format**: Modify the HTTP Request node to send media or templates. Personalize content**: Include dynamic fields like Name, Image URL, etc. Error handling**: Add IF or SET nodes to capture failed attempts, retry, or log errors. Workflow Highlights Triggered every 5 minutes** using the Schedule Trigger node. Filters messages** with Status = pending. Cleans numbers* and *verifies WhatsApp existence** before sending. Sends WhatsApp messages** via Rapiwa (Unofficial API). Updates Google Sheets** to mark Status = sent or not sent and Verification = verified/unverified. Wait node** prevents rapid-fire sending that could lead to being flagged by WhatsApp. Setup in n8n 1. Connect Google Sheets Add a Google Sheets node Authenticate using your Google account Select the document and worksheet Use filter: Status = pending 2. Loop Through Rows Use SplitInBatches or a Code node to process rows in small chunks (e.g., 5 rows) Add a Wait node to delay 5 seconds between messages 3. Send Message via HTTP Node How the "Send Message Using Rapiwa" Node Sends Messages This node makes an HTTP POST request to the Rapiwa API endpoint: https://app.rapiwa.com/api/send-message It uses Bearer Token Authentication with your Rapiwa API key. When this node runs, it sends a WhatsApp message to the specified number with the given text and optional image. The Rapiwa API handles message delivery using your own WhatsApp number connected to their service. JSON Body**: { "number": "{{ $json['WhatsApp No'] }}", "message": "{{ $json['Message'] }}" } Sample Google Sheet Structure A Google Sheet formatted like this sample | SL | WhatsApp No | Name | Message | Image URL | Verification | Status | |----|----------------|------------------------|----------------------|---------------------------------------------------------------------------|--------------|---------| | 1 | 8801322827799 | SpaGreen Creative | This is Test Message | https://spagreen.sgp1.cdn.digitaloceanspaces.com/... | verified | sent | | 2 | 8801725402187 | Abdul Mannan Zinnat | This is Test Message | https://spagreen.sgp1.cdn.digitaloceanspaces.com/... | verified | sent | Tips Modify the Limit node to increase/decrease messages per cycle. Adjust the Wait node to control how fast messages are sent (e.g., 5–10s delay). Make sure WhatsApp numbers are properly formatted (e.g., 8801XXXXXXXXX, no +, no spaces). Store your Rapiwa API key securely using n8n credentials. Use publicly accessible image URLs if sending images. Always mark processed messages as "sent" to avoid duplicates. Use the Error workflow in n8n to catch failed sends for retry. Test with a small batch before going full-scale. Schedule the Trigger node for every 5 minutes to keep automation running. Useful Links Dashboard:** https://app.rapiwa.com Official Website:** https://rapiwa.com Documentation:** https://docs.rapiwa.com Support & Community Need help setting up or customizing the workflow? Reach out here: WhatsApp: Chat with Support Discord: Join SpaGreen Server Facebook Group: SpaGreen Community Website: SpaGreen Creative Envato: SpaGreen Portfolio
by Robert Breen
This n8n workflow template creates an intelligent data analysis system that converts natural language questions into Google Sheets SQL queries using OpenAI's GPT-4o model. The system generates proper Google Sheets query URLs and executes them via HTTP requests for efficient data retrieval. What This Workflow Does Natural Language to SQL**: Converts user questions into Google Sheets SQL syntax Direct HTTP Queries**: Bypasses API limits by using Google Sheets' built-in query functionality Column Letter Mapping**: Automatically maps column names to their corresponding letters (A, B, C, etc.) Structured Query Generation**: Outputs properly formatted Google Sheets query URLs Real-time Data Access**: Retrieves live data directly from Google Sheets Memory Management**: Maintains conversation context for follow-up questions Tools Used OpenAI Chat Model (GPT-4o)** - SQL query generation and natural language understanding OpenAI Chat Model (GPT-4.1 Mini)** - Result formatting and table output Google Sheets Tool** - Column metadata extraction and schema understanding HTTP Request Node** - Direct data retrieval via Google Sheets query API Structured Output Parser** - Formats AI responses into executable queries Memory Buffer Window** - Conversation history management Chat Trigger** - Webhook-based conversation interface Step-by-Step Setup Instructions 1. Prerequisites Before starting, ensure you have: An n8n instance (cloud or self-hosted) An OpenAI account with API access and billing setup A Google account with access to Google Sheets The target Google Sheet must be publicly accessible or shareable via link 2. Import the Workflow Copy the workflow JSON provided In your n8n instance, go to Workflows → Import from JSON Paste the JSON and click Import Save with a descriptive name like "Google Sheets SQL Query Generator" 3. Set Up OpenAI Connections Get API Key: Go to OpenAI Platform Sign in or create an account Navigate to API Keys section Click Create new secret key Copy the generated API key Important: Add billing information and credits to your OpenAI account Configure Both OpenAI Nodes: OpenAI Chat Model1 (GPT-4o): Click on the node Click Create New Credential Select OpenAI API Paste your API key Save the credential OpenAI Chat Model2 (GPT-4.1 Mini): Apply the same OpenAI API credential This handles result formatting 4. Set Up Google Sheets Connection Create OAuth2 Credentials: Go to Google Cloud Console Create a new project or select existing one Enable the Google Sheets API Go to Credentials → Create Credentials → OAuth 2.0 Client IDs Set application type to Web Application Add authorized redirect URIs (get this from n8n credentials setup) Copy the Client ID and Client Secret Configure in n8n: Click on the Get Column Info2 node Click Create New Credential Select Google Sheets OAuth2 API Enter your Client ID and Client Secret Complete the OAuth flow by clicking Connect my account Authorize the required permissions 5. Prepare Your Google Sheet Option A: Use the Sample Data Sheet Access the pre-configured sheet: Sample Marketing Data Make a copy to your Google Drive Important**: Set sharing to "Anyone with the link can view" Critical: Set sharing to "Anyone with the link can view" for HTTP access Copy the Sheet ID from the URL Update the Get Column Info2 node with your Sheet ID and column metadata sheet 6. Configure Sheet References Get Column Info2 Node: Set Document ID to your Google Sheet ID Set Sheet Name to your columns metadata sheet (e.g., "Columns") This provides the AI with column letter mappings HTTP Request Node: No configuration needed - it uses dynamic URLs from the AI agent Ensure your sheet has proper sharing permissions 7. Update System Prompt (If Using Custom Sheet) If using your own Google Sheet, update the system prompt in the AI Agent3 node: Replace the URL in the system message with your Google Sheet URL Update the GID (sheet ID) to match your data sheet Keep the same query structure format Contact Information Robert Ynteractive For support, customization, or questions about this template: Email**: robert@ynteractive.com LinkedIn**: Robert Breen Need help implementing this workflow, want to add security features, or require custom modifications? Reach out for professional n8n automation services and AI integration support.
by Davide
🤖🎵 This workflow automates the creation, storage, and cataloging of AI-generated music using the Eleven Music API, Google Sheets, and Google Drive. Key Advantages ✅ Fully Automated Music Generation Pipeline Once started, the workflow automatically: Reads track parameters Generates music via API Uploads the file Updates your spreadsheet No manual steps needed after initialization. ✅ Centralized Track Management A single Google Sheet acts as your project control center, letting you organize: Prompts Durations Generated URLs This avoids losing track of files and creates a ready-to-share catalog. ✅ Seamless Integration with Google Services The workflow: Reads instructions from Google Sheets Saves the MP3 to Google Drive Updates the same Sheet with the final link This ensures everything stays synchronized and easy to access. ✅ Scalable and Reliable Processing The loop-with-delay mechanism: Processes tracks sequentially Prevents API overload Ensures stable execution This is especially helpful when generating multiple long tracks. ✅ Easy Customization Because the prompts and durations come from Google Sheets: You can edit prompts at any time You can add more tracks without modifying the workflow You can clone the Sheet for different projects ✅ Ideal for Creators and Businesses This workflow is perfect for: Content creators generating background music Agencies designing custom soundtracks Businesses needing AI-generated audio assets Automated production pipelines How It Works The process operates as follows: The workflow starts manually via the "Execute workflow" trigger It retrieves a list of music track requests from a Google Sheets spreadsheet containing track titles, text prompts, and duration specifications The system processes each track request individually through a batch loop For each track, it sends the text prompt and duration to ElevenLabs Music API to generate studio-quality music The generated MP3 file (in 44100 Hz, 128 kbps format) is automatically uploaded to a designated Google Drive folder Once uploaded, the workflow updates the original Google Sheets with the direct URL to the generated music file A 1-minute wait period between each track generation prevents API rate limiting The process continues until all track requests in the spreadsheet have been processed Set Up Steps Prerequisites: ElevenLabs paid account with Music API access enabled Google Sheets spreadsheet with specific columns: TITLE, PROMPT, DURATION (ms), URL Google Drive folder for storing generated music files Configuration Steps: ElevenLabs API Setup: Enable Music Generation access in your ElevenLabs account Generate an API key from the ElevenLabs developer dashboard Configure HTTP Header authentication in n8n with name "xi-api-key" and your API value Google Sheets Preparation: Create or clone the music tracking spreadsheet with required columns Fill in track titles, detailed text prompts, and durations in milliseconds (10,000-300,000 ms) Configure Google Sheets OAuth credentials in n8n Update the document ID in the Google Sheets nodes Google Drive Configuration: Create a dedicated folder for music uploads Set up Google Drive OAuth credentials in n8n Update the folder ID in the upload node Workflow Activation: Ensure all API credentials are properly configured Test with a single track entry in the spreadsheet Verify music generation, upload, and spreadsheet update work correctly Execute the workflow to process all pending track requests The workflow automatically names files with timestamp prefixes (song_yyyyMMdd) and handles the complete lifecycle from prompt to downloadable music file. 👉 Subscribe to my new YouTube channel. Here I’ll share videos and Shorts with practical tutorials and FREE templates for n8n. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Sayone Technologies
⭐ Google Review Sentiment Analysis & Slack Notification Workflow This workflow automates the process of collecting Google Business Profile reviews 🏪, analyzing customer sentiment with Google Gemini 🤖✨, and sending structured reports to Slack 💬. 🔑 Key Advantages 📥 Fetches Google Business Profile reviews for a given business and time period 🧠 Runs sentiment analysis using Gemini AI 📊 Consolidates comments, ratings, and trends into a JSON-based summary 🧩 Restructures results into Slack Block Kit format for easy readability 🚀 Sends automated sentiment reports directly to a Slack channel ⚙️ Set Up Essentials You’ll Need 🔑 Google Business Profile API access with project approval ✅ Enabled Google Business Profile API service 🔐 Gemini API credentials 💬 Slack workspace & channel for receiving reports 🚀 How to Get Started 🔧 Configure your Google Business Profile API and enable access 👤 Set the owner name and 📍 location to fetch reviews ⏳ Define the review time period using the Set Time Period node 🔗 Connect your Slack account and select a channel for notifications 🕒 Deploy and let the workflow run on schedule for automated insights
by Yanagi Chinatsu
Who it's for This workflow is perfect for space enthusiasts, community managers, and content creators who want to automatically share stunning, curated space imagery with their Slack communities. It's ideal for teams that enjoy a daily dose of scientific inspiration and visually engaging content without any manual effort. What it does This workflow automates the creation and posting of a daily space image gallery to Slack. Every day at a scheduled time, it fetches three distinct images from NASA's public APIs: one from the Mars Rover, one from the EPIC satellite observing Earth, and one from the extensive Image Library. For each image, the workflow uses an AI model to generate a unique and poetic caption, transforming a simple image post into a more engaging piece of content. Finally, it combines these three images and their AI-generated captions into a single, beautifully formatted message and posts it to your designated Slack channel. As a bonus, it also saves a copy of the message to a Google Drive folder for archival purposes. How to set up Configure Variables: In the Workflow Configuration node, enter your NASA API Key in the nasaApiKey field and specify your target Slack channel name in the slackChannel field (e.g., general). Connect Credentials: You will need to add your credentials for the OpenAI Chat Model, Post to Slack, and Google Drive nodes. Activate Workflow: Once your credentials and variables are set, simply save and activate the workflow. Requirements A NASA API Key (free to generate). An OpenAI account and API key. A Slack workspace with permissions to post messages. A Google Drive account. How to customize the workflow Adjust the Schedule: Change the trigger time or frequency in the Daily 10:00 - Start Poll node. Change AI Tone: Modify the system message in the AI Agent node to alter the style, tone, or language of the generated captions. Swap Image Sources: Update the URLs in the Fetch nodes to pull images from different NASA APIs or use different search queries. Add More Channels: Duplicate the Post to Slack node and modify it to send notifications to other services like Discord or Telegram.
by Adil Khan
This workflow bridges the gap between anonymous website traffic and on-chain wallet activity. It captures wallet connections via a webhook, enriches the data with real-time USD balances from the Zerion API, and syncs the results to Google Analytics 4, BigQuery, and Discord for immediate action. This directly helps Web3 marketing and growth teams identify high-value "whales" the moment they connect to your dApp, allowing for real-time monitoring and advanced attribution analysis. How it works Video tutorial: https://youtu.be/2_wuTRzRpkg How it works Webhook Trigger: Receives the wallet address, GA Client ID, and Session ID from your website via GTM. Zerion API Integration: Queries the real-time USD balance and individual chain distributions for the connected wallet. Whale Filtering (Switch): A logic that filters wallets based on a USD threshold (e.g., >$50) to trigger high-priority alerts. Dynamic Discord Alerts: Sends a formatted message to Discord with a 2-decimal rounded total balance and a dynamic breakdown of assets across all active chains (Base, Ethereum, etc.). GA4 Push: Sends the wallet_usd_balance as a custom metric to GA4 via the Measurement Protocol to maintain session continuity. BigQuery Archive: Records the wallet address, hashed ID, and USD balance into a secure table for SQL joining with raw GA4 data Prerequisites Zerion API Key: Required for fetching real-time balance and chain data. Discord Bot Token: Required to send automated whale alerts to your team server. Google Cloud Project: A project with BigQuery enabled and a JSON Service Account key for secure data insertion. GA4 Measurement Protocol API Secret: Required to push custom metrics back into active GA4 sessions.
by InfyOm Technologies
✅ What problem does this workflow solve? Missed return pickups create logistics delays, extra follow-ups, and unhappy customers for e-commerce teams. This workflow automates return pickup reminders, ensuring customers are notified on the day of pickup via WhatsApp messages and automated voice calls, without any manual effort. ⚙️ What does this workflow do? Runs automatically on a daily schedule. Reads return pickup data from Google Sheets. Identifies customers with: 📅 Pickup date = today ⏳ Status = Pending Sends personalized WhatsApp reminders. Places automated voice call reminders when required. Updates reminder status in Google Sheets for clear tracking. 🧠 How It Works – Step by Step 1. ⏰ Scheduled Trigger The workflow starts at a fixed time every day (e.g., 9–10 AM) using a Schedule Trigger. 2. 📄 Read Pickup Data from Google Sheets It fetches rows from Google Sheets where: Pickup Date** = today Status** = Pending This ensures only relevant pickups are processed. 3. 🔁 Loop Through Pickups Each matching row is processed individually to send customer-specific reminders. 4. ✍️ Generate Personalized Messages Using a Code node, the workflow creates: 📲 A WhatsApp text message 📞 A voice message script Messages include: Customer name Product name Pickup address Return reason Pickup timing reminder 5. 📲 Send WhatsApp Reminder A personalized WhatsApp message is sent via Twilio, reminding the customer to keep the package ready. 6. 📞 Place Voice Call Reminder If required, the workflow places an automated voice call using Twilio and reads out a clear pickup reminder using text-to-speech. 7. ✅ Update Pickup Status Once notifications are sent: The workflow updates the Status column to “Reminder Sent” Ensures the same pickup is not notified again 📊 Sample Google Sheet Columns | Order ID | Customer Name | Phone Number | Product | Pickup Date | Address | Return Reason | Status | |--------|----------------|--------------|---------|-------------|---------|---------------|--------| 🔧 Integrations Used Google Sheets** – Pickup data source and tracking Twilio WhatsApp API** – Message delivery Twilio Voice API** – Automated call reminders n8n Schedule + Logic Nodes** – Automation orchestration 👤 Who can use this? Perfect for: 🛒 E-commerce brands 📦 Reverse logistics teams 🚚 Delivery & pickup operations 🧑💼 Customer support teams It also works well for service visits, deliveries, appointments, and field operations. 💡 Key Benefits ✅ Fewer missed pickups ✅ Improved customer compliance ✅ Reduced manual follow-ups ✅ Clear tracking in Google Sheets ✅ Scalable and fully automated 🚀 Ready to Use? Just connect: ✅ Google Sheets with pickup data ✅ Twilio credentials (WhatsApp + Voice) ✅ Schedule trigger time
by Fahmi Fahreza
Automated Multi-Bank Balance Sync to BigQuery This workflow automatically fetches balances from multiple financial institutions (RBC, Amex, Wise, PayPal) using Plaid, maps them to QuickBooks account names, and loads structured records into Google BigQuery for analytics. Who’s it for? Finance teams, accountants, and data engineers managing consolidated bank reporting in Google BigQuery. How it works The Schedule Trigger runs weekly. Four Plaid API calls fetch balances from RBC, Amex, Wise, and PayPal. Each response splits out individual accounts and maps them to QuickBooks names. All accounts are merged into one dataset. The workflow structures the account data, generates UUIDs, and formats SQL inserts. BigQuery node uploads the finalized records. How to set up Add Plaid and Google BigQuery credentials, replace client IDs and secrets with variables, test each connection, and schedule the trigger for your reporting cadence.
by Yusei Miyakoshi
Who's it for This template is for teams that want to stay updated on industry trends, tech news, or competitor mentions without manually browsing news sites. It's ideal for marketing, development, and research teams who use Slack as their central hub for automated, timely information. What it does / How it works This workflow runs on a daily schedule (default 9 AM), fetches the top articles from Hacker News for a specific keyword you define (e.g., 'AI'), and uses an AI agent with OpenRouter to generate a concise, 3-bullet point summary in Japanese for each article. The final formatted summary, including the article title, is then posted to a designated Slack channel. The entire process is guided by descriptive sticky notes on the canvas, explaining each configuration step. How to set up In the Configure Your Settings node, change the default keyword AI to your topic of interest and update the slack_channel to your target channel name. Click the OpenRouter Chat Model node and select your OpenRouter API key from the Credentials dropdown. If you haven't connected it yet, you will need to create a new credential. Click the Send Summary to Slack node and connect your Slack account using OAuth2 credentials. (Optional) Adjust the schedule in the Trigger Daily at 9 AM node to change how often the workflow runs. Activate the workflow. Requirements An n8n instance (Cloud or self-hosted). A Slack account and workspace. An OpenRouter API key stored in your n8n credentials. If self-hosting, ensure the LangChain nodes are enabled. How to customize the workflow Change the News Source:* Replace the *Hacker News* node with an *RSS Feed Read** node or another news integration to pull articles from different sources. Modify the AI Prompt:* In the *Summarize Article with AI** node, you can edit the system message to change the summary language, length, or tone. Use a Different AI Model:* Swap the *OpenRouter* node for an *OpenAI, **Anthropic, or any other supported chat model. Track Multiple Keywords:* Modify the workflow to loop through a list of keywords in the *Configure Your Settings** node to monitor several topics at once.
by Alex Berman
Who is this for This workflow is built for real estate investors, wholesalers, and skip tracers who need to find contact details -- phone numbers, emails, and addresses -- for property owners at scale. It automates the entire lookup process using the ScraperCity People Finder API and stores clean results in Airtable for follow-up. How it works A manual trigger starts the workflow. A configuration node lets you define the list of property owner names (or phones/emails) to look up. The workflow submits a skip trace job to the ScraperCity People Finder API, which returns a runId for async tracking. An async polling loop checks the job status every 60 seconds until the result is marked SUCCEEDED. Once complete, the workflow downloads the results CSV and parses each contact record using a code node. Duplicate records are removed, and each unique contact is synced into an Airtable base as a new row with name, phone, email, and address fields. How to set up Create a ScraperCity API credential in n8n (HTTP Header Auth, header name Authorization, value Bearer YOUR_KEY). Update the Configure Search Inputs node with your target names, phones, or emails. Connect your Airtable credential and set your Base ID and Table name in the Sync Contacts to Airtable node. Requirements ScraperCity account with People Finder access (scrapercity.com) Airtable account with a base set up to receive contact data How to customize the workflow Change max_results in Configure Search Inputs to return more contacts per person. Swap the Airtable node for a Google Sheets node if preferred. Add a filter node after parsing to keep only records that have a verified phone number.
by Alex Berman
Who is this for This workflow is built for real estate investors, private investigators, recruiters, and sales teams who need to skip trace individuals -- finding contact details, addresses, and phone numbers from a name, email, or phone number -- and store the enriched records automatically in Notion. How it works A user fills out an n8n form with one or more search inputs (name, email, or phone number). The workflow submits that data to the ScraperCity People Finder API, which begins an async enrichment job. The workflow then polls the job status every 60 seconds until it completes. Once the scrape succeeds, the results are downloaded, parsed, deduplicated, and each enriched person record is written as a new page in a Notion database. How to set up Create a ScraperCity account at scrapercity.com and copy your API key. In n8n, create an HTTP Header Auth credential named "ScraperCity API Key" with the key Authorization and value Bearer YOUR_KEY. Create a Notion integration and share your target database with it. Create a Notion credential in n8n. Open the Configure Search Defaults node and set your preferred max_results value. Open the Save Person Record to Notion node and set your Notion Database ID. Requirements ScraperCity account (scrapercity.com) with People Finder access n8n instance (cloud or self-hosted) Notion workspace with a database for storing people records How to customize the workflow Change the form fields to accept bulk CSV input instead of a single name. Add a Filter node after parsing to only save records that include a valid phone number. Route results to Google Sheets or HubSpot instead of Notion by swapping the final node.