by Evilasio Ferreira
This workflow automatically creates daily backups of all n8n workflows and stores them in Google Drive, using the n8n API to export workflows and a scheduled retention policy to keep storage organized. The automation runs in two stages: backup and cleanup. Daily Backup Process A Schedule Trigger runs at a defined time each day. The workflow creates a folder in Google Drive using the current date. It calls the n8n API to retrieve the list of all workflows. Each workflow is processed individually and converted into a .json file. The files are uploaded to the daily backup folder in Google Drive. This ensures that every workflow is safely stored and versioned by date. Automatic Cleanup A second scheduled process maintains storage hygiene: The workflow lists all backup folders in the Google Drive root directory. It checks the folder creation date. Any folder older than the defined retention period (default 15 days) is automatically deleted. This prevents unnecessary storage usage while keeping recent backups easily accessible. Key Features Automated daily workflow backups Uses n8n API to export workflows Files stored in Google Drive Automatic retention cleanup Fully documented with sticky notes inside the workflow Uses secure credentials (no hardcoded API keys) Setup Configuration takes only a few minutes: Connect a Google Drive OAuth credential Define the Google Drive root folder ID for backups Configure n8n API credentials securely Adjust: Backup schedule Retention period (default 15 days) Once configured, the workflow will run automatically, creating daily backups and removing old ones without manual intervention.
by Alex Berman
Who is this for This template is for investigators, real estate professionals, recruiters, and sales teams who need to skip trace individuals -- finding current addresses, phone numbers, and emails from a name, phone, or email address. It is ideal for anyone who needs to enrich a list of contacts with verified location and contact data at scale. How it works You configure a list of names (or phones/emails) in the setup node. The workflow submits a skip trace job to the ScraperCity People Finder API and captures the run ID. An async polling loop checks the job status every 60 seconds until it returns SUCCEEDED. Once complete, the results are downloaded, parsed from CSV, and written row by row to Google Sheets. How to set up Add your ScraperCity API key as an HTTP Header Auth credential named "ScraperCity API Key". Open the "Configure Search Inputs" node and replace the placeholder names with your target list. Open the "Save Results to Google Sheets" node and set your Google Sheet document ID and sheet name. Click Execute to run. Requirements ScraperCity account with People Finder access (app.scrapercity.com) Google Sheets OAuth2 credential connected to n8n How to customize the workflow Switch the search input from names to phone numbers or emails by editing the JSON body in "Start People Finder Scrape". Increase or decrease max_results in the request body to control how many matches are returned per person. Add a Filter node after CSV parsing to keep only results with a confirmed phone number or address.
by Sri Kolagani
Transform your lead qualification process with automated AI-powered phone calls triggered directly from Salesforce lead creation. What this workflow does: Webhook Trigger: Receives new lead data from Salesforce Automated Calling: Initiates phone calls via Retell AI Smart Monitoring: Polls call status until completion AI Analysis: Uses OpenAI to analyze call transcripts Salesforce Integration: Creates follow-up tasks with insights Perfect for: Sales teams wanting to qualify leads faster Companies using Salesforce CRM Organizations looking to automate initial prospect outreach Teams wanting AI-powered call analysis You'll need: Salesforce org with lead creation triggers Retell AI account and agent setup OpenAI API access Basic n8n workflow knowledge Setup time: ~15 minutes Author: Sri Kolagani Template Type: Free
by ToolMonsters
How it works This workflow lets you search for leads using FullEnrich's People Search API directly from Monday.com, then auto-fills the results as new items on your board. A Monday.com automation sends a webhook when a new item is created on your "search criteria" board The workflow responds to Monday.com's challenge handshake, then calls FullEnrich's People Search API with the criteria from your Monday.com columns (job title, industry, location, company size, number of results) The search results are split into individual people and each one is created as a new item on your Monday.com "results" board with their name, title, company, domain, LinkedIn URL, and location Set up steps Setup takes about 15 minutes: Monday.com "Search Criteria" board — Create a board with these columns: Job Title (text), Industry (text), Location (text), Company Size (text), Number of Results (number). Note down the column IDs Monday.com "Results" board — Create a board with columns for: First Name, Last Name, Job Title, Company Name, Company Domain, LinkedIn URL, Location. Note down the board ID, group ID, and column IDs Monday.com automation — On your search criteria board, create an automation: When item created → send webhook to the production URL from the "Monday.com Webhook" node FullEnrich — Connect your FullEnrich API credentials Monday.com — Connect your Monday.com API credentials Column mapping — Update the column IDs in the HTTP Request body and in the Monday.com node to match your boards Activate the workflow
by Andri Darnius
Automatically extract stock transactions from Indonesian broker trade confirmation documents sent via Telegram using AI vision. How it works: Send a PDF or image of your broker trade confirmation to the bot. The workflow downloads the file, encodes it, and sends it to OpenRouter (Gemini) for AI extraction. All detected transactions are displayed in a single confirmation message with ✅ Save All / ❌ Cancel buttons. The extracted data is stored temporarily using workflow static data, ready to be forwarded to any destination on confirm. Features: Supports PDF and image (JPG, PNG, screenshot) Handles multi-transaction documents — all shown in one batch confirmation Indonesian market aware — quantity in lots (1 lot = 100 shares) Extracts: ticker, company name, type, quantity, price, fee, total, date, broker, confidence score Low-confidence extractions handled gracefully Modular — connect the confirm output to any node (HTTP Request, Google Sheets, Airtable, Notion, database, etc.) Required credentials: Telegram account — your bot token from BotFather OpenRouter API — Header Auth credential (Authorization: Bearer sk-or-v1-...) Setup: Import workflow Add credentials Expose n8n via HTTPS (Cloudflare Tunnel, ngrok, or public server) Activate workflow — webhook registers automatically Send a broker PDF to your bot Default model: google/gemini-2.5-flash-lite via OpenRouter (free tier available)
by Romain Jouhannet
📝 Release Note Helper Triggered by a GitLab MR webhook, this workflow automatically assists your team in writing customer-facing release notes by combining Linear issue data with Claude AI. Apply the rn-release-n8n label to any release note MR in your docs repository to trigger it. How it works Version detection — reads your release RSS feed to find the last published version, then fetches all matching Linear version labels created since then to determine the version range automatically Issue collection — queries Linear for all completed issues in that version range that have Zendesk tickets, Slack links, or custom labels (Customer request, Release note public) attached Ticket summary — posts a structured list of all relevant issues to the MR as a comment AI draft — sends issue details to Claude, which generates customer-facing changelog entries grouped into ### Enhancements and ### Fixes, posted as a second MR comment Done label — adds rn-done to the MR when complete to prevent re-runs Setup Configure a GitLab webhook on your docs repo pointing to this workflow's URL (Merge Request events) Create two labels on your GitLab repo: rn-release-n8n (to trigger) and rn-done (auto-applied on completion) Update the RSS Read node URL to your release RSS feed Replace YOUR_PROJECT_ID in all GitLab API nodes with your docs project ID Replace YOUR_WORKSPACE in the Code nodes with your Linear workspace slug Connect Linear API, GitLab API, and Anthropic API credentials Notes Versioning assumes a vX.Y Linear label convention — adapt the Format labels node for your own scheme The AI prompt in Message a model is ready to use but can be customised to match your tone and changelog format Issues are filtered to those with Zendesk, Slack attachments, or your custom labels — adjust in Set Params
by MUHAMMAD SHAHEER
What It Does Build your own AI Chatbot that listens, thinks, searches, and speaks — all inside n8n. This template combines Groq AI, LangChain Agent, SerpAPI, and StreamElements TTS to create a chatbot that: Understands natural language input Searches the web for real-time answers Remembers previous messages (context memory) Replies with a realistic AI voice How It Works Chat Trigger: The workflow activates whenever a new message is received. Groq AI Agent: Processes user input, performs reasoning, and integrates with SerpAPI for live web results. Memory Node: Keeps the chat context for a natural conversation flow. TTS Node: Converts AI responses into realistic voice replies using StreamElements API. Setup Steps Connect your Groq, SerpAPI, and StreamElements credentials (no coding required). Customize the chatbot behavior directly inside n8n. Deploy instantly and chat via webhook or UI widget. Use Cases Voice-enabled customer-support bots AI chat widgets for websites Personal assistants that talk and search the web
by 1Shot API
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Monetize Your Private LLM Models with x402 and Ollama Self-hosting custom LLMs is becoming more popular and easier with turn-key inferencing tools like Ollama. With Ollama you can host your own proprietary models for customers in a private cloud or on your own hardware. But monetizing custom-trained, propietary models is still a challenge, requiring integrations with payment processors like Stripe, which don't support micropayments for on-demand API consumption. With this free workflow you can quickly monetize your proprietary LLM models with the x402 payment scheme in n8n with 1Shot API. Setup Steps: Authenticate the 1Shot API node against your 1Shot API business account. Point the 1Shot API simulate and execute nodes at the x402-compatible payment token you'd like to receive as payment. Configure the Ollama n8n node in the workflow (with optional authentication) to forward to your Ollama API endpoint and let users query it through an n8n webhook endpoint while paying you directly in your preferred stablecoin (like USDC). Through x402, users and AI agents can pay per-inference, whith no overhead wasted on centralized payment processors. Walkthrough Tutorial Check out the YouTube tutorial for this workflow so see the full end-to-end process.
by Daniel Rosehill
This workflow provides a way to capture detailed AI prompts using a voice note transcription service and then passes them on for completion to an AI agent. To preserve outputs in a knowledge management system, the AI response and the prompt are combined into one document that is created in a Nuclino collection (note: the Nuclino step is configured manually with a HTTP request node). How it works A webhook receives voice note data from Voicenotes.com containing the title and transcript The transcript is extracted and sent to an AI Agent powered by OpenRouter's Claude Sonnet model The AI generates a structured response in markdown format with Summary, Prompt, and Response sections The original prompt and AI response are merged and prepared for multiple outputs A Nuclino document is created via HTTP Request with the structured content A Slack notification is sent with the prompt, response, and Nuclino note URL Both the original prompt and AI response are archived in NocoDB for future reference How to use The webhook trigger can be configured to receive data from Voicenotes.com or any service that provides title and transcript data Replace the manual trigger with webhook, form, or other triggers as needed Customize the AI system message to change response format and behavior Configure Nuclino workspace and collection IDs for proper document organization Requirements OpenRouter account** for AI model access (Claude Sonnet) Nuclino account** and API token for document creation Slack workspace** with bot permissions for notifications NocoDB instance** for archiving (optional) Voicenotes.com account** for voice input (or alternative webhook source) Customising this workflow AI Models**: Switch between different OpenRouter models by changing the model parameter Response Format**: Modify the AI Agent system message to change output structure Documentation Platforms**: Replace Nuclino HTTP Request with other documentation APIs Notification Channels**: Add multiple Slack channels or other notification services Archive Storage**: Replace NocoDB with other database solutions Input Sources**: Adapt webhook to accept data from different voice note or transcription services Nuclino API The Nuclino API is documented here.
by oka hironobu
Who is this for Legal teams, operations managers, and freelancers who review contracts regularly and want to catch risky clauses before signing. Ideal for small teams without dedicated legal counsel. What this workflow does This workflow automates contract risk analysis using AI. A user uploads a PDF contract through a web form and selects the contract type. The Code node extracts text from the PDF, then Google Gemini analyzes the full contract for risky clauses, unfavorable terms, and missing legal protections. Each clause gets a severity rating (high, medium, low) with a suggested fix. The parsed results are logged to Google Sheets for tracking, and if the overall risk score exceeds your threshold, a Slack alert fires immediately so nothing slips through. How to set up Get a free Google Gemini API key from Google AI Studio Connect your Google Sheets account and create a spreadsheet called "Contract Reviews" Connect your Slack workspace and select the channel for risk alerts Activate the workflow and share the form URL with your team Requirements Google Gemini API key (free tier available) Google Sheets account Slack workspace with a channel for alerts n8n instance (self-hosted or cloud) How to customize Edit the AI prompt in the "Analyze Contract" node to focus on specific clause types like indemnification or IP assignment Change the risk threshold in the "Check Risk Level" node (default triggers on scores above 7) Add columns to the Sheets node for additional tracking fields like reviewer name or department
by Nima Salimi
Overview This n8n workflow automatically fetches the Forex Factory calendar for yesterday using Rapid API, then saves the data to a connected Google Sheet and sends Telegram alerts for high and medium impact events. It runs daily on schedule, collecting key fields such as currency, time, impact, and market indicators, and organizes them for easy tracking and analysis. Perfect for forex traders and analysts who need quick access to reliable market data from the previous day’s events. ✅ Tasks ⏰ Runs automatically every day 🌐 Fetches yesterday’s Forex Factory calendar via Rapid API 🧾 Collects key data fields: year, date, time, currency, impact, actual, forecast, previous 📊 Saves all records to Google Sheets for tracking and analysis 🚨 Sends Telegram alerts for high and medium impact events ⚙️ Keeps your market data updated and organized with no manual work required 🛠 How to Use 📄 Create a Google Spreadsheet Create a new spreadsheet in Google Sheets and add two sheets: High Impact and Low Impact. Connect it to your Google Sheets nodes in n8n. 🌐 Find the API on Rapid API Go to Rapid API and search for Forex Factory Scraper. Subscribe to the API to get your access key. 🔑 Connect Rapid API to n8n In your HTTP Request node, add the header below to authenticate your request: 💬 Add Your Telegram Chat ID In the Telegram node, paste your Chat ID to receive daily alerts for high-impact news. 🕒 Activate the Workflow Enable the Schedule Trigger to run daily. The workflow will automatically fetch yesterday’s Forex Factory calendar, save it to Google Sheets, and send Telegram notifications.
by swathi
The problem Ever attend a networking event and find yourself taking screenshots of people's LinkedIn? Sounds counter-intuitive because you are connecting on LinkedIn. But you find it hard to keep track of everyone you've met. You also don't want to miss diligently updating your CRM with details and insights. *The solution * There's no need for yet another app. Continue taking screenshots. Just share them on a 2-field only Google Form: screenshot + your quick notes about the person. Create a shortcut to the Google Form link on your phone homescreen. Voila! You have app-like access without the need for an app. Once you submit with just these 2 pieces of info, AI parses the image AND crafts a follow-up message. Within minutes! Just open your spreadsheet to have all that information consolidated - automatically - for your review. Promote yourself from do-er to manager. Who should use it? Anyone really. If you find yourself meeting people but want to be more meticulous or efficient staying on top, use this. How to set it up Time: ~10 minutes end-to-end. Import the provided workflow JSON in n8n. Connect credentials: Google Drive (read), Google Sheets (write), OpenAI. Configure key information: Google Sheets and relevant columns Configure Open AI models based on your cost/ efficiency requirements Confirm column headers in your Sheet match the variables (or update the variables). Test with one screenshot. Pro-tip: Add that Google Form link as a shortcut on your phone's home screen. Get app-like convenience without downloading yet another app.