by Solomon
Based on Jonathan's work. Check out his templates. How it works This workflow will backup your workflows to GitHub. It uses the n8n API node to export all workflows. It then loops over the data, checks in GitHub to see if a file exists that uses the credential's ID. Once checked it will: update the file on GitHub if it exists; create a new file if it doesn't exist; ignore if it's the same. Who is this for? People wanting to backup their workflows outside the server for safety purposes or to migrate to another server. Check out my other templates 👉 https://n8n.io/creators/solomon/
by David Ashby
Complete MCP server exposing all Jina AI Tool operations to AI agents. Zero configuration needed - all 3 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Jina AI Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Jina AI Tool tool with full error handling 📋 Available Operations (3 total) Every possible Jina AI Tool operation is included: 🔧 Reader (2 operations) • Read URL content • Search web 🔧 Research (1 operations) • Perform deep research 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Jina AI Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Jina AI Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing all Humantic AI Tool operations to AI agents. Zero configuration needed - all 3 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Humantic AI Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Humantic AI Tool tool with full error handling 📋 Available Operations (3 total) Every possible Humantic AI Tool operation is included: 🔧 Profile (3 operations) • Create a profile • Get a profile • Update a profile 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Humantic AI Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Humantic AI Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing all Gong Tool operations to AI agents. Zero configuration needed - all 4 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Gong Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Gong Tool tool with full error handling 📋 Available Operations (4 total) Every possible Gong Tool operation is included: 🔧 Call (2 operations) • Get call • Get many calls 👤 User (2 operations) • Get user • Get many users 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Gong Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Gong Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Nazmy
Based on Jonathan & Solomon work. > The only addition I've made is a Set node. This node organizes workflows into subfolders within the GitHub repository based on their respective tags. How it works This workflow will backup your workflows to GitHub. It uses the n8n API node to export all workflows. It then loops over the data, checks in GitHub to see if a file exists that uses the credential's ID. Once checked it will: update the file on GitHub if it exists; create a new file if it doesn't exist; ignore if it's the same. Who is this for? People wanting to backup their workflows outside the server for safety purposes or to migrate to another server.
by Zacharia Kimotho
This workflow makes it easier to keep track of the stocks market and get an email with a summary of the daily highlights on what happened, key insights and trends Setup Guide Define the schedule (days, times, intervals). Replace sample stock data with your desired stock list (ticker, name, etc.) in JSON format. Split Out the fields to have a clean list of the stocks to monitor set keyword node Extracts the stock ticker from each item and sets it to the keyword property. Financial times scraper Triggers the Bright Data Datasets API to scrape financial data. Set the node as below Method: POST URL: https://api.brightdata.com/datasets/v3/trigger Query Parameters: dataset_id: Replace with your Bright Data dataset ID. include_errors: true type: discover_new discover_by: keyword Headers: Authorization: Bearer YOUR_BRIGHTDATA_API_KEY Replace with your Bright Data API key. Body: JSON, ={{ $('set keyword').all().map(item => item.json)}} Execute Once: Checked. Get progress node Checks the status of the Bright Data scraping job if complete, or running Setup: URL: https://api.brightdata.com/datasets/v3/progress/{{ $json.snapshot_id }} Headers: Authorization: Bearer YOUR_BRIGHTDATA_API_KEY Replace with your Bright Data API key. Get snapshot + data retrieves the scraped data from the Bright Data API. Pass the request as URL: https://api.brightdata.com/datasets/v3/snapshot/{{ $json.snapshot_id }} Query Parameters: format: json Headers: Authorization: Bearer YOUR_BRIGHTDATA_API_KEY Replace with your Bright Data API key. Aggregate. Combines the data from each stock item into a single object Update to sheet and add all items to This sheet. Make a copy before you can map the data create summary node generates a summary of the scraped stock data using the Google Gemini AI model and notifies you via Gmail. Setup: Prompt Type: define Text: Customize the prompt to define the AI's role, input format, tasks, output format (HTML email), and constraints. Google Sheets. Appends the scraped data to a Google Sheet. This should be set to automap so as to adjust to the results found in the request Important Notes: Remember to replace placeholder values (API keys, dataset IDs, email addresses, Google Sheet IDs) with your actual values. Review and customize the AI prompt for the "create summary" node to achieve the desired email summary output. Consider adding error handling for a more robust workflow. Monitor API usage to avoid rate limits.
by Leonard
Open Deep Research - AI-Powered Autonomous Research Workflow Description This workflow automates deep research by leveraging AI-driven search queries, web scraping, content analysis, and structured reporting. It enables autonomous research with iterative refinement, allowing users to collect, analyze, and summarize high-quality information efficiently. How it works 🔹 User Input The user submits a research topic via a chat message. 🧠 AI Query Generation A Basic LLM generates up to four refined search queries to retrieve relevant information. 🔎 SERPAPI Google Search The workflow loops through each generated query and retrieves top search results using the SerpAPI API. 📄 Jina AI Web Scraping Extracts and summarizes webpage content from the URLs obtained via SerpAPI. 📊 AI-Powered Content Evaluation An AI Agent evaluates the relevance and credibility of the extracted content. 🔁 Iterative Search Refinement If the AI finds insufficient or low-quality information, it generates new search queries to improve results. 📜 Final Report Generation The AI compiles a structured markdown report, including sources with citations. Set Up Instructions 🚀 Estimated setup time: ~10-15 minutes ✅ Required API Keys:** SerpAPI → For Google Search results Jina AI → For text extraction OpenRouter → For AI-driven query generation and summarization ⚙️ n8n Components Used:** AI Agents with memory buffering for iterative research Loops to process multiple search queries efficiently HTTP Requests for direct API interactions with SerpAPI and Jina AI 📝 Recommended Enhancements:** Add sticky notes in n8n to explain each step for new users Implement Google Drive or Notion Integration to save reports automatically 🎯 Ideal for: ✔️ Researchers & Analysts - Automate background research ✔️ Journalists - Quickly gather reliable sources ✔️ Developers - Learn how to integrate multiple AI APIs into n8n ✔️ Students - Speed up literature reviews 🔗 Completely free and open-source! 🚀
by David Ashby
Complete MCP server exposing all Cloudflare Tool operations to AI agents. Zero configuration needed - all 4 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Cloudflare Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Cloudflare Tool tool with full error handling 📋 Available Operations (4 total) Every possible Cloudflare Tool operation is included: 🔧 Zonecertificate (4 operations) • Delete a certificate • Get a certificate • Get many certificates • Upload a certificate 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Cloudflare Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Cloudflare Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
Complete MCP server exposing all Beeminder Tool operations to AI agents. Zero configuration needed - all 4 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Beeminder Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Beeminder Tool tool with full error handling 📋 Available Operations (4 total) Every possible Beeminder Tool operation is included: 🔧 Datapoint (4 operations) • Create datapoint for goal • Delete a datapoint • Get many datapoints for a goal • Update a datapoint 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Beeminder Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Beeminder Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Obsidi8n
How it works: Send notes from Obsidian via Webhook to start the audio conversion OpenAI converts your text to natural-sounding audio and generates episode descriptions Audio files are stored in Cloudinary and automatically attached to your notes in Obsidian A professional podcast feed is generated, compatible with all major podcast platforms (Apple, Spotify, Google) Set up steps: Install and configure the Post Webhook Plugin in Obsidian Set up Custom Auth credentials in n8n for Cloudinary using the following JSON: { "name": "Cloudinary API", "type": "httpHeaderAuth", "authParameter": { "type": "header", "key": "Authorization", "value": "Basic {{Buffer.from('your_api_key:your_api_secret').toString('base64')}}" } } Configure podcast feed metadata (title, author, cover image, etc.) Note: The second flow is a generic Podcast Feed module that can be reused in any '[...]-to-Podcast' workflow. It generates a standard RSS feed from Google Sheets data and podcast metadata, making it compatible with all major podcast platforms.
by RealSimple Solutions
Who Is This For? This workflow is designed for AI engineers, automation specialists, and content creators who need a scalable system to dynamically manage prompts stored in GitHub. It eliminates manual updates, enforces required variable checks, and ensures that AI interactions always receive fully processed prompts. 🚀 What Problem Does This Solve? Manually managing AI prompts can be inefficient and error-prone. This workflow: ✅ Fetches dynamic prompts from GitHub ✅ Auto-populates placeholders with values from the setVars node ✅ Ensures all required variables are present before execution ✅ Processes the formatted prompt through an AI agent 🛠 How This Workflow Works This workflow consists of three key branches, ensuring smooth prompt retrieval, variable validation, and AI processing. 1️⃣ Retrieve the Prompt from GitHub (HTTP Request → Extract from File → SetPrompt) The workflow starts manually or via an external trigger. It fetches a text-based prompt stored in a GitHub repository. The Extract from File Node retrieves the content from the GitHub file. The SetPrompt Node stores the prompt, making it accessible for processing. 📌 Note: The prompt must contain n8n expression format variables (e.g., {{ $json.company }}) so they can be dynamically replaced. 2️⃣ Extract & Auto-Populate Variables (Check All Prompt Vars → Replace Variables) A Code Node scans the prompt for placeholders in the n8n expression format ({{ $json.variableName }}). The workflow compares required variables against the setVars node: ✅ If all variables are present, it proceeds to variable replacement. ❌ If any variables are missing, the workflow stops and returns an error listing them. The Replace Variables Node replaces all placeholders with values from setVars. 📌 Example of a properly formatted GitHub prompt: Hello {{ $json.company }}, your product {{ $json.features }} launches on {{ $json.launch_date }}. This ensures seamless replacement when processed in n8n. 3️⃣ AI Processing & Output (AI Agent → Prompt Output) The Set Completed Prompt Node stores the final, processed prompt. The AI Agent Node (Ollama Chat Model) processes the prompt. The Prompt Output Node returns the fully formatted response. 📌 Optional: Modify this to use OpenAI, Claude, or other AI models. ⚠️ Error Handling: Missing Variables If a required variable is missing, the workflow stops execution and provides an error message: ⚠️ Missing Required Variables: ["launch_date"] This ensures no incomplete prompts are sent to AI agents. ✅ Example Use Case 📜 GitHub Prompt File (Using n8n Expressions) Hello {{ $json.company }}, your product {{ $json.features }} launches on {{ $json.launch_date }}. 🔹 Variables in setVars Node { "company": "PropTechPro", "features": "AI-powered Property Management", "launch_date": "March 15, 2025" } ✅ Successful Output Hello PropTechPro, your product AI-powered Property Management launches on March 15, 2025. 🚨 Error Output (If Missing launch_date) ⚠️ Missing Required Variables: ["launch_date"] 🔧 Setup Instructions 1️⃣ Connect Your GitHub Repository Store your prompt in a public or private GitHub repo. The workflow will fetch the raw file using the GitHub API. 2️⃣ Configure the SetVars Node Define the required variables in the SetVars Node. Make sure the variable names match those used in the prompt. 3️⃣ Test & Run Click Test Workflow to execute. If variables are missing, it will show an error. If everything is correct, it will output the fully formatted prompt. ⚡ How to Customize This Workflow 💡 Need CRM or Database Integration? Connect the setVars node to an Airtable, Google Sheets, or HubSpot API to pull variables dynamically. 💡 Want to Modify the AI Model? Replace the Ollama Chat Model with OpenAI, Claude, or a custom LLM endpoint. 📌 Why Use This Workflow? ✅ No Manual Updates Required – Fetches prompts dynamically from GitHub. ✅ Prevents Broken Prompts – Ensures required variables exist before execution. ✅ Works for Any Use Case – Handles AI chat prompts, marketing messages, and chatbot scripts. ✅ Compatible with All n8n Deployments – Works on Cloud, Self-Hosted, and Desktop versions.
by Nick Saraev
Google Maps Email Scraper System Categories: Lead Generation, Web Scraping, Business Automation This workflow creates a completely free Google Maps email scraping system that extracts unlimited business emails without requiring expensive third-party APIs. Built entirely in N8N using simple HTTP requests and JavaScript, this system can generate thousands of targeted leads for any industry or location while operating at 99% free cost structure. Benefits Zero API Costs** - Operates entirely through free Google Maps scraping without expensive third-party services Unlimited Lead Generation** - Extract emails from thousands of Google Maps listings across any industry Geographic Targeting** - Search by specific cities, regions, or business types for precise lead targeting Complete Automation** - From search query to organized email list with minimal manual intervention Built-in Data Cleaning** - Automatic duplicate removal, filtering, and data validation Scalable Processing** - Handle hundreds of businesses per search with intelligent rate limiting How It Works Google Maps Search Integration: Uses strategic HTTP requests to Google Maps search URLs Processes search queries like "Calgary + dentist" to extract business listings Bypasses API restrictions through direct HTML scraping techniques Intelligent URL Extraction: Custom JavaScript regex patterns extract website URLs from Google Maps data Filters out irrelevant domains (Google, schema, static files) Returns clean list of actual business websites for processing Smart Website Processing: Loop-based architecture prevents IP blocking through intelligent batching Built-in delays and redirect handling for reliable scraping Processes each website individually with error handling Email Pattern Recognition: Advanced regex patterns identify email addresses within website HTML Extracts contact emails, info emails, and administrative addresses Handles multiple email formats and validation patterns Data Aggregation & Cleaning: Automatically removes duplicate emails across all processed websites Filters null entries and invalid email formats Exports clean, organized email lists to Google Sheets Required Google Sheets Setup Create a Google Sheet with these exact column headers: Search Tracking Sheet: searches - Contains your search queries (e.g., "Calgary dentist", "Miami lawyers") Email Results Sheet: emails - Contains extracted email addresses from all processed websites Setup Instructions: Create Google Sheet with two tabs: "searches" and "emails" Add your target search queries to the searches tab (one per row) Connect Google Sheets OAuth credentials in n8n Update the Google Sheets document ID in all sheet nodes The workflow reads search queries from the first sheet and exports results to the second sheet automatically. Business Use Cases Local Service Providers** - Find competitors and potential partners in specific geographic areas B2B Sales Teams** - Generate targeted prospect lists for cold outreach campaigns Marketing Agencies** - Build industry-specific lead databases for client campaigns Real Estate Professionals** - Identify businesses in target neighborhoods for commercial opportunities Franchise Development** - Research potential markets and existing competition Market Research** - Analyze business density and contact information across regions Revenue Potential This system transforms lead generation economics: $0 per lead vs. $2-5 per lead from paid databases Process 1,000+ leads daily without hitting API limits Sell as a service for $500-2,000 per industry/location Perfect for agencies offering lead generation to local businesses Difficulty Level: Intermediate Estimated Build Time: 1-2 hours Monthly Operating Cost: $0 (completely free) Watch My Complete Build Process Want to watch me build this entire system live from scratch? I walk through every single step - including the JavaScript code, regex patterns, error handling, and all the debugging that goes into creating a bulletproof scraping system. 🎥 Watch My Live Build: "Scrape Unlimited Leads WITHOUT Paying for APIs (99% FREE)" This comprehensive tutorial shows the real development process - including writing custom JavaScript, handling rate limits, and building systems that actually work at scale without getting blocked. Set Up Steps Basic Workflow Architecture: Set up manual trigger for testing and Google Sheets integration Configure initial HTTP request node for Google Maps searches Enable SSL ignore and response headers for reliable scraping URL Extraction Code Setup: Configure JavaScript code node with custom regex patterns Set up input data processing from Google Maps HTML responses Implement URL filtering logic to remove irrelevant domains Website Processing Pipeline: Add "Split in Batches" node for intelligent loop processing Configure HTTP request nodes with proper delays and redirect handling Set up error handling for websites that can't be scraped Email Extraction System: Implement JavaScript code node with email-specific regex patterns Configure email validation and format checking Set up data aggregation for multiple emails per website Data Cleaning & Export: Configure filtering nodes to remove null entries and duplicates Set up "Split Out" node to aggregate emails into single list Connect Google Sheets integration for organized data export Testing & Optimization: Use limit nodes during testing to prevent IP blocking Test with small batches before scaling to full searches Implement proxy integration for high-volume usage Advanced Optimizations Scale the system with: Multi-Page Scraping:** Extract URLs from homepages, then scrape contact pages for more emails Proxy Integration:** Add residential proxies for unlimited scraping without rate limits Industry Templates:** Create pre-configured searches for different business types Contact Information Expansion:** Extract phone numbers, addresses, and social media profiles CRM Integration:** Automatically add leads to sales pipelines and marketing sequences Important Considerations Rate Limiting:** Built-in delays prevent IP blocking during normal usage Scalability:** For high-volume usage, consider proxy services for unlimited requests Compliance:** Ensure proper usage rights for extracted contact information Data Quality:** System includes filtering but manual verification recommended for critical campaigns Check Out My Channel For more advanced automation systems and business-building strategies that generate real revenue, explore my YouTube channel where I share proven automation techniques used by successful agencies and entrepreneurs.