by KPendic
How it works This workflow simply exports all your CloudFlare domains to Google Sheet to get high overview of all of your settings. This could help for easy debugging, searching or similar needs. In flow simple pagging nodes are used to iterate over all your domains, because this list could be huge. For each host we are merging DNS & Settings and transforming them into columns for all our domains. Requirements For storing and processing of data in this flow you will need: CloudFlare.com API key/token - for retrieving your data (https://dash.cloudflare.com/:account/api-tokens) (need full access) Google Spreadsheet auth connected in your n8n Credentials Google Spreadsheet template - you can copy my sheet as starting point, start by copying it to your account Match Sheet ID in 'Export' node to your newly created. Official CloudFlare api Documentation For full details and specifications please use API documentation from: https://developers.cloudflare.com/api/ Potential API timeouts If you encounter CF API timeouts - I would suggest to only put somewhere in the loop simple sleep/wait node - for couple of seconds - and it should resolve timeouts. Google Sheet I've used simple Google Sheet feature conditional formatting to visually distinct my on|off toggles that was of my interest to easily get high overview for debuggint some of the settings on my hosts - but please use your own logic or change it completely.
by Incrementors
🛒 Google Maps Business Phone Number Scraper Using Bright Data API & Google Sheets Integration This template requires a self-hosted n8n instance to run. An automated workflow that extracts business information including phone numbers from Google Maps using Bright Data's API and saves the data to Google Sheets for easy access and analysis. 📋 Overview This workflow provides an automated solution for extracting business contact information from Google Maps based on location and keyword searches. Perfect for lead generation, market research, competitor analysis, and business directory creation. ✨ Key Features 🎯 Form-Based Input: Easy-to-use form for location and keyword submission 🗺️ Google Maps Integration: Uses Bright Data's Google Maps dataset for accurate business data 📊 Comprehensive Data Extraction: Extracts business names, addresses, phone numbers, ratings, and more 📧 Automated Processing: Handles the entire scraping process automatically 📈 Google Sheets Storage: Automatically saves extracted data to organized spreadsheets 🔄 Smart Status Checking: Monitors scraping progress with automatic retry logic ⚡ Fast & Reliable: Professional scraping with built-in error handling 🎯 Customizable Output: Configurable data fields for specific business needs 🎯 What This Workflow Does Input Location:** Geographic area to search (city, state, country) Keywords:** Business type or industry keywords Processing Form Submission: User submits location and keywords through web form API Request: Sends scraping request to Bright Data's Google Maps dataset Status Monitoring: Continuously checks scraping progress Data Retrieval: Fetches completed business data when ready Data Storage: Saves extracted information to Google Sheets Error Handling: Implements retry logic for failed requests Output Data Points | Field | Description | Example | |-------|-------------|---------| | Business Name | Official business name from Google Maps | "Joe's Pizza Restaurant" | | Phone Number | Contact phone number | "+1-555-123-4567" | | Address | Complete business address | "123 Main St, New York, NY 10001" | | Rating | Google Maps rating score | 4.5 | | URL | Google Maps listing URL | "https://maps.google.com/..." | 🚀 Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access Bright Data account with Google Maps dataset access 5-10 minutes for setup Step 1: Import the Workflow Copy the JSON workflow code from the provided file In n8n: Workflows → + Add workflow → Import from JSON Paste JSON and click Import Step 2: Configure Bright Data Set up Bright Data credentials: In n8n: Credentials → + Add credential → HTTP Request Auth Enter your Bright Data API key Test the connection Configure dataset: Ensure you have access to Google Maps dataset (gd_m8ebnr0q2qlklc02fz) Verify dataset permissions in Bright Data dashboard Step 3: Configure Google Sheets Integration Create a Google Sheet: Go to Google Sheets Create a new spreadsheet named "Business Data" or similar Copy the Sheet ID from URL: https://docs.google.com/spreadsheets/d/SHEET_ID_HERE/edit Set up Google Sheets credentials: In n8n: Credentials → + Add credential → Google Sheets OAuth2 API Complete OAuth setup and test connection Prepare your data sheet with columns: Column A: Name Column B: Address Column C: Rating Column D: Phone Number Column E: URL Step 4: Update Workflow Settings Update Google Sheets node: Open "Save to Google Sheets" node Replace the document ID with your Sheet ID Select your Google Sheets credential Choose the correct sheet/tab name Update Bright Data nodes: Open HTTP Request nodes Replace BRIGHT_DATA_API_KEY with your actual API key Verify dataset ID matches your subscription Step 5: Test & Activate Test the workflow: Activate workflow (toggle switch) Submit test form with location: "New York" and keywords: "restaurants" Verify data appears in Google Sheet Check for proper phone number extraction 📖 Usage Guide Submitting Search Requests Access the form URL provided by n8n Enter the desired location (city, state, or country) Enter relevant keywords (business type, industry, etc.) Submit the form and wait for processing Understanding the Results Your Google Sheet will populate with business data including: Complete business contact information Verified phone numbers from Google Maps Accurate addresses and ratings Direct links to Google Maps listings 🔧 Customization Options Adding More Data Points Edit the "Bright Data API - Request Business Data" node to capture additional fields: Business descriptions Operating hours Reviews count Website URLs Photos and videos Modifying Search Parameters Customize the search behavior: Adjust "limit_per_input" for more or fewer results Modify search type and discovery method Add geographical coordinates for precise targeting 🚨 Troubleshooting Common Issues & Solutions 1. "Bright Data connection failed" Cause:** Invalid API credentials or dataset access Solution:** Verify credentials in Bright Data dashboard, check dataset permissions 2. "No business data extracted" Cause:** Invalid search parameters or no results found Solution:** Try broader keywords or different locations, verify dataset availability 3. "Google Sheets permission denied" Cause:** Incorrect credentials or sheet permissions Solution:** Re-authenticate Google Sheets, check sheet sharing settings 4. "Workflow execution timeout" Cause:** Large search results or slow API response Solution:** Reduce search scope, increase timeout settings, check internet connection 📊 Use Cases & Examples 1. Lead Generation Goal:** Find potential customers in specific areas Search for businesses by industry and location Extract contact information for outreach campaigns Build targeted prospect lists 2. Market Research Goal:** Analyze local business landscape Study competitor density in target markets Identify market gaps and opportunities Gather business intelligence for strategic planning 3. Directory Creation Goal:** Build comprehensive business directories Create industry-specific business listings Maintain updated contact databases Support local business communities 📈 Performance & Limits Expected Performance Processing time:** 1-5 minutes per search depending on results Data accuracy:** 95%+ for active Google Maps listings Success rate:** 90%+ for accessible businesses Concurrent requests:** Depends on Bright Data plan limits Resource Usage Memory:** ~50MB per execution Storage:** Minimal (data stored in Google Sheets) API calls:** 2-3 Bright Data calls + 1 Google Sheets call per search Bandwidth:** ~1-2MB per search request Execution time:** 2-5 minutes for typical searches Scaling Considerations Rate limiting:** Respect Bright Data API limits Error handling:** Implement retry logic for failed requests Data validation:** Add checks for incomplete business data Cost optimization:** Monitor API usage to control expenses Batch processing:** Group multiple searches for efficiency 🤝 Support & Community Getting Help n8n Community Forum:** community.n8n.io Documentation:** docs.n8n.io Bright Data Support:** Contact through your dashboard GitHub Issues:** Report bugs and feature requests Contributing Share improvements with the community Report issues and suggest enhancements Create variations for specific use cases Document best practices and lessons learned 🎯 Ready to Use! This workflow provides a solid foundation for automated Google Maps business data extraction. Customize it to fit your specific needs and use cases. Your workflow URL: https://your-n8n-instance.com/workflow/google-maps-scraper For any questions or support, please contact: info@incrementors.com or fill out this form: https://www.incrementors.com/contact-us/
by Keith Rumjahn
Who's this for? Anyone who wants to improve the SEO of their website Umami users who want insights on how to improve their site SEO managers who need to generate reports weekly Case study Watch youtube tutorial here Get my SEO A.I. agent system here You can read more about how this works here. How it works This workflow calls the Umami API to get data Then it sends the data to A.I. for analysis It saves the data and analysis to Baserow How to use this Input your Umami credentials Input your website property ID Input your Openrouter.ai credentials Input your baserow credentials You will need to create a baserow database with columns: Date, Summary, Top Pages, Blog (name of your blog). Future development Use this as a template. There's alot more Umami stats you can pull from the API. Change the A.I. prompt to give even more detailed analysis. Created by Rumjahn
by David Ashby
🛠️ seven Tool MCP Server Complete MCP server exposing all seven Tool operations to AI agents. Zero configuration needed - all 2 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every seven Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n seven Tool tool with full error handling 📋 Available Operations (2 total) Every possible seven Tool operation is included: 🔧 Sms (1 operations) • Send an SMS 🔧 Voice (1 operations) • Convert text to voice 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native seven Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every seven Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Sira Ekabut
Facebook access tokens expire quickly, requiring regular updates for continued API access. This workflow simplifies the process of exchanging short-lived tokens for long-lived ones, saving time and reducing manual effort. What this workflow does Exchanges a short-lived Facebook User Access Token for a long-lived token using the Facebook Graph API. Optionally retrieves a long-lived Page Access Token associated with the user. Outputs both the user and page tokens for further use in automation or integrations. Setup Prerequisites: A valid Facebook App ID and App Secret. A short-lived User Access Token from the Facebook platform. (Optional) The App-Scoped User ID for fetching associated page tokens. Workflow Configuration: Replace placeholder values in the "Set Parameter" node with your Facebook credentials and token. Run the workflow manually to generate long-lived tokens. Documentation Reference: Follow the official Facebook guide for more details: https://developers.facebook.com/docs/facebook-login/guides/access-tokens/get-long-lived/
by David Ashby
Complete MCP server exposing all Oura Tool operations to AI agents. Zero configuration needed - all 4 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Oura Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Oura Tool tool with full error handling 📋 Available Operations (4 total) Every possible Oura Tool operation is included: 🔧 Profile (1 operations) • Get a profile 🔧 Summary (3 operations) • Get activity summary • Get readiness summary • Get sleep summary 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Oura Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Oura Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Octoleo
Overview This workflow automates the backup of all workflows from your system to a Git repository hosted on Gitea. It runs on a scheduled trigger, fetching, encoding, and committing workflow data, ensuring seamless version control and disaster recovery. 📌 Quick Setup: Just update three global variables and configure authentication—no manual exports needed! How It Works (Quick Glance) 1️⃣ Scheduled Execution → Runs automatically at defined intervals. 2️⃣ Fetch Workflows → Uses the API to retrieve all workflows. 3️⃣ Process Workflows → Converts workflow data into a Git-friendly format. 4️⃣ Commit & Push to Git → Saves workflows in a Gitea repository. Setup Steps (⚡ Takes ~5 min) 1️⃣ Set Global Variables Go to the Globals section in the workflow and update: repo.url* → https://your-gitea-instance.com *(Replace with your actual Gitea URL) repo.name* → workflows *(Repository name where backups will be stored) repo.owner* → octoleo *(Gitea account that owns the repository) 📌 These three variables define where the workflows are stored. 2️⃣ Configure Gitea Authentication Go to your Gitea account* → Generate a *Personal Access Token** In the credential manager, create a new Gitea Token with: Name:** Authorization Value:** Bearer YOUR_PERSONAL_ACCESS_TOKEN 📌 Ensure there is a space after Bearer before the token! 3️⃣ Link Credentials to Git Nodes Attach the Gitea credentials to these three Git nodes: GetGitea** → Retrieves existing repository data PutGitea** → Updates workflows PostGitea** → Adds new workflows 4️⃣ Link Credentials for API Requests Add API authentication** in the node that fetches all workflows. 5️⃣ Test & Activate Run the workflow manually** to confirm backups work. Enable the schedule trigger for automation. 📌 The workflow automatically checks for changes before committing updates. Why Use This Workflow? ✅ Automated Backups → No manual exports needed. ✅ Version Control → Easily track workflow changes. ✅ Simple Setup → Just configure globals & credentials. ✅ Secure → Uses token-based authentication. Next Steps 💬 Have questions? Reach out on the forum! 🚀
by Jimleuk
This n8n template showcases a cool feature of n8n Forms where the form itself can be defined dynamically using the form fields schema. It may be debateable how useful this template actually is since both Airtable and Baserow provide form interfaces already but still a great exercise and demonstration if ever the use-case comes around. How it works A form trigger is used to dynamically select a database/table from which to build the n8n form from. the table's schema is imported into the workflow and using the code node, is converted into the n8n form fields schema. This let's us dynamically build the fields in our n8n form when we choose to define the form using the JSON option. Once the n8n form submits, we convert the values back into our table's API schema so that we can create a new row. Note any files/attachments fields are removed as they need to be handled separately. Files are processed separately as they may first need to be stored. Once complete, the reference is saved into the newly created row. Check out the example Airtable here - https://airtable.com/appfP15Xd0aVZR9xV/shrGFgXLyQ4Jg58SU How to use The n8n form is autogenerated which means you only need provide access to the table. Using this approach, this template can be reused for any number of Airtable and/or Baserow tables. Requirements You'll need either an Airtable account or a Baserow account to use this template. Accessible n8n instance to your users Customising this workflow Not using either Airtable or Baserow? Theoretically any datastore which provides a fields schema can be used with this template. If you're feeling creative, split the table into multiple forms for a better user experience.
by Agent Studio
Restore backed up workflows from GitHub to your n8n workspace. This workflow was inspired by this one that lets you back up your n8n workflows to GitHub. It will let you restore your backed up workflows in your workspace, without creating duplicates. In case of issue with your instance, it will save you a lot of time to restore them. How it works It retrieves the workflows saved in a GitHub repository. Then compares these saved workflows with the ones in your n8n workspace based on the name. It will only create them if they don't already exist. Set up steps Open the "Global" node and set your own information (see Configuration below) Click on "Test workflow" It will run through all the workflows in the GitHub repository, check if the name doesn't already exist in your workspace and, in this case, create it. Configuration repo.owner: your GitHub owner name repo.name: your GitHub repository name repo.path: the path within the GitHub repository
by Davide
This low-code automation enables all eCommerce store visitors to upload a photo of themselves and virtually “try on” a garment in just a few clicks. With this workflow, WooCommerce, Prestashop, Shopify and more merchants can offer a cutting-edge “virtual try-on” feature with minimal development effort, enhancing customer engagement and reducing product returns. Key Advantages Zero-Coding, Visual Setup** Build end-to-end e-commerce features with drag-and-drop nodes instead of custom backend code. Asynchronous, Scalable Processing** Non-blocking “Wait” + “If” loop handles multi-second AI jobs gracefully, freeing up the workflow for other tasks. Dynamic Inputs & URLs** Query strings (e.g. ?Product=IMAGE_URL) allow you to embed the form on any product page and pass the garment image on the fly. Seamless User Experience** Instant pop-up within your storefront and automatic redirect to the generated mock-up keeps shoppers engaged without page reloads. Easy Credential Management** API keys, FTP credentials, and webhook IDs are all stored securely in n8n’s credential manager. How It Works Form Submission: A user submits a form with their name, an image of themselves ("Me"), and a hidden product image URL ("Product"). The form is triggered via the On form submission node, which collects the input data. Image Upload: The uploaded image ("Me") is sent to an FTP server for temporary storage using the FTP node. The filename includes a timestamp to ensure uniqueness. Virtual Try-on Request: The Create Image node sends a POST request to the Fal.run API, providing: The uploaded human image URL (from FTP). The product image URL (from the hidden form field). This generates a virtual try-on result. Result Processing: The workflow checks the status of the image generation (Get status node) in a loop (with a 10-second wait between checks) until it is marked as "COMPLETED." Once ready, the final image URL is fetched (Get Url image node) and displayed to the user via a redirect (Form node). User Experience: The user is redirected to the generated try-on image, completing the process. Set Up Steps API Key Setup: Create an account and obtain an API key. Configure the Create Image node with HTTP Header Authentication: Name: Authorization Value: Key YOURAPIKEY FTP/S3 Configuration: Set up an FTP server or S3 bucket to temporarily store uploaded user images. Configure the FTP node with your FTP credentials and storage path. Ecommerce Integration: On your WooCommerce site, add a "Try On" button that opens the form in a pop-up. Dynamically pass the product image URL as a query parameter: Example: https://URL_N8N/form/ca1c314d-46c6-4eeb-b6a5-359XXXXXX?Product=IMAGE_URL Testing: Verify the workflow by submitting a test form and ensuring the virtual try-on image is generated and displayed correctly. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by David Ashby
Complete MCP server exposing all Marketstack Tool operations to AI agents. Zero configuration needed - all 3 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Marketstack Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Marketstack Tool tool with full error handling 📋 Available Operations (3 total) Every possible Marketstack Tool operation is included: 🔧 Endofdaydata (1 operations) • Get many EoD data 🔧 Exchange (1 operations) • Get an exchange 🔧 Ticker (1 operations) • Get a ticker 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Marketstack Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Marketstack Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by David Ashby
🛠️ MailerLite Tool MCP Server Complete MCP server exposing all MailerLite Tool operations to AI agents. Zero configuration needed - all 4 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every MailerLite Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n MailerLite Tool tool with full error handling 📋 Available Operations (4 total) Every possible MailerLite Tool operation is included: 🔧 Subscriber (4 operations) • Create a subscriber • Get a subscriber • Get many subscribers • Update a subscriber 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native MailerLite Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every MailerLite Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.