by gotoHuman
Let AI classify your incoming emails and draft replies. Use gotoHuman to approve emails before they are sent out. It also lets you manually edit the draft or even ask for a retry. How it works The workflow is triggered for each new email which gets passed to an AI classification agent. It assigns the email to one of the categories defined in the prompt (Customer Support, Sales opportunity, Promo, Personal,...). The agent also determines whether a reply is needed and if it is important. If a reply is needed, we ask the AI Email Writer to draft a response. Even if it is missing context it can help us draft an outline for the response. The email draft is sent to gotoHuman where the reviewer can manually edit it or ask to regenerate it with the option to even edit the prompt (Retries loop back to the AI Email Writer node) Approved email replies are automatically sent from the workflow How to set up Most importantly, install the gotoHuman node before importing this template! (Just add the node to a blank canvas before importing) Set up your credentials for gotoHuman, OpenAI, and Gmail In gotoHuman, select and create the pre-built review template "Email agent" or import the ID: v81wzxwYoFYvWpmuIBgX Select this template in the gotoHuman node Requirements You need accounts for gotoHuman (human supervision) OpenAI (classification, drafting) Gmail How to customize Change the predefined categories in the prompt of the AI classification agent Provide the AI Email Writer with more context to create replies. Consider adding tools that allow the agent to fetch more infos about clients, your calendar, FAQs for your product,...
by Dahiana
Send personalized pet care tips from Google Sheets with AI Automate weekly pet wellness emails with AI-generated, location and age-specific advice. Who's it for Pet care businesses, veterinary clinics, pet subscription services, and animal shelters sending regular wellness content to pet owners. How it works Loads pets data from Google Sheets Filters pets who haven't received email in 7+ days Calculates age from birthdate (formats as "2 years and 3 months") AI generates tip - GPT-4o-mini creates climate-aware, veterinary-aligned advice based on pet type, age, and location Sends email via Gmail or SendGrid Updates timestamp in sheet to prevent duplicates Logs activity to tracking sheet Requirements APIs: Google Sheets, Airtable, Typeform or similar OpenAI (GPT-4o-mini) Gmail OAuth2 OR SendGrid, you can use Brevo, Mailchimp or any other. Google Sheet Structure: Sheet 1: Pets | Email | Owner_Name | Pet_Name | Pet_Type | Date_of_Birth | Country (ISO) | Status | Last_Email_Sent | |-------|------------|----------|----------|---------------|---------------|--------|-----------------| Sheet 2: Email_Log | Timestamp | Parent_Email | Pet_Name | Tip_Category | Status | |-----------|--------------|----------|--------------|--------| How to set up Create Google Sheet with structure above, add 2-3 test pets. Import workflow and add credentials. Update nodes: "Load Pet Info": Set your Sheet ID "Update Last_Email_Sent Date": Set Sheet ID "Log to Email_Log Sheet": Set Sheet ID Test manually with 1 active pet Enable schedule (default: Mondays 9am) How to customize Switch email provider: Enable "Send via SendGrid" node Disable "Send Health Tip using Gmail" node Update template ID Modify AI prompt: Edit "Generate Personalized Tip" node Adjust temperature Add/remove categories Use cases beyond pets Same workflow works for: Plant care** (growth stage tips) Baby milestones** (age-based parenting advice) Fitness coaching** (experience level workouts) Language learning** (study streak motivation) Just update sheet columns and AI prompt. Notes Choose only one mailing service. Country codes use ISO format (US, UK, AU, CA, etc.) AI considers location for seasonal advice.
by Yusuke Yamamoto
This n8n template creates an automated alert system that checks NASA's data for near-Earth asteroids twice a day. When it finds asteroids meeting specific criteria, it sends a summary alert to Slack and creates individual events in Google Calendar for each object. Use cases Automated Monitoring**: Keep track of potentially hazardous asteroids without manually checking websites. Team or Community Alerts**: Automatically inform a team, a group of friends, or a community about significant celestial events via Slack. Personalized Space Calendar**: Populate your Google Calendar with upcoming asteroid close approaches, creating a personal "what's up in space" agenda. Educational Tool**: Use this as a foundation to learn about API data fetching, data processing, and multi-channel notifications in n8n. Good to know This workflow runs on a schedule (every 12 hours by default) and does not require a manual trigger. NASA API Key is highly recommended**. The default DEMO_KEY has strict rate limits. Get a free key from api.nasa.gov. The filtering logic for what constitutes an "alert-worthy" asteroid (distance and size) is fully customizable within the "Filter and Process Asteroids" Code node. How it works A Schedule Trigger starts the workflow every 12 hours. The "Calculate Date Range" Code node generates the start and end dates for the API query (today to 14 days from now). The NASA node uses these dates to query the Near Earth Object Web Service (NeoWs) API, retrieving a list of all asteroids that will pass by Earth in that period. The "Filter and Process Asteroids" Code node iterates through the list. It filters out objects that are too small or too far away, based on thresholds defined in the code. It then formats and sorts the remaining "interesting" asteroids by their closest approach distance. An If node checks if any asteroids were found after filtering. If true (asteroids were found), the flow continues to the alert steps. If false, the workflow ends quietly via a NoOp node. The "Format Alert Messages" Code node compiles a single, well-formatted summary message for Slack and prepares the data for other notifications. The workflow then splits into two parallel branches: Slack Alert: The Slack node sends the summary message to a specified channel. Calendar Events: The Split Out node separates the data so that each asteroid is processed individually. For each asteroid, the Google Calendar node creates an all-day event on its close-approach date. How to use Configure the NASA Node: Open the "Get an asteroid neo feed" (NASA) node. Create new credentials and replace the default DEMO_KEY with your own NASA API key. Customize Filtering (Optional): Open the "Filter and Process Asteroids" Code node. Adjust the MAX_DISTANCE_KM and MIN_DIAMETER_METERS variables to make the alerts more or less sensitive. // Example: For closer, larger objects const MAX_DISTANCE_KM = 7500000; // 7.5 million km (approx. 19.5 lunar distances) const MIN_DIAMETER_METERS = 100; // 100 meters Configure Slack Alerts: Open the "Send Slack Alert" node. Add your Slack OAuth2 credentials. Select the channel where you want to receive alerts (e.g., #asteroid-watch). Configure Google Calendar Events: Open the "Create an event" (Google Calendar) node. Add your Google Calendar OAuth2 credentials. Select the calendar where events should be created. Activate the workflow. Requirements A free NASA API Key. Slack credentials** (OAuth2) and a workspace to post alerts. Google Calendar credentials** (OAuth2) to create events. Customising this workflow Add More Notification Channels**: Add nodes for Discord, Telegram, or email to send alerts to other platforms. Create a Dashboard**: Instead of just sending alerts, use the processed data to populate a database (like Baserow or Postgres) to power a simple dashboard. Different Data Source**: Modify the HTTP Request node to pull data from other space-related APIs, like a feed of upcoming rocket launches.
by AureusR
Live Demo Booking Form with Outlook Calendar and Zoom link Who’s it for This workflow is designed for SaaS companies, consultants, or sales teams that regularly run live demos. It helps automate demo scheduling, ensuring clients can only book from available time slots while instantly generating Zoom links and calendar invitations. How it works / What it does Client fills demo request form → Collects company, contact details, and a preferred date. Check Outlook calendar availability → Searches for pre-created “Online Meeting Slot” events. Time slot selection → If the date has slots, the client chooses from up to 3 nearest available times. If not, they’re asked to pick another date. Create Zoom meeting → Once a date & time are confirmed, a Zoom link is automatically generated. Update Outlook calendar → The chosen slot is updated with the client’s details and Zoom link, marked as “Booked Live Demo” so it can’t be double-booked. Send confirmation → The client receives a styled confirmation screen, and both parties get the calendar invite. How to set up Import the workflow JSON into your n8n instance. Configure the following credentials: Microsoft Outlook OAuth2 API (for calendar access). Zoom OAuth2 API (for automatic meeting creation). Pre-create “Online Meeting Slot” events in your Outlook calendar to define available demo times. Publish the form via n8n’s webhook URL (embed it in your website or share the link). Test by submitting a request to ensure slots update correctly and Zoom links are created. Requirements n8n self-hosted or cloud account. Microsoft Outlook account with calendar access. Zoom account with OAuth2 credentials. Pre-created calendar slots named “Online Meeting Slot”. How to customize the workflow Form fields**: Adjust the client details form to capture additional data (e.g., industry, product interest). Email/notification**: Add an Email or Slack node to notify your sales team of new demo bookings. Custom branding**: Update the CSS in the form nodes to match your company’s style. Capacity rules**: Modify the IF nodes to limit the number of bookings per day or adjust the slot-checking logic.
by Oneclick AI Squad
This automated workflow monitors your website's keyword rankings daily and sends instant alerts to your team when significant ranking drops occur. It fetches current ranking positions, compares them with historical data, and triggers notifications through Slack and email when keywords drop beyond your defined threshold. Good to know The workflow uses SERP API for accurate ranking data; API costs apply based on your usage volume Ranking checks are performed daily to avoid overwhelming search engines with requests The system tracks ranking changes over time and maintains historical data for trend analysis Slack integration requires workspace permissions and proper bot configuration False positives may occur due to personalized search results or data center variations How it works Daily SEO Check Trigger** initiates the workflow on a scheduled basis Get Keywords Database** retrieves your keyword list and current ranking data Filter Active Keywords Only** processes only keywords marked as active for monitoring Fetch Google Rankings via SERP API** gets current ranking positions for each keyword Wait For Response** Wait for gets current ranking positions Parse Rankings & Detect Changes** compares new rankings with historical data and identifies significant drops Filter Significant Ranking Drops** isolates keywords that dropped beyond your threshold (e.g., 5+ positions) Send Slack Ranking Alert** notifies your team channel about ranking drops Send Email Ranking Alert** sends detailed email reports to stakeholders Update Rankings in Google Sheet** saves new ranking data for historical tracking Generate SEO Monitoring Summary** creates a comprehensive report of all ranking changes How to use Import the workflow into n8n and configure your SERP API credentials Set up your Google Sheet with the required keyword database structure Configure Slack webhook URL and email SMTP settings Define your ranking drop threshold (recommended: 5+ position drops) Test the workflow with a small keyword set before full deployment Schedule the workflow to run daily during off-peak hours Requirements SERP API account** with sufficient credits for daily keyword checks Google Sheets access** for keyword database and ranking storage Slack workspace** with webhook permissions for team notifications Email service** (SMTP or API) for stakeholder alerts Keywords database** properly formatted in Google Sheets Database/Sheet Columns Required Google Sheet: "Keywords Database" Create a Google Sheet with the following columns: | Column Name | Description | Example | |-------------|-------------|---------| | keyword | Target keyword to monitor | "best seo tools" | | domain | Your website domain | "yourwebsite.com" | | current_rank | Latest ranking position | 5 | | previous_rank | Previous day's ranking | 3 | | status | Monitoring status | "active" | | target_url | Expected ranking URL | "/best-seo-tools-guide" | | search_volume | Monthly search volume | 1200 | | difficulty | Keyword difficulty score | 65 | | date_added | When keyword was added | "2025-01-15" | | last_checked | Last monitoring date | "2025-07-30" | | drop_threshold | Custom drop alert threshold | 5 | | category | Keyword grouping | "Product Pages" | Customising this workflow Modify ranking thresholds** in the "Filter Significant Ranking Drops" node to adjust sensitivity (e.g., 3+ positions vs 10+ positions) Add competitor monitoring** by duplicating the SERP API node and tracking competitor rankings for the same keywords Customize alert messages** in Slack and email nodes to include your brand voice and specific stakeholder information Extend to multiple search engines** by adding Bing or Yahoo ranking checks alongside Google Implement ranking improvement alerts** to celebrate when keywords move up significantly Add mobile vs desktop tracking** by configuring separate SERP API calls for different device types
by 寳田 武
This workflow automates the entire process of running a Print-on-Demand (POD) business by combining market trend analysis with autonomous AI design and quality control. It acts as a virtual product team that researches, designs, vets, and publishes new products to your store every week. Who is it for? This template is ideal for e-commerce entrepreneurs, content creators, and print-on-demand store owners who want to scale their merchandise inventory without spending hours on design and market research. What it does Market Research: Fetches real-time search data from Google Trends and customer preference data from Typeform. AI Design: Uses OpenAI (GPT-4o) to brainstorm t-shirt concepts based on the gathered trends, then generates high-quality vector-style images using Replicate (Flux/Stable Diffusion). Quality Control: A "Vision AI" agent analyzes the generated image, rates it on a scale of 1-10, and filters out any design scoring below 7. Dynamic Pricing & Publishing: Automatically calculates a premium price for higher-rated designs and publishes the product directly to your Printify store. Logging: Saves the product details to Airtable for your records. How to set up Configure Credentials: Open the "Workflow Configuration" node. Replace the placeholder values with your API keys for OpenAI, Replicate, Printify, and Typeform. Set Printify Details: In the "Workflow Configuration" node, add your Shop ID. In the "Publish to Printify" node, update the blueprint_id (the specific t-shirt model, e.g., Bella+Canvas 3001) and print_provider_id. Airtable Setup: Create a table with columns for Title, Description, Price, Quality Score, and Image URL, then map the IDs in the Airtable node. Requirements n8n: Cloud or Self-hosted instance. API Keys: OpenAI (with GPT-4o access), Replicate, Printify, Typeform, and Airtable. Printify Account: A connected store (e.g., Shopify, Etsy, or Pop-up). How to customize Prompt Engineering: Modify the "Chief Designer AI" system prompt to change the artistic style (e.g., from "vector" to "pixel art" or "vintage"). Pricing Logic: Adjust the JavaScript in the "Dynamic Pricing Calculator" to change your base margins or markup rules. Schedule: Change the "Weekly Schedule Trigger" to run daily or monthly depending on your volume needs.
by Rahul Joshi
Description Turn raw marketing data into actionable insights with this n8n Source/UTM Attribution and Reporting workflow! It automatically aggregates lead submissions, calculates Cost Per Lead (CPL) per channel, and generates AI-powered weekly attribution reports—delivered straight to your inbox in a professional HTML format. What This Template Does 📅 Runs hourly to process new lead submissions 📊 Aggregates leads by source (Instagram, LinkedIn, Google Ads, etc.) 💰 Calculates key metrics like Cost Per Lead (CPL) 🧠 Uses AI to generate executive-ready HTML reports 📈 Highlights top-performing sources and growth opportunities 📧 Sends polished reports via Gmail automatically Prerequisites Google Sheets with lead submission data Google Forms (or similar) as the data input source n8n instance (self-hosted or cloud) Azure OpenAI (GPT-4o-mini) API key for AI-powered reporting Gmail API credentials for automated report delivery Step-by-Step Setup Trigger workflow hourly with n8n Scheduler. Fetch new lead submissions from Google Sheets. Aggregate and group data by Source/UTM parameters. Calculate CPL using spend + lead count per channel. Standardize column names for consistent reporting. Send raw + aggregated data to Azure OpenAI for report generation. Format into a professional HTML report (with insights & recommendations). Send report via Gmail node to stakeholders. Customization Ideas Replace Gmail with Slack/Teams notifications for real-time sharing. Add visual charts (Google Data Studio / Looker) for more analytics. Use additional UTM fields (campaign, adgroup, creative) for deeper granularity. Extend reporting to include ROI and ROAS calculations. Key Benefits ✅ Hands-free attribution tracking and analysis ✅ Accurate CPL metrics per channel ✅ AI-generated reports with actionable insights ✅ Saves time vs. manual data crunching ✅ Weekly reports ensure marketing strategy stays optimized Perfect For Marketing teams managing multi-channel campaigns Agencies providing client attribution reports Business owners optimizing ad spend efficiency Growth teams tracking lead quality by source
by Ishita Virmani
This N8N template helps keep track of multiple school websites for admission updates and sends an email notification. Good To Know This template uses all free tier tools like Gemini for LLM, Email alerting, How it works For each School Website provided: Get & clean the content through HTTP Request Node Gemini model takes the HTML content and defined prompt that instructs on how to identify if Pre-nursery Admissions for year 2026-207 have announced yet. If LLM response confirms the announcement, trigger an email to the configured address. Features Scheduled daily checks HTTP scraping Google Gemini text extraction for admission for Pre-nursery Email alerts How to use Import workflow. Provide already created or new Gemini API key within "Are admissions open" node. Setup SMTP account credentials within "Send Email" node, along with From-Email and To-Email. Finally update your list of School and their Admission URLs within "Shortlisted Schools" node. Customizing the workflow It can be used for tracking school admissions for any class including Pre-Nursery/ Bal-vatika/ 1st etc. via modifying the prompt. It can be used for tracking any school that has details uploaded on their websites and can be extracted via HTTP request node.
by Florent
Restore workflows & credentials from FTP - Remote Backup Solution This n8n template provides a safe and intelligent restore solution for self-hosted n8n instances, allowing you to restore workflows and credentials from FTP remote backups. Perfect for disaster recovery or migrating between environments, this workflow automatically identifies your most recent FTP backup and provides a manual restore capability that intelligently excludes the current workflow to prevent conflicts. Works seamlessly with date-organized backup folders stored on any FTP/SFTP server. Good to know This workflow uses n8n's native import commands (n8n import:workflow and n8n import:credentials) Works with date-formatted backup folders (YYYY-MM-DD) stored on FTP servers The restore process intelligently excludes the current workflow to prevent overwriting itself Requires FTP/SFTP server access and proper Docker volume configuration All downloaded files are temporarily stored server-side before import Compatible with backups created by n8n's export commands and uploaded to FTP Supports selective restoration: restore only credentials, only workflows, or both How it works Restore Process (Manual) Manual trigger with configurable pinned data options (credentials: true/false, worflows: true/false) The Init node sets up all necessary paths, timestamps, and configuration variables using your environment settings The workflow connects to your FTP server and scans for available backup dates Automatically identifies the most recent backup folder (latest YYYY-MM-DD date) Creates temporary restore folders on your local server for downloaded files If restoring credentials: Lists all credential files from FTP backup folder Downloads credential files to temporary local folder Writes files to disk using "Read/Write Files from Disk" node Direct import using n8n's import command Credentials are imported with their encrypted format intact If restoring workflows: Lists all workflow JSON files from FTP backup folder Downloads workflow files to temporary local folder Filters out the credentials subfolder to prevent importing it as a workflow Writes workflow files to disk Intelligently excludes the current restore workflow to prevent conflicts Imports all other workflows using n8n's import command Optional email notifications provide detailed restore summaries with command outputs Temporary files remain on server for verification (manual cleanup recommended) How to use Prerequisites Existing n8n backups on FTP server in date-organized folder structure (format: /ftp-backup-folder/YYYY-MM-DD/) Workflow backups as JSON files in the date folder Credentials backups in subfolder: /ftp-backup-folder/YYYY-MM-DD/n8n-credentials/ FTP/SFTP access credentials configured in n8n For new environments: N8N_ENCRYPTION_KEY from source environment (see dedicated section below) Initial Setup Configure your environment variables: N8N_ADMIN_EMAIL: Your email for notifications (optional) FTP_BACKUP_FOLDER: FTP path where backups are stored (e.g., /n8n-backups) N8N_PROJECTS_DIR: Projects root directory (e.g., /files/n8n-projects-data) GENERIC_TIMEZONE: Your local timezone (e.g., Europe/Paris) N8N_ENCRYPTION_KEY: Required if restoring credentials to a new environment (see dedicated section below) Create your FTP credential in n8n: Add a new FTP/SFTP credential Configure host, port, username, and password/key Test the connection Update the Init node: (Optional) Configure your email here: const N8N_ADMIN_EMAIL = $env.N8N_ADMIN_EMAIL || 'youremail@world.com'; Set PROJECT_FOLDER_NAME to "Workflow-backups" (or your preferred name) Set FTP_BACKUP_FOLDER to match your FTP backup path (default: /n8n-backups) Set credentials to "n8n-credentials" (or your backup credentials folder name) Set FTPName to a descriptive name for your FTP server (used in notifications) Configure FTP credentials in nodes: Update the FTP credential in "List Credentials Folders" node Verify all FTP nodes use the same credential Test connection by executing "List Credentials Folders" node Optional: Configure SMTP for email notifications: Add SMTP credential in n8n Activate "SUCCESS email Credentials" and "SUCCESS email Workflows" nodes Or remove email nodes if not needed Performing a Restore Open the workflow and locate the "Start Restore" manual trigger node Edit the pinned data to choose what to restore: { "credentials": true, "worflows": true } credentials: true - Restore credentials from FTP worflows: true - Restore workflows from FTP (note: typo preserved from original) Set both to true to restore everything Update the node's notes to reflect your choice (for documentation) Click "Execute workflow" on the "Start Restore" node The workflow will: Connect to FTP and find the most recent backup Download selected files to temporary local folders Import credentials and/or workflows Send success email with detailed operation logs Check the console logs or email for detailed restore summary Important Notes The workflow automatically excludes itself during restore to prevent conflicts Credentials are restored with their encryption intact. If restoring to a new environment, you must configure the N8N_ENCRYPTION_KEY from the source environment (see dedicated section below) Existing workflows/credentials with the same names will be overwritten Temporary folders are created with date prefix (e.g., 2025-01-15-restore-credentials) Test in a non-production environment first if unsure Critical: N8N_ENCRYPTION_KEY Configuration Why this is critical: n8n generates an encryption key automatically on first launch and saves it in the ~/.n8n/config file. However, if this file is lost (for example, due to missing Docker volume persistence), n8n will generate a NEW key, making all previously encrypted credentials inaccessible. When you need to configure N8N_ENCRYPTION_KEY: Restoring to a new n8n instance When your data directory is not persisted between container recreations Migrating from one server to another As a best practice to ensure key persistence across updates How credentials encryption works: Credentials are encrypted with a specific key unique to each n8n instance This key is auto-generated on first launch and stored in /home/node/.n8n/config When you backup credentials, they remain encrypted but the key is NOT included If the key file is lost or a new key is generated, restored credentials cannot be decrypted Setting N8N_ENCRYPTION_KEY explicitly ensures the key remains consistent Solution: Retrieve and configure the encryption key Step 1: Get the key from your source environment Check if the key is defined in environment variables docker-compose exec n8n printenv N8N_ENCRYPTION_KEY If this command returns nothing, the key is auto-generated and stored in n8n's data volume: Enter the container docker-compose exec n8n sh Check configuration file cat /home/node/.n8n/config Exit container exit Step 2: Configure the key in your target environment Option A: Using .env file (recommended for security) Add to your .env file N8N_ENCRYPTION_KEY=your_retrieved_key_here Then reference it in docker-compose.yml: services: n8n: environment: N8N_ENCRYPTION_KEY=${N8N_ENCRYPTION_KEY} Option B: Directly in docker-compose.yml (less secure) services: n8n: environment: N8N_ENCRYPTION_KEY=your_retrieved_key_here Step 3: Restart n8n docker-compose restart n8n Step 4: Now restore your credentials Only after configuring the encryption key, run the restore workflow with credentials: true. Best practice for future backups: Always save your N8N_ENCRYPTION_KEY in a secure location alongside your backups Consider storing it in a password manager or secure vault Document it in your disaster recovery procedures Requirements FTP Server FTP or SFTP server with existing n8n backups Read access to backup folder structure Network connectivity from n8n instance to FTP server Existing Backups on FTP Date-organized backup folders (YYYY-MM-DD format) Backup files created by n8n's export commands or compatible format Credentials in subfolder structure: YYYY-MM-DD/n8n-credentials/ Environment Self-hosted n8n instance (Docker recommended) Docker volumes mounted with write access to project folder Access to n8n CLI commands (n8n import:credentials and n8n import:workflow) Proper file system permissions for temporary folder creation Credentials FTP/SFTP credential configured in n8n Optional: SMTP credentials for email notifications Technical Notes FTP Connection and Download Process Uses n8n's built-in FTP node for all remote operations Supports both FTP and SFTP protocols Downloads files as binary data before writing to disk Temporary local storage required for import process Smart Workflow Exclusion During workflow restore, the current workflow's name is cleaned and matched against backup files This prevents the restore workflow from overwriting itself The exclusion logic handles special characters and spaces in workflow names A bash command removes the current workflow from the temporary restore folder before import Credentials Subfolder Filtering The "Filter out Credentials sub-folder" node checks for binary data presence Only items with binary data (actual files) proceed to disk write Prevents the credentials subfolder from being imported as a workflow Timezone Handling All timestamps use UTC for technical operations Display times use local timezone for user-friendly readability FTP backup folder scanning works with YYYY-MM-DD format regardless of timezone Security FTP connections should use SFTP or FTPS for encrypted transmission Credentials are imported in n8n's encrypted format (encryption preserved) Temporary files stored in project-specific folders Consider access controls for who can trigger restore operations No sensitive credential data is logged in console output Troubleshooting Common Issues FTP connection fails: Verify FTP credentials are correctly configured and server is accessible No backups found: Ensure the FTP_BACKUP_FOLDER path is correct and contains date-formatted folders (YYYY-MM-DD) Permission errors: Ensure Docker user has write access to N8N_PROJECTS_DIR for temporary folders Path not found: Verify all volume mounts in docker-compose.yml match your project folder location Import fails: Check that backup files are in valid n8n export format Download errors: Verify FTP path structure matches expected format (date folder / credentials subfolder / files) Workflow conflicts: The workflow automatically excludes itself, but ensure backup files are properly named Credentials not restored: Verify the FTP backup contains a n8n-credentials subfolder with credential files Credentials decrypt error: Ensure N8N_ENCRYPTION_KEY matches the source environment Error Handling "Find Last Backup" node has error output configured to catch FTP listing issues "Download Workflow Files" node continues on error to handle presence of credentials subfolder All critical nodes log detailed error information to console Email notifications include stdout and stderr from import commands Version Compatibility Tested with n8n version 1.113.3 Compatible with Docker-based n8n installations Requires n8n CLI access (available in official Docker images) Works with any FTP/SFTP server (Synology NAS, dedicated FTP servers, cloud FTP services) This workflow is designed for FTP/SFTP remote backup restoration. For local disk backups, see the companion workflow "n8n Restore from Disk". Works best with backups from: "Automated n8n Workflows & Credentials Backup to Local/Server Disk & FTP"
by WeblineIndia
WooCommerce Product Reviews: Sentiment Analysis with Slack Summary This workflow automatically fetches product reviews from your WooCommerce store, analyzes the sentiment of each review using AI, stores the results in Airtable and sends a summary of positive, neutral and negative reviews to Slack. It helps teams quickly understand overall customer feedback, track product sentiment and stay updated without manually reading all reviews. Quick Start – Implementation Steps Import the workflow JSON file into n8n. Configure credentials: WooCommerce HTTP Basic Auth (for fetching reviews) OpenAI API (for sentiment analysis) Airtable Personal Access Token (for storing reviews) Slack API (for sending summary messages) Adjust the Cron/Schedule Trigger node to your preferred interval (e.g., every 10 minutes). Test the workflow with a few reviews to ensure AI and Slack integrations work correctly. Activate the workflow after confirming functionality. What It Does This workflow automates sentiment analysis and team notification: Schedule Trigger** – Runs the workflow automatically at defined intervals. Set WooCommerce Domain** – Defines the WooCommerce store to fetch reviews from. Fetch Reviews** – Retrieves all recent product reviews using WooCommerce API credentials. Loop Over Items** – Processes reviews in smaller batches for efficiency. Message a Model** – Sends each review to OpenAI to detect sentiment (positive, neutral, negative) and generate a short summary. Merge & Code Nodes** – Combines original review data with AI results and ensures proper data alignment. If Node** – Checks sentiment for further processing. Create a Record (Airtable)** – Stores each review and its sentiment in Airtable. Code Summary Node** – Counts positive, neutral, and negative reviews to create a summary. Send a Message (Slack)** – Posts the sentiment summary to the team’s Slack channel for visibility. Who’s It For This workflow is ideal for: E-commerce managers tracking WooCommerce product feedback Customer support teams monitoring review sentiment Marketing and product teams seeking automated insights Teams using Airtable and Slack for data tracking and collaboration Requirements to Use This Workflow An n8n instance (cloud or self-hosted) WooCommerce store** with API access OpenAI API key** for sentiment analysis Airtable account** with base/table configured Slack workspace** with API access for messaging Basic familiarity with APIs and workflow automation How It Works Schedule Trigger – Executes the workflow at the defined interval. Set WooCommerce Domain – Configures which store to fetch reviews from. Fetch Reviews – Retrieves all recent product reviews from WooCommerce. Loop Over Items – Splits reviews into manageable batches. Message a Model – Sends reviews to AI for sentiment analysis and short summary. Merge & Code Nodes – Combines AI results with original review data and prepares it for storage and summary. If Node – Checks review sentiment for saving or further processing. Create a Record (Airtable) – Saves each review with sentiment and AI summary. Code Summary Node – Counts the number of positive, neutral, and negative reviews. Send a Message (Slack) – Sends a concise summary of review sentiment to the Slack channel. Setup Steps Import the workflow JSON into n8n. Add credentials: WooCommerce HTTP Basic Auth OpenAI API Airtable Personal Access Token Slack API Configure the WooCommerce domain in the Set WooCommerce Domain node. Test the workflow with sample reviews to ensure AI outputs correctly. Adjust the Schedule Trigger interval as needed. Activate the workflow after confirming that data flows correctly from WooCommerce → AI → Airtable → Slack. How To Customize Nodes Schedule Trigger Adjust interval (minutes, hours, days) as needed. Set WooCommerce Domain Replace with your store domain URL. Fetch Reviews Update endpoint or filters if needed. Ensure WooCommerce credentials are correct. Message a Model Change AI model or prompts to adjust sentiment analysis or summary style. Create a Record (Airtable) Map additional fields if needed. Ensure the table has the necessary columns for sentiment, summary, rating, and product info. Send a Message (Slack) Customize Slack message format. Change the channel ID to send summaries to the right team. Optional Enhancements Include historical review trends. Automatically trigger notifications only for negative reviews. Send summaries to email or other messaging apps. Visualize sentiment trends in Airtable or external dashboards. Use Case Examples Automated Sentiment Tracking – Understand customer feedback without manual reading. Team Alerts – Notify product and support teams about negative reviews quickly. Data Storage & Reporting – Keep historical sentiment in Airtable for trend analysis. Efficient Batch Processing – Process large number of reviews without overloading the system. Troubleshooting Guide | Issue | Possible Cause | Solution | |--------------------------|--------------------------------------------------|--------------------------------------------------------------| | Reviews not fetched | Wrong WooCommerce credentials or endpoint | Check WooCommerce HTTP Basic Auth and store domain | | AI analysis fails | OpenAI API key invalid or prompt error | Verify OpenAI credentials and prompt syntax | | Slack message missing | Incorrect Slack channel or API token | Check Slack credentials and channel ID | | Airtable not storing reviews | Table or field mismatch | Verify Airtable base, table, and column mapping | Need Help? If you need assistance setting up the workflow, customizing AI sentiment analysis or integrating Slack summaries, feel free to contact our n8n development team at WeblineIndia. We provide workflow automation, AI integration and reporting solutions for WooCommerce stores.
by Sk developer
This workflow fetches free Udemy courses hourly via the Udemy Coupons and Courses API on RapidAPI, filters them, and updates a Google Sheet. It sends alerts on errors for smooth monitoring. Node-by-Node Explanation Schedule Trigger: Runs the workflow every hour automatically. Fetch Udemy Coupons: Sends POST request to the Udemy Coupons and Courses API on RapidAPI to get featured courses. Check API Success: Verifies if the API response is successful; routes accordingly. Filter Free Courses: Selects only courses with sale_price of zero (free courses). Send Error Notification: Emails admin if API fetch fails for quick action. Sync Courses to Google Sheet: Appends or updates the filtered free courses into Google Sheets. Google Sheets Columns id name price sale_price image lectures views rating language category subcategory slug store sale_start Google Sheets Setup & Configuration Steps Create Google Sheet: Create or open a Google Sheet where you want to sync courses. Set Headers: Add columns headers matching the fields synced (id, name, price, etc.). Enable Google Sheets API: Go to Google Cloud Console, enable Google Sheets API for your project. Create Service Account: In Google Cloud Console, create a Service Account with editor access. Download Credentials: Download the JSON credentials file from the service account. Share Sheet: Share your Google Sheet with the Service Account email (found in JSON file). Configure n8n Google Sheets Node: Use the service account credentials, set operation to “Append or Update”, provide Sheet URL and sheet name or gid. Match Columns: Map the course fields to your sheet columns and set id as the unique key for updates. How to Obtain RapidAPI Key & Setup API Request Sign up/Login: Visit RapidAPI Udemy Coupons and Courses API and create an account or log in. Subscribe to API: Subscribe to the Udemy Coupons and Courses API plan (free or paid). Get API Key: Navigate to your dashboard and copy your x-rapidapi-key. Configure HTTP Request: In your workflow’s HTTP Request node: Set method to POST. URL: https://udemy-coupons-and-courses.p.rapidapi.com/featured.php Add headers: x-rapidapi-host: udemy-coupons-and-courses.p.rapidapi.com x-rapidapi-key: your copied API key Set content type to multipart/form-data. Add body parameter: page=1 (or as needed). Test API: Run the node to ensure the API responds with data successfully before continuing workflow setup. Use Cases & Benefits Automates daily updates of free Udemy courses in your sheet using the Udemy Coupons and Courses API on RapidAPI. Saves manual effort in tracking coupons and deals. Enables quick error alerts to maintain data accuracy. Ideal for course aggregators, affiliate marketers, or learning platforms needing fresh course data. Who This Workflow Is For Content curators and edtech platforms tracking free courses. Affiliate marketers promoting Udemy deals. Anyone needing real-time access to updated free Udemy coupons.
by Msaid Mohamed el hadi
Overview This workflow automates the discovery, extraction, enrichment, and storage of business information from Google Maps search queries using AI tools, scrapers, and Google Sheets. It is ideal for: Lead generation agencies Local business researchers Digital marketing firms Automation & outreach specialists 🔧 Tools & APIs Used Google Maps Search (via HTTP)** Custom JavaScript Parsing** URL Filtering & De-duplication** Google Sheets (Read/Write)** APIFY Actor** for business scraping LangChain AI Agent** (OpenRouter - Gemini 2.5) n8n Built-in Logic** (Loops, Conditions, Aggregators) 🧠 Workflow Summary Trigger The automation starts via schedule (every hour). Read Queries from Google Sheet Loads unprocessed keywords from a Google Sheet tab named keywords. Loop Through Keywords Each keyword is used to search Google Maps for relevant businesses. Extract URLs JavaScript parses HTML to find all external website URLs from the search results. Clean URLs Filters out irrelevant domains (e.g., Google-owned, example.com, etc.), and removes duplicates. Loop Through URLs For each URL: Checks if it already exists in the Google Sheet (to prevent duplication). Calls the APIFY Actor to extract full business data. Optionally uses AI Agent (Gemini) to provide detailed insight on the business, including: Services, About, Market Position, Weaknesses, AI suggestions, etc. Converts the AI result (text) to a structured JSON object. Save to Google Sheet Adds all extracted and AI-enriched business information to a separate tab (Sheet1). Mark Queries as Processed Updates the original row in keywords to avoid reprocessing. 🗃️ Output Fields Saved The following information is saved per business: Business Name, Website, Email, Phone Address, City, Postal Code, Country, Coordinates Category, Subcategory, Services About Us, Opening Hours, Social Media Links Legal Links (Privacy, Terms) Logo, Languages, Keywords AI-Generated Description** Google Maps URL 📈 Use Cases Build a prospect database for B2B cold outreach. Extract local SEO insights per business. Feed CRMs or analytics systems with enriched business profiles. Automate market research for regional opportunity detection. 📩 Want a Similar Workflow? If you’d like a custom AI-powered automation like this for your business or agency, feel free to contact me: 📧 msaidwolfltd@gmail.com