by Sk developer
Automate Text To Video Generation with Google Veo3 API and Google Drive Integration Create CGI ads effortlessly by integrating the Google Veo3 API for video generation and uploading to Google Drive with seamless email notifications. Node-by-Node Explanation: On form submission: Triggers the workflow when a form is submitted with a prompt for the video. Wait for API Response: Waits for 35 seconds for the API to respond. API Request: Check Task Status: Sends an HTTP request to check the task status for success or failure. Condition: Task Output Status: Checks the task's output status and triggers the appropriate action (success, processing, or failure). Wait for Task to Complete: Waits for another 30 seconds to recheck the task's completion status. Send Email: API Error - Task ID Missing: Sends an email if the task ID is missing from the API response. Upload File to Google Drive: Uploads the generated video to Google Drive. Set Google Drive Permissions: Configures the permissions for the uploaded video on Google Drive. Send an email: Video Link: Sends a final email with the link to the completed video on Google Drive. Download Video: Downloads the video from the generated URL. How to Obtain RapidAPI Key: Visit Google Veo3 API on RapidAPI. Sign up or log in to your account. Subscribe to the Google Veo3 API plan. Copy the API key provided in your RapidAPI dashboard. How to Configure Google Drive: Go to Google Cloud Console. Enable the Google Drive API. Create credentials for OAuth 2.0 and download the credentials file. In your workflow, authenticate using these credentials to upload and manage files on Google Drive. Use Case: This workflow is ideal for businesses looking to automate CGI video creation for advertisements using the Google Veo3 API, with seamless file management and sharing via Google Drive. Benefits: Automation**: Completely automates the CGI video creation and sharing process. Error Handling**: Sends error notifications for task failures or missing task IDs. File Management**: Automatically uploads and manages videos on Google Drive. Easy Sharing**: Generates shareable links to videos via email. Who Is This For? Digital marketers looking to create ads at scale. Creative agencies producing CGI content. Developers integrating API workflows for video generation. Link to Google Veo3 API: Google Veo3 API on RapidAPI
by Oneclick AI Squad
Live Airport Delay Dashboard with FlightStats & Team Alerts Description Automates live monitoring of airport delays using FlightStats API. Stores and displays delay data, with Slack alerts for severe delays to operations/sales teams. Essential Information Runs on a scheduled trigger (e.g., hourly or daily). Fetches real-time delay data from FlightStats API. Stores data in Google Sheets and alerts teams via Slack for severe delays. System Architecture Delay Monitoring Pipeline**: Set Schedule: Triggers the workflow hourly or daily via Cron. FlightStats API: Retrieves live airport delay data. Data Management Flow**: Set Output Data: Prepares data for storage or display. Merge API Data: Combines and processes delay data. Alert and Display**: Send Response via Slack: Alerts ops/sales for severe delays. No Action for Minor Delays: Skips minor delays with no action. Implementation Guide Import the workflow JSON into n8n. Configure Cron node for desired schedule (e.g., every 1 hr). Set up FlightStats API credentials and endpoint (e.g., https://api.flightstats.com). Configure Google Sheets or Notion for data storage/display. Test with a sample API call and verify Slack alerts. Adjust delay severity thresholds as needed. Technical Dependencies Cron service for scheduling. FlightStats API for real-time delay data. Google Sheets API or Notion API for data storage/display. Slack API for team notifications. n8n for workflow automation. Database & Sheet Structure Delay Tracking Sheet** (e.g., AirportDelays): Columns: airport_code, delay_status, delay_minutes, timestamp, alert_sent Example: JFK, Severe, 120, 2025-07-29T20:28:00Z, Yes Customization Possibilities Adjust Cron schedule for different frequencies (e.g., every 30 min). Modify FlightStats API parameters to track specific airports. Customize Slack alert messages in the Send Response via Slack node. Integrate with a dashboard tool (e.g., Google Data Studio) for live display. Add email alerts for additional notification channels.
by Sk developer
Automated Seo Website Traffic Checker with Google Sheets Logging (Seo) Description: This workflow uses the Website Traffic Checker Semrush API to analyze website traffic and performance. It collects data through a user-submitted website URL and stores the results in Google Sheets for easy access and reporting. Ideal for SEO analysis and data tracking. Node-by-Node Explanation: 1. On form submission Captures the website URL submitted by the user through a form. Triggers the workflow when a website URL is submitted via the form interface. 2. Check webTraffic Sends a request to the Website Traffic Checker Semrush API to gather traffic data for the submitted website. Uses the provided URL to fetch real-time traffic statistics using the Semrush API. 3. Re format output Extracts and reformats the raw traffic data from the API response. Cleans and structures the traffic data for easy readability and reporting. 4. Google Sheets Appends the formatted traffic data into a Google Sheet for storage and further analysis. Stores the data in a Google Sheets document for long-term tracking and analysis. Benefits of This Flow: Real-Time Data Collection:** Collects real-time website traffic data directly from the Website Traffic Checker Semrush API, ensuring up-to-date information is always available. Automation:** Automatically processes and formats the website traffic data into an easily accessible Google Sheet, saving time and effort. Customizable:** The workflow can be customized to track multiple websites, and the data can be filtered and expanded as per user needs. SEO Insights:** Get in-depth insights like bounce rate, pages per visit, and visits per user, essential for SEO optimization. Use Case: SEO Monitoring:** Track and analyze the traffic of competitor websites or your own website for SEO improvements. This is ideal for digital marketers, SEO professionals, and website owners. Automated Reporting:** Automatically generate traffic reports for various websites and save them in a Google Sheet for easy reference. No need to manually update data or perform complex calculations. Data-Driven Decisions:** By utilizing data from the Website Traffic Checker Semrush API, users can make informed decisions to improve website performance and user experience.
by SpaGreen Creative
Automated Web Form Data Collection and Storage to Google Sheets Overview This n8n workflow allows you to collect data from a web form and automatically store it in a Google Sheet. It includes data cleanup, date stamping, optional batching, and throttling for smooth handling of single or bulk submissions. What It Does Accepts data submitted from a frontend form via HTTP POST Cleans and structures the incoming JSON data Adds the current date automatically Appends structured data into a predefined Google Sheet Supports optional batch processing and a wait/delay mechanism to control data flow Features Webhook trigger for external form submissions JavaScript-based data cleaning and formatting Looping and delay nodes to manage bulk submissions Direct integration with Google Sheets via OAuth2 Fully mapped columns to match sheet structure Custom date field (submitted_date) auto-generated per entry Who’s It For This workflow is perfect for: Developers or marketers collecting lead data via online forms Small businesses tracking submissions from landing pages or contact forms Event organizers managing RSVP or booking forms Anyone needing to collect and store structured data in Google Sheets automatically Prerequisites Make sure the following are ready before use: An n8n instance (self-hosted or cloud) A Google account with edit access to the target Google Sheet Google Sheets OAuth2 API credentials** configured in n8n A web form or app capable of sending POST requests with the following fields: business_name location whatsapp email name Google Sheet Format Ensure your Google Sheet contains the following exact column names (case-sensitive): | Business Name | Location | WhatsApp Number | Email | Name | Date | |---------------|------------|------------------|----------------------|----------------|------------| | SpaGreen | Bangladesh | 8801322827753 | spagreen@gmail.com | Abdul Mannan | 2025-09-14 | | Dev Code Journey | Bangladesh | 8801322827753 | admin@gmail.com | Shakil Ahammed | 2025-09-14 | > Note: The "Email" column includes a trailing space — this must match exactly in both the sheet and column mapping settings. Setup Instructions 1. Configure Webhook Use the Webhook node with path: /93a81ced-e52c-4d31-96d2-c91a20bd7453 Accepts POST requests from a frontend form or application 2. Clean Incoming Data The JavaScript (Code) node extracts the submitted fields Adds a submitted_date in YYYY-MM-DD format 3. Loop Over Items (Optional for Batches) The Split In Batches node allows handling bulk form submissions For single entries, the workflow still works without adjustment 4. Append to Google Sheet The Google Sheets node appends each submission as a new row Mapped fields include: Business Name Location WhatsApp Number Email Name Date (auto-filled) 5. Add Delay (Optional) The Wait node adds a 5-second delay per loop Helps throttle requests when handling large batches How to Use It Clone or import the workflow into your n8n instance Update the Webhook URL in your frontend form’s POST action Connect your Google Sheets account in the Google Sheets node Confirm that your target sheet matches the required column structure Start sending data from your form — new entries will appear in your sheet automatically > This setup ensures form submissions are received, cleaned, stored efficiently, and processed in a controlled manner. Notes Use the Sticky Notes in the workflow to understand each node’s purpose You can modify the delay duration or disable looping for single submissions For added security, consider securing your webhook with headers or tokens Ideal Use Cases Contact forms Lead capture pages Event signups or bookings Newsletter or email list opt-ins Surveys or feedback forms Support WhatsApp Support: Chat Now Discord: Join SpaGreen Community Facebook Group: SpaGreen Support Website: spagreen Developer Portfolio: Codecanyon SpaGreen
by Oneclick AI Squad
This automated n8n workflow delivers daily multi-currency exchange rate updates via API to email and WhatsApp. The system fetches the latest exchange rates, formats the data, and sends alerts to designated recipients to keep users informed of currency fluctuations. What is Multi-Currency Exchange Update? Multi-currency exchange updates involve retrieving the latest exchange rates for multiple currencies against a base currency (INR) via an API, formatting the data, and distributing it through email and WhatsApp for real-time financial awareness. Good to Know Exchange rate accuracy depends on the reliability of the external API source API rate limits should be respected to ensure system stability Manual configuration of API keys and recipient lists is required Real-time alerts help users stay updated on currency movements How It Works Daily Trigger** - Triggers the workflow daily at 7:30 AM IST to fetch and send exchange rates Set Config: API Key & Currencies** - Defines the API key and target currencies (INR, CAD, AUD, CNY, EUR, USD) for use in the API call Fetch Exchange Rates (CurrencyFreaks)** - Calls the exchange rate API and fetches the latest rates with INR as the base Wait for API Response** - Adds a short delay (5s) to ensure API rate limits are respected and system stability is maintained Set Email & WhatsApp Recipients** - Sets the list of email addresses and WhatsApp numbers who should receive the currency update Create Message Subject & Body** - Dynamically generates a subject line (e.g., "Today's Currency Exchange Rates [(Date)]") and the body containing all rates Send Email Alert** - Sends the formatted currency rate update via email Send WhatsApp Alert** - Sends the formatted currency rate update via WhatsApp How to Use Import the workflow into n8n Configure the API key for the CurrencyFreaks API Set the list of target currencies and ensure INR is the base currency Configure email credentials for sending alerts Configure WhatsApp credentials or API integration for sending messages Test the workflow with sample data to verify rate fetching and alert delivery Adjust trigger time or recipient list as needed Requirements CurrencyFreaks API credentials Email service credentials (Gmail, SMTP, etc.) WhatsApp API or integration credentials Customizing This Workflow Modify the target currencies in the Set Config node to include additional currencies Adjust the delay time in the Wait node based on API rate limits Customize the email and WhatsApp message formats in the Create Message node to suit user preferences
by Sk developer
Competitor Analysis & SEO Data Logging Workflow Using Competitor Analysis Semrush API Description This workflow automates SEO competitor analysis using the Competitor Analysis Semrush API and logs the data into Google Sheets for structured reporting. It captures domain overview, organic competitors, organic pages, and keyword-level insights from the Competitor Analysis Semrush API, then appends them to different sheets for easy tracking. Node-by-Node Explanation On form submission – Captures the website URL entered by the user. Competitor Analysis – Sends the website to the Competitor Analysis Semrush API via HTTP POST request. Re format output – Extracts and formats the domain overview data. Domain overview – Saves organic keywords and traffic into Google Sheets. Reformat – Extracts the organic competitors list. Organic Competitor – Logs competitor domains, relevance, and traffic into Google Sheets. Reformat 2 – Extracts organic pages data. Organic Pages – Stores page-level data such as traffic and keyword counts. Reformat2 – Extracts organic keywords details. organic keywords – Logs keyword data like CPC, volume, and difficulty into Google Sheets. Benefits ✅ Automated competitor tracking – No manual API calls, all logged in Google Sheets. ✅ Centralized SEO reporting – Data stored in structured sheets for quick access. ✅ Time-saving – Streamlines research by combining multiple reports in one workflow. ✅ Accurate insights – Direct data from the Competitor Analysis Semrush API ensures reliability. Use Cases 📊 SEO Research – Track domain performance and competitor strategies. 🔍 Competitor Monitoring – Identify competitor domains, keywords, and traffic. 📝 Content Strategy – Find top-performing organic pages and replicate content ideas. 💰 Keyword Planning – Use CPC and difficulty data to prioritize profitable keywords. 📈 Client Reporting – Generate ready-to-use SEO competitor analysis reports in Google Sheets.
by Florent
n8n Restore workflows & credentials from Disk - Self-Hosted Solution This n8n template provides a safe and intelligent restore solution for self-hosted n8n instances, allowing you to restore workflows and credentials from disk backups. Perfect for disaster recovery or migrating between environments, this workflow automatically identifies your most recent backup and provides a manual restore capability that intelligently excludes the current workflow to prevent conflicts. Works seamlessly with date-organized backup folders. Good to know This workflow uses n8n's native import commands (n8n import:workflow and n8n import:credentials) Works with date-formatted backup folders (YYYY-MM-DD) for easy version identification The restore process intelligently excludes the current workflow to prevent overwriting itself Requires proper Docker volume configuration and file system permissions All operations are performed server-side with no external dependencies Compatible with backups created by n8n's export commands How it works Restore Process (Manual) Manual trigger with configurable pinned data options (credentials: true/false, workflows: true/false) The Init node sets up all necessary paths, timestamps, and configuration variables using your environment settings The workflow scans your backup folder and automatically identifies the most recent backup If restoring credentials: Direct import from the latest backup folder using n8n's import command Credentials are imported with their encrypted format intact If restoring workflows: Scans the backup folder for all workflow JSON files Creates a temporary folder with all workflows from the backup Intelligently excludes the current restore workflow to prevent conflicts Imports all other workflows using n8n's import command Cleans up temporary files automatically Optional email notifications provide detailed restore summaries with command outputs How to use Prerequisites Existing n8n backups in date-organized folder structure (format: /backup-folder/YYYY-MM-DD/) Workflow backups as JSON files in the date folder Credentials backups in subfolder: /backup-folder/YYYY-MM-DD/n8n-credentials/ For new environments: N8N_ENCRYPTION_KEY from source environment (see dedicated section below) Initial Setup Configure your environment variables: N8N_ADMIN_EMAIL: Your email for notifications (optional) N8N_BACKUP_FOLDER: Location where your backups are stored (e.g., /files/n8n-backups) N8N_PROJECTS_DIR: Projects root directory GENERIC_TIMEZONE: Your local timezone N8N_ENCRYPTION_KEY: Required if restoring credentials to a new environment (see dedicated section below) Update the Init node: (Optional) Configure your email here: const N8N_ADMIN_EMAIL = $env.N8N_ADMIN_EMAIL || 'youremail@world.com'; Set PROJECT_FOLDER_NAME to "Workflow-backups" (or your preferred name) Set credentials to "n8n-credentials" (or your backup credentials folder name) Verify BACKUP_FOLDER path matches where your backups are stored Ensure your Docker setup has: Mounted volume containing backups (e.g., /local-files:/files) Access to n8n's CLI import commands Proper file system permissions (read access to backup directories) Performing a Restore Open the workflow and locate the "Start Restore" manual trigger node Edit the pinned data to choose what to restore: credentials: true - Restore credentials workflows: true - Restore workflows Set both to true to restore everything Click "Execute workflow" on the "Start Restore" node to execute the restore The workflow will automatically find the most recent backup (latest date) Check the console logs or optional email for detailed restore summary Important Notes The workflow automatically excludes itself during restore to prevent conflicts Credentials are restored with their encryption intact. If restoring to a new environment, you must configure the N8N_ENCRYPTION_KEY from the source environment (see dedicated section below) Existing workflows/credentials with the same names will be overwritten Test in a non-production environment first if unsure ⚠ Critical: N8N_ENCRYPTION_KEY Configuration Why this is critical: n8n generates an encryption key automatically on first launch and saves it in the ~/.n8n/config file. However, if this file is lost (for example, due to missing Docker volume persistence), n8n will generate a NEW key, making all previously encrypted credentials inaccessible. When you need to configure N8N_ENCRYPTION_KEY: Restoring to a new n8n instance When your data directory is not persisted between container recreations Migrating from one server to another As a best practice to ensure key persistence across updates How credentials encryption works: Credentials are encrypted with a specific key unique to each n8n instance This key is auto-generated on first launch and stored in /home/node/.n8n/config When you backup credentials, they remain encrypted but the key is NOT included If the key file is lost or a new key is generated, restored credentials cannot be decrypted Setting N8N_ENCRYPTION_KEY explicitly ensures the key remains consistent Solution: Retrieve and configure the encryption key Step 1: Get the key from your source environment Check if the key is defined in environment variables docker-compose exec n8n printenv N8N_ENCRYPTION_KEY If this command returns nothing, the key is auto-generated and stored in n8n's data volume: Enter the container docker-compose exec n8n sh Check configuration file cat /home/node/.n8n/config Exit container exit Step 2: Configure the key in your target environment Option A: Using .env file (recommended for security) Add to your .env file N8N_ENCRYPTION_KEY=your_retrieved_key_here Then reference it in docker-compose.yml: services: n8n: environment: N8N_ENCRYPTION_KEY=${N8N_ENCRYPTION_KEY} Option B: Directly in docker-compose.yml (less secure) services: n8n: environment: N8N_ENCRYPTION_KEY=your_retrieved_key_here Step 3: Restart n8n docker-compose restart n8n Step 4: Now restore your credentials Only after configuring the encryption key, run the restore workflow with credentials: true. Best practice for future backups: Always save your N8N_ENCRYPTION_KEY in a secure location alongside your backups Consider storing it in a password manager or secure vault Document it in your disaster recovery procedures Requirements Existing Backups Date-organized backup folders (YYYY-MM-DD format) Backup files created by n8n's export commands or compatible format Environment Self-hosted n8n instance (Docker recommended) Docker volumes mounted with access to backup location Optional: SMTP server configured for email notifications Credentials (Optional) SMTP credentials for email notifications (if using email nodes) Technical Notes Smart Workflow Exclusion During workflow restore, the current workflow's name is cleaned and matched against backup files This prevents the restore workflow from overwriting itself The exclusion logic handles special characters and spaces in workflow names A temporary folder is created with all workflows except the current one Timezone Handling All timestamps use UTC for technical operations Display times use local timezone for user-friendly readability Backup folder scanning works with YYYY-MM-DD format regardless of timezone Security Credentials are imported in n8n's encrypted format (encryption preserved) Ensure backup directories have appropriate read permissions Consider access controls for who can trigger restore operations No sensitive data is logged in console output Troubleshooting Common Issues No backups found: Verify the N8N_BACKUP_FOLDER path is correct and contains date-formatted folders Permission errors: Ensure Docker user has read access to backup directories Path not found: Verify all volume mounts in docker-compose.yml match your backup location Import fails: Check that backup files are in valid n8n export format Workflow conflicts: The workflow automatically excludes itself, but ensure backup files are properly named Credentials not restored: Verify the backup contains a n8n-credentials folder with credential files Credentials decrypt error: Ensure N8N_ENCRYPTION_KEY matches the source environment Version Compatibility Tested with n8n version 1.113.3 Compatible with Docker-based n8n installations Requires n8n CLI access (available in official Docker images) This workflow is designed for self-hosted server backup restoration. For FTP/SFTP remote backups, see the companion workflow "n8n Restore from FTP". Works best with backups from: "Automated n8n Workflows & Credentials Backup to Local/Server Disk & FTP"
by Oneclick AI Squad
Automatically detects and hides hate speech/toxic comments, alerts your team, and logs flagged content for review. Workflow Overview Trigger**: A Schedule node runs every 15 minutes to poll for new comments (Instagram doesn't natively push notifications easily, so polling is used). You could replace this with a Webhook if you set up Instagram webhooks via Graph API. Scan Comments**: Uses Instagram Graph API (via HTTP Request) to fetch recent posts and their comments. Assumes you have an Instagram Business Account and a valid access token (from Facebook Developer Portal). Detect Toxicity**: For each comment, it sends the text to Google's Perspective API (a free toxicity detection API; sign up at https://perspectiveapi.com/ for an API key). Threshold for "toxic" is set to >0.7 toxicity score (configurable). Auto-Hide Offensive Ones**: If toxic, uses Instagram API to hide the comment. Alert Team**: Sends a Slack notification (or email; configurable) with details. Store Evidence**: Appends the toxic comment details (text, user, score, timestamp) to a Google Sheet for auditing. Error Handling**: Basic error node to notify if API calls fail. Business Value Alignment**: This automates protection, reducing manual moderation and building trust. Prerequisites: n8n installed (self-hosted or cloud). Instagram Graph API access token (set in n8n credentials or as environment variable). Perspective API key (free tier available). Slack webhook or email credentials. Google Sheets API credentials (for storage). How to Import In n8n, go to the workflows list. Click "Import from JSON" (or paste into a new workflow). Update placeholders: Replace YOUR_INSTAGRAM_ACCESS_TOKEN with your token. Replace YOUR_PERSPECTIVE_API_KEY with your key. Set up credentials for HTTP Request (Instagram), Slack, and Google Sheets. Adjust YOUR_INSTAGRAM_BUSINESS_ACCOUNT_ID and YOUR_MEDIA_ID (or make it dynamic). Test and activate. If you encounter issues (e.g., API rate limits), adjust the schedule or add waits. Notes on Customization Looping**: The "Loop Over Comments" uses SplitInBatches to process comments one by one, avoiding API rate limits. Toxicity API**: I used Perspective API as it's reliable and free for low volume. If you prefer another (e.g., Hugging Face), swap the HTTP Request body. Instagram API**: This fetches comments for the first recent post (simplified). To handle multiple posts, add another loop. Alerts**: Slack is used; change to Email node if preferred. Storage**: Google Sheets for simplicity; could be swapped for MongoDB or Airtable. Sticky Notes**: Three notes explain phases – they won't affect execution but help in the UI. Testing**: Start with test data. Instagram API requires app review for production.
by Harsh Maniya
Build an AI Research Assistant for WhatsApp with Perplexity and Claude 💡 Ever wished you could get a deep, multi-source research report on any topic, delivered directly to your WhatsApp chat in seconds? This workflow transforms your WhatsApp into a powerful, on-demand research assistant, perfect for students, professionals, and curious minds. Leveraging the deep research capabilities of Perplexity, the nuanced formatting skills of Anthropic's Claude, and the messaging power of Twilio, this workflow automates the entire process from query to polished answer. Ask it anything, and receive a well-structured, easy-to-read summary moments later. What This Workflow Does 🚀 📲 Listens for Incoming Queries: The workflow starts when a user sends a message to your configured Twilio WhatsApp number. 🧠 Performs Deep Research: It takes the user's query and feeds it to the Perplexity node, which uses the sonar-pro model to conduct a comprehensive, multi-source analysis on the topic. 🎨 Polishes the Content: The raw, detailed research report from Perplexity is then passed to an Anthropic Claude model. This crucial step refines the text, adds engaging emojis, ensures it's under the WhatsApp character limit, and formats it perfectly for mobile viewing. 💬 Sends an Instant Reply: The final, beautifully formatted summary is sent directly back to the user's WhatsApp chat via Twilio, completing the entire request. Nodes Used 🛠️ Webhook: To receive the initial message from Twilio and trigger the workflow. Perplexity: To perform the AI-powered deep research on the user's query. Anthropic (via LangChain): To connect to and use the Claude model for reformatting and polishing the content. Twilio: To send the final, formatted message back to the user on WhatsApp. How to Set Up This Workflow ⚙️ This workflow requires careful setup between n8n and Twilio to function correctly. Follow these steps closely. 1\. Prerequisites ✅ You will need accounts for the following services: n8n (Cloud or self-hosted) Twilio Perplexity AI (for an API key) Anthropic (for a Claude API key) 2\. Configure Credentials 🔑 In your n8n instance, add your API keys for Twilio, Perplexity, and Anthropic. You can add credentials in n8n by going to the Credentials tab in the left-hand menu. Learn more about managing credentials in n8n. 3\. Set Up Your Twilio WhatsApp Number 📱 Log in to your Twilio account. Either purchase a phone number that is WhatsApp-enabled or use the free Twilio Sandbox for WhatsApp for testing. Follow Twilio's guide to connect your number or sandbox to the WhatsApp Business API. Read Twilio's Getting Started with WhatsApp Guide. 4\. Expose Your n8n Webhook URL 🌐 For Twilio to communicate with n8n, your n8n webhook URL must be publicly accessible. Open the Fetch Whatsapp Request (Webhook) node in the workflow. You will see two URLs: Test and Production. For this guide, we will use the Test URL. If you are running n8n locally or on a private server, you must expose this URL to the public internet. You can do this easily using n8n's built-in tunnel feature. Start n8n from your computer's command line with the following command: n8n start --tunnel After starting, n8n will provide you with a public "tunnel URL". It will look something like https://[subdomain].hooks.n8n.cloud/. Copy this public tunnel URL. Learn more about tunneling in the n8n docs. 5\. Connect Twilio to Your n8n Webhook 🔗 In your Twilio Console, navigate to the settings for the phone number (or sandbox) you configured in Step 3. Scroll down to the Messaging section. Find the field labeled "A MESSAGE COMES IN". Paste your n8n Test URL (or your public tunnel URL from the previous step) into this field. Ensure the dropdown next to it is set to HTTP POST. Click Save. 6\. Activate and Test Your Workflow ▶️ Go back to your n8n canvas. Click "Test workflow" in the top right corner. This will put your webhook in a "listening" state. Now, send a message from your personal WhatsApp to your configured Twilio WhatsApp number. You should see the workflow execute successfully in n8n and receive a formatted reply on WhatsApp\! Once you've confirmed it works, save and activate the workflow to have it run permanently. Further Reading 📚 n8n Webhook Node Documentation n8n Twilio Node Documentation Perplexity API Documentation Anthropic API Documentation
by Davide
This workflow implements a Retrieval-Augmented Generation (RAG) system using Google Gemini's File Search API. It allows users to upload files to a dedicated search store and then ask questions about their content in a chat interface. The system automatically retrieves relevant information from the uploaded files to provide accurate, context-aware answers. Key Advantages 1. ✅ Seamless Integration of File Upload + AI Context The workflow automates the entire lifecycle: Upload file Index file Retrieve content for AI chat Everything happens inside one n8n automation, without manual actions. 2. ✅ Automatic Retrieval for Every User Query The AI agent is instructed to always query the Search Store. This ensures: More accurate answers Context-aware responses Ability to reference the exact content the user has uploaded Perfect for knowledge bases, documentation Q&A, internal tools, and support. 3. ✅ Reusable Search Store for Multiple Sessions Once created, the Search Store can be reused: Multiple files can be imported Many queries can leverage the same indexed data A sustainable foundation for scalable RAG operations. 4. ✅ Visual and Modular Workflow Design Thanks to n8n’s node-based flow: Each step is clearly separated Easy to debug Easy to expand (e.g., adding authentication, connecting to a database, notifications, etc.) 5. ✅ Supports Both Form Submission and Chat Messages The workflow is built with two entry points: A form for uploading files A chat-triggered entry point for RAG conversations Meaning the system can be embedded in multiple user interfaces. 6. ✅ Compliant and Efficient Interaction With Gemini APIs Your workflow respects the structure of Gemini’s File Search API: /fileSearchStores (create store) upload endpoint importFile endpoint generateContent with file search tools This ensures compatibility and future expandability. 7. ✅ Memory-Aware Conversations With the Memory Buffer node, the chat session preserves context across messages—providing a more natural and sophisticated conversational experience. How it Works STEP 1 - Create a new Search Store Triggered manually via the “Execute workflow” node, this step sends a request to the Gemini API to create a FileSearch Store, which acts as a private vector index for your documents. The store name is then saved using a Set node. This store will later be used for file import and retrieval. STEP 2 - Upload and import a file into the Search Store When the form is submitted (through the Form Trigger), the workflow: Accepts a file upload via the form. Uploads the file to Gemini using the /upload endpoint. Imports the uploaded file into the Search Store, making it searchable. This step ensures content is stored, chunked, and indexed so the AI model can retrieve relevant sections later. STEP 3 - RAG-enabled Chat with Google Gemini When a chat message is received: The workflow loads the Search Store identifier. A LangChain Agent is used along with the Google Gemini Chat Model. The model is configured to always use the SearchStore tool, so every user query is enriched by a search inside the indexed files. The system retrieves relevant chunks from your documents and uses them as context for generating more accurate responses. This creates a fully functioning RAG chatbot powered by Gemini. Set up Steps Before activating this workflow, you must complete the following configuration: Google Gemini API Credentials: Ensure you have a valid Google AI Studio API key. This key must be entered in all HTTP Request nodes (Create Store, Upload File, Import to Store, and SearchStore). Configure the Search Store: Manually trigger the "Create Store" node once via the "Execute Workflow" button. This will call the Gemini API to create a new File Search Store and return its resource name (e.g., fileSearchStores/my-store-12345). Copy this resource name and update the "Get Store" and "Get Store1" Set nodes. Replace the placeholder value fileSearchStores/my-store-XXX in both nodes with the actual name of your newly created store. Deploy Triggers: For production use, you should activate the workflow. This will generate public URLs for the "On form submission" node (for file uploads) and the "When chat message received" node (for the chat interface). These URLs can be embedded in your applications (e.g., a website or dashboard). Once these steps are complete, the workflow is ready. Users can start uploading files via the form and then ask questions about them in the chat. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Oneclick AI Squad
Competitor Price & Feature Tracker for Real Estate Projects Overview This solution monitors competitor pricing and features for real estate projects by fetching data from a competitor API, parsing it, logging it to Google Sheets, and sending email alerts for significant price changes. It runs on a scheduled basis to keep real-time track of market trends. Operational Process Set Cron**: Triggers the workflow on a scheduled basis (e.g., hourly). Fetch Competitor Data**: Performs GET requests to retrieve competitor pricing and feature data (e.g., https://api.competitor.com). Wait For Data**: Introduces a delay to ensure data is fully retrieved. Parse Data**: Processes and extracts relevant pricing and feature details. Log to Google Sheets**: Appends the parsed data to a Google Sheet for tracking. Check Price Change**: Evaluates if there’s a significant price change. Send Alert Email**: Sends an email notification if a price change is detected. No Action for Minor Changes**: Skips action if no significant price change is found. Implementation Guide Import the workflow JSON into n8n. Configure the Cron node for the desired schedule (e.g., every hour). Set up the HTTP node with the competitor API URL (e.g., https://api.competitor.com). Configure Google Sheets integration and specify the log sheet. Test with a sample API call and verify email alerts. Adjust price change thresholds in the Check Price Change node as needed. Requirements HTTP request capability for API data retrieval. Google Sheets API for data logging. Email service (e.g., SMTP) for alerts. n8n for workflow automation and scheduling. Customization Options Adjust the Cron schedule for different intervals (e.g., daily). Modify the HTTP node to fetch additional competitor data (e.g., features, availability). Customize email alert content in the Send Alert Email node. Enhance Google Sheets log with additional fields (e.g., timestamp, competitor name). Add Slack or WhatsApp notifications for additional alert channels.
by Muntasir Mubin
Get ==Instant== Alerts When Your Website Goes Down — Using ==n8n== as ==Website Downtime Checker Robot== If you manage websites (your own or clients’), downtime alerts are critical. But most monitoring tools create alert fatigue — ==emails for every tiny hiccup==, even 30–60 second outages. This setup shows how to use n8n as a smart uptime monitor: ✅ No extra subscriptions ✅ No false-positive spam ✅ Alerts only for real downtime ✅ Optional instant phone notifications Why Use n8n for Website Monitoring? Traditional tools like Uptime Robot become limiting or expensive as you scale. With n8n, you get: Full control over alert logic Custom timing & thresholds No forced notification rules One tool for uptime and other automations You decide when, how, and why alerts are sent. Quick Start: Free n8n Website Monitoring Workflow Get running in minutes: Use the prebuilt n8n template Sign up for n8n Cloud or self-host for free Set your schedule (default: hourly) Add the websites you want to monitor Key Setting (Important) Wait time: ==300 seconds (5 minutes)== >Recommended* If a site goes down, the workflow waits before alerting. ➡️ Short hiccups = ignored ➡️ Real outages = ==alerted== How to Test & Use Activate the workflow Toggle it on — monitoring runs automatically. Test instantly Add a fake or non-existent URL and run the workflow. After the wait period, you’ll receive an alert. Stay organized Alerts arrive cleanly in your inbox (Tip: pair with an AI email labeling workflow for color-coded alerts) Get Critical Alerts on Your Phone (Telegram) Email is fine — but critical sites need instant mobile alerts. Best option: Telegram bots Free Fast No extra APIs or subscriptions How It Works Create a Telegram bot via BotFather Add the bot token & chat ID to n8n Receive downtime alerts instantly on your phone No missed notifications. No noise. FAQ Can I monitor unlimited sites? > ==Yes== — just add more URLs. What about short downtime (seconds)? > Filtered out by the 5-minute wait. Do I need a paid n8n plan? > ==No.== Self-hosting is ==free==, and this works on free plans. Why not SMS or WhatsApp? > Telegram is ==faster, simpler, and doesn’t require paid APIs.== 📩 Contact Me If you have any questions, ideas to share, or would like to collaborate on a project, feel free to reach out. I’m always open to meaningful discussions, feedback, and new opportunities. 🔗 ==Connect with me== Facebook LinkedIn 💬 You’re welcome to send me a message on any platform, and I’ll get back to you as soon as possible.