by Mihai Farcas
Chat with local LLMs using n8n and Ollama This n8n workflow allows you to seamlessly interact with your self-hosted Large Language Models (LLMs) through a user-friendly chat interface. By connecting to Ollama, a powerful tool for managing local LLMs, you can send prompts and receive AI-generated responses directly within n8n. Use cases Private AI Interactions Ideal for scenarios where data privacy and confidentiality are important. Cost-Effective LLM Usage Avoid ongoing cloud API costs by running models on your own hardware. Experimentation & Learning A great way to explore and experiment with different LLMs in a local, controlled environment. Prototyping & Development Build and test AI-powered applications without relying on external services. How it works When chat message received: Captures the user's input from the chat interface. Chat LLM Chain: Sends the input to the Ollama server and receives the AI-generated response. Delivers the LLM's response back to the chat interface. Set up steps Make sure Ollama is installed and running on your machine before executing this workflow. Edit the Ollama address if different from the default.
by Ricardo Espinozaas
Use Case When having a call with a new potential customer, one of the keys to getting the most out of the call is to find out as much information as you can about them before the call. Normally this involves a lot of manual research before every call. This workflow automates this tedious work for you. What this workflow does The workflow runs every time a new call is booked via your Calendly. It then filters out personal emails, before enriching the email. If the email is attached to a company it enriches the company and upserts it in your Hubspot CRM. Setup Add Clearbit, Hubspot, and Calendly credentials. Click on Test workflow. Book a meeting on Calendly so the event starts the workflow. Be aware that you can adapt this workflow to work with your enrichment tool, CRM, and booking tool of choice.
by Dr. Firas
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Automate Content Publishing to TikTok, YouTube, Instagram, Facebook via Blotato 🎯 Who is this for? This workflow is perfect for: Content creators who post daily to multiple platforms Marketing teams managing brand presence across channels Solo entrepreneurs and social media managers looking to scale their output Anyone tired of uploading content manually across apps 💡 What problem is this solving? Managing content across platforms is time-consuming. You need to: Track posts per platform Upload videos manually Adapt captions and posting time Avoid repetitive mistakes This workflow solves all of that by centralizing everything in one place (Google Sheets) and automating it via Blotato. ⚙️ What this workflow does Every hour, this workflow will: Check your Google Sheet for any post marked as "TO GO" Select one item at a time (avoids spam and overposting) Extract media from a shared Google Drive link Upload the media to Blotato Publish it automatically to: TikTok YouTube Shorts Instagram Facebook Update the post status in your Sheet to "Posted" 🧰 Setup Before running this template, make sure you have: ✅ A Blotato account (Pro plan required for API key) 🔑 Generated your Blotato API key (Settings > API > Generate) 📦 Enabled Verified Community Nodes in n8n Admin Panel 🧩 Installed the Blotato node via the community nodes list 🛠 Created a Blotato credential in n8n using your API key ☁️ Made sure your media folder in Google Drive is set to Anyone with the link can view 📌 Followed the 3 setup steps in the brown sticky notes inside the workflow 🛠 How to customize this workflow Add new platform nodes (LinkedIn, Threads, Pinterest, etc.) using Blotato Adjust the scheduling frequency from hourly to daily or weekly Add an approval layer (Slack/Telegram) before publishing Customize your captions dynamically using GPT or formulas in Sheets Use tags, categories, or campaign tracking for analytics 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Gaurav
Automate your entire guest communication journey from booking to post-stay with personalized welcome emails, review requests, and daily operational reports. Perfect for hotels, B&Bs, and short-term rental properties looking to enhance guest experience while reducing manual work and improving operational efficiency. How it works Pre-arrival welcome emails - Automatically sends personalized welcome emails 1-2 days before guest check-in with reservation details, hotel amenities, and contact information Post-stay review requests - Sends automated review request emails 24 hours after checkout with Google Reviews links and return guest discount codes Daily staff reports - Generates comprehensive arrival/departure reports every morning at 6 AM for front desk, housekeeping, and management teams Smart tracking - Prevents duplicate emails by automatically updating tracking status in your Google Sheets database Professional templates - Uses responsive HTML email templates that work across all devices and email clients Set up steps Connect Google Sheets - Link your hotel reservation spreadsheet (must include columns for guest details, check-in/out dates, and email tracking) Configure Gmail account - Set up Gmail credentials for sending automated emails Customize hotel information - Update hotel name, contact details, and branding in the "Edit Fields" nodes Set staff email addresses - Configure recipient addresses for daily operational reports Adjust timing - Modify schedule triggers if you want different timing for emails and reports (currently set to every 6 hours for guest emails and 6 AM daily for staff reports) Time investment: ~30 minutes for initial setup, then fully automated operation.
by Autonomous Work
This workflow exports every table in a base as its own CSV, saves the files in a time-stamped folder in Amazon S3, pings you on Slack, and optionally prunes older copies. You get an automated weekly backup that is easy to inspect or re-import as needed. You can easily swap the S3 node for the storage provider of your choice. ++How it works++ Weekly Backup Schedule trigger fires weekly Sets and formats the week ex. [2025-W12] Create a folder in S3 bucket with the week Loops through all tables in Airtable base creating CSVs and uploading to the new path Slack message is sent on completion Monthly Prune Schedule trigger fires weekly Sets a cut-off date 4 weeks in the past Lists folders in S3 Deletes all folders > 4 weeks old ++Setup Steps++ Clone workflow Swap credentials for Airtable, AWS, and Slack Ensure AWS credential has appropriate IAM policy to manage bucket & objects Set workflow to "Active"
by n8n Team
This workflow syncs Zendesk tickets to Pipedrive contact owners. This workflow is triggered every day at 09:00 with Zendesk collecting all the tickets updated after the last execution timestamp and updating them according to Pipedrive contacts. It also adds Zendesk comments to the tickets as notes in Pipedrive. Prerequisites Pipedrive account and Pipedrive credentials Zendesk account and Zendesk credentials Note: The Pipedrive and the Zendesk accounts need to be created by the same person / with the same email. How it works Cron node triggers the workflow every day at 09:00. Zendesk node collects all the tickets updated after the last execution timestamp. If node checks if the channel in the ticket is an email, and if so, it continues the workflow. The Item Lists node removes duplicates to make search efficient. Pipedrive node searches persons by email. Set node renames and keeps only needed fields (email & person id) Merge by key node adds the Pipedrive contact id to Zendesk tickets. The HTTP Request node gets Zendesk comments for tickets and the Merge node adds them to tickets. Split node adds nodes in batches with each iteration. Item list node splits comments into separate items. Pipedrive node adds comment as notes. If node checks if the data processing is done and if not, goes back to the Split node. The Function Item node sets the new last execution timestamp.
by Airtop
README Monitor Competitor Facebook Ads with Airtop Use Case Monitor a competitor’s active Facebook ads and get a weekly HTML intelligence brief by email — saving time on manual research and helping you spot messaging, offers, and creative trends quickly. What This Automation Does Runs weekly on a set schedule. Uses Airtop to visit the competitor’s Facebook Ad Library page and extract up to 30 active ads. Summarizes each ad with key points: message, topic, CTA, duration active, language, target audience. Sends the compiled HTML report via Gmail. How It Works Schedule Trigger – Fires once a week at the configured time. Airtop Extraction – Loads the Ad Library URL and runs a prompt to extract and format the ads into HTML. Email Delivery – Sends the HTML report to your specified recipient using Gmail. Setup Requirements Airtop API Key — Generate here. Airtop Credential in n8n — Add your API key under “Airtop” in n8n. Gmail OAuth2 Credential — Connect the Gmail account to send reports. Competitor’s Ad Library URL — Replace the default view_all_page_id in the workflow with your target. Next Steps Duplicate the Airtop step for multiple competitors. Enrich reports by visiting ad landing pages for deeper analysis. Send outputs to Slack or archive in a shared workspace. Read about ways to monitor your competitors ads here
by Alexey from Mingles.ai
AI Image Generator & Editor with GPT-4 Vision - Complete Workflow Template Description Transform text prompts into stunning images or edit existing visuals using OpenAI's latest GPT-4 Vision model through an intuitive web form interface. This comprehensive n8n automation provides three powerful image generation modes: 🎨 Text-to-Image Generation Simply enter a descriptive prompt and generate high-quality images from scratch using OpenAI's gpt-image-1 model. Perfect for creating original artwork, concepts, or visual content. 🖼️ Image-to-Image Editing Upload an existing image file and transform it based on your text prompt. The AI analyzes your input image and applies modifications while maintaining the original structure and context. 🔗 URL-Based Image Editing Provide a direct URL to any online image and edit it with AI. Great for quick modifications of web images or collaborative workflows. Key Features Smart Input Processing Flexible Form Interface**: User-friendly web form with authentication Multiple Input Methods**: File upload, URL input, or text-only generation Quality Control**: Selectable quality levels (low, medium, high) Format Support**: Accepts PNG, JPG, and JPEG formats Advanced AI Integration Latest GPT-4 Vision Model**: Uses gpt-image-1 for superior results Intelligent Switching**: Automatically detects input type and routes accordingly Context-Aware Editing**: Maintains image coherence during modifications Customizable Parameters**: Control size (1024x1024), quality, and generation settings Dual Storage Options Google Drive Integration**: Automatic upload with public sharing permissions ImgBB Hosting**: Alternative cloud storage for instant public URLs File Management**: Organized storage with timestamp-based naming Instant Telegram Delivery Real-time Notifications**: Results sent directly to your Telegram chat Rich Media Messages**: Includes generated image with prompt details Quick Access Links**: Direct links to view and download results Markdown Formatting**: Clean, professional message presentation Technical Workflow Form Submission → User submits prompt and optional image Smart Routing → System detects input type (text/file/URL) AI Processing → OpenAI generates or edits image based on mode Binary Conversion → Converts base64 response to downloadable file Cloud Upload → Stores in Google Drive or ImgBB with public access Telegram Delivery → Sends result with viewing links and metadata Perfect For Content Creators**: Generate unique visuals for social media and marketing Designers**: Quick concept development and image variations Developers**: Automated image processing for applications Teams**: Collaborative image editing and sharing workflows Personal Use**: Transform ideas into visual content effortlessly Setup Requirements OpenAI API Key**: Access to GPT-4 Vision model Google Drive API** (optional): For Google Drive storage ImgBB API Key** (optional): For alternative image hosting Telegram Bot**: For result delivery Basic Auth Credentials**: For form security What You Get ✅ Complete image generation and editing pipeline ✅ Secure web form with authentication ✅ Dual cloud storage options ✅ Instant Telegram notifications ✅ Professional result formatting ✅ Flexible input methods ✅ Quality control settings ✅ Automated file management Start creating AI-powered images in minutes with this production-ready template! Tags: #AI #ImageGeneration #OpenAI #GPT4 #ImageEditing #Telegram #GoogleDrive #Automation #ComputerVision #ContentCreation
by Dariusz Koryto
Google Drive to FTP Transfer Workflow - Setup Guide Overview This n8n workflow automatically transfers files from Google Drive to an FTP server on a scheduled basis. It includes comprehensive logging, email notifications, and error handling. Features Automated Scheduling**: Runs every 6 hours (customizable) Manual Trigger**: Webhook endpoint for on-demand transfers File Filtering**: Supports specific file types and size limits Comprehensive Logging**: Detailed transfer reports saved to Google Drive Email Notifications**: HTML reports sent after each run Error Handling**: Graceful handling of failed transfers Batch Processing**: Files processed individually to prevent rate limits Prerequisites Before setting up this workflow, ensure you have: n8n instance running (self-hosted or cloud) Google Drive account with files to transfer FTP server with upload permissions Email service for sending reports (SMTP) Step-by-Step Setup Instructions 1. Google Drive API Setup 1.1 Create Google Cloud Project Go to Google Cloud Console Create a new project or select existing one Enable the Google Drive API: Navigate to "APIs & Services" → "Library" Search for "Google Drive API" Click "Enable" 1.2 Create OAuth2 Credentials Go to "APIs & Services" → "Credentials" Click "Create Credentials" → "OAuth client ID" Configure consent screen if prompted Choose "Web application" as application type Add your n8n instance URL to authorized redirect URIs: https://your-n8n-instance.com/rest/oauth2-credential/callback Note down the Client ID and Client Secret 1.3 Configure n8n Credential In n8n, go to "Credentials" → "Add Credential" Select "Google Drive OAuth2 API" Enter your Client ID and Client Secret Complete OAuth flow by clicking "Connect my account" Set credential ID as: your-google-drive-credentials-id 2. FTP Server Setup 2.1 FTP Server Requirements Ensure FTP server is accessible from your n8n instance Verify you have upload permissions Note the server details: Host/IP address Port (usually 21 for FTP) Username and password Destination directory path 2.2 Configure n8n FTP Credential In n8n, go to "Credentials" → "Add Credential" Select "FTP" Enter your FTP server details: Host: your-ftp-server.com Port: 21 (or your custom port) Username: your-ftp-username Password: your-ftp-password Set credential ID as: your-ftp-credentials-id 3. Email Setup (SMTP) 3.1 Choose Email Provider Configure SMTP settings for one of these providers: Gmail**: smtp.gmail.com, port 587, use App Password Outlook**: smtp-mail.outlook.com, port 587 Custom SMTP**: Your organization's SMTP server 3.2 Configure n8n Email Credential In n8n, go to "Credentials" → "Add Credential" Select "SMTP" Enter your SMTP details: Host: smtp.gmail.com (or your provider) Port: 587 Security: STARTTLS Username: your-email@example.com Password: your-app-password Set credential ID as: your-email-credentials-id 4. Workflow Configuration 4.1 Import Workflow Copy the workflow JSON from the artifact above In n8n, click "Import from JSON" Paste the workflow JSON and import 4.2 Update Credential References Google Drive nodes: Verify credential ID matches your-google-drive-credentials-id FTP node: Verify credential ID matches your-ftp-credentials-id Email node: Verify credential ID matches your-email-credentials-id 4.3 Customize Parameters FTP Server Settings (Upload to FTP node) { "host": "your-ftp-server.com", // Replace with your FTP host "username": "your-ftp-username", // Replace with your FTP username "password": "your-ftp-password", // Replace with your FTP password "path": "/remote/directory/{{ $json.validFiles[$json.batchIndex].name }}", // Update destination path "port": 21 // Change if using different port } Email Settings (Send Report Email node) { "sendTo": "admin@yourcompany.com", // Replace with your email address "subject": "Google Drive to FTP File Transfer - Report" } File Filter Settings (Filter & Validate Files node) In the JavaScript code, update these settings: const transferNotes = { settings: { maxFileSizeMB: 50, // Change maximum file size allowedExtensions: [ // Add/remove allowed file types '.pdf', '.doc', '.docx', '.txt', '.jpg', '.png', '.zip', '.xlsx' ], autoDeleteAfterTransfer: false, // Set to true to delete from Drive after transfer verifyTransfer: true // Keep true for verification } }; Google Drive Notes Storage (Upload Notes to Drive node) { "parents": { "parentId": "your-notes-folder-id" // Replace with actual folder ID from Google Drive } } 5. Schedule Configuration 5.1 Modify Schedule Trigger In the "Schedule Trigger" node, adjust the interval: { "rule": { "interval": [ { "field": "hours", "hoursInterval": 6 // Change to desired interval (hours) } ] } } Alternative schedule options: Daily**: "field": "days", "daysInterval": 1 Weekly**: "field": "weeks", "weeksInterval": 1 Custom cron**: Use cron expression for complex schedules 5.2 Webhook Configuration The webhook trigger is available at: POST https://your-n8n-instance.com/webhook/webhook-transfer-status Use this for manual triggers or external integrations. 6. Testing and Validation 6.1 Test Connections Test Google Drive: Run "Get Drive Files" node manually Test FTP: Upload a test file using "Upload to FTP" node Test Email: Send a test email using "Send Report Email" node 6.2 Run Test Transfer Activate the workflow Click "Execute Workflow" to run manually Monitor execution in the workflow editor Check for any error messages or failed nodes 6.3 Verify Results FTP Server**: Confirm files appear in destination directory Email**: Check you receive the transfer report Google Drive**: Verify transfer notes are saved to specified folder 7. Monitoring and Maintenance 7.1 Workflow Monitoring Execution History**: Review past runs in n8n interface Error Logs**: Check failed executions for issues Performance**: Monitor execution times and resource usage 7.2 Regular Maintenance Credential Renewal**: Google OAuth tokens may need periodic renewal Storage Cleanup**: Consider archiving old transfer notes Performance Tuning**: Adjust batch sizes or schedules based on usage 8. Troubleshooting 8.1 Common Issues Google Drive Authentication Errors: Verify OAuth2 credentials are correctly configured Check if Google Drive API is enabled Ensure redirect URI matches n8n instance URL FTP Connection Failures: Verify FTP server credentials and connectivity Check firewall settings allow FTP connections Confirm destination directory exists and has write permissions Email Delivery Issues: Verify SMTP credentials and server settings Check if email provider requires app-specific passwords Ensure sender email is authorized File Transfer Failures: Check file size limits in filter settings Verify allowed file extensions include your file types Monitor FTP server disk space 8.2 Debug Mode Enable debug mode by: Adding console.log statements in code nodes Using "Execute Workflow" with step-by-step execution Checking node outputs for data validation 9. Advanced Customizations 9.1 Additional File Filters Add custom filtering logic in the "Filter & Validate Files" node: // Example: Filter by modification date const isRecentFile = new Date(file.modifiedTime) > new Date(Date.now() - 7 * 24 * 60 * 60 * 1000); // Last 7 days // Example: Filter by folder location const isInSpecificFolder = file.parents && file.parents.includes('specific-folder-id'); 9.2 Enhanced Reporting Customize the email report template in "Send Report Email" node: 📊 File Transfer Report Summary Date: {{ new Date().toLocaleString('en-US') }} Success Rate: {{ Math.round((successfulTransfers / totalFiles) * 100) }}% 9.3 Integration with Other Services Add nodes to integrate with: Slack**: Send notifications to team channels Discord**: Post updates to Discord servers Webhook**: Trigger other workflows or systems Database**: Log transfers to MySQL, PostgreSQL, etc. 10. Security Considerations 10.1 Credential Security Use environment variables for sensitive data Regularly rotate FTP and email passwords Implement least-privilege access for service accounts 10.2 Network Security Use SFTP instead of FTP when possible Implement VPN connections for sensitive transfers Monitor network traffic for unusual patterns 10.3 Data Privacy Ensure compliance with data protection regulations Implement data retention policies for transfer logs Consider encryption for sensitive file transfers Support and Resources Documentation Links n8n Documentation Google Drive API Documentation n8n Community Forum Getting Help If you encounter issues: Check the troubleshooting section above Review n8n execution logs for error details Search the n8n community forum for similar issues Create a support ticket with detailed error information Note: Replace all placeholder values (URLs, credentials, IDs) with your actual configuration before running the workflow.
by Daiki Takayama
Who's it for This workflow is perfect for content creators, international teams, and businesses that need to translate documents into multiple languages automatically. Whether you're localizing documentation, translating marketing materials, or creating multilingual content, this workflow saves hours of manual work. What it does Automatically monitors a Google Drive folder for new documents (PDF, DOCX, TXT, or Markdown) and translates them into multiple languages using DeepL API. Each translated document is saved with a language-specific filename (e.g., document_en.pdf, document_zh.pdf) in a designated folder. You receive an email notification when all translations are complete. How it works Monitors a Google Drive folder for new files Detects file format (PDF/DOCX/TXT/Markdown) and extracts text Translates the content into your chosen languages (default: English, Chinese, Korean, Spanish, French, German) Saves translated files with language codes in the filename Sends an email notification with translation summary Optional: Records translation history in Notion database Set up instructions Requirements Google Drive account (for file storage) DeepL API key (free tier: 500,000 characters/month) Gmail account (for notifications) Notion account (optional, for tracking translation history) Setup steps Create Google Drive folders: Create a "Source" folder for original files Create a "Translated" folder for output Copy the folder IDs from the URLs Get DeepL API key: Sign up at DeepL API Copy your API key Configure the workflow: Open the "Configuration (Edit Here)" node (yellow node) Replace folder IDs with your own Set your notification email Choose target languages Set up credentials: Add Google Drive OAuth2 credentials Add DeepL API credentials Add Gmail OAuth2 credentials Activate the workflow and upload a test file! Customization options Change target languages**: Edit the targetLanguages array in the Configuration node (supports 30+ languages) Adjust polling frequency**: Change trigger from "every minute" to hourly or daily for batch processing Enable Notion tracking**: Set enableNotion to true and provide your database ID Add more file formats**: Extend the Switch node to handle additional file types Filter by file size**: Add conditions to skip files larger than a certain size Supported languages EN (English), ZH (Chinese), KO (Korean), JA (Japanese), ES (Spanish), FR (French), DE (German), IT (Italian), PT (Portuguese), RU (Russian), and 20+ more. Performance Short files** (1 page): ~30 seconds for 6 languages Medium files** (10 pages): ~2 minutes for 6 languages Large files** (100 pages): ~15 minutes for 6 languages Technical Details Trigger**: Google Drive folder monitoring (1-minute polling) Translation**: DeepL API with automatic source language detection Loop implementation**: Split Out + Aggregate pattern for parallel translation Error handling**: Catches API failures and sends email alerts Storage**: Original file format preserved in translated outputs Notes DeepL free tier provides 500,000 characters/month (approximately 250 pages) For high-volume translation, consider upgrading to DeepL Pro The workflow creates new files instead of overwriting, preserving translation history Google Docs are automatically converted to the appropriate format before translation What You'll Learn This workflow demonstrates several n8n patterns: File format detection and routing (Switch node) Loop implementation with Split Out + Aggregate Binary data handling for file operations Conditional logic with IF nodes (optional features) Cross-node data references Error handling and user notifications Perfect for learning automation best practices while solving a real business problem!
by Oneclick AI Squad
This is a production-ready, end-to-end workflow that automatically compares hotel prices across multiple booking platforms and delivers beautiful email reports to users. Unlike basic building blocks, this workflow is a complete solution ready to deploy. ✨ What Makes This Production-Ready ✅ Complete End-to-End Automation Input**: Natural language queries via webhook Processing**: Multi-platform scraping & comparison Output**: Professional email reports + analytics Feedback**: Real-time webhook responses ✅ Advanced Features 🧠 Natural Language Processing for flexible queries 🔄 Parallel scraping from multiple platforms 📊 Analytics tracking with Google Sheets integration 💌 Beautiful HTML email reports 🛡️ Error handling and graceful degradation 📱 Webhook responses for real-time feedback ✅ Business Value For Travel Agencies**: Instant price comparison service for clients For Hotels**: Competitive pricing intelligence For Travelers**: Save time and money with automated research 🚀 Setup Instructions Step 1: Import Workflow Copy the workflow JSON from the artifact In n8n, go to Workflows → Import from File/URL Paste the JSON and click Import Step 2: Configure Credentials A. SMTP Email (Required) Settings → Credentials → Add Credential → SMTP Host: smtp.gmail.com (for Gmail) Port: 587 User: your-email@gmail.com Password: your-app-password (not regular password!) Gmail Setup: Enable 2FA on your Google Account Generate App Password: https://myaccount.google.com/apppasswords Use the generated password in n8n B. Google Sheets (Optional - for analytics) Settings → Credentials → Add Credential → Google Sheets OAuth2 Follow the OAuth flow to connect your Google account Sheet Setup: Create a new Google Sheet Name the first sheet "Analytics" Add headers: timestamp, query, hotel, city, checkIn, checkOut, bestPrice, platform, totalResults, userEmail Copy the Sheet ID from URL and paste in the "Save to Google Sheets" node Step 3: Set Up Scraping Service You need to create a scraping API that the workflow calls. Here are your options: Option A: Use Your Existing Python Script Create a simple Flask API wrapper: api_wrapper.py from flask import Flask, request, jsonify import subprocess import json app = Flask(name) @app.route('/scrape/<platform>', methods=['POST']) def scrape(platform): data = request.json query = f"{data['checkIn']} to {data['checkOut']}, {data['hotel']}, {data['city']}" try: result = subprocess.run( ['python3', 'price_scrap_2.py', query, platform], capture_output=True, text=True, timeout=30 ) Parse your script output output = result.stdout Assuming your script returns price data return jsonify({ 'price': extracted_price, 'currency': 'USD', 'roomType': 'Standard Room', 'url': booking_url, 'availability': True }) except Exception as e: return jsonify({'error': str(e)}), 500 if name == 'main': app.run(host='0.0.0.0', port=5000) Deploy: pip install flask python api_wrapper.py Update n8n HTTP Request nodes: URL: http://your-server-ip:5000/scrape/booking URL: http://your-server-ip:5000/scrape/agoda URL: http://your-server-ip:5000/scrape/expedia Option B: Use Third-Party Scraping Services Recommended Services: ScraperAPI** (scraperapi.com) - $49/month for 100k requests Bright Data** (brightdata.com) - Pay as you go Apify** (apify.com) - Has pre-built hotel scrapers Example with ScraperAPI: // In HTTP Request node URL: http://api.scraperapi.com Query Parameters: api_key: YOUR_API_KEY url: https://booking.com/search?hotel={{$json.hotelName}}... Option C: Use n8n SSH Node (Like Your Original) Keep your SSH approach but improve it: Replace HTTP Request nodes with SSH nodes Point to your server with the Python script Ensure error handling and timeouts // SSH Node Configuration Host: your-server-ip Command: python3 /path/to/price_scrap_2.py "{{$json.hotelName}}" "{{$json.city}}" "{{$json.checkInISO}}" "{{$json.checkOutISO}}" "booking" Step 4: Activate Webhook Click on "Webhook - Receive Request" node Click "Listen for Test Event" Copy the webhook URL (e.g., https://your-n8n.com/webhook/hotel-price-check) Test with this curl command: curl -X POST https://your-n8n.com/webhook/hotel-price-check \ -H "Content-Type: application/json" \ -d '{ "message": "I want to check Marriott Hotel in Singapore from 15th March to 18th March", "email": "user@example.com", "name": "John Doe" }' Step 5: Activate Workflow Toggle the workflow to Active The webhook is now live and ready to receive requests 📝 Usage Examples Example 1: Basic Query { "message": "Hilton Hotel in Dubai from 20th December to 23rd December", "email": "traveler@email.com", "name": "Sarah" } Example 2: Flexible Format { "message": "I need prices for Taj Hotel, Mumbai. Check-in: 5th January, Check-out: 8th January", "email": "customer@email.com" } Example 3: Short Format { "message": "Hyatt Singapore March 10 to March 13", "email": "user@email.com" } 🎨 Customization Options 1. Add More Booking Platforms Steps: Duplicate an existing "Scrape" node Update the platform parameter Connect it to "Aggregate & Compare" Update the aggregation logic to include the new platform 2. Change Email Template Edit the "Format Email Report" node's JavaScript: Modify HTML structure Change colors (currently purple gradient) Add your company logo Include terms and conditions 3. Add SMS Notifications Using Twilio: Add new node: Twilio → Send SMS Connect after "Aggregate & Compare" Format: "Best deal: ${hotel} at ${platform} for ${price}" 4. Add Slack Integration Add Slack node after "Aggregate & Compare" Send to #travel-deals channel Include quick booking links 5. Implement Caching Add Redis or n8n's built-in cache: // Before scraping, check cache const cacheKey = ${hotelName}-${city}-${checkIn}-${checkOut}; const cached = await $cache.get(cacheKey); if (cached && Date.now() - cached.timestamp < 3600000) { return cached.data; // Use 1-hour cache } 📊 Analytics & Monitoring Google Sheets Dashboard The workflow automatically logs to Google Sheets. Create a dashboard with: Metrics to track: Total searches per day/week Most searched hotels Most searched cities Average price ranges Platform with best prices (frequency) User engagement (repeat users) Example Sheet Formulas: // Total searches today =COUNTIF(A:A, TODAY()) // Most popular hotel =INDEX(C:C, MODE(MATCH(C:C, C:C, 0))) // Average best price =AVERAGE(G:G) Set Up Alerts Add a node after "Aggregate & Compare": // Alert if prices are unusually high if (bestDeal.price > avgPrice * 1.5) { // Send alert to admin return [{ json: { alert: true, message: High prices detected for ${hotelName} } }]; } 🛡️ Error Handling The workflow includes comprehensive error handling: 1. Missing Information If user doesn't provide hotel/city/dates → Responds with helpful prompt 2. Scraping Failures If all platforms fail → Sends "No results" email with suggestions 3. Partial Results If some platforms work → Shows available results + notes errors 4. Email Delivery Issues Uses continueOnFail: true to prevent workflow crashes 🔒 Security Best Practices 1. Rate Limiting Add rate limiting to prevent abuse: // In Parse & Validate node const userEmail = $json.email; const recentSearches = await $cache.get(searches:${userEmail}); if (recentSearches && recentSearches.length > 10) { return [{ json: { status: 'rate_limited', response: 'Too many requests. Please try again in 1 hour.' } }]; } 2. Input Validation Already implemented - validates hotel names, cities, dates 3. Email Verification Add email verification before first use: // Send verification code const code = Math.random().toString(36).substring(7); await $sendEmail({ to: userEmail, subject: 'Verify your email', body: Your code: ${code} }); 4. API Key Protection Never expose scraping API keys in responses or logs 🚀 Deployment Options Option 1: n8n Cloud (Easiest) Sign up at n8n.cloud Import workflow Configure credentials Activate Pros: No maintenance, automatic updates Cons: Monthly cost Option 2: Self-Hosted (Most Control) Using Docker docker run -it --rm \ --name n8n \ -p 5678:5678 \ -v ~/.n8n:/home/node/.n8n \ n8nio/n8n Using npm npm install -g n8n n8n start Pros: Free, full control Cons: You manage updates Option 3: Cloud Platforms Railway.app (recommended for beginners) DigitalOcean App Platform AWS ECS Google Cloud Run 📈 Scaling Recommendations For < 100 searches/day Current setup is perfect Use n8n Cloud Starter or small VPS For 100-1000 searches/day Add Redis caching (1-hour cache) Use queue system for scraping Upgrade to n8n Cloud Pro For 1000+ searches/day Implement job queue (Bull/Redis) Use dedicated scraping service Load balance multiple n8n instances Consider microservices architecture 🐛 Troubleshooting Issue: Webhook not responding Solution: Check workflow is Active Verify webhook URL is correct Check n8n logs: Settings → Log Streaming Issue: No prices returned Solution: Test scraping endpoints individually Check if hotel name matches exactly Verify dates are in future Try different date ranges Issue: Emails not sending Solution: Verify SMTP credentials Check "less secure apps" setting (Gmail) Use App Password instead of regular password Check spam folder Issue: Slow response times Solution: Enable parallel scraping (already configured) Add timeout limits (30 seconds recommended) Implement caching Use faster scraping service
by Intuz
This n8n template from Intuz provides a complete and automated solution for scaling your DevOps practices across multiple repositories. Are you tired of the repetitive dance between git push, creating a pull request in GitHub, updating the corresponding task in JIRA, and then manually notifying your team in Slack, or Notion? This template puts your entire post-commit workflow on autopilot, creating a seamless and intelligent bridge between your code and your project management. By embedding specific keywords and a JIRA issue ID into your git commit commands, this workflow automatically creates a Pull Request in the correct GitHub repository and updates the corresponding JIRA ticket. This creates a complete, centralized system that keeps all your projects synchronized, providing a massive efficiency boost for teams managing a diverse portfolio of codebases. Who This Template Is For? This template is a must-have for any organization looking to streamline its software development lifecycle (SDLC). It’s perfect for: Development Teams: Eliminate tedious, manual tasks and enforce a consistent workflow, allowing developers to stay focused on coding. DevOps Engineers: A ready-to-deploy solution that integrates key developer tools without weeks of custom scripting. Engineering Managers & Team Leads: Gain real-time visibility into development progress and ensure processes are followed without constant check-ins. Project Managers: Get accurate, automatic updates in JIRA the moment development work is completed, improving project tracking and forecasting. Step-by-Step Setup Instructions Follow these steps carefully to configure the workflow for your environment. 1. Connect Your Tools (Credentials) GitHub: Create credentials with repo scope to allow PR creation. JIRA: Create an API token and connect your JIRA Cloud or Server instance. Slack: Connect your Slack workspace using OAuth2. Notion: Connect your Notion integration token. 2. Configure the GitHub Webhook (For Each Repository) This workflow is triggered by a GitHub webhook. You must add it to every repository you want to automate. First, Save and Activate the n8n workflow to ensure the webhook URL is live. In the n8n workflow, copy the Production URL from the Webhook node. Go to your GitHub repository and navigate to Settings > Webhooks > Add webhook. In the Payload URL field, paste the n8n webhook URL. Change the Content type to application/json. Under "Which events would you like to trigger this webhook?", select "Just the push event." Click "Add webhook." Repeat this for all relevant repositories. 3. Configure the JIRA Nodes (Crucial Step) Your JIRA project has unique IDs for its statuses. You must update the workflow to match yours. Find the two JIRA nodes named "Update task status after PR" and "Update the task status without PR." In each node, go to the Status ID field. Click the dropdown and select the status that corresponds to "Done" or "Development Done" in your specific JIRA project workflow. The list is fetched directly from your connected JIRA instance. 4. Configure Notification Nodes Tell the workflow where to send updates. For Slack: Open the two nodes named "Send message in slack..." and select your desired channel from the Channel ID dropdown. For Notion: Open the two nodes named "Append a block in notion..." and paste the URL of the target Notion page or database into the Block ID field. 5. Final Activation Once all configurations are complete, ensure the workflow is Saved and the toggle switch is set to Active. You are now ready to automate! Customization Guidance This template is a powerful foundation. Here’s how you can adapt it to your team's specific needs. 1. Changing the PR Title or Body: Go to the "Request to create PR" (HTTP Request) node. In the JSON Body field, you can edit the title and body expressions. For example, you could add the committer's name ({{$('Webhook').item.json.body.pusher.name }}) or a link back to the JIRA task. 2. Adapting to a Fixed Branching Strategy: If your team always creates pull requests against a single branch (e.g., develop), you can simplify the workflow. In the "Request to create PR" node, change the base value in the JSON body from {{...}} to your static branch name: "base": "develop". You can then remove the base branch logic from the "Commit Message Breakdown" (Code) node. 3. Modifying Notification Messages: The text sent to Slack and Notion is fully customizable. Open any of the Slack or Notion nodes and edit the text fields. You can include any data from previous nodes, such as the PR URL ({{ $('Request to create PR').item.json.body.html_url }}) or the repository name. 4. Adjusting the Commit Regex for Different Conventions: This is an advanced customization. If your team uses a different commit format (e.g., (DEV-123) instead of DEV-123), you can edit the regular expression in the "Commit Message Breakdown" (Code) node. Be sure to test your changes carefully. 5. Adding/Removing Notification Channels: Don't use Notion? Simply delete the two Notion nodes. Want to send an email instead? Add a Gmail or SMTP node in parallel with a Slack node and configure it with the same data. Connect with us Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Worflow Automation Click here- Get Started