by Ruslan Elishev
🤖 Telegram Bot with Dynamic Menus & Rating System What It Does This n8n workflow creates an interactive Telegram bot with: Dynamic inline keyboards that respond to user clicks 5-star rating system for collecting feedback Personalized responses using the user's actual name Multi-level menu navigation (Main → Settings → Profile, etc.) Real-time message updates when buttons are clicked How It Works Receives messages via Telegram webhook trigger node Extracts user data (name, ID, message type) Builds dynamic menus based on user actions Sends/updates messages with inline keyboards Handles button clicks without page refresh 🚀 Setup Instructions Get Your Bot Credentials Configure Workflow Open "Set Bot Token" node Replace token with yours Save and activate workflow (Active) Test Your Bot Message your bot on Telegram Click the buttons to navigate menus Try the rating system on Feature 1 🎨 Customization Guide Add New Menu Items In the "Prepare Response" Function node, add new cases: case 'your_feature': responseText = 'Your feature description'; keyboard = [ [{ text: '🎯 Button 1', callback_data: 'action1' }], [{ text: '🔙 Back', callback_data: 'main' }] ]; break; Modify Rating Options Change star buttons to numbers or emojis: // Current: ⭐⭐⭐ // Alternative: 1️⃣ 2️⃣ 3️⃣ or 👎 👍 Change Bot Responses Edit responseText for message content Modify keyboard arrays for button layout Add HTML formatting: bold, italic 💡 ++Key Features Demonstrated++ HTTP Request workaround for dynamic keyboards (n8n Telegram node limitation) Callback query handling to prevent loading animations Message editing vs sending new messages User data extraction from Telegram API Switch-case menu routing for scalable navigation ⚠️ ++Important Notes++ Limitation: n8n's native Telegram node doesn't support dynamic inline keyboards, this is why need to use HTTP nodes. Solution demonstrated: Use HTTP Request node with Telegram Bot API directly
by Nima Salimi
🚀 Automated Daily SERP Rank Tracker for SEO Specialists (Google Sheets + DataForSEO) Overview 🌐 This workflow automates your daily keyword rank tracking 🔍 using DataForSEO ⚙️ and Google Sheets 📊. It pulls live Google Search results for each keyword in your list, extracts key details (query, rank, domain, date), and appends them to your Google Sheet automatically. 📆 You’ll have a complete daily snapshot of your keyword positions — no manual checks needed. Built for SEO professionals, digital marketers, and agencies, this workflow helps you centralize ranking data, build trend dashboards, and automate reporting workflows. 👤 Who’s it for? 🧠 SEO specialists tracking daily keyword performance 📈 Marketing teams managing multiple websites 💼 Agencies providing automated ranking reports for clients 💻 Growth teams who want rank tracking data for dashboards or AI tools ⚙️ How to Set Up Connect Your Google Sheet Use this template sheet 👉 Google Sheet Example Make sure it has a query column containing your target keywords. Set Up DataForSEO Credentials Create an account at dataforseo.com Add your API credentials under HTTP Request node or DataForSEO node. Customize Location & Language In the “Fetch SERP Data (DataForSEO API)” node: location_code: 2840 → 🇺🇸 United States (changeable) language_code: en → 🇬🇧 English (changeable) Format Date The “Add Timestamp & Prepare Output” node converts timestamps into YYYY-MM-DD format automatically 🗓️ Run or Schedule Trigger manually for tests, or enable daily schedule trigger to automate it. Results append directly to your Google Sheet after each run ✅ 📊 Example Output | query | rank | domain | date | |--------|-------|-----------------------|------------| | cloud host | 1 | cloudhost.one | 2025-10-24 | | cloud host | 2 | cloud.google.com | 2025-10-24 | | cloud hosting | 1 | cloud.google.com | 2025-10-24 | | cloud hosting | 2 | aws.amazon.com | 2025-10-24 | | cloud hosting | 3 | www.hostinger.com | 2025-10-24 | 📈 Use this data to build trend charts, compare historical performance, or connect to Looker Studio for automated dashboards. 🧩 Workflow Highlights 🕒 Automated Daily Runs – via Schedule Trigger 🔍 Accurate SERP Data – powered by DataForSEO API 📄 Dynamic Keyword Input – read directly from Google Sheets 📊 Historical Tracking – appends new data each day 🌎 Regional Customization – change language and location easily 🧠 AI-Ready – integrate GPT or AI nodes for insights or summaries 💡 Pro Tips Add a Slack or Gmail alert node for position drops or gains 📬 Combine with NocoDB or Airtable for more advanced data storage Expand with DataForSEO Labs endpoints for keyword difficulty, CPC, or SERP features 📺 Check Out My Channel 💬 Learn more about SEO Automation & n8n Workflows 👉 Connect with me on LinkedIn: linkedin.com/in/nima-salimi-a655a6231 Follow for more workflow templates, AI integrations, and SEO automation tutorials 💥
by WeblineIndia
📝 Compliance Report Collector (Google Form → Drive + MySQL) This n8n workflow automates the collection and archival of compliance reports submitted via Google Forms. Uploaded documents (PDF, DOCX, etc.) are archived into Google Drive and submission metadata is logged into a MySQL database. It ensures compliance documentation is properly stored, searchable and auditable without manual effort. ⚡ Quick Implementation Steps Import the JSON file into n8n. Set up a Google Form to POST file + metadata (reporter, category, etc.) to the /submit-report webhook. Update the Set Config node with your: MySQL connection details Google Drive folder ID Deploy and test a form submission with a file upload. Each report is stored in Drive and logged to your DB. 🎯 Who’s It For Compliance officers handling environmental or safety reports. Admins managing documentation for inspections. Renewable energy companies required to maintain audit-ready records. Any org needing structured report archival & metadata logging. 🛠 Requirements | Tool | Purpose | |------|---------| | n8n Instance | Workflow automation | | Google Drive | To archive uploaded reports | | MySQL Database | To log submission metadata | | Google Forms / HTML Form | Report submission source | 🧠 What It Does Listens for incoming POST requests with a file and metadata. Uploads the file to a specified Google Drive folder. Extracts metadata like: Reporter name Category/type Timestamp File name, MIME type Logs that metadata into a MySQL table for auditing or reporting. 🧾 Sample MySQL Table Schema CREATE TABLE report_logs ( id INT AUTO_INCREMENT PRIMARY KEY, reporter VARCHAR(100), category VARCHAR(100), timestamp DATETIME, file_name VARCHAR(255), mime_type VARCHAR(50), folder_id VARCHAR(100) ); 🔧 How To Set Up – Step-by-Step Import the JSON into n8n. Configure the following in the Set Config node: MySQL: dbHost, dbUser, dbPassword, dbName, dbTable Google Drive: driveFolderId Update the webhook URL in your Google Form (via Apps Script or middleware). Test submission with a file upload. Confirm: File lands in your Drive folder Log entry appears in your database ✨ How To Customize | Customization | How | |---------------|-----| | Add more form fields | Extend the metadata mapping in the Function node | | Rename files before upload | Modify filename in Google Drive node | | Add email confirmation | Add an Email Send node after DB insert | | Filter file types | Add IF node before upload to validate MIME type | ➕ Add‑ons (Optional Extensions) | Add-on | Description | |--------|-------------| | 📤 Email Acknowledgment | Email sender a confirmation with Drive link | | 🧾 PDF Parser | Auto-parse content using PDF.co or OpenAI | | 📊 Admin Dashboard | Display logs in Supabase or Metabase | | 🗃 File Backup | Copy files to Dropbox or S3 after Drive upload | 📈 Use Case Examples Collect monthly safety audits from plant staff into a Drive archive. Accept vendor compliance declarations via Google Form and auto-log to DB. Capture field inspection reports and tag by category for audit. Store weekly environmental reports for long-term access. 🧯 Troubleshooting Guide | Issue | Possible Cause | Solution | |-------|----------------|----------| | File not uploaded | Drive ID is invalid | Check permissions and folder ID | | DB not logging | Connection or table issue | Verify DB credentials and schema | | Webhook not triggered | Form not integrated correctly | Ensure form POSTs file to n8n webhook | | Wrong file type | MIME mismatch | Validate acceptable types via Function/IF node | 📞 Need Help? Want to integrate this with audit dashboards or add Google Sheet exports? 👉 Contact WeblineIndia — Experts in compliance automation and renewable energy workflows.
by Dominic Spatz
🔄 Purpose of the Workflow: The Update-N8N workflow is designed to automatically trigger a (Portainer) webhook to update an N8N container, but only if a new version of N8N is available. ⚙️ Detailed Workflow Steps: 🕒 Scheduled Trigger The workflow runs every 16 hours at minute 8 using a schedule trigger node. 🌐 Fetch Latest N8N Version It sends an HTTP GET request to https://registry.npmjs.org/n8n/latest to retrieve the latest published N8N version from the npm registry. 📈 Get Currently Running Local Version Another HTTP GET request is sent to https://127.0.0.1/metrics (likely the Prometheus metrics endpoint of the local N8N instance) to extract the currently installed N8N version. 🧠 Version Comparison The workflow compares the local version (parsed from metrics) with the latest available version. 📬 Trigger Portainer Webhook If the versions do not match (i.e., an update is available), a POST request is sent to a webhook URL, which might be a Portainer webhook that redeploys or updates the N8N container/stack. ✅ Key Benefits: No manual checks or updates needed. Triggers only when a new version is available. Integrates seamlessly with Portainer via webhook. Secure configuration, e.g., disallowing unauthorized TLS certs for external requests.
by Zane
This workflow automates a batch upload of multiple videos to YouTube, spacing each upload 12 hours apart in Japan Standard Time (UTC+9) and automatically adding them to a playlist. ⚙️ Workflow Logic Manual Trigger — Starts the workflow manually. List Video Files — Uses a shell command to find all .mp4 files under the specified directory (/opt/downloads/单词卡/A1-A2). Sort and Generate Items — Sorts videos by day number (dayXX) extracted from filenames and assigns a sequential order value. Calculate Publish Schedule (+12h Interval) — Computes the next rounded JST hour plus a configurable buffer (default 30 min). Staggers each video’s scheduled time by order × 12 hours. Converts JST back to UTC for YouTube’s publishAt field. Split in Batches (1 per video) — Iterates over each video item. Read Video File — Loads the corresponding video from disk. Upload to YouTube (Scheduled) — Uploads the video privately with the computed publishAtUtc. Add to Playlist — Adds the newly uploaded video to the target playlist. 🕒 Highlights Timezone-safe:** Pure UTC ↔ JST conversion avoids double-offset errors. Sequential scheduling:** Ensures each upload is 12 hours apart to prevent clustering. Customizable:** Change SPAN_HOURS, BUFFER_MIN, or directory paths easily. Retry-ready:** Each upload and playlist step has retry logic to handle transient errors. 💡 Typical Use Cases Multi-part educational video series (e.g., A1–A2 English learning). Regular content release cadence without manual scheduling. Automated YouTube publishing pipelines for pre-produced content. Author: Zane Category: Automation / YouTube / Scheduler Timezone: JST (UTC+09:00)
by Yaron Been
This workflow provides automated access to the Fofr Any Comfyui Workflow AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for other generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete other generation process using the Fofr Any Comfyui Workflow model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Run any ComfyUI workflow. Guide: https://github.com/replicate/cog-comfyui Key Capabilities Specialized AI model with unique capabilities** Advanced processing and generation features** Custom AI-powered automation tools** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Fofr/any-comfyui-workflow AI model Fofr Any Comfyui Workflow**: The core AI model for other generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Other Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Specialized Processing**: Handle specific AI tasks and workflows Custom Automation**: Implement unique business logic and processing Data Processing**: Transform and analyze various types of data AI Integration**: Add AI capabilities to existing systems and workflows Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #aiprocessing #dataprocessing #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation
by Cameron Booth
This template demonstrates how to combine n8n, OpenAI agents, and the new Xano Node to build an intelligent support-ticket routing system — without writing a single API call. Start your Xano journey with the downloadable snippet here! When a ticket arrives, the workflow: Receives the ticket via Webhook Classifies the issue using an n8n Agent with an OpenAI model Searches Xano to check whether the user already exists Creates or updates records using the native Xano Node (no headers or manual HTTP setup) Triggers backend logic in Xano, where escalation rules and agent workflows process the ticket Returns a structured response to n8n for further routing (Slack, CRM, inbox, etc.) This template highlights how Xano can act as your backend intelligence layer while n8n orchestrates everything else — making it easy to automate support operations, apply escalation policies, and unify your data across tools. Use this as a foundation to build more advanced automation: customer enrichment, billing checks, account risk detection, SLA enforcement, and more. Happy building! 🚀
by InfraNodus
Build an embeddable AI chatbot with an access to a knowledge base This is an example of a simple AI chatbot that has access to external knowledge to augment its responses. The knowledge can be added manually or imported from multiple sources (text and PDF files, websites, CSVs, Google search results, AI generated, YouTube search results, RSS feeds, etc) using InfraNodus. • no OpenAI account needed • no vector store needed • easy data import: PDF, text, CSV, Google / YouTube results, RSS feeds, websites, or AI-generated How it works First, you add your data into your InfraNodus graph — this will be your knowledge base. You can import this data from multiple sources or add it manually. You will have a visual interface available that will show the main concepts and topics in your knowledge base, so you can have an overview of its structure and know how to improve it, if necessary. Your data is represented as a knowledge graph which contains information about relations and topical clusters in your data, making the LLM responses much more precise. How to use Copy the template Add your InfraNodus API key to the HTTP AI response node Create a new graph in InfraNodus with your data (or import from an external source) Add the name of this graph into the name field of the AI response HTTP node. That's it! You can query it using the embeddable web form available via a URL Requirements You only need an InfraNodus account to set this workflow up. Free 14-day trials are available.
by Robert Breen
Eventbrite → Pipedrive Lead‑Sync Bring your Eventbrite attendee data into Pipedrive automatically—no spreadsheets, CSVs, or manual uploads. 🚀 What the Workflow Does Polls Eventbrite* on a schedule (default 30 min) for *new registrations. Creates or updates* matching *Person* and *Deal** records in Pipedrive. Deduplicates** by email and stores a timestamp so each attendee is processed only once. Easily configurable** field‑mapping lets you decide exactly which attendee data lands in Pipedrive. 📋 Key Features | Feature | Benefit | |---------|---------| | Incremental Sync | Processes only registrations created since the last run. | | Person + Deal Linking | Keeps contacts and sales opportunities in one place. | | No Community Nodes | 100 % official n8n nodes—simple to import and run. | | Fully Editable Code Node | Swap your Eventbrite token, organization ID, and field mappings in seconds. | 🔑 Prerequisites Eventbrite Personal OAuth Token** Eventbrite Organization ID** Pipedrive API Token** n8n 1.25 or later 🛠 Quick Start Import the workflow JSON. Open the Code node → paste your Eventbrite token and organization ID. Add your Eventbrite and Pipedrive credentials in their respective nodes. Activate the workflow and watch new registrants appear in Pipedrive within minutes. Contact Email:** rbreen@ynteractive.com Website:** https://ynteractive.com YouTube:** https://www.youtube.com/@ynteractivetraining LinkedIn:** https://www.linkedin.com/in/robertbreen
by Zakwan
This workflow is a user-friendly tool that automates the creation of high-quality advertising images for products. It takes a simple product image uploaded by a user and uses AI to transform it into a professional, photorealistic advertisement featuring a fashion model actively using the product. The final image is then made available for the user to download. Step-by-Step Breakdown: Here is a breakdown of the automated process: Form Submission: The workflow is triggered by a public form. The user uploads a product image and selects a character model (male or female) from a dropdown menu. Image Processing: The uploaded image file is extracted and prepared for the AI. This includes converting the binary file data into a format that the AI model can understand. AI Image Generation: An HTTP request is sent to a large language model (Google's Gemini via OpenRouter). The request includes a prompt that combines the user's selected character model and the uploaded product image. The AI is instructed to generate a new, photorealistic image of the model using the product. Data Conversion: The AI's output, which is a base64 encoded image string, is then processed. The workflow separates the image data from its metadata. Final Image Delivery: The base64 data is converted back into a binary file, which is then provided to the user for automatic download via a completion form.
by Arjan ter Heegde
n8n Placeholdarr for Plex (BETA) This flow creates dummy files for every item added in your *Arrs (Radarr/Sonarr) with the tag unprocessed-dummy. It’s useful for maintaining a large Plex library without needing the actual movies or shows to be present on your Debrid provider. How It Works When a dummy file is played, the corresponding item is automatically monitored in *Arr and added to the download queue. This ensures that the content becomes available within ~3 minutes for playback. If the content finishes downloading while the dummy is still being played, Tautulli triggers a webhook that stops the stream and notifies the user. Requirements Each n8n node must have the correct URL and authorization headers configured. The SSH host (used to create dummy files) must have FFmpeg installed. A Trakt.TV API key is required if you're using Trakt collections. Warning > ⚠️ This flow is currently in BETA and under active development. > It is not recommended for users without technical experience. > Keep an eye on the GitHub repository for updates. https://github.com/arjanterheegde/n8n-workflows-for-plex
by Khairul Muhtadin
Automatically monitor your keyword positions on Google every day. This workflow uses Decodo to pull live search results and logs them straight into Google Sheets, no manual checking needed. Why This Workflow? Save Time: No more spending an hour every morning checking rankings by hand. Everything runs on autopilot. Save Money: Skip expensive SEO tools that charge per keyword for basic rank tracking. Accurate Data: Every result — including meta descriptions and URLs — gets recorded without copy-paste mistakes. Easy to Scale: Want to add more keywords, switch countries, or change devices? Just tweak one config node. Who Is This For? SEO Specialists:** Keep tabs on daily keyword movements for clients or internal projects. Content Marketers:** See how your latest content stacks up against competitors for target keywords. Digital Agencies:** Deliver automated ranking reports to clients without touching a spreadsheet manually. How It Works Scheduled Start: The workflow kicks off every morning at 9:00 AM using the Schedule Trigger. Set Parameters: Keyword, country code (e.g., "id"), language ("en"), and device type are defined upfront. Fetch Results: The Decodo node runs a live Google search based on your settings. Filter & Validate: The workflow strips out ads, maps, and snippets — keeping only organic results with valid URLs. Store Rankings: Top results (e.g., Top 5) get appended to a "SERP_Results" tab in Google Sheets. Log Issues: If something goes wrong (API timeout, empty results), errors are recorded in a separate "SERP_Errors" tab. What You Need | Requirement | Purpose | |-------------|---------| | n8n instance | Platform to run the workflow | | Decodo API | Pulls live Google Search data | | Google Sheets | Stores ranking data and error logs | Setup Steps Import the JSON file into your n8n instance. Add your credentials: Decodo API: Sign up at Decodo, grab your API key, and paste it into the Decodo Search node. Google Sheets: Connect your Google account via OAuth2. Make sure your spreadsheet has two tabs: SERP_Results and SERP_Errors. Select your spreadsheet: In the Write SERP Results and Write SERP Errors nodes, pick the spreadsheet you prepared. Adjust your settings: Open the Set Search Input node to change the keyword, country code, and how many results to track (top_n). Run a test: Click "Test Workflow" and confirm the data shows up in your Google Sheet. How the Logic Works The workflow uses a Split Out + Extract Organic approach. Even when Google returns mixed results like ads, map packs, or featured snippets, this setup specifically targets organic blue links. The "Valid Row" check keeps your spreadsheet clean by only saving results that fall within your Top N threshold. Customization Quick Changes: Run Frequency:** Switch the Schedule Trigger to hourly, weekly, or whatever fits your needs. Rank Depth:** Change the top_n value in the input node to track the Top 10, 20, or 50. Going Further: Slack Alerts:** Add a Slack node to notify your team when a keyword drops out of the Top 3. Multiple Keywords:** Swap the "Set" node with a Google Sheet "Read" node to loop through hundreds of keywords at once. Real-World Examples Competitor Tracking for E-commerce Problem: A retail brand needs to know who's ranking for "best running shoes" every day. Solution: This workflow checks the Top 5 results each morning, making it easy to spot when a competitor's blog post overtakes their product page. Outcome: The marketing team catches ranking drops within 24 hours and updates content right away. Local SEO Monitoring Problem: A service business wants to track rankings specifically in Indonesia ("id") using English ("en") search settings. Solution: By setting the country and language parameters in the Set node, the workflow captures localized search data accurately. Outcome: The business gets a clear view of their local visibility without needing a VPN. Created by: Khairul Muhtadin Category: Marketing | Tags: SEO, SERP, Google Sheets, Decodo Need custom workflows? Contact us Connect with the creator: Portfolio • Store • LinkedIn • Medium • Threads