by automedia
Transcribe New YouTube Videos and Save to Supabase Who's It For? This workflow is for content creators, marketers, researchers, and anyone who needs to quickly get text transcripts from YouTube videos. If you analyze video content, repurpose it for blogs or social media, or want to make videos searchable, this template will save you hours of manual work. What It Does This template automatically monitors multiple YouTube channels for new videos. When a new video is published, it extracts the video ID, retrieves the full transcript using the youtube-transcript.io API, and saves the structured data—including the title, author, URL, and full transcript—into a Supabase table. It intelligently filters out YouTube Shorts by default and includes error handling to ensure that only successful transcriptions are processed. Requirements A Supabase account with a table ready to receive the video data. An API key from youtube-transcript.io (offers a free tier). The Channel ID for each YouTube channel you want to track. You can find this using a free online tool like TunePocket's Channel ID Finder. How to Set Up Add Channel IDs: In the "Channels To Track" node, replace the example YouTube Channel IDs with your own. The workflow uses these IDs to create RSS links and find new videos. Configure API Credentials: Find the "youtube-captions" HTTP Request node. In the credentials tab, create a new "Header Auth" credential. Name it youtube-transcript-io and paste your API key into the "Value" field. The "Name" field should be x-api-key. Connect Your Supabase Account: Navigate to the "Add to Content Queue Table" node. Create new credentials for your Supabase account using your Project URL and API key. Once connected, select your target table and map the incoming fields (title, source_url, content_snippet, etc.) to the correct columns in your table. Set Your Schedule (Optional): The workflow starts with a manual trigger. To run it automatically, replace the "When clicking ‘Execute workflow’" node with a Schedule node and set your desired interval (e.g., once a day). Activate the Workflow: Save your changes and toggle the workflow to Active in the top right corner. How to Customize Transcribe YouTube Shorts:* To include Shorts in your workflow, select the *"Does url exist?"** If node and delete the second condition that checks for youtube.com/shorts. Change Your Database:* Don't use Supabase? Simply replace the *"Add to Content Queue Table"* node with another database or spreadsheet node, such as *Google Sheets, **Airtable, or n8n's own Table.
by higashiyama
Advanced Code Review Automation (AI + Lint + Slack) Who’s it for For software engineers, QA teams, and tech leads who want to automate intelligent code reviews with both AI-driven suggestions and rule-based linting — all managed in Google Sheets with instant Slack summaries. How it works This workflow performs a two-layer review system: Lint Check: Runs a lightweight static analysis to find common issues (e.g., use of var, console.log, unbalanced braces). AI Review: Sends valid code to Gemini AI, which provides human-like review feedback with severity classification (Critical, Major, Minor) and visual highlights (red/orange tags). Formatter: Combines lint and AI results, calculating an overall score (0–10). Aggregator: Summarizes results for quick comparison. Google Sheets Writer: Appends results to your review log. Slack Notification: Posts a concise summary (e.g., number of issues and average score) to your team’s channel. How to set up Connect Google Sheets and Slack credentials in n8n. Replace placeholders (<YOUR_SPREADSHEET_ID>, <YOUR_SHEET_GID_OR_NAME>, <YOUR_SLACK_CHANNEL_ID>). Adjust the AI review prompt or lint rules as needed. Activate the workflow — reviews will start automatically whenever new code is added to the sheet. Requirements Google Sheets and Slack integrations enabled A configured AI node (Gemini, OpenAI, or compatible) Proper permissions to write to your target Google Sheet How to customize Add more linting rules (naming conventions, spacing, forbidden APIs) Extend the AI prompt for project-specific guidelines Customize the Slack message formatting Export analytics to a dashboard (e.g., Notion or Data Studio) Why it’s valuable This workflow brings realistic, team-oriented AI-assisted code review to n8n — combining the speed of automated linting with the nuance of human-style feedback. It saves time, improves code quality, and keeps your team’s review history transparent and centralized.
by BizThrive.ai
Turn your Telegram bot into a real-time research assistant with this intelligent n8n workflow. Designed for founders, analysts, and knowledge workers, this automation uses Perplexity Sonar and Sonar Pro to deliver concise, citation-rich answers to complex queries — directly inside Telegram. 🔍 What It Does ✅ Smart Query Routing** Automatically selects the right tool based on query complexity — Sonar for fast lookups, Sonar Pro for multi-source synthesis. 📚 Cited Research Summaries** Includes clickable URLs from Perplexity’s source data for transparency and auditability. 🧠 Session Memory** Maintains chat context using Telegram chat ID for follow-up questions and threaded insight. 🔐 Secure Access Filter** Restricts bot usage to authorized Telegram users. ⚙️ Customizable Agent Behavior** Easily adjust tone, tool preferences, and citation style via system message. 🚀 Use Cases Market research & competitor analysis Academic and scientific deep-dives Legal and transcript summarization Podcast, video, and trend monitoring Personal AI assistant for founders and consultants 🛠 Setup Instructions Create a Telegram bot via @BotFather and add your token. Add your OpenAI and Perplexity API keys. Update the filter node with your Telegram user ID. Deploy and start chatting — responses appear in Telegram.
by Yehor EGMS
🔐 n8n Workflow: Access Control for Internal Chats or Chatbots This n8n workflow helps you restrict access to your internal chats or chatbots so that only authorized team members can interact with them. It's perfect for setups using Telegram, Slack, or other corporate messengers, where you need to prevent unauthorized users from triggering internal automations. 📌 Section 1: Trigger & Input ⚡ Receive Message (Telegram Trigger) Purpose: Captures every incoming message from a user interacting with your Telegram bot (or another messenger). How it works: When a user sends a message, it instantly triggers the workflow and passes their username or ID as input data. Benefit: Acts as the entry point for verifying whether a user is allowed to proceed. 📌 Section 2: Access Table Lookup 📋 User Access Table (Data Node / Spreadsheet / DB Query) Purpose: Stores all your team members and their current access status. Structure Example: | Username | Access Status | |----------|---------------| | user1 | granted | | user2 | denied | | user3 | granted | Benefit: Centralized access control — you can easily update user permissions without editing the workflow. 📌 Section 3: Permission Check 🧩 Check Access (IF Node) Purpose: Compares the incoming user's name or ID against the access table. Logic: If status = granted → Allow message to continue If status = denied → Stop workflow execution Benefit: Ensures only approved users can interact with your automations or receive responses. 📌 Section 4: Response Handling 💬 Send Reply (Telegram Node) Purpose: Sends a message back to the user depending on their access level. Paths: ✅ Granted: Sends the normal bot response or triggers the main process. ❌ Denied: Sends no reply (or an optional "Access denied" message). Benefit: Prevents unauthorized access while maintaining a seamless experience for approved users. 📊 Workflow Overview Table | Section | Node Name | Purpose | |---------|-----------|---------| | 1. Trigger | Receive Message | Captures incoming messages | | 2. Access Table | User Access Table | Stores usernames + permissions | | 3. Check | Check Access | Verifies if user has permission | | 4. Response | Send Reply | Sends or blocks response based on status | 🎯 Key Benefits 🔐 Secure access control: Only trusted users can trigger your internal automations. ⚙️ Dynamic management: Easily update user permissions from a table or database. 🧠 Lightweight setup: Just three nodes create a fully functional access gate. 🚀 Scalable foundation: Extend it with role-based access or activity logging later.
by DIGITAL BIZ TECH
SharePoint → Supabase → Google Drive Sync Workflow Overview This workflow is a multi-system document synchronization pipeline built in n8n, designed to automatically sync and back up files between Microsoft SharePoint, Supabase/Postgres, and Google Drive. It runs on a scheduled trigger, compares SharePoint file metadata against your Supabase table, downloads new or updated files, uploads them to Google Drive, and marks records as completed — keeping your databases and storage systems perfectly in sync. Workflow Structure Data Source:** SharePoint REST API for recursive folder and file discovery. Processing Layer:** n8n logic for filtering, comparison, and metadata normalization. Destination Systems:** Supabase/Postgres for metadata, Google Drive for file backup. SharePoint Sync Flow (Frontend Flow) Trigger:** Schedule Trigger Runs at fixed intervals (customizable) to start synchronization. Fetch Files:** Microsoft SharePoint HTTP Request Recursively retrieves folders and files using SharePoint’s REST API: /GetFolderByServerRelativeUrl(...)?$expand=Files,Folders,Folders/Files,Folders/Folders/Folders/Files Filter Files:** filter files A Code node that flattens nested folders and filters unwanted file types: Excludes system or temporary files (~$) Excludes extensions: .db, .msg, .xlsx, .xlsm, .pptx Normalize Metadata:** normalize last modified date Ensures consistent Last_modified_date format for accurate comparison. Fetch Existing Records:** Supabase (Get) Retrieves current entries from n8n_metadata to compare against SharePoint files. Compare Datasets:** Compare Datasets Detects new or modified files based on UniqueId, Last_modified_date, and Exists. Routes only changed entries forward for processing. File Processing Engine (Backend Flow) Loop:** Loop Over Items2 Iterates through each new or updated file detected. Build Metadata:** get metadata and Set metadata Constructs final metadata fields: file_id, file_title, file_url, file_type, foldername, last_modified_date Generates fileUrl using UniqueId and ServerRelativeUrl if missing. Upsert Metadata:** Insert Document Metadata Inserts or updates file records in Supabase/Postgres (n8n_metadata table). Operation: upsert with id as the primary matching key. Download File:** Microsoft SharePoint HTTP Request1 Fetches the binary file directly from SharePoint using its ServerRelativeUrl. Rename File:** rename files Renames each downloaded binary file to its original file_title before upload. Upload File:** Upload file Uploads the renamed file to Google Drive (My Drive → root folder). Mark Complete:** Postgres Updates the Supabase/Postgres record setting Loading Done = true. Optional Cleanup:** Supabase1 Deletes obsolete or invalid metadata entries when required. Integrations Used | Service | Purpose | Credential | |----------|----------|-------------| | Microsoft SharePoint | File retrieval and download | microsoftSharePointOAuth2Api | | Supabase / Postgres | Metadata storage and synchronization | Supabase account 6 ayan | | Google Drive | File backup and redundancy | Google Drive account 6 rn dbt | | n8n Core | Flow control, dataset comparison, batch looping | Native | System Prompt Summary > “You are a SharePoint document synchronization workflow. Fetch all files, compare them to database entries, and only process new or modified files. Download files, rename correctly, upload to Google Drive, and mark as completed in Supabase.” Workflow rule summary: > “Maintain data integrity, prevent duplicates, handle retries gracefully, and continue on errors. Skip excluded file types and ensure reliable backups between all connected systems.” Key Features Scheduled automatic sync across SharePoint, Supabase, and Google Drive Intelligent comparison to detect only new or modified files Idempotent upsert for consistent metadata updates Configurable file exclusion filters Safe rename + upload pipeline for clean backups Error-tolerant and fully automated operation Summary > A reliable, SharePoint-to-Google Drive synchronization workflow built with n8n, integrating Supabase/Postgres for metadata management. It automates file fetching, filtering, downloading, uploading, and marking as completed — ensuring your data stays mirrored across platforms. Perfect for enterprises managing document automation, backup systems, or cross-cloud data synchronization. Need Help or More Workflows? Want to customize this workflow for your organization? Our team at Digital Biz Tech can extend it for enterprise-scale document automation, RAGs and social media automation. We can help you set it up for free — from connecting credentials to deploying it live. Contact: shilpa.raju@digitalbiz.tech Website: https://www.digitalbiz.tech LinkedIn: https://www.linkedin.com/company/digital-biz-tech/ You can also DM us on LinkedIn for any help.
by Matthew
Instagram Hashtag Lead Generation Automate the process of finding and qualifying Instagram leads based on hashtags. This workflow reads hashtags from Google Sheets, scrapes Instagram for posts using Apify, analyzes caption content and language, compiles unique usernames, gathers detailed user info, and filters leads based on follower count. How It Works Fetch Hashtags The workflow starts and pulls a list of hashtags from a Google Sheet. Scrape Instagram Posts For each hashtag, it builds Instagram explore URLs and scrapes posts using Apify. Analyze Captions Each caption is cleaned, hashtags and links are removed, and language/content is analyzed (English/French/Spanish). Extract & Filter Usernames Usernames are combined and deduplicated, their Instagram profiles scraped for follower counts and other details. Qualified Leads Only users with followers in your target range are kept as qualified leads for outreach or analysis. Requirements An n8n instance. Apify API key. Google account with a Sheet containing hashtags. Apify Instagram Scraper actor access. The Google Sheet should have a column with hashtags. Setup Instructions Add Credentials In n8n, add your Apify API key and Google Sheets credentials. Configure Google Sheets Nodes Choose your credentials, spreadsheet, and sheet name in the “Get list of Hashtags” node. Configure Apify Request Nodes Enter your Apify API key and select the Instagram scraper actors. Adjust Filtering Edit the min/max follower count in the relevant filter node to match your needs. Test & Run Manually execute the workflow or add a trigger to run on a schedule. Customization Options 💡 Trigger:** Add a schedule or webhook to automate execution. Language Filtering:** Modify keyword lists or add language detection logic. Lead Output:** Extend the workflow to save leads to a CRM or send notifications. Details Nodes used in workflow: Manual Trigger Google Sheets Code HTTP Request (Apify) IF (Conditional) Aggregate Remove Duplicates Sticky Note
by RamK
Phishing Lookout (Typosquatting) and Brand Domain Monitor This workflow monitors SSL certificate logs to find and scan new domains that might be impersonating your brand. Background In modern cybersecurity, Brand Impersonation (or "Typosquatting") is quite common in phishing attacks. Attackers register domains that look nearly identical to a trusted brand—such as .input-n8n.io, n8n.i0, etc. instead of the legitimate— to deceive users into revealing sensitive credentials or downloading malware. How it works Monitor: Checks crt.sh every hour for new SSL certificates matching your brand keywords. Process: Uses a Split Out node to handle multi-domain certificates and a Filter node to ignore your own legitimate domains bringing only most recent certificates. Scan: Automatically sends suspicious domains to Urlscan.io for a headless browser scan and screenshot. Loop & Triage: Implements a 30-second Wait to allow the scan in loop to finish before fetching results. Alert: Sends a Slack message with the domain name, report link, and an image of the supposedly suspicious site trying to mimic your site login page, etc. alerting potentially a phishing case. Setup Steps Credentials: Connect your Urlscan.io API key and Slack bot token. Configuration: Update the "Poll crt.sh" node. In URL https://crt.sh/?q=%.testdomain.com&output=json, use your specific brand name (e.g., %.yourbrand.com or .yourdomain.com instead of .testdomain.com). Whitelist: Add your real domains to the myDomains list in the Filter & Deduplicate code node to prevent false alerts. Alternatively, you may also NOT opt to include your own domain for testing purposes to check how the Workflow behaves and outputs. In such case, obviously, your domain and sub-domains also are highlighted as Suspicious (as received in Slack Alerts) Looping: Ensure the Alert Slack node output is connected back to the Split In Batches input to process all found domains.
by Anton Bezman
Add Upcoming Movies from TMDB to Your Google Calendar Via Telegram This n8n template demonstrates how to automatically fetch upcoming movie releases from TMDB and let users add selected movies to their Google Calendar directly from Telegram. How it works On a daily schedule, the workflow queries the TMDB API for upcoming movie releases. Each movie is checked against an n8n data table to avoid duplicates. New movies are stored in the data table for tracking. Movie details are sent to a Telegram chat with an inline “Add to calendar” button. When the button is pressed, the workflow retrieves the movie data from the table. A calendar event is created in Google Calendar using the movie’s release date. Use cases Tracking upcoming movie releases Personal or shared release calendars Telegram-based reminders for entertainment events Automating calendar updates from public APIs Requirements TMDB API access token Telegram bot token and user ID Google Calendar OAuth credentials One n8n data table for storing movie metadata
by Rahul Joshi
Description: Keep your customer knowledge base up to date with this n8n automation template. The workflow connects Zendesk with Google Sheets, automatically fetching tickets tagged as “howto,” enriching them with requester details, and saving them into a structured spreadsheet. This ensures your internal or public knowledge base reflects the latest customer how-to queries—without manual copy-pasting. Perfect for customer support teams, SaaS companies, and service providers who want to streamline documentation workflows. What This Template Does (Step-by-Step) ⚡ Manual Trigger or Scheduling Run the workflow manually for testing/troubleshooting, or configure a schedule trigger for daily/weekly updates. 📥 Fetch All Zendesk Tickets Connects to your Zendesk account and retrieves all available tickets. 🔍 Filter for "howto" Tickets Only Processes only tickets that contain the “howto” tag, ensuring relevance. 👤 Enrich User Data Fetches requester details (name, email, profile info) to provide context. 📊 Update Google Sheets Knowledge Base Saves ticket data—including Ticket No., Description, Status, Tag, Owner Name, and Email. ✔️ Smart update prevents duplicates by matching on description. 🔁 Continuous Sync Each new or updated “howto” ticket is synced automatically into your knowledge base sheet. Key Features 🔍 Tag-based filtering for precise categorization 📊 Smart append-or-update logic in Google Sheets ⚡ Zendesk + Google Sheets integration with OAuth2 ♻️ Keeps knowledge base fresh without manual effort 🔐 Secure API credential handling Use Cases 📖 Maintain a live “how-to” guide from real customer queries 🎓 Build self-service documentation for support teams 📩 Monitor and track recurring help topics 💼 Equip knowledge managers with a ready-to-export dataset Required Integrations Zendesk API (for ticket fetch + user info) Google Sheets (for storing/updating records) Why Use This Template? ✅ Automates repetitive data entry ✅ Ensures knowledge base accuracy & freshness ✅ Reduces support team workload ✅ Easy to extend with more tags, filters, or sheet logic
by Madame AI
Track new box office releases from BrowserAct to Google Sheets & Telegram This workflow acts as an automated movie tracker that monitors box office data, filters out movies you have already seen or tracked, and sends formatted updates to your Telegram. It leverages BrowserAct for scraping and an AI Agent to deduplicate entries against your database and format the content for delivery. Target Audience Movie enthusiasts, cinema news channel administrators, and data analysts tracking entertainment trends. How it works Fetch Data: The workflow runs on a schedule (e.g., every 15 minutes) to fetch the latest movie data using BrowserAct. Load Context: It retrieves your existing movie history from Google Sheets to identify which titles are already tracked. AI Processing: An AI Agent (powered by OpenRouter) compares the new list against the existing database to remove duplicates. It then formats the valid new entries, extracting stats like "Opening Weekend" and generating an HTML-formatted Telegram post. Update Database: The workflow appends the new movie details (Budget, Cast, Links) to Google Sheets. Notify: It sends the pre-formatted HTML message directly to your Telegram chat. How to set up Configure Credentials: Connect your BrowserAct, Google Sheets, OpenRouter, and Telegram accounts in n8n. Prepare BrowserAct: Ensure the Box Office Trifecta template is saved in your BrowserAct account. Setup Google Sheet: Create a new Google Sheet with the required headers (listed below). Select Spreadsheet: Open the Get row(s) in sheet and Append row in sheet nodes to select your specific spreadsheet. Configure Notification: Open the Send a text message node and enter your Telegram Chat ID (e.g., @channelname or a numeric ID). Google Sheet Headers To use this workflow, create a Google Sheet with the following headers: Name Budget Opening_Weekend Gross_Worldwide Cast Link Summary Requirements BrowserAct* account with the *Box Office Trifecta** template. Google Sheets** account. OpenRouter** account (or credentials for a compatible LLM like Gemini or Claude). Telegram** Bot Token. How to customize the workflow Adjust Filtering Logic: Modify the system prompt in the Scriptwriter node to change how movies are filtered (e.g., only track movies with a budget over $100M). Change Output Channel: Replace the Telegram node with a Discord or Slack node if you prefer those platforms. Enrich Data: Add an HTTP Request node to fetch the movie poster image and send it as a photo message instead of just text. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video Automated Box Office Movie Channel: n8n, IMDb & Telegram 🎬
by Madame AI
Find specific jobs on Indeed using Telegram, BrowserAct & Gemini This workflow transforms your Telegram bot into an intelligent job hunting assistant. Simply tell the bot what you are looking for (e.g., "Marketing Manager in Austin"), and it will automatically scrape real-time job listings from Indeed, format them into a clean, easy-to-read list, and send them directly to your chat. Target Audience Job seekers, recruiters, and career coaches who want to automate the job search process. How it works Receive Request: You send a message to your Telegram bot describing the job you want (e.g., "I need a Python developer job in London"). Extract Parameters: An AI Agent analyzes your message to extract the key details: Job Role and Location. If you don't specify a location, it defaults to a preset value (e.g., "Brooklyn"). Scrape Indeed: BrowserAct executes a background task to search Indeed.com using your specific criteria and extracts relevant job data (Title, Company, Pay, Link). Format Data: A second AI Agent processes the raw job list. It cleans up the text, adds emojis for readability, and formats the output into Telegram-friendly HTML. Deliver Alerts: The workflow splits the formatted list into individual messages (to respect character limits) and sends them to your Telegram chat. How to set up Configure Credentials: Connect your Telegram, BrowserAct, Google Gemini, and OpenRouter accounts in n8n. Prepare BrowserAct: Ensure the Indeed Smart Job Scout template is saved in your BrowserAct account. Configure Telegram: Create a bot via BotFather and add the API token to your Telegram credentials. Activate: Turn on the workflow. Test: Send a message like "Find marketing jobs in Chicago" to your bot. Requirements BrowserAct* account with the *Indeed Smart Job Scout** template. Telegram** account (Bot Token). Google Gemini** account. OpenRouter** account (or compatible LLM credentials). How to customize the workflow Change Default Location: Edit the system prompt in the Analyze user Input agent to change the fallback location from "Brooklyn" to your preferred city. Filter Results: Add logic to the Generate response agent to filter out jobs with low ratings or missing salary info. Add More Boards: Update the BrowserAct template to scrape LinkedIn or Glassdoor in addition to Indeed. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video Indeed Smart Job Scout: The Ultimate n8n Workflow for Job Channels
by Madame AI
Curate daily tech news for Slack and Telegram using BrowserAct & OpenRouter This workflow automates the creation and delivery of a professional morning tech briefing. It scrapes headlines from major sources like The Verge and Product Hunt, uses parallel AI agents to format the content specifically for Telegram (HTML) and Slack (Markdown), and broadcasts the updates to your team or community channels. Target Audience Product managers, development teams, community managers, and tech enthusiasts who want a consolidated daily news digest without manual curation. How it works Daily Trigger: A Schedule Trigger starts the workflow every morning at 10:00 AM. Fetch News: A BrowserAct node executes a background task to scrape the latest headlines and links from defined sources (default: The Verge and Product Hunt). Parallel Formatting: The data splits into two paths: Telegram Path: The "Telegram Master" AI Agent formats the news into clean HTML, adding emojis and ensuring links work within Telegram's API constraints. Slack Path: The "Slack Master" AI Agent formats the same news into Slack-compatible Markdown, prioritizing developer tools and using Slack-specific syntax. Smart Splitting: Both AI agents automatically split the content into multiple message parts if the text exceeds the character limits of the respective platforms. Broadcast: The workflow iterates through the generated message parts and sends them sequentially to Telegram and Slack. How to set up Configure Credentials: Connect your BrowserAct, OpenRouter (for AI models), Telegram, and Slack accounts in n8n. Prepare BrowserAct: Ensure the Automated Multi-Site Morning Brief template is saved in your BrowserAct account. Configure Channels: Open the Send a text message (Telegram) node and enter your Chat ID. Open the Send a message (Slack) node and select your target channel. Activate: Turn on the workflow to start receiving daily briefs. Requirements BrowserAct* account with the *Automated Multi-Site Morning Brief** template. OpenRouter** account (or credentials for a specific LLM like Claude or Gemini). Telegram** Bot Token. Slack** account. How to customize the workflow Add News Sources: Update the BrowserAct node inputs to scrape different websites (e.g., Hacker News, TechCrunch). Change AI Persona: Modify the system prompts in the Telegram Master or Slack Master nodes to change the tone from "Professional" to "Casual" or "Sarcastic." Add More Platforms: Duplicate one of the formatting branches to create a version for Discord or Microsoft Teams. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video Keep Your Team Updated Automatically 🌅 n8n Morning Brief Workflow