by furuidoreandoro
Automated TikTok Real Estate Research for Couples This workflow automates the process of finding real estate (rental) videos on TikTok, filtering them for a specific target audience (couples in their 20s), generating an explanation of why they are recommended, and saving the results to Google Sheets and Slack. Who’s it for Real Estate Agents & Marketers:** To research trending rental properties and video styles popular on social media. Content Curators:** To automatically gather and summarize niche content from TikTok. House Hunters:** To automate the search for "rental" videos tailored to couples. How it works / What it does Trigger: The workflow starts manually (on click). Scrape TikTok: It connects to Apify to run a "TikTok Scraper". It searches for videos with the hashtag 賃貸 (Rental) and retrieves metadata. Filter & Extract (AI Agent 1): An AI Agent (using OpenRouter) analyzes the retrieved video data to select properties suitable for "couples in their 20s" and outputs the video URL. Generate Insights (AI Agent 2): A second AI Agent reviews the URL/content and generates a specific reason why this property is recommended for the target audience, formatting the output with the URL and explanation. Save to Database: The final text (URL + Reason) is appended to a Google Sheet. Notify Team: The same recommendation text is sent to a specific Slack channel to alert the user. Requirements n8n:** Version 1.0 or later. Apify Account:** You need an API token and access to the clockworks/tiktok-scraper actor. OpenRouter Account:** An API Key to use Large Language Models (LLMs) for the AI Agents. Google Cloud Platform:** A project with the Google Sheets API enabled and OAuth credentials. Slack Workspace:** Permission to add apps/bots to a channel. How to set up Import the Workflow: Copy the JSON code and paste it into your n8n editor. Configure Credentials: Apify: Create a new credential in n8n using your Apify API Token. OpenRouter: Create a new credential using your OpenRouter API Key. Google Sheets: Connect your Google account via OAuth2. Slack: Connect your Slack account via OAuth2. Configure Nodes: Google Sheets Node: Select your specific Spreadsheet and Sheet from the dropdown lists (replace the placeholders YOUR_SPREADSHEET_ID etc. if they don't update automatically). Slack Node: Select the Channel where you want to receive notifications (replace YOUR_CHANNEL_ID). Test: Click "Execute Workflow" to run a test. How to customize the workflow Change the Search Topic:* Open the *Apify** node and change the hashtags value in the "Custom Body" JSON (e.g., change "賃貸" to "DIY" or "Travel"). Adjust the Persona:* Open the *AI Agent** nodes and modify the text prompt. You can change the target audience from "20s couples" to "students" or "families." Increase Volume:* In the *Apify** node, increase the resultsPerPage or maxProfilesPerQuery to process more videos at once (note: this will consume more API credits). Change Output Format:* Modify the *Google Sheets** node to map specific fields (like Video Title, Author, Likes) into separate columns instead of just one raw output string.
by Angel Menendez
Who it’s for This workflow is for content creators and marketers who write short scripts in Google Sheets and want to automatically turn each line into an AI-generated avatar video stored in Google Drive, with links written back to the sheet. How it works A Manual Trigger starts the workflow. It first uses Get Avatar Description (Google Sheets) to read avatar details from a dedicated “Gaia” sheet. The Global Variables node sets the working script page (for example, “Draft 5”) and exposes the avatar description. Next, Get Script reads all rows from the selected sheet. Loop Over Items iterates through each row, while Set Loop Inputs prepares the variables: avatar description, speech, and framing. For every row, Generate a video with Veo (Google Gemini video model) creates an 8-second 16:9 clip. Upload video file saves it to a chosen Google Drive folder, and Update row in sheet with link to video writes the Drive link back into the same row, then loops to the next snippet. Yellow sticky notes explain each phase, with the large one summarizing the end-to-end snippet generation loop. How to set up Connect your Google Sheets and Google Drive credentials. Update the spreadsheet IDs, sheet names, and Drive folder to match your own. Configure the Gemini/Veo model credentials. Adjust the default script page name in Global Variables. Requirements n8n instance Google Sheets and Google Drive accounts Google Gemini / Veo API access No API keys or personal identifiers are hardcoded; always store credentials securely in n8n and avoid real PII in test data. How to customize Change the page value in Global Variables to target different script tabs. Edit the Veo prompt to alter background, camera framing, or speaking style. Modify video duration, aspect ratio, or output folder in the Gemini and Drive nodes. Extend the loop to add more post-processing steps (e.g., thumbnail generation, analytics tracking).
by Anton Bezman
Add Upcoming Movies from TMDB to Your Google Calendar Via Telegram This n8n template demonstrates how to automatically fetch upcoming movie releases from TMDB and let users add selected movies to their Google Calendar directly from Telegram. How it works On a daily schedule, the workflow queries the TMDB API for upcoming movie releases. Each movie is checked against an n8n data table to avoid duplicates. New movies are stored in the data table for tracking. Movie details are sent to a Telegram chat with an inline “Add to calendar” button. When the button is pressed, the workflow retrieves the movie data from the table. A calendar event is created in Google Calendar using the movie’s release date. Use cases Tracking upcoming movie releases Personal or shared release calendars Telegram-based reminders for entertainment events Automating calendar updates from public APIs Requirements TMDB API access token Telegram bot token and user ID Google Calendar OAuth credentials One n8n data table for storing movie metadata
by automedia
Transcribe New YouTube Videos and Save to Supabase Who's It For? This workflow is for content creators, marketers, researchers, and anyone who needs to quickly get text transcripts from YouTube videos. If you analyze video content, repurpose it for blogs or social media, or want to make videos searchable, this template will save you hours of manual work. What It Does This template automatically monitors multiple YouTube channels for new videos. When a new video is published, it extracts the video ID, retrieves the full transcript using the youtube-transcript.io API, and saves the structured data—including the title, author, URL, and full transcript—into a Supabase table. It intelligently filters out YouTube Shorts by default and includes error handling to ensure that only successful transcriptions are processed. Requirements A Supabase account with a table ready to receive the video data. An API key from youtube-transcript.io (offers a free tier). The Channel ID for each YouTube channel you want to track. You can find this using a free online tool like TunePocket's Channel ID Finder. How to Set Up Add Channel IDs: In the "Channels To Track" node, replace the example YouTube Channel IDs with your own. The workflow uses these IDs to create RSS links and find new videos. Configure API Credentials: Find the "youtube-captions" HTTP Request node. In the credentials tab, create a new "Header Auth" credential. Name it youtube-transcript-io and paste your API key into the "Value" field. The "Name" field should be x-api-key. Connect Your Supabase Account: Navigate to the "Add to Content Queue Table" node. Create new credentials for your Supabase account using your Project URL and API key. Once connected, select your target table and map the incoming fields (title, source_url, content_snippet, etc.) to the correct columns in your table. Set Your Schedule (Optional): The workflow starts with a manual trigger. To run it automatically, replace the "When clicking ‘Execute workflow’" node with a Schedule node and set your desired interval (e.g., once a day). Activate the Workflow: Save your changes and toggle the workflow to Active in the top right corner. How to Customize Transcribe YouTube Shorts:* To include Shorts in your workflow, select the *"Does url exist?"** If node and delete the second condition that checks for youtube.com/shorts. Change Your Database:* Don't use Supabase? Simply replace the *"Add to Content Queue Table"* node with another database or spreadsheet node, such as *Google Sheets, **Airtable, or n8n's own Table.
by higashiyama
Advanced Code Review Automation (AI + Lint + Slack) Who’s it for For software engineers, QA teams, and tech leads who want to automate intelligent code reviews with both AI-driven suggestions and rule-based linting — all managed in Google Sheets with instant Slack summaries. How it works This workflow performs a two-layer review system: Lint Check: Runs a lightweight static analysis to find common issues (e.g., use of var, console.log, unbalanced braces). AI Review: Sends valid code to Gemini AI, which provides human-like review feedback with severity classification (Critical, Major, Minor) and visual highlights (red/orange tags). Formatter: Combines lint and AI results, calculating an overall score (0–10). Aggregator: Summarizes results for quick comparison. Google Sheets Writer: Appends results to your review log. Slack Notification: Posts a concise summary (e.g., number of issues and average score) to your team’s channel. How to set up Connect Google Sheets and Slack credentials in n8n. Replace placeholders (<YOUR_SPREADSHEET_ID>, <YOUR_SHEET_GID_OR_NAME>, <YOUR_SLACK_CHANNEL_ID>). Adjust the AI review prompt or lint rules as needed. Activate the workflow — reviews will start automatically whenever new code is added to the sheet. Requirements Google Sheets and Slack integrations enabled A configured AI node (Gemini, OpenAI, or compatible) Proper permissions to write to your target Google Sheet How to customize Add more linting rules (naming conventions, spacing, forbidden APIs) Extend the AI prompt for project-specific guidelines Customize the Slack message formatting Export analytics to a dashboard (e.g., Notion or Data Studio) Why it’s valuable This workflow brings realistic, team-oriented AI-assisted code review to n8n — combining the speed of automated linting with the nuance of human-style feedback. It saves time, improves code quality, and keeps your team’s review history transparent and centralized.
by Cahya
This workflow allows you to generate QR codes (Barcodes) in bulk from a Google Sheets file and store the generated QR images automatically in Google Drive. Each QR code contains a unique identifier (in this template, an email address) and is connected to a validation webhook. When a QR code is scanned, the webhook checks whether the ID exists and verifies its status. The system updates the spreadsheet to reflect whether the QR code has been generated and whether it has already been used. This template demonstrates a simple but practical QR lifecycle system: Generate QR codes in bulk from spreadsheet data Store QR images in Google Drive Validate QR scans through webhook Track generation and usage status directly in the spreadsheet Example use cases include event check-in systems, access control, membership validation, digital ticketing, or controlled asset distribution. The workflow is designed to be modular, easy to customize, and suitable for real-world implementation. How it works Set Up Google Sheets Create a Google Sheets file with at least the following columns: email & status_qr. The email column acts as the unique ID. The status_qr column will automatically update to indicate: QR_GENERATED QR_USED Make sure there are no duplicate email values. Set Up Credentials Google Sheets Credential Google Drive Credential Connect your Google account. Copy the Spreadsheet & Folder ID and configure it in the node. Configure the QR Generator Workflow Confirm the spreadsheet and sheet name are correctly selected. Ensure the workflow checks the status_qr column before generating a QR code. Only rows with empty status_qr will generate new QR codes. Generated QR images will be uploaded automatically to the configured Google Drive folder. The status_qr column will be updated to indicate that the QR has been generated. Run the workflow to generate QR codes in bulk. Configure the Validation Webhook Set the workflow to activate. Ensure the webhook URL matches the base URL embedded in the QR codes. When a QR code is scanned: The system extracts the email (ID). It checks whether the email exists in the spreadsheet. It verifies whether the QR has already been used. If valid, it updates status_qr to reflect usage. Need Help? Contact me on LinkedIn!
by BizThrive.ai
Turn your Telegram bot into a real-time research assistant with this intelligent n8n workflow. Designed for founders, analysts, and knowledge workers, this automation uses Perplexity Sonar and Sonar Pro to deliver concise, citation-rich answers to complex queries — directly inside Telegram. 🔍 What It Does ✅ Smart Query Routing** Automatically selects the right tool based on query complexity — Sonar for fast lookups, Sonar Pro for multi-source synthesis. 📚 Cited Research Summaries** Includes clickable URLs from Perplexity’s source data for transparency and auditability. 🧠 Session Memory** Maintains chat context using Telegram chat ID for follow-up questions and threaded insight. 🔐 Secure Access Filter** Restricts bot usage to authorized Telegram users. ⚙️ Customizable Agent Behavior** Easily adjust tone, tool preferences, and citation style via system message. 🚀 Use Cases Market research & competitor analysis Academic and scientific deep-dives Legal and transcript summarization Podcast, video, and trend monitoring Personal AI assistant for founders and consultants 🛠 Setup Instructions Create a Telegram bot via @BotFather and add your token. Add your OpenAI and Perplexity API keys. Update the filter node with your Telegram user ID. Deploy and start chatting — responses appear in Telegram.
by Yehor EGMS
🔐 n8n Workflow: Access Control for Internal Chats or Chatbots This n8n workflow helps you restrict access to your internal chats or chatbots so that only authorized team members can interact with them. It's perfect for setups using Telegram, Slack, or other corporate messengers, where you need to prevent unauthorized users from triggering internal automations. 📌 Section 1: Trigger & Input ⚡ Receive Message (Telegram Trigger) Purpose: Captures every incoming message from a user interacting with your Telegram bot (or another messenger). How it works: When a user sends a message, it instantly triggers the workflow and passes their username or ID as input data. Benefit: Acts as the entry point for verifying whether a user is allowed to proceed. 📌 Section 2: Access Table Lookup 📋 User Access Table (Data Node / Spreadsheet / DB Query) Purpose: Stores all your team members and their current access status. Structure Example: | Username | Access Status | |----------|---------------| | user1 | granted | | user2 | denied | | user3 | granted | Benefit: Centralized access control — you can easily update user permissions without editing the workflow. 📌 Section 3: Permission Check 🧩 Check Access (IF Node) Purpose: Compares the incoming user's name or ID against the access table. Logic: If status = granted → Allow message to continue If status = denied → Stop workflow execution Benefit: Ensures only approved users can interact with your automations or receive responses. 📌 Section 4: Response Handling 💬 Send Reply (Telegram Node) Purpose: Sends a message back to the user depending on their access level. Paths: ✅ Granted: Sends the normal bot response or triggers the main process. ❌ Denied: Sends no reply (or an optional "Access denied" message). Benefit: Prevents unauthorized access while maintaining a seamless experience for approved users. 📊 Workflow Overview Table | Section | Node Name | Purpose | |---------|-----------|---------| | 1. Trigger | Receive Message | Captures incoming messages | | 2. Access Table | User Access Table | Stores usernames + permissions | | 3. Check | Check Access | Verifies if user has permission | | 4. Response | Send Reply | Sends or blocks response based on status | 🎯 Key Benefits 🔐 Secure access control: Only trusted users can trigger your internal automations. ⚙️ Dynamic management: Easily update user permissions from a table or database. 🧠 Lightweight setup: Just three nodes create a fully functional access gate. 🚀 Scalable foundation: Extend it with role-based access or activity logging later.
by Madame AI
Generate SEO articles from search queries to WordPress with BrowserAct This workflow automates a programmatic SEO pipeline by turning a list of search queries into fully researched, authoritative blog posts. It scrapes search results (focusing on community insights like Reddit) for real-world data, uses AI to draft comprehensive guides, and publishes them directly to your WordPress site. Target Audience SEO specialists, content marketers, niche site builders, and editorial teams looking to scale content production with high-quality, researched articles. How it works Define Topics: The workflow begins by defining a list of target keywords or questions in a Set node (e.g., "Best automation tools"). Research: It iterates through each query using a Loop node. For each item, BrowserAct scrapes search engine results to gather raw insights, discussions, and market consensus. Draft Content: An AI Agent (acting as a "Senior Technical Editor") analyzes the raw data. It synthesizes the information into a structured, HTML-formatted article with tables, headers, and actionable advice. Publish: The generated content is sent to WordPress to create a new post. Notify: Once the entire batch is processed, a Slack message is sent to notify the team. How to set up Configure Credentials: Connect your BrowserAct, OpenRouter, WordPress, and Slack accounts in n8n. Prepare BrowserAct: Ensure the Programmatic SEO Data Pipeline template is saved in your BrowserAct account. Set Keywords: Open the Set queries node and update the Queries array with the list of topics you want to write about. Configure WordPress: Open the Create a post node and ensure it is connected to your WordPress site. Configure Notification: Open the Send completion notification node and select the Slack channel where you want to receive alerts. Requirements BrowserAct* account with the *Programmatic SEO Data Pipeline** template. OpenRouter** account (or credentials for a specific LLM like GPT-4o or GPT-5). WordPress** account. Slack** account. How to customize the workflow Adjust the Persona: Modify the system prompt in the AI Agent node to change the writing style (e.g., from "Technical Editor" to "Casual Blogger" or "Sales Copywriter"). Add Visuals: Insert an image generation node (like DALL-E or Stable Diffusion) before the WordPress node to create a unique featured image based on the article title. Review Loop: Instead of publishing directly, change the final step to add the draft to Google Docs or Notion for human approval. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video Automated Content Factory: From Reddit Data to SEO Blog Posts with n8n
by Madame AI
Track new box office releases from BrowserAct to Google Sheets & Telegram This workflow acts as an automated movie tracker that monitors box office data, filters out movies you have already seen or tracked, and sends formatted updates to your Telegram. It leverages BrowserAct for scraping and an AI Agent to deduplicate entries against your database and format the content for delivery. Target Audience Movie enthusiasts, cinema news channel administrators, and data analysts tracking entertainment trends. How it works Fetch Data: The workflow runs on a schedule (e.g., every 15 minutes) to fetch the latest movie data using BrowserAct. Load Context: It retrieves your existing movie history from Google Sheets to identify which titles are already tracked. AI Processing: An AI Agent (powered by OpenRouter) compares the new list against the existing database to remove duplicates. It then formats the valid new entries, extracting stats like "Opening Weekend" and generating an HTML-formatted Telegram post. Update Database: The workflow appends the new movie details (Budget, Cast, Links) to Google Sheets. Notify: It sends the pre-formatted HTML message directly to your Telegram chat. How to set up Configure Credentials: Connect your BrowserAct, Google Sheets, OpenRouter, and Telegram accounts in n8n. Prepare BrowserAct: Ensure the Box Office Trifecta template is saved in your BrowserAct account. Setup Google Sheet: Create a new Google Sheet with the required headers (listed below). Select Spreadsheet: Open the Get row(s) in sheet and Append row in sheet nodes to select your specific spreadsheet. Configure Notification: Open the Send a text message node and enter your Telegram Chat ID (e.g., @channelname or a numeric ID). Google Sheet Headers To use this workflow, create a Google Sheet with the following headers: Name Budget Opening_Weekend Gross_Worldwide Cast Link Summary Requirements BrowserAct* account with the *Box Office Trifecta** template. Google Sheets** account. OpenRouter** account (or credentials for a compatible LLM like Gemini or Claude). Telegram** Bot Token. How to customize the workflow Adjust Filtering Logic: Modify the system prompt in the Scriptwriter node to change how movies are filtered (e.g., only track movies with a budget over $100M). Change Output Channel: Replace the Telegram node with a Discord or Slack node if you prefer those platforms. Enrich Data: Add an HTTP Request node to fetch the movie poster image and send it as a photo message instead of just text. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video Automated Box Office Movie Channel: n8n, IMDb & Telegram 🎬
by Vonn
This workflow automatically searches YouTube Data API for videos related to specific keywords, extracts channel data, filters channels based on performance metrics, and saves the results into Google Sheets. Instead of manually searching YouTube and copying channel information, this workflow continuously discovers creators and builds a structured lead database. It is designed for marketers, researchers, agencies, and teams that need a reliable way to identify relevant YouTube channels in specific niches. Who this workflow is for This workflow is ideal for: influencer marketing teams discovering creators in specific niches agencies building creator outreach lists researchers analyzing YouTube channel growth and trends startups building creator marketplaces marketers identifying potential partners or competitors If you regularly search YouTube for channels related to certain topics, this workflow automates that process. How it works The workflow runs on a schedule in n8n and processes keywords one at a time to avoid API limits. The automation performs the following steps: A Schedule Trigger starts the workflow automatically. Keywords are retrieved from a keyword list. The workflow selects one keyword to process. The YouTube Data API searches for videos related to that keyword. Channel IDs are extracted from the video results. Channel statistics such as subscriber count, total views, and video count are retrieved. Channels are filtered using configurable thresholds. Qualified channels are saved or updated in Google Sheets. The result is an automatically growing database of YouTube channels relevant to your target topics. Requirements To use this workflow you will need: An instance of n8n A YouTube Data API credential A Google Sheets account A spreadsheet where discovered channels will be stored How to set up Create credentials for the YouTube Data API in n8n. Connect your Google Sheets account. Add your target keywords to your keyword source. Configure filtering rules such as minimum subscribers or views. Set the schedule trigger interval. Once configured, the workflow will run automatically and populate your spreadsheet with newly discovered YouTube channels. How to customize the workflow You can adapt the workflow depending on your use case. Common customizations include: adjusting subscriber or view thresholds discovering smaller emerging creators by lowering filters adding additional data fields such as video titles or upload dates integrating notification tools like Slack when new channels are discovered exporting results to databases or CRM systems Because the workflow is built in n8n, you can easily extend it to support more advanced creator discovery or automation pipelines.
by Jemee
This workflow automates the extraction of SEO metadata (URL, page title, and meta description) from every page listed in your website's sitemap and exports it to Google Sheets. Ideal for SEO audits, content inventories, and tracking on-page elements. Prerequisites Before using this workflow: A publicly accessible sitemap.xml URL Google Sheets spreadsheet with columns: URL, Title, and meta description Google Sheets API access via OAuth2 Setup Instructions 1. Configure Sitemap Source In the "Get Sitemap XML" node, replace the default URL with your actual sitemap URL 2. Connect Google Sheets Open the "Append or update row in sheet" node Configure Google Sheets credentials Set Document ID and Sheet Name Verify column mappings match your spreadsheet 3. Adjust Rate Limiting (Optional) Modify Wait nodes if encountering 429 errors Increase delay between requests if needed How It Works Trigger: Manual workflow execution Sitemap Fetch: Retrieve sitemap.xml file URL Parsing: Extract all URLs from sitemap Batch Processing: Process URLs in manageable batches Data Extraction: Scrape title and meta description from each page Data Merge: Combine URL with extracted metadata Sheet Update: Append or update rows in Google Sheets using URL as a unique key Features Duplicate Prevention**: Uses appendOrUpdate with URL matching Rate Limiting**: Built-in delays between requests Flexible Processing**: Handles sitemaps of various sizes Easy Customization**: Modify code nodes for additional data extraction Use Cases SEO audits of title and description tags Content migration planning Website content inventory management Ongoing SEO monitoring and reporting