by Swot.AI
This workflow automates document summarization directly from Google Drive, processes the content using Mistral AI, and delivers a clean, styled summary via Gmail. It's ideal for professionals who need quick insights from lengthy documents without manually reading through them. โ Key Features: Google Drive Integration: Fetches a file (PDF/DOCX) from your Drive. AI Summarization: Uses Mistral AI to extract key points efficiently. Styled Email Output: Delivers a formatted, easy-to-read summary to your inbox with a timestamp. Error Handling: Built to skip corrupted files or missing credentials. ๐ง Nodes Breakdown: 1๏ธโฃ Manual Trigger โ Starts the workflow manually for easy testing. 2๏ธโฃ Google Drive Node โ Downloads a specified file from Google Drive (supports PDF/DOCX). 3๏ธโฃ Mistral Cloud Chat Model Node โ Connects to Mistral AI for summarization. 4๏ธโฃ Summarization Chain Node โ Breaks the file into chunks, processes content, and generates a concise summary. 5๏ธโฃ Gmail Node โ Sends the styled summary directly to the userโs inbox, with custom formatting and current time in the Lagos timezone. Extra Features: Dynamic Time Formatting: Supports Lagos timezone (easily adjustable). HTML Styling: Beautiful email formatting with headers, icons, and line breaks for clarity. Custom Email Sender Name: Branded output (e.g., "Swot.AI"). Future Expansion: Can extend to WhatsApp or Slack with minor tweaks. Use Cases: Legal teams summarizing contracts. Content creators extracting highlights from research papers. Business analysts getting insights from reports on-the-go. Customization Tips: Change the timezone (Africa/Lagos) to match your preferred location. Add error-handling nodes for missing files or API failures. Swap Mistral AI with OpenAI for different summarization behavior. Change the "Send To" address(email to receive the Summarized texts) with your personal preffered address.** Change the "Sender Name" from Swot.AI to your preferred Sender Name.** Why To Use This Workflow? This automation saves hours of manual reading. Itโs perfect for personal productivity, legal analysis, content creation, or business reporting. With clean formatting and a professional email summary โ your team will get instant insights in seconds! I can make this much better and build others, If Interested: *Swot.ai25@gmail.com*
by Kalyxi Ai
๐ Automate News Discovery & Publishing with GPT-4, Google Search API & Slack ๐ฏ Overview Automated content publishing system that discovers industry news, transforms it into original articles using GPT-4, and publishes across multiple channels with SEO optimization and intelligent duplicate prevention. โจ Key Features ๐ค Smart Query Generation** - AI agent generates unique search queries while checking Google Sheets to avoid duplicates ๐ News Discovery** - Uses Google Custom Search API to find recent articles (last 7 days) ๐ง Content Intelligence** - Processes search results and skips anti-bot protected sites automatically ๐ GPT-4 Article Generation** - Creates professional, SEO-optimized news articles in Reuters/Bloomberg style ๐ข Multi-Channel Publishing** - Publishes to CMS with automatic Slack notifications ๐ Comprehensive Tracking** - Logs all activity to Google Sheets for analytics and duplicate prevention ๐ How It Works โฐ Scheduled Trigger runs every 8 hours to maintain consistent content flow ๐ค AI Agent generates targeted search queries for your niche while checking historical data ๐ Google Search finds recent articles and extracts metadata (title, snippet, source) ๐ก๏ธ Smart Content Handler bypasses sites with anti-bot protection, using search snippets instead โก GPT-4 Processing transforms snippets into comprehensive 2000+ word articles with proper formatting ๐ Publishing Pipeline formats content for CMS with SEO metadata and publishes automatically ๐ฑ Notification System sends detailed Slack updates with article metrics ๐ Activity Logging tracks all published content to prevent future duplicates ๐ง Setup Requirements ๐ Prerequisites Google Custom Search API key and Search Engine ID OpenAI GPT-4 API access Google account for tracking spreadsheet Slack workspace for notifications CMS or website with API endpoint for publishing ๐ ๏ธ Step-by-Step Setup Step 1: ๐ Google Custom Search Configuration Go to Google Custom Search Engine Create a new search engine Configure to search the entire web Copy your Search Engine ID (cx parameter) Get your API key from Google Cloud Console Step 2: ๐ Google Sheets Template Setup Create a Google Sheet with these required columns: Column A:** timestamp - ISO date format (YYYY-MM-DD HH:MM:SS) Column B:** query - The search query used Column C:** title - Published article title Column D:** url - Published article URL Column E:** status - Publication status (success/failed) Column F:** word_count - Final article word count Template URL: Copy this Google Sheets template Step 3: ๐ Credential Configuration Set up the following credentials in n8n: ๐ Google Sheets API - OAuth2 connection to your Google account ๐ค OpenAI API - Your GPT-4 API key ๐ฑ Slack Webhook - Webhook URL for your notification channel ๐ Custom Search API - Your Google Custom Search API key Step 4: โ๏ธ Workflow Customization Modify these key parameters to fit your needs: ๐ฏ Search Topic:** Edit the AI agent prompt to focus on your industry โฐ Publishing Schedule:** Adjust the cron trigger (default: every 8 hours) ๐ Article Length:** Modify GPT-4 prompt for different word counts ๐ CMS Endpoint:** Update the publishing node with your website's API ๐จ Customization Options ๐ฏ Content Targeting Modify the AI agent's search query generation to focus on specific industries Adjust date restrictions (currently set to last 7 days) Change the number of search results processed per run โ๏ธ Article Style Customize GPT-4 prompts for different writing styles (formal, casual, technical) Adjust article length requirements Modify SEO optimization parameters ๐ก Publishing Channels Add additional CMS endpoints for multi-site publishing Configure different notification channels (Discord, Teams, etc.) Set up social media auto-posting integration ๐ก Use Cases ๐ฐ Automated news websites ๐ Industry blog content generation ๐ SEO content pipeline automation ๐ News aggregation and republishing ๐ Content marketing automation ๐ ๏ธ Technical Notes Workflow includes error handling for anti-bot protection Duplicate prevention through Google Sheets tracking Rate limiting considerations for API usage Automatic retry logic for failed requests ๐ Support For setup assistance or customization help, refer to the workflow's internal documentation nodes or contact the template creator.
by Michael Gullo
A Customizable n8n Automation That Turns Your Inbox Into A Daily Digest. The goal of this workflow is to offer a highly customizable foundation that users can tailor to fit their specific platform and setup. While the current version uses Gmail, it can easily be adapted to work with other providers by replacing the email node with alternatives such as IMAP Email Trigger, Microsoft Outlook, or any compatible Email node. This workflow can also be extended to work with platforms like Telegram, WhatsApp, or any service that supports bots and n8n integration. The core objective is to generate scheduled email summaries whether itโs the most recent email, emails from a specific sender, or all emails received within a day. I built this workflow as a flexible building block for anyone looking to develop a more advanced email agent. Itโs designed to reduce the mental load of reviewing emails each day by automatically delivering a summarized version of your inbox. Currently, the summary is saved to Google Docs, chosen for its simplicity and accessibility. However, users can easily modify this to integrate with other document management systems or destinations. I plan to continue updating and expanding this workflow to better serve the needs of users. If you have suggestions, ideas, or feedback, Iโd love to hear them your input helps make this tool even more useful. Workflow Components Schedule Node โ Triggers the workflow daily at a specified time. Gmail: Get Messages Node โ Retrieves the latest email. Can be changed for any amount of emails. Limit Node โ Ensures only one or any number emails is processed at a time. If Node โ Checks if any emails were retrieved. Code Node โ Cleans and formats the email content. Code Node โ Provides a fallback message if no emails are found. OpenAI Summary Node โ Summarizes the email using CharGPT. Create Google Doc Node โ Creates a new Google Document for the summary. Update Google Doc Node โ Inserts the summarized content into the document. Expanding The Workflow This workflow is fully modular and easy to extend. To send summaries via Telegram, Slack, or any other emails simply add the respective node after the summary is generated and connect your bot or webhook credentials. To use Outlook instead of Gmail, just swap the email input node with the Microsoft Outlook node or an IMAP Email Trigger, depending on your preferred setup. Need Help? Have Questions? For consulting and support, or if you have questions, please feel free to connect with me on LinkedIn or via email.
by Dhrumil Patel
Stay ahead in your trading game with this powerful n8n automation workflow. Designed for real-time efficiency, this setup continuously scans your Gmail inbox for trading alerts from TradingView and ensures you never miss a signal. Every minute, this workflow will: ๐ฉ Check Gmail for new messages using a trigger. ๐ Identify emails coming specifically from TradingView. ๐ Extract key trading signals like BUY or SELL along with the company name. ๐ Capture the exact date and time of the alert. ๐ Log all extracted signals neatly into a Google Sheet for easy tracking and analysis. ๐ฒ Instantly notify you on Telegram with a custom message so you can act fast. Perfect for traders who want an automated assistant to handle alerts, record them for analysis, and provide timely updates โ all without lifting a finger.
by Artem Boiko
Estimate embodied carbon (CO2e) for grouped BIM/CAD elements. The workflow accepts an existing XLSX (grouped element data) or, if missing, can trigger a local RvtExporter.exe to generate one. It detects category fields, filters out non-building elements, infers aggregation rules with AI, computes CO2 using densities & emission factors, and exports a multi-sheet Excel plus a clean HTML report. What it does Reads or builds XLSX** (from your model via RvtExporter.exe when needed). Finds category/volumetric fields**; separates building vs. annotation elements. Uses AI to infer aggregation rules (sum/mean/first) per header. Groups** rows by your group_by field and aggregates totals. Prepares enhanced prompts and calls your LLM to classify materials and estimate CO2 (A1-A3 minimum). Computes project totals* and generates a *multi-sheet XLSX* + *HTML** report with charts and hotspots. Prerequisites LLM credentials** for one provider (e.g., OpenAI, Anthropic, Gemini, Grok/OpenRouter). Enable one chat node and connect credentials. Windows host* only if you want to auto-extract from .rvt/.ifc via RvtExporter.exe. If you already have an XLSX, Windows is *not required**. Optional: mapping/classifier files (XLSX/CSV/PDF) to improve material classification. How to use Import this JSON into n8n. Open the Setup/Parameters node(s) and set: project_file โ path to your .rvt/.ifc or to an existing grouped *_rvt.xlsx path_to_converter โ C:\\DDC_Converter_Revit\\datadrivenlibs\\RvtExporter.exe (optional) group_by โ e.g., Type Name / Category / IfcType sheet_name โ default Summary (if reading from XLSX) Enable one LLM node and attach credentials; keep others disabled. Execute (Manual Trigger). The workflow detects/builds the XLSX, analyzes, classifies, estimates CO2, then writes Excel and opens the HTML report. Outputs Excel** (CO2_Analysis_Report_YYYY-MM-DD.xlsx, ~8 sheets): Executive Summary, All Elements, Material Summary, Category Analysis, Impact Analysis, Top 20 Hotspots, Data Quality, Recommendations. HTML**: executive report with key KPIs and charts. Per-group fields include: Material (EU/DE/US), Quantity & Unit, Density, Mass, CO2 Factor, Total CO2 (kg/tonnes), CO2 %, Confidence, Assumptions. Notes & tips Input quantities (volumes/areas) are already aggregated per group โ do not multiply by element count. Use -no-collada upstream if you only need XLSX in extraction. Prefer ASCII-safe paths and ensure write permissions to output folder. Categories Data Extraction ยท Files & Storage ยท ETL ยท CAD/BIM ยท Carbon/ESG Tags cad-bim, co2, carbon, embodied-carbon, lca, revit, ifc, xlsx, html-report, llm Author DataDrivenConstruction.io info@datadrivenconstruction.io Consulting and Training We work with leading construction, engineering, consulting agencies and technology firms around the world to help them implement open data principles, automate CAD/BIM processing and build robust ETL pipelines. If you would like to test this solution with your own data, or are interested in adapting the workflow to real project tasks, feel free to contact us. Docs & Issues: Full Readme on GitHub
by n8n Team
This n8n workflow serves as an incident response and notification system for handling potentially malicious emails flagged by Sublime Security. It begins with a Webhook trigger that Sublime Security uses to initiate the workflow by POSTing an alert. The workflow then extracts message details from Sublime Security using an HTTP Request node, based on the provided messageId, and subsequently splits into two parallel paths. In the first path, the workflow looks up a Slack user by email, aiming to find the recipient of the email that triggered the alert. If a user is found in Slack, a notification is sent to them, explaining that they have received a potentially malicious email that has been quarantined and is under investigation. This notification includes details such as the email's subject and sender. The second path checks whether the flagged email has been opened by inspecting the read_at value from Sublime Security. If the email was opened, the workflow prepares a table summarizing the flagged rules and creates a corresponding issue in Jira Software. The Jira issue contains information about the email, including its subject, sender, and recipient, along with the flagged rules. Issues that someone might encounter when setting up this workflow for the first time include potential problems with the Slack user lookup if the user information is not available or if Slack API integration is not configured correctly. Additionally, the issue creation in Jira Software may not work as expected, as indicated by the note that mentions a need for possible node replacement. Thorough testing and validation with sample data from Sublime Security alerts can help identify and resolve any potential issues during setup.
by Friedemann Schuetz
Welcome to my Airbnb Telegram Agent Workflow! This workflow creates an intelligent Telegram bot that helps users search and find Airbnb accommodations using natural language queries and voice messages. DISCLAIMER: This workflow only works with self-hosted n8n instances! You have to install the n8n-nodes-mcp-client Community Node! What this workflow does This workflow processes incoming Telegram messages (text or voice) and provides personalized Airbnb accommodation recommendations. The AI agent understands natural language queries, searches through Airbnb data using MCP tools, and returns mobile-optimized results with clickable links, prices, and key details. Key Features: Voice message support (speech-to-text and text-to-speech) Conversation memory for context-aware responses Mobile-optimized formatting for Telegram Real-time Airbnb data access via MCP integration This workflow has the following sequence: Telegram Trigger - Receives incoming messages from users Text or Voice Switch - Routes based on message type Voice Processing (if applicable) - Downloads and transcribes voice messages Text Preparation - Formats text input for the AI agent Airbnb AI Agent - Core logic that: Lists available MCP tools for Airbnb data Executes searches with parsed parameters Formats results for mobile display Response Generation - Sends formatted text response Voice Response (optional) - Creates and sends audio summary Requirements: Telegram Bot API**: Documentation Create a bot via @BotFather on Telegram Get bot token and configure webhook OpenAI API**: Documentation Used for speech transcription (Whisper) Used for chat completion (GPT-4) Used for text-to-speech generation MCP Community Client Node**: Documentation Custom integration for Airbnb data Requires MCP server setup with Airbnb/Airtable connection Provides tools for accommodation search and details Important: You need to set up an MCP server with Airbnb data access. The workflow uses MCP tools to retrieve real accommodation data, so ensure your MCP server is properly configured with the Airtable/Airbnb integration. Configuration Notes: Update the Telegram chat ID in the trigger for your specific bot Modify the system prompt in the Airbnb Agent for different use cases The workflow supports both individual users and can be extended for group chats Feel free to contact me via LinkedIn, if you have any questions!
by iamvaar
โ ๏ธ RUN the FIRST WORKFLOW ONLY ONCE (as it will convert your content in Embedding format and save it in DB and is ready for the RAG Chat) ๐ Telegram Trigger Type:** telegramTrigger Purpose:** Waits for new Telegram messages to trigger the workflow. Note:** Currently disabled. ๐ Content for the Training Type:** googleDocs Purpose:** Fetches document content from Google Docs using its URL. Details:** Uses Service Account authentication. โ๏ธ Splitting into Chunks Type:** code Purpose:** Splits the fetched document text into smaller chunks (1000 chars each) for processing. Logic:** Loops over text and slices it. ๐ง Embedding Uploaded Document Type:** httpRequest Purpose:** Calls Together AI embedding API to get vector embeddings for each text chunk. Details:** Sends JSON with model name and chunk as input. ๐ข Save the embedding in DB Type:** supabase Purpose:** Saves each text chunk and its embedding vector into the Supabase embed table. SECOND WORKFLOW EXPLAINATION: ๐ฌ When chat message received Type:** chatTrigger Purpose:** Starts the workflow when a user sends a chat message. Details:** Sends an initial greeting message to the user. ๐งฉ Embend User Message Type:** httpRequest Purpose:** Generates embedding for the userโs input message. Details:** Calls Together AI embeddings API. ๐ Search Embeddings Type:** httpRequest Purpose:** Searches Supabase DB for the top 5 most similar text chunks based on the generated embedding. Details:** Calls Supabase RPC function matchembeddings1. ๐ฆ Aggregate Type:** aggregate Purpose:** Combines all retrieved text chunks into a single aggregated context for the LLM. ๐ง Basic LLM Chain Type:** chainLlm Purpose:** Passes the user's question + aggregated context to the LLM to generate a detailed answer. Details:** Contains prompt instructing the LLM to answer only based on context. ๐ค OpenRouter Chat Model Type:** lmChatOpenRouter Purpose:** Provides the actual AI language model that processes the prompt. Details:** Uses qwen/qwen3-8b:free model via OpenRouter and you can use any of your choice.
by Davide
This workflow automates the process of converting images from JPG/PNG format to WEBP using the APYHub API. It retrieves image URLs from a Google Sheet, converts the images, and uploads the converted files to Google Drive. This workflow is a powerful tool for automating image conversion tasks, saving time and ensuring that images are efficiently converted and stored in the desired format. Using WebP images on a website provides several SEO benefits: Faster Loading Speed โ WebP files are smaller than JPG and PNG, reducing page load times and improving user experience. Better Core Web Vitals โ Google prioritizes websites with good performance metrics like LCP (Largest Contentful Paint). Improved Mobile Performance โ Smaller images consume less bandwidth, enhancing mobile usability. Higher Search Rankings โ Faster sites tend to rank better on Google due to improved user experience. Reduced Server Load โ Lighter images lower hosting and CDN costs while improving site efficiency. Below is a breakdown of the workflow: 1. How It Works The workflow is designed to convert images from JPG/PNG to WEBP format and manage the converted files. Here's how it works: Manual Trigger: The workflow starts with a Manual Trigger node, which initiates the process when the user clicks "Test workflow." Set API Key: The Set API KEY node defines the API key required to access the APYHub API. Get Images: The Get Images node retrieves a list of image URLs from a Google Sheet. The sheet contains columns for the original image URL (FROM), the converted image URL (TO), and a status flag (DONE). Get Extension: The Get Extension node extracts the file extension (JPG, JPEG, or PNG) from the image URL and adds it to the JSON data. Determine Image Type: The JPG or PNG? node checks the file extension and routes the workflow to the appropriate conversion node: JPG/JPEG: Routes to the From JPG to WEBP node. PNG: Routes to the PNG to WEBP node. Convert Image: The From JPG to WEBP and PNG to WEBP nodes send POST requests to the APYHub API to convert the images to WEBP format. The API returns the URL of the converted image. Update Google Sheet: The Update Sheet node updates the Google Sheet with the URL of the converted image and marks the row as done (DONE). Get Converted Image: The Get File Image node downloads the converted WEBP image from the URL provided by the APYHub API. Upload to Google Drive: The Upload Image node uploads the converted WEBP image to a specified folder in Google Drive. 2. Set Up Steps To set up and use this workflow in n8n, follow these steps: APYHub API Key: Obtain an API Key from APYHub. In the Set API KEY node, define the API key. Google Sheets Integration: Set up Google Sheets credentials in n8n for the Get Images and Update Sheet nodes. Create a Google Sheet with columns for FROM (original image URL), TO (converted image URL), and DONE (status flag). Provide the Document ID and Sheet Name in the Get Images node. Google Drive Integration: Set up Google Drive credentials in n8n for the Upload Image node. Specify the folder ID in Google Drive where the converted images will be uploaded. Test the Workflow: Click the "Test workflow" button in n8n to trigger the workflow. The workflow will: Retrieve image URLs from the Google Sheet. Convert the images to WEBP format using the APYHub API. Update the Google Sheet with the converted image URLs. Upload the converted images to Google Drive. Optional Customization: Modify the workflow to include additional features, such as: Adding more image formats for conversion. Sending notifications when the conversion is complete. Integrating with other storage services (e.g., Dropbox, AWS S3).
by AI Native
This workflow automates the process of retrieving Hugging Face paper summaries, analyzing them with OpenAI, and storing the results in Notion. Hereโs a breakdown of how it works: โฐ Scheduled Trigger: The flow is set to run automatically at 8 AM on weekdays. ๐ Fetching Paper Data: It fetches Hugging Face paper summaries using their API. ๐ Data Check: Before processing, the workflow checks if the paper already exists in Notion to avoid duplicates. ๐ค Content Analysis with OpenAI: If the paper is new, it extracts the summary and uses OpenAI to analyze the content. ๐ฅ Store Results in Notion: After analysis, the summarized data is saved in Notion for easy reference. โ๏ธ Set Up Steps for Automation Follow these steps to set up this automated workflow with Hugging Face, OpenAI, and Notion integration: ๐ Obtain API Tokens: Youโll need the Notion and OpenAI API tokens to authenticate and connect these services with n8n. ๐ Integration in n8n: Link Hugging Face, OpenAI, and Notion by configuring the appropriate nodes in n8n. ๐ง Configure Workflow Logic: Set up a cron trigger for automatic execution at 8 AM on weekdays. Use an HTTP request node to fetch Hugging Face paper data. Add logic to check if the data already exists in Notion. Set up the OpenAI integration to analyze the paperโs summary. Store the results in Notion for easy access and reference. Result:
by Yaron Been
This workflow automatically analyzes competitor content performance across various platforms to understand what content resonates with their audience. It saves you time by eliminating the need to manually track competitor content and provides insights into successful content strategies and engagement patterns. Overview This workflow automatically scrapes competitor websites, blogs, and social media to analyze content performance metrics including engagement rates, shares, comments, and audience response. It uses Bright Data to access competitor content without restrictions and AI to intelligently analyze performance data and extract actionable insights. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping competitor content platforms without being blocked OpenAI**: AI agent for intelligent content performance analysis Google Sheets**: For storing competitor content analysis and performance metrics How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your content analysis spreadsheet Customize: Define competitor URLs and content performance tracking parameters Use Cases Content Strategy**: Learn from high-performing competitor content to improve your own strategy Competitive Analysis**: Track competitor content trends and audience engagement patterns Content Optimization**: Identify content types and topics that drive the most engagement Market Research**: Understand what content resonates with your target audience Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #competitoranalysis #contentperformance #brightdata #webscraping #contentmarketing #n8nworkflow #workflow #nocode #contentanalysis #competitormonitoring #contentresearch #engagementanalysis #marketresearch #contentintelligence #competitiveintelligence #contentoptimization #performancetracking #contentmetrics #marketanalysis #contentaudit #brandanalysis #contentstrategy #digitalmarketing #contentinsights #socialmediaanalysis #contentmonitoring #performanceanalysis #competitorresearch
by Taiki
disclaimer: This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Quick Summary This template automates the creation of YouTube Shorts scripts. Simply send a YouTube video URL to a Telegram bot, and this workflow will use AI (Google Gemini) to generate a script optimized for Shorts and send it back to you. It streamlines your content creation process by eliminating the manual work of transcribing and scriptwriting. Target Audience & Problem Solved This template is ideal for: YouTube Content Creators Social Media Marketers Anyone looking to efficiently repurpose - - video content It solves the following problems: Quickly creating engaging short-form video content from existing long-form videos. Drastically reducing the time spent on video transcription, summarization, and scriptwriting. Enabling content idea generation on the go, even from a smartphone without access to a PC. Workflow Overview This workflow automatically generates a script for YouTube Shorts using AI (Google Gemini) from a YouTube video URL received via Telegram, and then sends the result back to Telegram. The process is as follows: Receive URL: A Telegram bot receives a YouTube video URL from a user. Get Transcript: It uses the community node n8n-nodes-supadata to retrieve the video's transcript via the Supadata service. Generate Script: Based on the retrieved transcript, Google Gemini creates a script (title and body) optimized for a YouTube Shorts video. Send Result: The generated script is sent back to the user who sent the original URL via Telegram. How to Use Prerequisites: Install the n8n-nodes-supadata community node as described below. Register your API keys for Telegram, Supadata, and Google Gemini as credentials in n8n. Set the appropriate credentials in each node within the workflow. For Telegram, create your bot with BotFarther. Execution: Activate the workflow. Send a YouTube video URL to your configured Telegram bot. Confirmation: After a short while, the workflow will run automatically, and you will receive the completed script from the Telegram bot. Customization Guide You can tailor this template to your specific needs: Modify the Prompt: Edit the prompt in the Google Gemini node to change the style of the generated script (e.g., make it more casual, use a specific tone, include emojis). Change the AI Model: You can switch to other AI models like OpenAI. Simply add or replace the corresponding n8n node. Change the Notification Channel: Replace the Telegram node with a Discord or Slack node to send the results to a different platform. Community Node Installation This workflow requires the n8n-nodes-supadata community node. Please install it using the following steps. Installation from the n8n UI (Recommended) Navigate to Settings > Community Nodes in your n8n interface. Click on Install. Enter n8n-nodes-supadata in the input field and click the Install button.