by Luciano Gutierrez
Instagram Auto-Comment Responder with AI Agent Integration Version: 1.1.0 ‧ n8n Version: 1.88.0+ ‧ License: MIT A fully automated workflow for managing and responding to Instagram comments using AI agents. Designed to improve engagement and save time, this system listens for new Instagram comments, verifies and filters them, fetches relevant post data, processes valid messages with a natural language AI, and posts context-aware replies directly on the original post. Key Features 💬 AI-Driven Engagement: Intelligent responses to comments via a GPT-powered agent. ✅ Webhook Verification: Handles Instagram webhook handshake to ensure secure integration. 📦 Data Extraction: Maps incoming payload fields (user ID, username, message text, media ID) for processing. 🚫 Self-Comment Filtering: Automatically skips comments made by the account owner to prevent loops. 📡 Post Data Retrieval: Fetches the media’s id and caption from the Graph API (v22.0) before generating a reply. 🧠 Natural Language Processing: Uses a custom system prompt to maintain brand tone and context. 🔁 Automated Replies: Posts the AI-generated message back to the comment thread using Instagram’s API. 🧩 Modular Architecture: Clear separation of steps via sticky notes and dedicated HTTP Request and Agent nodes. Use Cases Social Media Automation**: Keep followers engaged 24/7 with instant, relevant replies. Community Building**: Maintain a consistent voice and tone across all interactions. Brand Reputation Management**: Ensure no valid comment goes unanswered. AI Customer Support**: Triage simple questions and direct followers to resources or support. Technical Implementation Webhook Verification Node: Webhook + Respond to Webhook Echoes hub.challenge to confirm subscription and secure incoming events. Data Extraction Node: Set Maps payload fields into structured variables: conta.id, usuario.id, usuario.name, usuario.message.id, usuario.message.text, usuario.media.id, endpoint. User Validation Node: Filter Skips processing if conta.id equals usuario.id (self-comments). Post Data Retrieval Node: HTTP Request (Get post data) GET https://graph.instagram.com/v22.0/{{ $json.usuario.media.id }}?fields=id,caption&access_token={{ credentials }} Captures the media’s caption for richer context in replies. AI Response Generation Nodes: AI Agent + OpenRouter Chat Model Uses a detailed system prompt with: Profile persona (expert in AI & automations, friendly tone). Input data (username, comment text, post caption). Filtering logic (spam, praise, questions, vague comments). Returns either the reply text or [IGNORE] for irrelevant content. Posting the Reply Node: HTTP Request (Post comment) POST {{ $json.endpoint }}/{{ $json.usuario.message.id }}/replies with message={{ $json.output }} Sends the AI answer back under the original comment. Instructions for Setup Import Workflow In n8n > Workflows > Import from File, upload the provided .json template. Configure Credentials Instagram Graph API (Header Auth or FacebookGraphApi) with instagram_basic, instagram_manage_comments scopes. OpenRouter/OpenAI API key for AI agent. Customize System Prompt Edit the AI Agent’s prompt to adjust brand tone, language (Brazilian Portuguese), length, or emoji usage. Test & Activate Publish a test comment on an Instagram post. Verify each node’s execution, ensuring the webhook, filter, data extraction, HTTP requests, and AI Agent respond as expected. Extend & Monitor Add sentiment analysis or lead capture nodes as needed. Monitor execution logs for errors or rate-limit events. Tags Social Media • Instagram Automation • Webhook Verification • AI Agent • HTTP Request • Auto Reply • Community Management
by Ranjan Dailata
Who this is for? Extract & Summarize Indeed Company Info is an automated workflow that extracts the Indeed company profile information using Bright Data Web Unlocker, transform it using Google Gemini’s LLM, and forward the transformed response with the summary to a specified webhook for downstream use. This workflow is tailored for: Recruiters and HR teams looking to assess companies quickly during talent sourcing. Job seekers researching potential employers and needing summarized company insights. Market researchers and analysts monitoring competitor or industry players. What problem is this workflow solving? Searching and evaluating company profiles on Indeed manually can be time-consuming and inefficient, especially when dealing with large volumes of companies. Manually browsing, copying, and summarizing company descriptions, reviews, and ratings from Indeed hinders productivity and limits real-time insights. This workflow solves this by: Automating the extraction of company details from Indeed using Bright Data Web Unlocker. Summarizing the raw data using Google Gemini's language model for a quick, human-readable overview. Sending the transformed response with the summary to a chosen endpoint, like Slack, Notion, Airtable, or a custom webhook. What this workflow does This automated pipeline does the following: Scrape Indeed company profile pages (e.g., ratings, description, reviews) using Bright Data’s Web Unlocker. Transform the scraped content into structured JSON using n8n’s built-in tools. Summarize and extract meaningful insights using Google Gemini's large language model. Forward the summarized data to a specified webhook or app for real-time access, storage, or analysis. Forward the formatted response to a specified webhook or app for real-time access, storage, or analysis. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). Update the search query, Bright Data zone by navigating to the Set Indeed Search Query node. Update the Webhook Notifier with the Webhook endpoint of your choice. How to customize this workflow to your needs This workflow is built to be flexible - whether you're a company or a market researcher, entrepreneur, or data analyst. Here’s how you can adapt it to fit your specific use case: Changing the data source**: Replace the Indeed search input with other job or business listing platforms if needed (e.g., Glassdoor, Crunchbase) Refining the LLM prompt**: Tailor the Gemini prompt to transform or summarize the Indeed company information in a specific format. Routing the output to different destinations**: Send summaries or transformed response to Google Sheets, Airtable, or CRMs like HubSpot or Salesforce etc.
by Yang
👥 Who is this for? This workflow is ideal for virtual assistants, researchers, developers, automation specialists, and data analysts who need to regularly extract and organize structured product information (like books) from a website. It’s especially useful for those working with catalog-based websites who want to automate extraction and delivery of clean, sorted data. 🧩 What problem is this solving? Manually copying product listings like book titles and prices from a website into a spreadsheet is slow and repetitive. This automation solves that problem by scraping content using Dumpling AI, extracting the right data using CSS selectors, and formatting it into a clean CSV file that is sent to your email—all triggered automatically when a new URL is added to Google Sheets. ⚙️ What this workflow does This template automates an entire content scraping and delivery process: Watches a Google Sheet for new URLs Scrapes the HTML content of the given webpage using Dumpling AI Uses CSS selectors in the HTML node to extract each book from the page Splits the HTML array into individual items Extracts the book title and price from each HTML block Sorts the books in descending order based on price Converts the sorted data to a CSV file Sends the CSV via email using Gmail 🛠️ Setup Google Sheets Create a sheet titled something like URLs Add your product listing URLs (e.g., http://books.toscrape.com) Connect the Google Sheets trigger node to your sheet Ensure you have proper credentials connected Dumpling AI Create an account at Dumpling AI) - Generate your API key Set the HTTP Method to POST and pass the URL dynamically from the Google Sheet Use Header Auth to include your API key in the request header Make sure "cleaned": "True" is included in the body for optimized HTML output HTML Node The first HTML node extracts the main book container blocks using: .row > li The second HTML node parses out the individual fields: title: h3 > a (via the title attribute) price: .price_color Sort Node Sorts books by price in descending order Note: price is extracted as a string, ensure it's parsable if you plan to use numeric filtering later Convert to CSV The JSON data is passed into a Convert node and transformed into a CSV file Gmail Sends the CSV as an attachment to a designated email 🔄 How to customize this workflow Extract more data**: Add more CSS selectors in the second HTML node to pull fields like author, availability, or product links Switch destinations**: Replace Gmail with Slack, Google Drive, Dropbox, or another platform Adjust sorting**: Sort alphabetically or based on another extracted value Use a different source**: As long as the site structure is consistent, this can scrape any listing-like page Trigger differently**: Use a webhook, form submission, or schedule trigger instead of Google Sheets ⚠️ Dependencies and Notes This workflow uses Dumpling AI to perform the web scraping. This requires an API key and uses credits per request. The HTML node depends on valid CSS selectors. If the site layout changes, the selectors may need to be updated. Ensure you’re not scraping content from websites that prohibit automated scraping.
by GYANENDRA DWIVEDI
🚀 WhatsApp Automation Template Designed & Developed by Infridet Solutions Private Limited 🔧 Objective: Automate your lead nurturing and sales process from YouTube/Instagram → Landing Page → CRM → Email → WhatsApp → Sales → Deal Closure using tools like: 🌐 WordPress (Landing Page + Fluent Forms) 🧾 Google Sheets (Backup Log) 📩 FluentCRM (Lead Tagging + Email Sequences) 💬 Whinta.com (WhatsApp Messaging API) ⚙️ N8N (Workflow Automation Engine) 🧩 System Flow Overview: Lead Source: YouTube or Instagram CTA Landing Page: Built on WordPress with a story-driven design Form Capture: Fluent Forms with dynamic input fields Data Sync: Backup to Google Sheets Push lead to FluentCRM and tag as New Lead Email Sequence: Warm-up emails (1 to 5) Introduce offer or service WhatsApp Outreach: Send personalized message via Whinta Triggered 1 hour after form fill or last email Sales Follow-Up: Sales team handles replies manually CRM tag updated to Customer upon closing 📁 Folder Structure (Optional Git/Zip File): 📦 WhatsApp-Automation-Infridet/ │ ├── whatsapp-automation-n8n.json # N8N Flowchart Import File ├── email-templates.docx # Warm-up Email Scripts ├── whinta-api-integration.pdf # API Documentation ├── crm-tagging-notes.txt # CRM Tag Setup Details └── readme.md # This Instruction File 🛠️ Required Integrations & Setup ✅ Fluent Forms (WordPress) Embed form with Name, Email, Phone Enable webhook to N8N: /lead-capture ✅ Google Sheets Use n8n-nodes-base.googleSheets node Capture name, email, phone, source, timestamp ✅ FluentCRM REST API enabled Push contact and assign tag New Lead Setup Email Automation via tag trigger ✅ SMTP Email (Optional) Use Gmail SMTP or Brevo Trigger email on form submission ✅ Whinta.com (WhatsApp API) Send POST request Payload includes phone, message, sender_id Customize message with personalization 💬 Sample WhatsApp Message: Hey {{name}}, Gyan here from Account Craft 👋 I saw your form submission – would you like help in starting your YouTube journey this week? Let me know. I'm just one text away. ✅ 📧 Sample Email (Warmup Day 1): > Subject: Welcome to Account Craft 🚀 > Body: > Hi {{name}}, > > I’m Gyan from Account Craft. Thanks for joining us! > Here’s what’s coming next: exclusive videos, personalized tips, and real support to get your YouTube channel earning. > > Let’s go! > – Gyan 🔁 CRM Tag Updates: | Action | Tag Assigned | |-------------------|------------------| | On form fill | New Lead | | After WhatsApp | Engaged | | After sale closed | Customer | 📌 Final Output: Once completed, the system will: Log all leads into a database Automatically send emails and WhatsApp messages Notify your sales team Update lead status without manual entry > Automation Template Designed & Deployed by > Infridet Solutions Private Limited > Smart Integrations. Seamless Business. > 🌐 www.infridetsolutions.com | 📞 +91-8853354829
by Corentin Ribeyre
This template can be used to search for an email address with Icypeas. Be sure to have an active account to use this template. How it works This workflow can be divided into three steps : The workflow initiates with a manual trigger (On clicking 'execute'). It connects to your Icypeas account. It performs an HTTP request to search for an email address. Set up steps You will need a working icypeas account to run the workflow and get your API Key, API Secret and User ID. You will need a personn firstname, lastname and domain/company name to perform the search.
by Paulo Ramirez
Upload your CRM contacts to telli and schedule AI voice-agent calls Introduction to telli and AI Voice-Agent Calls telli is an innovative platform that provides AI-powered voice agents capable of making calls and performing tasks tailored to specific customer use cases. These AI voice-agents can handle a wide range of communication tasks, from appointment scheduling to customer support, with remarkable efficiency and natural conversation flow. This template is designed for businesses and organizations looking to automate their outbound calling processes using telli's AI voice-agents in conjunction with Airtable as their CRM. It solves the problem of manual call scheduling and data transfer between your CRM and calling system, saving time and reducing human error. Prerequisites telli account Airtable base with contact information n8n instance Step-by-Step Setup Guide n8n Setup: Create a new workflow in n8n. Add the Airtable node to connect to your CRM table. telli API Configuration: Log in to your telli dashboard. Locate and copy your API key under telli - Settings - API/Webhooks. Workflow Configuration: Add two HTTP Request nodes to your n8n workflow. Set the "Authorization" header in both POST requests, replacing the value with your telli API key. Configure the first request to use the /add-contact endpoint. Set up the second request to use the /schedule-call endpoint. Data Mapping: Map the relevant fields from your Airtable node to the telli API requests. Testing and Activation: Run a test execution of your workflow. Once satisfied with the results, activate the workflow. API Endpoint Details Add Contact Endpoint URL**: https://api.telli.com/v1/add-contact Method**: POST Headers**: Authorization: YOUR-API-KEY Content-Type: application/json Payload**: { "external_contact_id": "string", "salutation": "string", "first_name": "string", "last_name": "string", "phone_number": "string", "email": "jsmith@example.com", "contact_details": {}, "timezone": "string" } Schedule Call Endpoint URL**: https://api.telli.com/v1/schedule-call Method**: POST Headers**: Authorization: YOUR-API-KEY Content-Type: application/json Payload**: { "contact_id": TELLI-CONTACT-ID, "agent_id": "string", "max_retry_days": 123, "call_details": { "message": "Hello, this is your friendly reminder!", "questions": [ { "fieldName": "email", "neededInformation": "email of the customer", "exampleQuestion": "What is your email address?", "responseFormat": "email string" } ] }, "override_from_number": "string" } Use Cases This template is versatile and can be applied to various scenarios, including: Lead Qualification*: Automatically schedule calls to new leads entered in your CRM. Appointment Reminders*: Set up calls to remind clients of upcoming appointments. Customer Feedback*: Schedule follow-up calls after product deliveries or service completions. Uploading Multiple Contacts For bulk operations, you have two options: Loop Node: Include a Loop node in your n8n workflow to process multiple contacts sequentially. Batch Endpoints: Instead of /add-contact and /schedule-call, use telli's batch endpoints: /add-contacts-batch: Add multiple contacts within an array. /schedule-calls-batch: Schedule multiple calls at once. Example of batch endpoint usage: { "contacts": [ {"name": "John Doe", "phone": "+1234567890"}, {"name": "Jane Smith", "phone": "+1987654321"} ] } By leveraging this template, you can seamlessly integrate your Airtable CRM with telli's powerful AI voice-agents, automating your outbound calling process and enhancing your customer communication strategy.
by Ranjan Dailata
Who this is for? The Automate Etsy Data Mining with Bright Data Scrape & Google Gemini workflow is designed for eCommerce analysts, product researchers, and AI developers seeking to extract actionable insights from Etsy listings at scale. It is ideal for: eCommerce Entrepreneurs** - Researching product demand and competition. Market Analysts** - Tracking pricing, reviews, and trends across Etsy categories. Product Managers** - Identifying niche opportunities and design inspirations. Data Scientists & AI Engineers** - Automating product intelligence pipelines. Growth Hackers** - Leveraging Etsy insights to refine product-market fit. What problem is this workflow solving? Manually browsing Etsy to analyze product listings, pricing, reviews, and seller activity is slow, inconsistent, and unscalable. Scraping Etsy requires unlocking JavaScript-heavy content and structuring noisy data for analysis. This workflow solves: Automated and scalable scraping of Etsy product listings using Bright Data’s infrastructure. A fully paginated data structured Estry production data extraction via the Google Gemini LLM. Enables faster decision-making for product research and competitive analysis via the fully automated paginated data extraction. What this workflow does Receives input: Sets the Esty URL for the data extraction and analysis. Uses Bright Data's Web Unlocker to extract content from relevant sites. Cleans and preprocesses the scraped content for readability. Sends the content to Google Gemini for: Enriched results including: Data persistence over the disk. Sends the response to a target system via Webhook notification. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). Update the Set Esty Search Query for setting the brand content URL and the Bright Data Zone name. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. How to customize this workflow to your needs Input Sources** : Replace the static URL with dynamic input from Google Sheets, Webhook, or Airtable to research multiple niches. Prompt Customization** : Adjust Gemini prompts to extract specific insights for example: List key features of the product Summarization of the review themes Data Output Options** : Update the Webhook notification to save data to: Google Sheets Notion or Airtable SQL/NoSQL Slack/Email
by mariskarthick
QuantumDefender AI is a next-generation intelligent cybersecurity assistant designed to harness the symbolic strength of quantum computing’s promise alongside cutting-edge AI capabilities. This sophisticated agent empowers SOC analysts, red teamers, and security researchers with rapid threat investigation, operational automation, and intelligent command execution—all driven by GPT-4 and integrated tools, accessible through Telegram or on any medium. 🔑 Key Features: Expert-Level Cybersecurity Research & Analysis: Leverages powerful AI models to deliver clean, detailed, domain-specific insights across detection, remediation, and offensive security. Command & Control: Executes Linux shell commands, autonomous scripts, and system operations securely in isolated environments. Real-Time Web Intelligence: Utilizes integrated Langsearch API to provide timely internet research with contextual relevance. Calendar & Scheduling Automation: Manage Google Calendar events or any similar application(create, update, delete, retrieve) dynamically from chat. Multi-Tool Orchestration: Combines calculator functions, internet searches, command execution, and messaging for comprehensive operational support. Telegram-native Chatbot: Delivers an adaptive, memory-informed, and interactive conversational experience with immediate typing indicators and high responsiveness. Conversation & Session Management: Maintains context-aware, session-based memory to enable smooth, multi-turn dialogues with individual users. Sends “typing…” indicators during processing to ensure an interactive, user-friendly chat experience. Operates exclusively within Telegram, delivering rich, timely responses and leveraging all Telegram bot capabilities. Execution Intelligence & Safety: Fully autonomous in deciding which tools to invoke, how frequently, and in what sequence to fulfill user requests comprehensively and responsibly. Operates within a secure temporary folder environment to contain all command executions safely and avoid persistent or harmful side effects. Enforces strict safety protocols to avoid running malicious or destructive commands, maintaining ethical standards and compliance. Use Cases: Cybersecurity researchers and operators seeking an intelligent assistant to accelerate investigations and automate routine tasks. Red team professionals requiring on-the-fly command execution and information gathering integrated with tactical chat interactions. SOC teams aiming to augment their alert triage and incident handling workflows with AI-powered analysis and action. Anyone looking for a robust multi-tool AI chatbot integrated with real-world operational capabilities. Setup Requirements: OpenAI API key for GPT-4.1-nano language processing. Telegram Bot API credentials with proper webhook setup to receive and respond to messages. Google OAuth credentials for Calendar integration if calendar features are used. SSH access credentials for executing commands on remote hosts, if remote execution is enabled. Internet connectivity for the Langsearch web search API. Customization & Extensibility: The workflow is built modularly with n8n’s flexible node system. Users can extend it by adding more tools, integrating other services (ticketing, threat intel, scanning tools), or modifying interaction logic to suit specialized operational needs and environments. Created by Mariskarthick M Senior Security Analyst | Detection Engineer | Threat Hunter | Open-Source Enthusiast
by Krishna Kumar Eswaran
🧠 Problem This Solves: For developers and creators, consistently posting quality content on LinkedIn can be time-consuming. This workflow automates the process by: Fetching the latest Dev.to articles Posting them to LinkedIn twice daily Preventing duplicates using Airtable Sending success alerts to Telegram This ensures you're always active on LinkedIn, with zero manual effort. 👥 Who This Template Is For Developers who want to build their presence on LinkedIn Tech creators or solo founders looking to grow an audience Community/page managers who want regular, curated content Busy professionals aiming for consistent LinkedIn engagement without doing it manually ⚙️ Workflow Breakdown This automation runs twice a day (9:00 AM and 7:00 PM) and performs the following steps: Fetches Dev.to articles based on a tag Checks Airtable to avoid reposting the same article Posts to LinkedIn if it’s new Sends a Telegram message after posting successfully 🧩 Step-by-Step Setup Instructions ✅ 1. Airtable Configuration Create a new base in Airtable with just one table and one column: Table Name: PostedArticles Column: ArticleID (Single line text – stores the unique ID of each Dev.to article posted) This column is used to track posted articles and prevent duplicates. 🔗 2. Dev.to API Setup Use the following endpoint in the HTTP Request node: arduino Copy Edit https://dev.to/api/articles?tag=YOUR_TAG_HERE&per_page=10 Replace YOUR_TAG_HERE with a tag like android, webdev, ai, etc. 💬 3. Telegram Bot Setup Open @BotFather in Telegram and create a new bot Save the bot token Get your chat ID using @userinfobot or via Telegram API Add a Telegram node in n8n using this token and chat ID This will notify you when a post is successfully published. 🧾 4. LinkedIn Setup Create a LinkedIn Developer App Use OAuth2 to connect it in n8n Choose to post on either a user profile or a company page 🧱 5. n8n Workflow Structure Here’s the basic structure of the workflow: Cron Node – Triggers at 9:00 AM and 7:00 PM daily HTTP Request – Fetches latest articles from Dev.to Airtable Search – Checks if ArticleID already exists IF Node – Filters new vs. already-posted articles LinkedIn Post – Publishes new article Airtable Create – Saves the new ArticleID Telegram Message – Sends success confirmation 🛠️ Customization Tips Change the Dev.to tag in the API URL Modify LinkedIn post format (add hashtags, emojis, personal notes) Adjust posting times in the Cron node Use additional filters (e.g., only post articles with a cover image or certain word count)
by JPres
👥 Who Is This For? Sales and marketing teams seeking efficient, hands‑free generation of personalized slide decks for each prospect from CSV lead lists. 🛠 What Problem Does This Solve? Manually editing presentation decks for large lead lists is slow and error‑prone. This workflow fully automates: Importing and parsing CSV lead data Logging leads and outputs in Google Sheets Duplicating a master Slides template per lead Injecting lead‑specific variables into slides 🔄 Node‑by‑Node Breakdown | Step | Node | Purpose | | ---- | ---------------------------------------- | -------------------------------------------------------- | | 1 | New Leads Arrived | Detect new CSV uploads in Drive | | 2 | File Type? | Filter for .csv files only | | 3 | Download by ID | Download the CSV content | | 4 | Create new Sheet | Create a Google Sheet to record lead data | | 5 | Combine Empty New Document with CSV Data | Structure each lead record for slide creation | | 6 | Merge Data for new Lead Document | Map template placeholders to lead values | | 7 | Get all Leads | Retrieve sheet rows to iterate through each lead | | 8 | MoveToLeadListFolder | Move processed CSV to an archive folder | | 9 | Copy Slides Template | Make a copy of the master Slides deck | | 10 | Create Custom Presentation | Replace placeholders in the copied deck with lead data | | 11 | Add Presentation ID to Lead | Write the generated presentation URL back into the Sheet | ⚙️ Pre‑conditions / Requirements n8n with Google Drive, Sheets, and Slides credentials A master Google Slides deck with placeholder tokens (e.g. {{Name}}, {{Company}}) A Drive folder for incoming CSV lead files ⚙️ Setup Instructions Import this workflow into your n8n instance. Configure the New Leads Arrived node to watch your CSV folder. Enter your Google credentials in the Drive, Sheets, and Slides nodes. Specify the master Slides template ID in the Copy Slides Template node. In Create Custom Presentation, map slide tokens to sheet column names. Disable “Keep Binary Data” in Copy Slides Template to conserve memory. Upload a sample CSV (with headers like Name, Company, Metric) to test. 🎨 How to Customize Add or remove variables by editing the CSV headers and updating the mapping in Merge Data for new Lead Document. Insert an AI/natural‑language node before slide creation to generate more advanced and personalized text blocks. Use SplitInBatches to throttle API calls and avoid rate‑limit errors. Add error‑handling branches to capture and log failed operations. 🔐 Security and Privacy The workflow uses placeholder variables for file and folder IDs, so no actual IDs are exposed in the template. Ensure OAuth scopes are limited to only the required Google APIs.
by JPres
👥 Who Is This For? Content creators, marketing teams, and channel managers who want a simple, hands‑off solution to upload videos and automatically generate optimized metadata from video transcripts. 🛠 What Problem Does This Solve? Manual video uploads with proper metadata creation is time‑consuming and repetitive. This workflow fully automates: Monitoring a specific Google Drive folder for new video uploads Seamless YouTube upload processing Transcript extraction for context understanding AI‑powered generation of titles, descriptions, and tags Metadata application to uploaded videos without manual intervention 🔄 Node‑by‑Node Breakdown | Step | Node Purpose | |------|---------------------------------------------------------------------| | 1 | New Video? (Trigger) – Monitors specified Google Drive folder | | 2 | Download New Video – Retrieves the video file from Google Drive | | 3 | Upload to YouTube – Uploads the video to YouTube with initial settings | | 4 | Get Transcript – Extracts transcript from the uploaded video | | 5 | Adjust Transcript Format – Formats raw transcript for processing | | 6 | Create Description – Generates SEO‑optimized description | | 7 | YT Tags (Message Model) – Creates relevant tags based on content | | 8 | YT Title (Message Model) – Generates compelling title | | 9 | Define File Path Upload Format (Optional) – Structures data paths | | 10 | Update Video’s Metadata – Applies generated title, description, tags| ⚙️ Pre‑conditions / Requirements n8n with Google Drive and YouTube API credentials configured (stored as n8n credentials/variables; no hard‑coded IDs) Dedicated Google Drive folder for video uploads YouTube channel with proper upload permissions AI service access for transcript processing and metadata generation Sufficient storage for temporary video handling ⚙️ Setup Instructions Import this workflow into your n8n instance. Configure Google Drive credentials; reference folder ID via n8n variable (do not hard‑code). Set up YouTube API credentials with upload and edit permissions. Specify the target Google Drive folder ID in the New Video? trigger node (via variable). Configure AI service credentials for transcript and metadata generation. Adjust message templates for title, description, and tag creation. Test with a small video file before production use. 🎨 How to Customize Modify AI prompts to match your channel’s tone and style. Add conditional logic based on video categories or naming conventions. Implement notification systems to alert when uploads complete. Create custom metadata templates for different content types. Include timestamps or chapter markers based on transcript analysis. Add social media sharing nodes to announce new uploads. ⚠️ Important Notes Video quality is preserved through the upload process. Consider YouTube API quotas when handling multiple uploads. Transcript quality affects metadata generation results. Videos are initially uploaded without visibility adjustments. Processing time depends on video length and transcript complexity. 🔐 Security and Privacy Store API credentials and folder IDs as n8n Credentials/Variables—remove any hard‑coded tokens or IDs. Video files are processed temporarily and not stored permanently. Limit Google Drive folder access to authorized users only. Manage YouTube upload permissions carefully (use OAuth/service accounts). Ensure compliance with organizational data‑handling policies.
by Krishna Kumar Eswaran
🧠 Problem This Solves Manually sharing Medium articles to LinkedIn daily can be repetitive and time-consuming. This automation: Fetches the latest Medium articles based on a tag (e.g., android) Posts them on LinkedIn twice daily Uses Airtable to prevent duplicates Sends a confirmation to Telegram once posted Stay consistently active on LinkedIn without lifting a finger. 👥 Who This Template Is For Developers who write or follow Medium content Tech creators or founders looking to grow an audience Community or page managers needing regular curated posts Busy professionals who want hands-free LinkedIn engagement ⚙️ Workflow Breakdown This automation runs at 9:00 AM and 7:00 PM daily and performs these steps: Fetch articles from MediumAPI.com by tag Check Airtable to prevent reposting the same article Post on LinkedIn if it’s new Store the article ID in Airtable Send a Telegram message after successful posting 🧾 Step-by-Step Setup Instructions ✅ 1. Airtable Configuration Create a base with: Table Name: PostedArticles Column: ArticleID (Single line text – to track posted articles) 🔗 2. MediumAPI Setup Go to https://mediumapi.com Sign up and generate your API key from the dashboard Use this API endpoint in an HTTP node: GET https://mediumapi.com/api/tag/YOUR_TAG/latest Headers: Authorization: Bearer YOUR_API_KEY Replace YOUR_TAG with a topic like android, ai, webdev, etc. 💬 3. Telegram Bot Setup Go to @BotFather and create a new bot Save the bot token Use @userinfobot to get your Telegram chat ID Add a Telegram node in n8n with the token + chat ID 🔗 4. LinkedIn Setup Create a LinkedIn Developer App Connect it via OAuth2 in n8n Choose to post on your profile or company page 🧱 5. n8n Workflow Structure Node Type Description Cron Triggers the flow twice a day HTTP Request Fetches articles from MediumAPI.com Airtable Search Checks if article ID already exists IF Node Skips duplicates LinkedIn Post Publishes to your LinkedIn profile/page Airtable Create Stores posted article ID Telegram Node Sends success notification 🛠️ Customization Tips Change the tag in the API URL to match your niche Add hashtags or personal comments to the LinkedIn message Schedule different posting times in the Cron node Filter Medium posts based on length or title keywords (optional)