by Lucas Walter
Reverse engineer short-form videos from Instagram and TikTok using Gemini AI Who's it for Content creators, AI video enthusiasts, and digital marketers who want to analyze successful short-form videos and understand their production techniques. Perfect for anyone looking to reverse-engineer viral content or create detailed prompts for AI video generation tools like Google Veo or Sora. How it works This automation takes any Instagram Reel or TikTok URL and performs a forensic analysis of the video content. The workflow downloads the video, converts it to base64, and uses Google's Gemini 2.5 Pro vision API to generate an extremely detailed "Generative Manifest" - a comprehensive prompt that could be used to recreate the video with AI tools. The analysis includes: Visual medium identification (film stock, camera sensor, lens characteristics) Color grading and lighting breakdown Shot-by-shot deconstruction with precise timing Camera movement and framing details Subject description and action choreography Environmental and atmospheric details How to set up Configure API credentials: Add your Apify API key for video scraping Set up Google Gemini API authentication Set up Slack integration (optional): Configure Slack OAuth for result sharing Update the channel ID where results should be posted Access the form: The workflow creates a web form where you can input video URLs Form accepts both Instagram Reel and TikTok URLs Requirements Apify account** with API access for video scraping Google Cloud account** with Gemini API enabled Slack workspace** (optional, for sharing results) Videos must be publicly accessible (no private accounts) How to customize the workflow Modify the analysis prompt:** Edit the "set_base_prompt" node to adjust the depth and focus of the video analysis Add different platforms:** Extend the switch node to handle other video platforms Integrate with other tools:** Replace Slack with email, Discord, or other notification systems
by algopi.io
Who is this template for? This workflow template is designed for Marketing and pre-Sales to get prospects from a form like Tally, decline data in the famous opensource CRM (SuiteCRM), synchronize contact in Brevo with linking the id from CRM, and then notify in NextCloud. Bonus : validate email with ++CaptainVerify++ and notify in NextCloud depending on response How it works For each submission in the form, a webhook is triggered. A check of the email is done with CaptainVerify. Depending on the response, and if it is ok, then a Lead is created in SuiteCRM. Else, a message in your selected discussion is sent. As the lead has been created, we can create a contact in Brevo (for future campain), ank link this contact with the lead_id from the CRM in a dedicated field. Finaly, a message in your selected discussion in NextCloud informs you about the lead. Set up instructions Complete the Set up credentials step when you first open the workflow. You'll need a CaptainVerify account (Api Key), a dedicated SuiteCRM user with Oauth, a Brevo account (Api Key) and a Nextcloud account. Set up the Webhook in the form's tool of your choice (why not Tally ?). Set each node with the explanations in sticky Notes. Enjoy ! Template was created in n8n v1.44.1
by Jimleuk
Ever wanted to build your own RAG search over Youtube videos? Well, now you can! This n8n template shows how you can build a very capable Youtube search engine powered by Apify, Qdrant and your LLM of choice to quickly and efficiently browse over many videos for research. I originally started to template to ask questions on the "n8n @ scale office-hours" livestream videos but then extended it to include the latest videos on the official channel. Check out a demo here: https://jimleuk.app.n8n.cloud/webhook/n8n_videos How it works Stage 1 is to collect the Youtube video transcripts and push them into a vector database. For this, I've used Apify to scrape Youtube and Qdrant to store the embeddings. Transcripts are broken down into smaller chunks and carefully tagged with metadata to assist in later search and filtering. Stage 2 is to build a web frontend for the user to query the vectorised transcripts. I'm using a webhook to serve a simple web app and API to dynamically fetch the results. When searching for a video, I've opted to use Qdrant's search groups API which in this use-case, performs better as it returns a wider range of videos results. In the web frontend, when the user clicks on the results, the matching Youtube video plays in an embedded video player. How to use Once credentials are all set, first run steps 1 - 3 to populate your vector store. Next, set the workflow to active to expose the web frontend. Visit the webhook URL in your browser to use it. If only for personal use, you may want to remove the rate limiting mechanism in step 4. Requirements Apify for Youtube Channel and Video Scraping Qdrant for Vector store OpenAI for LLM and Embeddings Customising the template Not interested in official n8n videos? Swap to a different channel - this template will work on many as long as videos are not private or set to prevent embeds. Technically any vector store should work but may not have the same grouping API. Use the simple vector store node and revert back to basic searching instead.
by Marvin Wu
Who is this for? This workflow is designed for n8n users and developers who need to automate the documentation process of their n8n workflows. It's particularly useful for teams looking to streamline their documentation efforts and ensure consistency across their workflow documentation. What problem is this workflow solving? / Use case The primary problem this workflow addresses is the manual and time-consuming process of creating documentation for n8n workflows. It automates the generation of concise, clear, and comprehensive documentation directly from the workflow's JSON, making it easier for both technical and non-technical users to understand what the workflow does and how it operates. What this workflow does Upon receiving a form submission with the workflow title and JSON, this workflow automatically generates documentation that includes: A brief introduction to the workflow. The trigger mechanism (webhook URLs for test and production environments, or cron schedules). Setup requirements, including necessary credentials and external dependencies. Setup Credentials Setup: Ensure you have OpenAI API credentials configured in n8n to use the GPT model for generating documentation text. Form Submission: Users must submit the form with the workflow title and JSON. The form is accessible via: Test URL: domain/form-test/{webhookId} Production URL: domain/form/{webhookId} How to customize this workflow to your needs Modify Trigger URLs**: Adjust the webhook or form URLs based on your domain and specific n8n setup. Customize Documentation Template**: Edit the OpenAI node's prompt to change the structure or details of the generated documentation. Extend Functionality**: Add nodes to integrate with other systems (e.g., automatically publishing the documentation to a wiki or sending it via email). This workflow simplifies the documentation process, making it accessible and manageable for teams of all sizes and technical abilities. By automating documentation, it ensures that all workflows are properly documented, enhancing understanding and efficiency within teams.
by Trung Tran
🎧 IT Voice Support Automation Bot – Telegram Voice Message to JIRA ticket with OpenAI Whisper > Automatically process IT support requests submitted via Telegram voice messages by transcribing, extracting structured data, creating a JIRA ticket, and notifying relevant parties. 🧑💼 Who’s it for Internal teams that handle IT support but want to streamline voice-based requests. Employees who prefer using mobile/voice to report incidents or ask for support. Organizations aiming to integrate conversational AI into existing support workflows. ⚙️ How it works / What it does A user sends a voice message to a Telegram bot. The system checks whether it’s an audio message. If valid, the audio is: Downloaded Transcribed via OpenAI Whisper Backed up to Google Drive The transcription and file metadata are merged. The merged content is processed through an AI Agent (GPT) to extract structured request info. A JIRA ticket is created using the extracted data. The IT team is notified via Slack (or other channels). The requester receives a Telegram confirmation message with the JIRA ticket link. If the input is not audio, a polite rejection message is sent. 📌 Key Features Supports voice-based ticket creation Accurate transcription using Whisper Context-aware request parsing using GPT-4.1 mini Fully automated ticket creation in JIRA Notifies both IT and the original requester Cloud backup of original voice messages (Google Drive) 🛠️ Setup Instructions Prerequisites | Component | Required | |----------|----------| | Telegram Bot & API Key | ✅ | | OpenAI Whisper / Transcription Model | ✅ | | Google Drive Credentials (OAuth2) | ✅ | | Google Sheets or other storage (optional) | ⬜ | | JIRA Cloud API Access | ✅ | | Slack Bot or Webhook | ✅ | Workflow Steps Telegram Voice Message Trigger: Starts the flow when a user sends a voice message. Is Audio Message?: If false → reply "only voice is supported" Download Audio: Download .oga file from Telegram. Transcribe Audio: Use OpenAI Whisper to get text transcript. Backup to Google Drive: Upload original voice file with metadata. Merge Results: Combine transcript and metadata. Pre-process Output: Clean formatting before AI extraction. Transcript Processing Agent: GPT-based agent extracts: Requester name, department Request title & description Priority & request type Submit JIRA Request Ticket: Create ticket from AI-extracted data. Setup Slack / Email / Manual Steps: Optional internal routing or approvals. Inform Reporter via Telegram: Sends confirmation message with JIRA ticket link. 🔧 How to Customize Replace JIRA with Zendesk, GitHub Issues, or other ticketing tools. Change Slack to Microsoft Teams or Email. Add Notion/Airtable logging. Enhance agent to extract department from user ID or metadata. 📦 Requirements | Integration | Notes | |-------------|-------| | Telegram Bot | Used for input/output | | Google Drive | Audio backup | | OpenAI GPT + Whisper | Transcript & Extraction | | JIRA | Ticketing platform | | Slack | Team notification | Built with ❤️ using n8n
by Nima Salimi
Description🔍 This n8n workflow is a complete marketing automation system that connects to your CDP (Customer Data Platform), selects which flows to send, and delivers personalized emails using Brevo. It's modular and extensible — you can also add SMS, push notifications, Telegram messages, or other channels. To build a full marketing automation system, you need four key components: Workflow Automation – using n8n (this workflow) CDP – store and manage user data (e.g., NocoDB, Metabase, Power BI, etc.) Database – track transactions, templates, and send statuses (e.g., NocoDB) BI / Analytics – monitor performance by flows, journeys, and sent events This workflow represents the Workflow Automation layer. You can connect it to your own data stack or use the included example databases (cdp-ecrm, n8n-templates-ecrm, and n8n-transaction-ecrm) to get started quickly. 👤 Who’s it for? Growth & CRM teams managing user engagement flows Ecommerce marketers running time-sensitive email journeys Marketing automation pros using low-code CRM stacks Data teams building custom campaign triggers from CDPs ✅ Features 🔁 Two modular flows: "Insert user_id" and "Sending Email" 🧠 Select flow using flow_id from templates in NocoDB ✏️ Insert user data into n8n-transaction-ecrm with processing status 🔍 Filter duplicate users by user_id to avoid over-sending 📧 Validate email fields and flag disposables 📨 Send personalized emails using Brevo template parameters 📊 Track delivery with sent_result, sent_at, and status updates 🕒 Runs every 30 minutes via schedule trigger 🛠 How to Use Set your flow In the Setup Flow node, change the flow_id to match a row in your n8n-templates-ecrm table. Prepare your tables in NocoDB cdp-ecrm: contains users (user_id, email, first_name, phone_number) n8n-templates-ecrm: contains flows with metadata n8n-transaction-ecrm: stores and updates user send status Configure credentials NocoDB API Token Brevo (Sendinblue) API Key Trigger the flows Run “Insert user_id” manually or on a schedule to prepare users “Sending Email” runs automatically every 30 minutes 📌 Notes Disposable email domains are filtered using regex Status: 0-processing → just inserted 1-sending → ready to send 2-sent → email sent successfully 3-no-email → missing email address 4-disposal-email → disposable or banned email Easily duplicate the "Insert user_id" flow to add more campaigns
by Sagar
This template streamlines your AI Avatar Video Automation workflow by connecting Google sheets for Voice Text & AI Avatar Video Link storage, using HTTP Nodes for connecting Heygen API & AI Avatar/Voice Id for automated Video generation. Pre-requisites Before setting up this workflow, ensure you have: A Google account with access to Google Sheets A Heygen Account with API access in account's settings. n8n.io account with workflow access Setup Instructions Configure Data Source Create a Google Sheet with the following columns: Script/Voice Text & Final AI Avatar Video Link. Connect Google Sheet Add your Google sheet credentials in the “Google Sheet” node Specify the folder path where your columns are stored. Configure the node to retrieve files based on filenames from your Google Sheet Set Up HTTP Node with Heygen API Credentials Configure the node to generate AI Video based on Script/Voice Text. Configure HTTP Node 2 Connect Heygen API Credentials Set up the API node to Get the AI Avatar Video Link. then finally setup Google sheet node again to get & upload the final AI Avatar video link in the column "the Final AI Avatar Video Link" Workflow Automation Setup Configure the scheduler node to run at your preferred frequency Set up error handling to notify you of any posting failures Execution Instructions After completing all connections, test the workflow. Monitor the execution in the n8n dashboard to ensure proper functioning View the “Executions” tab to track successful and troubleshoot any errors. This template saves hours of manual AI Avatar video Creation Process. use this without the daily manual effort. Details Nodes used in workflow Manual Trigger Node Google Sheet Node 1 HTTP Node 1 HTTP Node 2 Google Sheets Node 2
by vinci-king-01
Smart Supplier Health Monitor with ScrapeGraphAI Risk Detection and Multi-Channel Alerts 🎯 Target Audience Procurement managers and directors Supply chain risk analysts CFOs and financial controllers Vendor management teams Enterprise risk managers Operations managers Contract administrators Business continuity planners 🚀 Problem Statement Manual supplier monitoring is reactive and time-consuming, often missing early warning signs of financial distress that could disrupt your supply chain. This template solves the challenge of proactive supplier health surveillance by automatically monitoring financial indicators, news sentiment, and market conditions to predict supplier risks before they impact your business operations. 🔧 How it Works This workflow automatically monitors your critical suppliers' financial health using AI-powered web scraping, analyzes multiple risk factors, identifies alternative suppliers when needed, and sends intelligent alerts through multiple channels to ensure your procurement team can act quickly on emerging risks. Key Components Weekly Health Check Scheduler - Automated trigger based on supplier criticality levels Supplier Database Loader - Dynamic supplier portfolio management with risk-based monitoring frequency ScrapeGraphAI Website Analyzer - AI-powered extraction of financial health indicators from company websites Financial News Scraper - Intelligent monitoring of financial news and sentiment analysis Advanced Risk Scorer - Industry-adjusted risk calculation with failure probability modeling Alternative Supplier Finder - Automated identification and ranking of backup suppliers Multi-Channel Alert System - Email, Slack, and API notifications with escalation rules 📊 Risk Analysis Specifications The template performs comprehensive financial health analysis with the following parameters: | Risk Factor | Weight | Score Impact | Description | |-------------|--------|--------------|-------------| | Financial Issues | 40% | +0-24 points | Revenue decline, debt levels, cash flow problems | | Operational Risks | 30% | +0-18 points | Management changes, restructuring, capacity issues | | Market Risks | 20% | +0-12 points | Industry disruption, regulatory changes, competition | | Reputational Risks | 10% | +0-6 points | Negative news, legal issues, public sentiment | Industry Risk Multipliers: Technology: 1.1x (Higher volatility) Manufacturing: 1.0x (Baseline) Energy: 1.2x (Regulatory risks) Financial: 1.3x (Market sensitivity) Logistics: 0.9x (Generally stable) Risk Levels & Actions: Critical Risk**: Score ≥ 75 (CEO/CFO escalation, immediate transition planning) High Risk**: Score ≥ 55 (Procurement director escalation, backup activation) Medium Risk**: Score ≥ 35 (Manager review, increased monitoring) Low Risk**: Score < 35 (Standard monitoring) 🏢 Supplier Management Features | Feature | Critical Suppliers | High Priority | Medium Priority | |---------|-------------------|---------------|-----------------| | Monitoring Frequency | Weekly | Bi-weekly | Monthly | | Risk Threshold | 35+ points | 40+ points | 50+ points | | Alert Recipients | C-Level + Directors | Directors + Managers | Managers only | | Alternative Suppliers | 3+ pre-qualified | 2+ identified | 1+ researched | | Transition Timeline | 24-48 hours | 1-2 weeks | 1-3 months | 🛠️ Setup Instructions Estimated setup time: 25-30 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Gmail account for email alerts (or alternative email service) Slack workspace with webhook or bot token Supplier database or CRM system API access Basic understanding of procurement processes Step-by-Step Configuration 1. Configure ScrapeGraphAI Credentials Sign up for ScrapeGraphAI API account Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials with your API key Test the connection to ensure proper functionality 2. Set up Email Integration Add Gmail OAuth2 credentials in n8n Configure sender email and authentication Test email delivery with sample message Set up email templates for different risk levels 3. Configure Slack Integration Create Slack webhook URL or bot token Add Slack credentials to n8n Configure target channels for different alert types Customize Slack message formatting and buttons 4. Load Supplier Database Update the "Supplier Database Loader" node with your supplier data Configure supplier categories, contract values, and criticality levels Set monitoring frequencies based on supplier importance Add supplier website URLs and contact information 5. Customize Risk Parameters Adjust industry risk multipliers for your business context Modify risk scoring thresholds based on risk tolerance Configure economic factor adjustments Set failure probability calculation parameters 6. Configure Alternative Supplier Database Populate the alternative supplier database in the "Alternative Supplier Finder" node Add supplier ratings, capacities, and specialties Configure geographic coverage and certification requirements Set suitability scoring parameters 7. Set up Procurement System Integration Configure the procurement system webhook endpoint Add API authentication credentials Test webhook payload delivery Set up automated data synchronization 8. Test and Validate Run test scenarios with sample supplier data Verify ScrapeGraphAI extraction accuracy Check risk scoring calculations and thresholds Confirm all alert channels are working properly Test alternative supplier recommendations 🔄 Workflow Customization Options Modify Risk Analysis Add custom risk indicators specific to your industry Implement sector-specific economic adjustments Configure contract-specific risk factors Add ESG (Environmental, Social, Governance) scoring Extend Data Sources Integrate credit rating agency APIs (Dun & Bradstreet, Experian) Add financial database connections (Bloomberg, Reuters) Include social media sentiment analysis Connect to government regulatory databases Enhance Alternative Supplier Management Add automated supplier qualification workflows Implement dynamic pricing comparison Create supplier performance scorecards Add geographic risk assessment Advanced Analytics Implement predictive failure modeling Add supplier portfolio optimization Create supply chain risk heatmaps Generate automated compliance reports 📈 Use Cases Supply Chain Risk Management**: Proactive monitoring of supplier financial stability Procurement Optimization**: Data-driven supplier selection and management Business Continuity Planning**: Automated backup supplier identification Financial Risk Assessment**: Early warning system for supplier defaults Contract Management**: Risk-based contract renewal and negotiation Vendor Diversification**: Strategic supplier portfolio management 🚨 Important Notes Respect ScrapeGraphAI API rate limits and terms of service Implement appropriate delays between supplier assessments Keep all API credentials secure and rotate them regularly Monitor API usage to manage costs effectively Ensure compliance with data privacy regulations (GDPR, CCPA) Regularly update supplier databases and contact information Review and adjust risk parameters based on market conditions Maintain confidentiality of supplier financial information 🔧 Troubleshooting Common Issues: ScrapeGraphAI extraction errors: Check API key validity and rate limits Email delivery failures: Verify Gmail credentials and permissions Slack notification failures: Check webhook URL and channel permissions False positive alerts: Adjust risk scoring thresholds and industry multipliers Missing supplier data: Verify website URLs and accessibility Alternative supplier errors: Check supplier database completeness Monitoring Best Practices: Set up workflow execution monitoring and error alerts Regularly review and update supplier information Monitor API usage and costs across all integrations Validate risk scoring accuracy with historical data Test disaster recovery and backup procedures Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Procurement best practices and industry standards Financial risk assessment methodologies Supply chain management resources and tools
by Aditya Gaur
Who is this template for? This template is designed for teams who need to automate data retrieval from SharePoint lists using n8n. It is ideal for users who want to authenticate via OAuth and then use the token to access SharePoint API endpoints, pulling in list data directly into n8n. How it works The template first generates an OAuth token using the Microsoft OAuth API. This token is then used to authenticate requests to the SharePoint List API, allowing the workflow to fetch data from a specified SharePoint list. By following the n8n workflow, the user can configure the necessary credentials and endpoints to automate SharePoint data access securely. Setup steps Step 1: Replace {tenant_id}, {client_id}, and {client_secret} with your Azure AD details for OAuth authentication. Step 2: Specify the SharePoint list API endpoint in the template (under "SharePoint List Fetch" node). Step 3: Configure the SharePoint list URL and make adjustments for specific data fields if necessary.
by Jay Hartley
What this workflow does Downloads the daily top podcasts of a selected genre Summarizes the content of each podcast in a few paragraphs Sends the summaries and the direct link to each podcast in a formatted email Setup Create a free API key on Taddy here: https://taddy.org/signup/developers Input your user number and API key into the TaddyTopDaily node in the header parameters X-USER-ID and X-API-KEY respectively. Create access credentials for your Gmail as described here: https://developers.google.com/workspace/guides/create-credentials. Use the credentials from your client_secret.json in the Gmail node. In the Genre node, set the genre of podcasts you want a summary for. Valid values are: TECHNOLOGY, NEWS, ARTS, COMEDY, SPORTS, FICTION, etc. Look at api.taddy.org for the full list (they will be displayed in the help docs as PODCASTSERIES_TECHNOLOGY, PODCASTSERIES_NEWS, etc.) Enter your email address in the Gmail node. Change the schedule time for sending email from Schedule to whichever time you want to receive the email. Test: Hit Test Workflow. Check your email for the results. That's it! It should take less than 5 minutes total.
by David Ashby
Complete MCP server exposing 2 IP Geolocation API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add IP Geolocation API credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the IP Geolocation API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.bigdatacloud.net • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (2 total) 🔧 Data (2 endpoints) • GET /data/ip-geolocation-full: IP Geolocation with Confidence Area and Hazard Report API • GET /data/ip-geolocation-with-confidence: IP Geolocation with Confidence Area API 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native IP Geolocation API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Aitor | 1Node
Stay ahead of the curve and keep your followers informed—automatically. This n8n workflow uses Perplexity AI to generate insightful answers to scheduled queries, then auto-posts the responses directly to X (Twitter). ⚙️ What this workflow does Scheduled Trigger – Runs at set times (daily, hourly, etc.). searchQuery – Define what kind of trending or relevant insight you want (e.g. “latest AI trends”). set API Key – Securely insert your Perplexity API key. Perplexity API Call – Fetches a short, insightful response to your query. Post to X – Automatically publishes the result as a tweet. 🧩 Requirements An n8n account (self-hosted or cloud) A Perplexity API key A connected X (Twitter) account via n8n’s credentials ✅ Setup Steps Add this workflow into your n8n account. Edit the searchQuery node with a topic (e.g. “What’s new in ecommerce automation?”). Paste your Perplexity API key into the set API key node. Connect your X (Twitter) account in the final node. Adjust the schedule timing to suit your content frequency. 💡 Ideas to Improve 💬 Add a formatting step to shorten or hashtag the response. 📊 Pull multiple trending questions and auto-schedule posts. 🔁 Loop responses to queue a full week of content. 🌐 Translate content before posting to reach a global audience. 🆘 Need help? Feel free to contact us at 1 Node. Get instant access to a library of free resources we created.