by Nalin
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Complete account-based outreach automation with Octave context engine Who is this for? Revenue teams, account-based marketing professionals, and growth operators who want a complete, automated pipeline from account identification to contextualized outreach. Built for teams ready to move beyond fragmented point solutions to an integrated, context-aware GTM engine. What problem does this solve? Most GTM teams are flying blind with disconnected tools that can't talk to each other. You qualify accounts in one system, find contacts in another, research context manually, then hope your email sequences land. Each step loses context, and by the time you're writing outreach, you've forgotten why the account was qualified in the first place. Octave centralizes all this typically fragmented context - your ICP definitions, personas, value propositions, and business logic - so every agent operation can act on the same unified understanding of your market. This workflow demonstrates how Octave's agents work together seamlessly because they all share the same context foundation. What this workflow does Complete Account-to-Outreach Pipeline: This workflow demonstrates the full power of Octave's context engine by connecting multiple agent operations in a seamless flow. Unlike traditional tools that lose context at each handoff, Octave centralizes your business context - ICP definitions, personas, value propositions, competitive positioning - so every agent operates from the same unified understanding of your market. External Context Research: Gathers real-time external data about target accounts (job postings, news, funding, etc.) Processes this information to create runtime context for later use in outreach Establishes the "why reach out now" foundation for the entire workflow Company-Level Qualification: Uses Octave's company qualification to assess account fit against your specific offering Leverages Product and Segment-level fit criteria defined in your Library Filters out accounts that don't meet your qualification thresholds Ensures only high-potential accounts proceed through the workflow Intelligent Contact Discovery: Runs Octave's prospector agent on qualified accounts Finds relevant stakeholders based on responsibilities and business context, not just job titles Discovers multiple contacts per account for comprehensive coverage Maintains qualification context when identifying the right people Runtime Context Integration: Takes the external context gathered at the beginning and injects it into sequence generation Creates truly dynamic, timely outreach that references current company events Generates sequences that feel impossibly relevant and well-researched Multi-Contact Sequence Generation: Splits discovered contacts into individual records for processing Generates contextualized email sequences for each contact Maintains account-level context while creating contact-specific messaging Produces sequences (1-7 emails) that feel unmistakably meant for each person Automated Campaign Deployment: Automatically adds all qualified contacts with their contextualized sequences to email campaigns Maps dynamic content to campaign variables for seamless execution Maintains the context chain from qualification through delivery Setup Required Credentials: Octave API key and workspace access External data source API (job boards, news APIs, enrichment services, etc.) Email platform API key (Instantly.ai configured, easily adaptable) Optional: LLM credentials if using the example external research agent Step-by-Step Configuration: Set up Account Input Source: Replace your-webhook-path-here with a unique, secure path Configure your account source (CRM, website visitors, target lists) to send company data Ensure account data includes company name and domain for processing Configure External Context Research: Replace the example AI agent with your preferred external data source Set up connections to job boards, news APIs, or enrichment services Configure context gathering to find timely, relevant information about target accounts Set up Company Qualification Agent: Add your Octave API credentials Replace your-octave-company-qualification-agent-id with your actual agent ID Configure qualification criteria at Product and Segment levels in your Octave Library Configure Prospector Agent: Replace your-octave-prospector-agent-id with your actual prospector agent ID Define target personas and stakeholder types in your Octave Library Set contact discovery parameters for optimal coverage Set up Sequence Agent: Replace your-octave-sequence-agent-id with your actual sequence agent ID Configure runtime context integration for dynamic content Test sequence quality with the external context integration Configure Email Campaign Platform: Add your email platform API credentials Replace your-campaign-id-here with your actual campaign ID Ensure campaign supports custom variables for dynamic content Required Webhook Payload Format: { "body": { "companyName": "InnovateTech Solutions", "companyDomain": "innovatetech.com" } } How to customize External Context Sources: Replace the example research with your data sources: Job Board APIs:** Reference current hiring and team expansion News APIs:** Mention funding, product launches, or market expansion Enrichment Services:** Pull technology adoption, market changes, or competitive moves Social Monitoring:** Reference recent company posts or industry discussions Company Qualification: Configure qualification in your Octave company qualification agent: Product Level:** Define "good fit" and "bad fit" questions for your core offering Segment Level:** Set criteria for different market segments or use cases Qualification Thresholds:** Adjust the filter score based on your standards Contact Discovery: Customize prospecting in your Octave prospector agent: Target Personas:** Define which Library personas to prioritize Organizational Levels:** Focus on specific seniority levels or decision-making authority Contact Volume:** Adjust how many contacts to discover per qualified account Runtime Context Integration: Configure dynamic content injection: Context Definition:** Specify what external data represents in your sequences Usage Instructions:** Define how to incorporate context into messaging Email-Level Control:** Apply different context to different emails in sequences Sequence Generation: Customize email creation: Core Context (Library):** Define personas, use cases, and offering definitions Strategy (Playbooks):** Configure messaging frameworks and value propositions Writing Style (Agent):** Adjust tone, voice, and communication approach Campaign Integration: Adapt for different email platforms: Update API endpoints and authentication for your preferred platform Modify variable mapping for platform-specific requirements Adjust sequence formatting and length based on platform capabilities Use Cases Complete inbound lead processing from website visitor to qualified outreach Event-triggered account processing for funding announcements or hiring spikes Competitive displacement campaigns with account qualification and contact discovery Market expansion automation for entering new territories or segments Product launch outreach with context-aware targeting and messaging Customer expansion workflows for upselling within existing account bases
by Belen
This n8n template automatically transcribes GoHighLevel (GHL) call recordings and creates an AI-generated summary that is added as a note directly to the related contact in your GHL CRM. It’s designed for real estate investors, agencies, and sales teams that handle a large volume of client calls and want to keep detailed, searchable notes without spending hours on manual transcription. Who’s it for Sales and acquisitions teams that want instant call notes in their CRM Real estate wholesalers or agencies using GoHighLevel for deal flow Support and QA teams that need summarized transcripts for review Any business owner who wants to automatically document client conversations How it works A HighLevel automation workflow triggers when a call is marked “Completed” and automatically sends a webhook to n8n. The n8n workflow receives this webhook and waits briefly to ensure the call recording is ready. It retrieves the conversation and message IDs from the webhook payload. The call recording is fetched from GHL’s API. An AI transcription node converts the audio to text. A summarization node condenses the transcript into bullet points or a concise paragraph. A Code node formats the AI output into proper JSON for GHL’s “Create Note” endpoint. Finally, an HTTP Request node posts the summary to the contact’s record in GHL. How to set up Add your GoHighLevel OAuth credential and connect your agency account. Add your AI credential (e.g., OpenAI, Anthropic, or Gemini). Replace the sample webhook URL with your n8n endpoint. Test with a recent call and confirm the summary appears in the contact timeline. Requirements GoHighLevel account with API and OAuth access AI service for transcription and summarization (e.g., OpenAI Whisper + GPT) Customizing this workflow You can tailor this automation for your specific team or workflow: Add sentiment analysis or keyword extraction to the summary. Change the AI prompt to focus on “action items,” “objections,” or “next steps.” Send summaries to Slack, Notion, or Google Sheets for reporting. Trigger follow-up tasks automatically in your CRM based on keywords. Good to know AI transcription and summarization costs vary by provider — check your LLM’s pricing. GoHighLevel’s recording availability may take up to 1 minute after the call ends; adjust the delay accordingly. For OAuth setup help, refer to GHL’s OAuth documentation. Happy automating! ⚙️
by Rapiwa
Automatically Send WhatsApp Discount Codes to Shopify Customers Using Rapiwa Who is this for? This n8n workflow automatically sends WhatsApp promotional messages to top customers whenever a new discount code is created in Shopify. It’s perfect for store owners, marketers, sales teams, or support agents who want to engage their best customers effortlessly. The workflow fetches customer data, filters high-spending customers, verifies their WhatsApp numbers using the Rapiwa API, sends discount messages to verified contacts, and logs all activity in Google Sheets. Designed for non-technical users who don’t use the official WhatsApp Business API, this automation simplifies customer outreach and tracking without any manual work. What this Workflow Does This n8n workflow connects with a Google Sheet that contains a list of contacts. It reads rows marked for processing, cleans the phone numbers, checks their validity using Rapiwa's WhatsApp validation API, sends WhatsApp messages to valid numbers, and updates the status of each row accordingly. Key Features Runs Every 5 Minutes**: Automatically triggers the workflow Google Sheets Integration**: Reads and writes data from a specific sheet Phone Number Validation**: Confirms if a WhatsApp number is active via Rapiwa API Message Sending**: Sends a message using Rapiwa's /send-message endpoint Status Update**: Sheet is updated with success or failure status Safe API Usage**: Delays added between requests to prevent rate limits Batch Limit**: Processes max 60 rows per cycle Conditional Checks**: Skips rows without a "check" value Requirements A Google Sheet with necessary columns Rapiwa account** with active subscription (you can free 200 message) Your WhatsApp number connected to Rapiwa Valid Bearer Token n8n Instance** (self-hosted or cloud) Google Sheets node configured HTTP Request node access How to Use Step-by-Step Setup Webhook Receives Shopify Webhook (discount creation) via HTTP POST request. This is triggered when a discount is created in your Shopify store. Configure Google Sheets in n8n Use the Google Sheets node with OAuth2 access Get Rapiwa API Token Create an account on Rapiwa Connect your WhatsApp number Copy your Bearer Token from the Rapiwa dashboard Set Up HTTP Request Nodes Validate number via: https://app.rapiwa.com/api/verify-whatsapp Send message via: https://app.rapiwa.com/api/send-message Add your bearer token to the headers Google Sheet Column Structure A Google Sheet** formatted like this ➤ Sample | discount_code | created_at | shop_domain | name | number | verify | status | | -------------------------------------------- | ------- | ------------------------- | ----------------------- | -------------- | ------------- | ---------- | -------- | | V8ZGVRDFP5TB | 2025-09-25T05:26:40-04:00 | your_shop_domain | Abdul Mannan | 8801322827798| unverified | not sent | | V8ZGVRDFP5TB | 2025-09-25T05:26:40-04:00 | your_shop_domain | Abdul Mannan | 8801322827799| verified | sent | Support & Help Rapiwa Website:** https://rapiwa.com WhatsApp**: Chat on WhatsApp Discord**: SpaGreen Community Facebook Group**: SpaGreen Support Website**: https://spagreen.net Developer Portfolio**: Codecanyon SpaGreen
by Julian Kaiser
Scan Any Workout Plan into the Hevy App with AI This workflow automates the creation of workout routines in the Hevy app by extracting exercise information from an uploaded PDF or Image using AI. What problem does this solve? Tired of manually typing workout plans into the Hevy app? Whether your coach sends them as Google Docs, PDFs, or you have a screenshot of a routine, entering every single exercise, set, and rep is a tedious chore. This workflow ends the madness. It uses AI to instantly scan your workout plan from any file, intelligently extract the exercises, and automatically create the routine in your Hevy account. What used to take 15 minutes of mind-numbing typing now happens in seconds. How it works Trigger: The workflow starts when a PDF file is submitted through an n8n form. Data Extraction: The PDF is converted to a Base64 string and sent to an AI model to extract the raw text of the workout plan. Context Gathering: The workflow fetches a complete list of available exercises directly from the Hevy API. This list is then consolidated. AI Processing: A Google Gemini model analyzes the extracted text, compares it against the official Hevy exercise list, and transforms the raw text into a structured JSON format that matches the Hevy API requirements. Routine Creation: The final structured data is sent to the Hevy API to create the new workout routine in your account. Set up steps Estimated set up time:** 15 minutes. Configure the On form submission trigger or replace it with your preferred trigger (e.g., Webhook). Ensure it's set up to receive a file upload. Add your API credentials for the AI service (in this case, OpenRouter.ai) and the Hevy app. You will need to create 'Hevy API' and OpenRouter API credentials in your n8n instance. In the Structured Data Extraction node, review the prompt and the json schema in the Structured Output Parser. You may need to adjust the prompt to better suit the types of files you are uploading. Activate the workflow. Test it by uploading a sample workout plan document.
by Evoort Solutions
🚀 Website Traffic Monitoring with SEMrush API and Google Sheets Integration Leverage the powerful SEMrush Website Traffic Checker API to automatically fetch detailed website traffic insights and log them into Google Sheets for real-time monitoring and reporting. This no-code n8n workflow simplifies traffic analysis for marketers, analysts, and website owners. ⚙️ Node-by-Node Workflow Breakdown 1. 🟢 On Form Submission Trigger:** The workflow is initiated when a user submits a website URL via a form. This serves as the input for further processing. Use Case:** When you want to track multiple websites and monitor their performance over time. 2. 🌐 Website Traffic Checker API Request:* The workflow makes a POST request to the *SEMrush Website Traffic Checker API** via RapidAPI using the website URL that was submitted. API Data:** The API returns detailed traffic insights, including: Visits Bounce rate Page views Sessions Traffic sources And more! 3. 🔄 Reformat Parsing:** The raw API response is parsed to extract the relevant data under trafficSummary. Data Structure:** The workflow creates a clean dataset of traffic data, making it easy to store in Google Sheets. 4. 📄 Google Sheets Logging Data:** The traffic data is appended as a new row in your Google Sheet. Google Sheet Setup:** The data is organized and updated in a structured format, allowing you to track website performance over time. 💡 Use Cases 📊 SEO & Digital Marketing Agencies:** Automate client website audits by pulling live traffic data into reports. 🌐 Website Owners & Bloggers:** Monitor traffic growth and analyze content performance automatically. 📈 Data Analysts & Reporting Teams:** Feed traffic data into dashboards and integrate with other KPIs for deeper analysis. 🕵️ Competitor Tracking:** Regularly log competitor site metrics for comparative benchmarking. 🎯 Key Benefits ✅ Automated Traffic Monitoring — Run reports automatically on-demand or on a scheduled basis. ✅ Real-Time Google Sheets Logging — Easily centralize and structure traffic data for easy sharing and visualization. ✅ Zero Code Required — Powered by n8n’s visual builder, set up workflows quickly without writing a single line of code. ✅ Scalable & Flexible — Extend the workflow to include alerts, additional API integrations, or other automated tasks. 🔐 How to Get Your SEMrush API Key via RapidAPI Visit the API Listing 👉 SEMrush Website Traffic Checker API Sign In or Create an Account Log in to RapidAPI or sign up for a free account. Subscribe to the API Choose the appropriate pricing plan and click Subscribe. Access Your API Key Go to the Endpoints tab. Your API key is located under the X-RapidAPI-Key header. Secure & Use the Key Add your API key to the request headers in your workflow. Never expose the key publicly. 🔧 Step-by-Step Setup Instructions 1. Creating the Form to Capture URL In n8n, create a new workflow and add a Webhook trigger node to capture website URLs. Configure the webhook to accept URL submissions from your form. Add a form to your website or app that triggers the webhook when a URL is submitted. 2. Configure SEMrush API Request Node Add an HTTP Request node after the webhook. Set the method to POST and the URL to the SEMrush API endpoint. Add the necessary headers: X-RapidAPI-Host: semrush-website-traffic-checker.p.rapidapi.com X-RapidAPI-Key: [Your API Key] Pass the captured website URL from the webhook as a parameter in the request body. 3. Reformat API Response Add a Set node to parse and structure the API response. Extract only the necessary data, such as: trafficSummary.visits trafficSummary.bounceRate trafficSummary.pageViews trafficSummary.sessions Format the response to be clean and suitable for Google Sheets. 4. Store Data in Google Sheets Add the Google Sheets node to your workflow. Authenticate with your Google account. Select the spreadsheet and worksheet where you want to store the traffic data. Configure the node to append new rows with the extracted traffic data. Google Sheets Columns Setup A**: Website URL B**: Visits C**: Bounce Rate D**: Page Views E**: Sessions F**: Date/Time (optional, you can use a timestamp) 5. Test and Deploy Run a test submission through your form to ensure the workflow works as expected. Check the Google Sheets document to verify that the data is being logged correctly. Set up scheduling or additional workflows as needed (e.g., periodic updates). 📈 Customizing the Template You can modify the workflow to suit your specific needs: Add more data points**: Customize the SEMrush API request to fetch additional metrics (e.g., traffic sources, keywords, etc.). Create separate sheets**: If you're tracking multiple websites, you can create a different sheet for each website or group websites by category. Add alerts**: Set up email or Slack notifications if specific traffic conditions (like sudden drops) are met. Visualize data**: Integrate Google Sheets with Google Data Studio or other tools for more advanced visualizations. 🚀 Start Automating in Minutes Build your automated website traffic dashboard with n8n today — no coding required. 👉 Start with n8n for Free Save time, improve accuracy, and supercharge your traffic insights workflow!
by Amit Mehta
This workflow performs structured data extraction and data mining from a web page by combining the capabilities of Bright Data and Google Gemini. How it Works This workflow focuses on extracting structured data from a web page using Bright Data's Web Unlocker Product. It then uses n8n's AI capabilities, specifically Google Gemini Flash Exp, for information extraction and custom sentiment analysis. The results are sent to webhooks and saved as local files. Use Cases Data Mining**: Automating the process of extracting and analyzing data from websites. Web Scraping**: Gathering structured data for market research, competitive analysis, or content aggregation. Sentiment Analysis**: Performing custom sentiment analysis on unstructured text. Setup Instructions Bright Data Credentials: You need to have an account and a Web Unlocker zone with Bright Data. Update the Header Auth account credentials in the Perform Bright Data Web Request node. Google Gemini Credentials: Provide your Google Gemini(PaLM) Api account credentials for the AI-related nodes. Configure URL and Zone: In the Set URL and Bright Data Zone node, set the web URL you want to scrape and your Bright Data zone. Update Webhook: Update the Webhook Notification URL in the relevant HTTP Request nodes. Workflow Logic Trigger: The workflow is triggered manually. Set Parameters: It sets the target URL and the Bright Data zone. Web Request: The workflow performs a web request to the specified URL using Bright Data's Web Unlocker. The output is formatted as markdown. Data Extraction & Analysis: The markdown content is then processed by multiple AI nodes to: Extract textual data from the markdown. Perform topic analysis with a structured response. Analyze trends by location and category with a structured response. Output: The extracted data and analysis are sent to webhooks and saved as JSON files on disk. Node Descriptions | Node Name | Description | |-----------|-------------| | When clicking 'Test workflow' | A manual trigger node to start the workflow. | | Set URL and Bright Data Zone | A Set node to define the URL to be scraped and the Bright Data zone to be used. | | Perform Bright Data Web Request | An httpRequest node that performs the web request to Bright Data's API to retrieve the content. | | Markdown to Textual Data Extractor | An AI node that uses Google Gemini to convert markdown content into plain text. | | Google Gemini Chat Model | A node representing the Google Gemini model used for the data extraction. | | Topic Extractor with the structured response | An AI node that performs topic analysis and outputs the results in a structured JSON format. | | Trends by location and category with the structured response | An AI node that analyzes and clusters emerging trends by location and category, outputting a structured JSON. | | Initiate a Webhook Notification... | These nodes send the output of the AI analysis to a webhook. | | Create a binary file... | Function nodes that convert the JSON output into binary format for writing to a file. | | Write the topics/trends file to disk | readWriteFile nodes that save the binary data to a local file (d:\topics.json and d:\trends.json). | Customization Tips Change the web URL in the Set URL and Bright Data Zone node to scrape different websites. Modify the AI prompts in the AI nodes to customize the analysis (e.g., change the sentiment analysis criteria). Adjust the output path in the readWriteFile nodes to save the files to a different location. Suggested Sticky Notes for Workflow Note**: "This workflow deals with the structured data extraction by utilizing Bright Data Web Unlocker Product... Please make sure to set the web URL of your interest within the 'Set URL and Bright Data Zone' node and update the Webhook Notification URL". LLM Usages**: "Google Gemini Flash Exp model is being used... Information Extraction is being used for the handling the custom sentiment analysis with the structured response". Required Files 1GOrjyc9mtZCMvCr_Structured_Data_Extract,Data_Mining_with_Bright_Data&_Google_Gemini.json: The main n8n workflow export for this automation. Testing Tips Run the workflow and check the webhook to verify that the extracted data is being sent correctly. Confirm that the d:\topics.json and d:\trends.json files are created on your disk with the expected structured data. Suggested Tags & Categories Engineering AI
by Nalin
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Generate dynamic email sequences with runtime context and external data Who is this for? Growth teams, sales development reps, and outbound marketers who want to reference specific, real-time information about prospects in their email sequences. Built for teams that have access to external data sources and want to create truly contextualized outreach that feels impossibly relevant. What problem does this solve? Most outbound sequences are static - they use the same messaging for everyone regardless of what's actually happening at the prospect's company right now. You might know they're hiring, launched a product, got funding, or expanded to new markets, but your email sequences can't dynamically reference these timely events. This workflow shows how to inject real-time external context into Octave's sequence generation, creating outreach that feels like you're personally monitoring each prospect's company. What this workflow does Lead Data & Context Collection: Receives lead information via webhook (firstName, companyName, companyDomain, profileURL, jobTitle) Uses external data sources to gather timely context about the prospect's company Example: AI agent researches current job postings to find roles they're actively hiring for Processes this context into structured data for sequence generation Runtime Context Integration: Feeds external context into Octave's sequence generation as "runtime context" Defines both WHAT the context is ("they are hiring a software engineer") and HOW to use it ("mention the role in the opening") Allows Octave to weave timely, relevant details into each email naturally Creates sequences that feel like personal research rather than mass outreach Dynamic Sequence Generation: Leverages Octave's context engine plus runtime data to create hyper-relevant sequences (1-7 emails) Generates subject lines and email content that reference specific, current company context Maintains your positioning and value prop while incorporating timely relevance Creates messaging that feels unmistakably meant for that specific moment in the prospect's business Campaign Integration: Automatically adds leads with contextualized sequences to your email platform Maps generated content to campaign variables for automated sending Supports multiple email platforms with easy customization Setup Required Credentials: Octave API key and workspace access External data source API (job boards, news APIs, enrichment services, etc.) Email platform API key (Instantly.ai configured, easily adaptable) Optional: LLM credentials if using the example AI agent for testing Step-by-Step Configuration: Set up External Data Source: Replace the AI Agent with your preferred data source (job board APIs, news APIs, company databases) Configure data collection to find relevant, timely information about prospects Structure the output to provide clean context for sequence generation Set up Octave Sequence Agent: Add your Octave API credentials in n8n Replace your-octave-sequence-agent-id with your actual sequence agent ID Configure runtime context parameters: Runtime Context: Define WHAT the external data represents Runtime Instructions: Specify HOW to use it in the messaging Configure Email Platform: Add your email platform API credentials Replace your-campaign-id-here with your actual campaign ID Ensure campaign supports custom variables for dynamic content Set up Lead Source: Replace your-webhook-path-here with a unique, secure path Configure your lead source to send prospect data to the webhook Test end-to-end flow with sample leads Required Webhook Payload Format: { "body": { "firstName": "Alex", "lastName": "Chen", "companyName": "InnovateTech", "companyDomain": "innovatetech.com", "profileURL": "https://linkedin.com/in/alexchen", "email": "alex@innovatetech.com", "jobTitle": "VP of Engineering" } } How to customize External Data Sources: Replace the AI agent with your preferred context collection method: Job Board APIs:** Reference current hiring needs and team expansion News APIs:** Mention recent company announcements, funding, or product launches Social Media Monitoring:** Reference recent LinkedIn posts, company updates, or industry discussions Enrichment Services:** Pull real-time company data, technology stack changes, or market expansion Runtime Context Configuration: Customize how external data integrates with sequences: Context Definition:** Specify what the external data represents ("they just raised Series B funding") Usage Instructions:** Define how to incorporate it ("mention the funding in the opening and tie it to growth challenges") Email-Level Control:** Configure different context usage for different emails in the sequence Global vs. Specific:** Apply context to all emails or target specific messages Data Processing: Replace the example AI agent with your external data processing: Modify data source connections to pull relevant context Ensure consistent output formatting for runtime context integration Add error handling for cases where external data isn't available Implement fallback context for prospects without relevant external data Sequence Customization: Configure Octave sequence generation: Core Context (Library):** Define your personas, use cases, and offering definitions Strategy (Playbooks):** Configure messaging frameworks and value proposition delivery Writing Style (Agent):** Adjust tone, voice, and communication style Email Platform Integration: Adapt for different email sequencing platforms: Update API endpoints and authentication for your preferred platform Modify variable mapping for platform-specific custom fields Adjust sequence length and formatting requirements Use Cases Job posting-triggered outreach for hiring managers and HR teams Funding announcement follow-ups for growth-stage companies Product launch congratulations with relevant use case discussions Market expansion outreach when companies enter new territories Technology adoption sequences based on recent stack additions Event attendance follow-ups with session-specific references
by Daniel
Generate stunning 10-second AI-crafted nature stock videos on autopilot and deliver them straight to your Telegram chat—perfect for content creators seeking effortless inspiration without the hassle of manual prompting or editing. 📋 What This Template Does This workflow automates the creation and delivery of high-quality, 10-second nature-themed videos using AI generation tools. Triggered on a schedule, it leverages Google Gemini to craft precise video prompts, submits them to the Kie AI API for video synthesis, polls for completion, downloads the result, and sends it via Telegram. Dynamically generates varied nature scenes (e.g., misty forests, ocean sunsets) with professional cinematography specs. Handles asynchronous video processing with webhook callbacks for efficiency. Ensures commercial-ready outputs: watermark-free, portrait aspect, natural ambient audio. Customizable schedule for daily/weekly bursts of creative B-roll footage. 🔧 Prerequisites n8n instance with HTTP Request and LangChain nodes enabled. Google Gemini API access for prompt generation. Kie AI API account for video creation (supports Sora-like text-to-video models). Telegram Bot setup for message delivery. 🔑 Required Credentials Google Gemini API Setup Go to aistudio.google.com → Create API key. Ensure the key has access to Gemini 1.5 Flash or Pro models. Add to n8n as "Google Gemini API" credential type. Kie AI API Setup Sign up at kie.ai → Dashboard → API Keys. Generate a new API key with video generation permissions (sora-2-text-to-video model). Add to n8n as "HTTP Header Auth" credential (header: Authorization, value: Bearer [Your API Key]). Telegram Bot API Setup Create a bot via @BotFather on Telegram → Get API token. Note your target chat ID (use @userinfobot for personal chats). Add to n8n as "Telegram API" credential type. ⚙️ Configuration Steps Import the workflow JSON into your n8n instance. Assign the required credentials to the Gemini, Kie AI, and Telegram nodes. Update the Telegram node's chat ID with your target chat (e.g., personal or group). Adjust the Schedule Trigger interval (e.g., daily at 9 AM) via node settings. Activate the workflow and monitor the first execution for video delivery. 🎯 Use Cases Content creators automating daily social media B-roll: Generate fresh nature clips for Instagram Reels or YouTube intros without filming. Marketing teams sourcing versatile stock footage: Quickly produce themed videos for campaigns, like serene landscapes for wellness brands. Educational bots for classrooms: Deliver randomized nature videos to Telegram groups for biology lessons on ecosystems and wildlife. Personal productivity: Schedule motivational nature escapes to your chat for remote workers needing quick digital breaks. ⚠️ Troubleshooting Video generation fails with quota error: Check Kie AI dashboard for usage limits and upgrade plan if needed. Prompt output too generic: Tweak the Video Prompting Agent's system prompt for more specificity (e.g., add seasonal themes). Telegram send error: Verify bot token and chat ID; test with a simple message node first. Webhook callback timeout: Ensure n8n production URL is publicly accessible; use ngrok for local testing.
by David Olusola
🧹 Auto-Clean CSV Uploads Before Import This workflow automatically cleans, validates, and standardizes any CSV file you upload. Perfect for preparing customer lists, sales leads, product catalogs, or any messy datasets before pushing them into Google Sheets, Google Drive, or other systems. ⚙️ How It Works CSV Upload (Webhook) Upload your CSV via webhook (supports form-data, base64, or binary file upload). Handles files up to ~10MB comfortably. Extract & Parse Reads raw CSV content. Validates file structure and headers. Detects and normalizes column names (e.g. First Name → first_name). Clean & Standardize Data Removes duplicate rows (based on email or all fields). Deletes empty rows. Standardizes fields: Emails → lowercased, validated format. Phone numbers → normalized (xxx) xxx-xxxx or +1 format. Names → capitalized (John Smith). Text → trims spaces & fixes inconsistent spacing. Assigns each row a data quality score so you know how “clean” it is. Generate Cleaned CSV Produces a cleaned CSV file with the same headers. Saves to Google Drive (optional). Ready for immediate import into Sheets or any app. Google Sheets Integration (Optional) Clears out an existing sheet. Re-imports the cleaned rows. Perfect for always keeping your “master sheet” clean. Final Report Logs processing summary: Rows before & after cleaning. Duplicates removed. Low-quality rows removed. Average data quality score. Outputs a neat summary for auditing. 🛠️ Setup Steps Upload Method Use the webhook endpoint generated by the CSV Upload Webhook node. Send CSV via binary upload, base64 encoding, or JSON payload with csv_content. Google Drive (Optional) Connect your Drive OAuth credentials. Replace YOUR_DRIVE_FOLDER_ID with your target folder. Google Sheets (Optional) Connect Google Sheets OAuth. Replace YOUR_GOOGLE_SHEET_ID with your target sheet ID. Customize Cleaning Rules Adjust the Clean & Standardize Data code node if you want different cleaning thresholds (default = 30% minimum data quality). 📊 Example Cleaning Report Input file: raw_leads.csv Rows before: 2,450 Rows after cleaning: 1,982 Duplicates removed: 210 Low-quality rows removed: 258 Avg. data quality: 87% ✅ Clean CSV saved to Drive ✅ Clean data imported into Google Sheets ✅ Full processing report generated 🎯 Why Use This? Stop wasting time manually cleaning CSVs. Ensure high-quality, import-ready data every time. Works with any dataset: leads, contacts, e-commerce exports, logs, surveys. Completely free — a must-have utility in your automation toolbox. ⚡ Upload dirty CSV → Get clean, validated, standardized data instantly!
by Evoort Solutions
📺 Automated YouTube Video Metadata Extraction Workflow Description: This workflow leverages the YouTube Metadata API to automatically extract detailed video information from any YouTube URL. It uses n8n to automate the entire process and stores the metadata in a neatly formatted Google Docs document. Perfect for content creators, marketers, and researchers who need quick, organized YouTube video insights at scale. ⚙️ Node-by-Node Explanation 1. ✅ On Form Submission This node acts as the trigger. When a user submits a form containing a YouTube video URL, the workflow is activated. Input: YouTube Video URL Platform: Webhook or n8n Form Trigger 2. 🌐 YouTube Metadata API (HTTP Request) This node sends the video URL to the YouTube Metadata API via HTTP request. Action: GET request Headers: -H "X-RapidAPI-Key: YOUR_API_KEY" -H "X-RapidAPI-Host: youtube-metadata1.p.rapidapi.com" Endpoint Example: https://youtube-metadata1.p.rapidapi.com/video?url=YOUTUBE_VIDEO_URL Output: JSON with metadata like: Title Description Views, Likes, Comments Duration Upload Date Channel Info Thumbnails 3. 🧠 Reformat Metadata (Code Node) This node reformats the raw metadata into a clean, human-readable text block. Example Output Format: 🎬 Title: How to Build Workflows with n8n 🧾 Description: This tutorial explains how to build... 👤 Channel: n8n Tutorials 📅 Published On: 2023-05-10 ⏱️ Duration: 10 minutes, 30 seconds 👁️ Views: 45,678 👍 Likes: 1,234 💬 Comments: 210 🔗 URL: https://youtube.com/watch?v=abc123 4. 📝 Append to Google Docs This node connects to your Google Docs and appends the formatted metadata into a selected document. Document Format Example:** 📌 Video Entry – [Date] 🎬 Title: 🧾 Description: 👤 Channel: 📅 Published On: ⏱️ Duration: 👁️ Views: 👍 Likes: 💬 Comments: 🔗 URL: --- 📄 Use Cases Content Creators**: Quickly analyze competitor content or inspirations. Marketers**: Collect campaign video performance data. Researchers**: Compile structured metadata across videos. Social Media Managers**: Create content briefs effortlessly. ✅ Benefits 🚀 Time-saving: Automates manual video data extraction 📊 Accurate: Uses reliable, updated YouTube API 📁 Organized: Formats and stores data in Google Docs 🔁 Scalable: Handles unlimited YouTube URLs 🎯 User-friendly: Simple setup and clean output 🔑 How to Get Your API Key for YouTube Metadata API Go to the YouTube Metadata API on RapidAPI. Sign up or log in to your RapidAPI account. Click Subscribe to Test and choose a pricing plan (free or paid). Copy your API Key shown in the "X-RapidAPI-Key" section. Use it in your HTTP request headers. 🧰 Google Docs Integration – Full Setup Instructions 🔐 Step 1: Enable Google Docs API Go to the Google Cloud Console. Create a new project or select an existing one. Navigate to APIs & Services > Library. Search for Google Docs API and click Enable. Also enable Google Drive API (for document access). 🛠 Step 2: Create OAuth Credentials Go to APIs & Services > Credentials. Click Create Credentials > OAuth Client ID. Select Web Application or Desktop App. Add authorized redirect URIs if needed (e.g., for n8n OAuth). Save your Client ID and Client Secret. 🔗 Step 3: Connect n8n to Google Docs In n8n, go to Credentials > Google Docs API. Add new credentials using the Client ID and Secret from above. Authenticate with your Google account and allow access. 📘 Step 4: Create and Format Your Google Document Go to Google Docs and create a new document. Name it (e.g., YouTube Metadata Report). Optionally, add a title or table of contents. Copy the Document ID from the URL: https://docs.google.com/document/d/DOCUMENT_ID/edit 🔄 Step 5: Use Append Content to Document Node in n8n Use the Google Docs node in n8n with: Operation: Append Content Document ID: Your copied Google Doc ID Content: The formatted video summary string 🎨 Customization Options 💡 Add Tags: Insert hashtags or categories based on video topics. 📆 Organize by Date: Create headers for each day or week’s entries. 📸 Embed Thumbnails: Use thumbnail_url to embed preview images. 📊 Spreadsheet Export: Use Google Sheets instead of Docs if preferred. 🛠 Troubleshooting Tips | Issue | Solution | | ------------------------------ | ------------------------------------------------------------------- | | ❌ Auth Error (Google Docs) | Ensure correct OAuth redirect URI and permissions. | | ❌ API Request Fails | Check API key and request structure; test on RapidAPI's playground. | | 📄 Doc Not Updating | Verify Document ID and sharing permissions. | | 🧾 Bad Formatting | Debug the code node output using logging or console in n8n. | | 🌐 n8n Timeout | Consider using Wait or Split In Batches for large submissions. | 🚀 Ready to Launch? You can deploy this workflow in just minutes using n8n. 👉 Start Automating with n8n
by WeblineIndia
Fill iOS localization gaps from .strings → Google Sheets and PR with placeholders (GitHub) This n8n workflow automatically identifies missing translations in .strings files across iOS localizations (e.g., Base.lproj vs fr.lproj) and generates a report in Google Sheets. Optionally, it creates a GitHub PR to insert placeholder strings ("TODO_TRANSLATE") so builds don't fail. Supports DRY\_RUN mode. Who’s it for iOS teams who want fast feedback on missing translations. Localization managers who want a shared sheet to assign work to translators. How it works A GitHub Webhook triggers on push or pull request. The iOS repo is scanned for .strings files under Base.lproj or en.lproj and their target-language counterparts. It compares keys and identifies what’s missing. A new or existing Google Sheet tab (e.g., fr) is updated with missing entries. If enabled, it creates a GitHub PR with placeholder keys (e.g., "TODO_TRANSLATE"). How to set up Import the Workflow JSON into your n8n instance. Set Config Node values like: { "GITHUB_OWNER": "your-github-user-name", "GITHUB_REPO": "your-iOS-repo-name", "BASE_BRANCH": "develop", "SHEET_ID": "<YOUR_GOOGLE_SHEET_ID>", "ENABLE_PR": "true", "IOS_SOURCE_GLOB": "/Base.lproj/*.strings,/en.lproj/*.strings", "IOS_TARGET_GLOB": "*/.lproj/*.strings", "PLACEHOLDER_VALUE": "TODO_TRANSLATE", "BRANCH_TEMPLATE": "chore/l10n-gap-{{YYYYMMDD}}", } Create GitHub Webhook URL: https://your-n8n-instance/webhook/l10n-gap-ios Content-Type: application/json Trigger on: Push, Pull Request Connect credentials GitHub token with repo scope Google Sheets API (Optional) Slack OAuth + SMTP Requirements | Tool | Needed For | Notes | | ---------------- | -------------------- | ---------------------------------------- | | GitHub Repo | Webhook, API for PRs | repo token or App | | Google Sheets | Sheet output | Needs valid SHEET_ID or create-per-run | | Slack (optional) | Notifications | chat:write scope | | SMTP (optional) | Email fallback | Standard SMTP creds | How to customize Multiple Locales**: Add comma-separated values to TARGET_LANGS_CSV (e.g., fr,de,es). Globs**: Adjust IOS_SOURCE_GLOB and IOS_TARGET_GLOB to scan only certain modules or file patterns. Ignore Rules**: Add IGNORE_KEY_PREFIXES_CSV to skip certain internal/debug strings. Placeholder Value**: Change PLACEHOLDER_VALUE to something meaningful like "@@@". Slack/Email**: Set SLACK_CHANNEL and EMAIL_FALLBACK_TO_CSV appropriately. DRY\_RUN**: Set to true to skip GitHub PR creation but still update the sheet. Add‑ons Android support:** Add a second path for strings.xml (values → values-<lang>), same diff → Sheets → placeholder PR. Multiple languages at once:** Expand TARGET_LANGS_CSV and loop tabs + placeholder commits per locale. .stringsdict handling:** Validate plural/format entries and open a precise PR. Translator DMs:** Provide a LANG → Slack handle/email map to DM translators with their specific file/key counts. GitLab/Bitbucket variants:** Replace GitHub API calls with GitLab/Bitbucket equivalents to open Merge Requests. Use Case Examples Before a test build, ensure fr has all keys present—placeholders keep the app compiling. Weekly run creates a single sheet for translators and a PR with placeholders, avoiding last‑minute breakages. A new screen adds 12 strings; the bot flags and pre‑fills them across locales. Common troubleshooting | Issue | Possible Cause | Solution | | ------------------------ | --------------------------------------------- | ------------------------------------------------------ | | No source files found | Glob doesn't match Base.lproj or en.lproj | Adjust IOS_SOURCE_GLOB | | Target file missing | fr.lproj doesn’t exist yet | Will be created in placeholder PR | | Parsing skips entries | Non-standard string format in file | Ensure proper .strings format "key" = "value"; | | Sheet not updating | SHEET_ID missing or insufficient permission | Add valid ID or allow write access | | PR not created | ENABLE_PR=false or no missing keys | Enable PR and ensure at least one key is missing | | Slack/Email not received | Missing credentials or config | Configure Slack/SMTP properly and set recipient fields | Need Help? Want to expand this for Android? Loop through 5+ locales at once? Or replace GitHub with GitLab? Contact our n8n Team at WeblineIndia with your repo & locale setup and we’ll help tailor it to your translation workflow!
by AFK Crypto
Try It Out! 🚀 Reddit Crypto Intelligence & Market Spike Detector ⸻ 🧠 Workflow Description Reddit Crypto Intelligence & Market Spike Detector is an automated market sentiment and price-monitoring workflow that connects social chatter with real-time crypto price analytics. It continuously scans new posts from r/CryptoCurrency, extracts recently mentioned coins, checks live price movements via CoinGecko, and alerts you on Discord when a significant spike or drop occurs. This automation empowers traders, analysts, and communities to spot early market trends before they become mainstream — all using free APIs and open data. ⸻ ⚙️ How It Works Monitor Reddit Activity ◦ Automatically fetches the latest posts from r/CryptoCurrency using Reddit’s free RSS feed. ◦ Captures trending titles, post timestamps, and mentions of coins or tokens (e.g., $BTC, $ETH, $SOL, $PEPE). Extract Coin Mentions ◦ A Code Node parses the feed using regex (\$[A-Za-z0-9]{2,10}) to identify any symbols or tickers discussed. ◦ Removes duplicates and normalizes all results for accurate data mapping. Fetch Market Data ◦ Each detected coin symbol is matched with CoinGecko’s public API to fetch live market data, including current price, market rank, and 24-hour price change. ◦ No API key required — completely free and reliable source. Detect Market Movement ◦ A second Code Node filters the fetched data to identify price movements greater than ±5% within the last 24 hours. ◦ This helps isolate meaningful market action from routine fluctuations. Generate and Send Alerts ◦ When a spike or dip is detected, the workflow composes a rich alert message including: ▪ 💎 Coin name and symbol ▪ 💰 Current price ▪ 📈 24h percentage change ▪ 🕒 Timestamp of detection ◦ The message is sent automatically to your Discord channel using a preconfigured webhook. ⸻ 💬 Example Output 🚨 Crypto Reddit Mention & Price Spike Alert! 🚨 💎 ETHEREUM (ETH) 💰 $3,945.23 📈 Change: +6.12% 💎 SOLANA (SOL) 💰 $145.88 📈 Change: +8.47% 🕒 Checked at: 2025-10-31T15:00:00Z If no coins cross the ±5% threshold: “No price spikes detected in the latest Reddit check.” 🔔 #MarketIntel #CryptoSentiment #PriceAlert ⸻ 🪄 Key Features • 🧠 Social + Market Intelligence – Combines Reddit sentiment with live market data to detect potential early signals. • 🔎 Automated Coin Detection – Dynamically identifies newly discussed tokens from live posts. • 📊 Smart Spike Filtering – Highlights only meaningful movements above configurable thresholds. • 💬 Discord Alerts – Delivers clear, structured, and timestamped alerts to your community automatically. • ⚙️ Fully No-Cost Stack – Utilizes free Reddit and CoinGecko APIs with no authentication required. ⸻ 🧩 Use Cases • Crypto Traders: Detect early hype or momentum shifts driven by social chatter. • Analysts: Automate social sentiment tracking tied directly to live market metrics. • Community Managers: Keep members informed about trending coins automatically. • Bots & AI Assistants: Integrate this logic to enhance automated trading signals or alpha alerts. ⸻ 🧰 Required Setup • Discord Webhook URL – For automatic alert posting. • (Optional) CoinGecko API endpoint (no API key required). • n8n Instance – Self-hosted or Cloud; free tier is sufficient. • Workflow Schedule – Recommended: hourly (Cron Node interval = 1 hour). ⸻ AFK Crypto Website: afkcrypto.com