by Nalin
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Generate dynamic email sequences with runtime context and external data Who is this for? Growth teams, sales development reps, and outbound marketers who want to reference specific, real-time information about prospects in their email sequences. Built for teams that have access to external data sources and want to create truly contextualized outreach that feels impossibly relevant. What problem does this solve? Most outbound sequences are static - they use the same messaging for everyone regardless of what's actually happening at the prospect's company right now. You might know they're hiring, launched a product, got funding, or expanded to new markets, but your email sequences can't dynamically reference these timely events. This workflow shows how to inject real-time external context into Octave's sequence generation, creating outreach that feels like you're personally monitoring each prospect's company. What this workflow does Lead Data & Context Collection: Receives lead information via webhook (firstName, companyName, companyDomain, profileURL, jobTitle) Uses external data sources to gather timely context about the prospect's company Example: AI agent researches current job postings to find roles they're actively hiring for Processes this context into structured data for sequence generation Runtime Context Integration: Feeds external context into Octave's sequence generation as "runtime context" Defines both WHAT the context is ("they are hiring a software engineer") and HOW to use it ("mention the role in the opening") Allows Octave to weave timely, relevant details into each email naturally Creates sequences that feel like personal research rather than mass outreach Dynamic Sequence Generation: Leverages Octave's context engine plus runtime data to create hyper-relevant sequences (1-7 emails) Generates subject lines and email content that reference specific, current company context Maintains your positioning and value prop while incorporating timely relevance Creates messaging that feels unmistakably meant for that specific moment in the prospect's business Campaign Integration: Automatically adds leads with contextualized sequences to your email platform Maps generated content to campaign variables for automated sending Supports multiple email platforms with easy customization Setup Required Credentials: Octave API key and workspace access External data source API (job boards, news APIs, enrichment services, etc.) Email platform API key (Instantly.ai configured, easily adaptable) Optional: LLM credentials if using the example AI agent for testing Step-by-Step Configuration: Set up External Data Source: Replace the AI Agent with your preferred data source (job board APIs, news APIs, company databases) Configure data collection to find relevant, timely information about prospects Structure the output to provide clean context for sequence generation Set up Octave Sequence Agent: Add your Octave API credentials in n8n Replace your-octave-sequence-agent-id with your actual sequence agent ID Configure runtime context parameters: Runtime Context: Define WHAT the external data represents Runtime Instructions: Specify HOW to use it in the messaging Configure Email Platform: Add your email platform API credentials Replace your-campaign-id-here with your actual campaign ID Ensure campaign supports custom variables for dynamic content Set up Lead Source: Replace your-webhook-path-here with a unique, secure path Configure your lead source to send prospect data to the webhook Test end-to-end flow with sample leads Required Webhook Payload Format: { "body": { "firstName": "Alex", "lastName": "Chen", "companyName": "InnovateTech", "companyDomain": "innovatetech.com", "profileURL": "https://linkedin.com/in/alexchen", "email": "alex@innovatetech.com", "jobTitle": "VP of Engineering" } } How to customize External Data Sources: Replace the AI agent with your preferred context collection method: Job Board APIs:** Reference current hiring needs and team expansion News APIs:** Mention recent company announcements, funding, or product launches Social Media Monitoring:** Reference recent LinkedIn posts, company updates, or industry discussions Enrichment Services:** Pull real-time company data, technology stack changes, or market expansion Runtime Context Configuration: Customize how external data integrates with sequences: Context Definition:** Specify what the external data represents ("they just raised Series B funding") Usage Instructions:** Define how to incorporate it ("mention the funding in the opening and tie it to growth challenges") Email-Level Control:** Configure different context usage for different emails in the sequence Global vs. Specific:** Apply context to all emails or target specific messages Data Processing: Replace the example AI agent with your external data processing: Modify data source connections to pull relevant context Ensure consistent output formatting for runtime context integration Add error handling for cases where external data isn't available Implement fallback context for prospects without relevant external data Sequence Customization: Configure Octave sequence generation: Core Context (Library):** Define your personas, use cases, and offering definitions Strategy (Playbooks):** Configure messaging frameworks and value proposition delivery Writing Style (Agent):** Adjust tone, voice, and communication style Email Platform Integration: Adapt for different email sequencing platforms: Update API endpoints and authentication for your preferred platform Modify variable mapping for platform-specific custom fields Adjust sequence length and formatting requirements Use Cases Job posting-triggered outreach for hiring managers and HR teams Funding announcement follow-ups for growth-stage companies Product launch congratulations with relevant use case discussions Market expansion outreach when companies enter new territories Technology adoption sequences based on recent stack additions Event attendance follow-ups with session-specific references
by Growth AI
Google Ads automated reporting to spreadsheets with Airtable Who's it for Digital marketing agencies, PPC managers, and marketing teams who manage multiple Google Ads accounts and need automated monthly performance reporting organized by campaign types and conversion metrics. What it does This workflow automatically retrieves Google Ads performance data from multiple client accounts and populates organized spreadsheets with campaign metrics. It differentiates between e-commerce (conversion value) and lead generation (conversion count) campaigns, then organizes data by advertising channel (Performance Max, Search, Display, etc.) with monthly tracking for budget and performance analysis. How it works The workflow follows an automated data collection and reporting process: Account Retrieval: Fetches client information from Airtable (project names, Google Ads IDs, campaign types) Active Filter: Processes only accounts marked as "Actif" for budget reporting Campaign Classification: Routes accounts through e-commerce or lead generation workflows based on "Typologie ADS" Google Ads Queries: Executes different API calls depending on campaign type (conversion value vs. conversion count) Data Processing: Organizes metrics by advertising channel (Performance Max, Search, Display, Video, Shopping, Demand Gen) Dynamic Spreadsheet Updates: Automatically fills the correct monthly column in client spreadsheets Sequential Processing: Handles multiple accounts with wait periods to avoid API rate limits Requirements Airtable account with client database Google Ads API access with developer token Google Sheets API access Client-specific spreadsheet templates (provided) How to set up Step 1: Prepare your reporting template Copy the Google Sheets reporting template Create individual copies for each client Ensure proper column structure (months B-M for January-December) Link template URLs in your Airtable database Step 2: Configure your Airtable database Set up the following fields in your Airtable: Project names: Client project identifiers ID GADS: Google Ads customer IDs Typologie ADS: Campaign classification ("Ecommerce" or "Lead") Status - Prévisionnel budgétaire: Account status ("Actif" for active accounts) Automation budget: URLs to client-specific reporting spreadsheets Step 3: Set up API credentials Configure the following authentication: Airtable Personal Access Token: For client database access Google Ads OAuth2: For advertising data retrieval Google Sheets OAuth2: For spreadsheet updates Developer Token: Required for Google Ads API access Login Customer ID: Manager account identifier Step 4: Configure Google Ads API settings Update the HTTP request nodes with your credentials: Developer Token: Replace "[Your token]" with your actual developer token Login Customer ID: Replace "[Your customer id]" with your manager account ID API Version: Currently using v18 (update as needed) Step 5: Set up scheduling Default schedule: Runs on the 3rd of each month at 5 AM Cron expression: 0 5 3 * * Recommended timing: Early month execution for complete previous month data Processing delay: 1-minute waits between accounts to respect API limits How to customize the workflow Campaign type customization E-commerce campaigns: Tracks: Cost and conversion value metrics Query: metrics.conversions_value for revenue tracking Use case: Online stores, retail businesses Lead generation campaigns: Tracks: Cost and conversion count metrics Query: metrics.conversions for lead quantity Use case: Service businesses, B2B companies Advertising channel expansion Current channels tracked: Performance Max: Automated campaign type Search: Text ads on search results Display: Visual ads on partner sites Video: YouTube and video partner ads Shopping: Product listing ads Demand Gen: Audience-focused campaigns Add new channels by modifying the data processing code nodes. Reporting period adjustment Current setting: Last month data (DURING LAST_MONTH) Alternative periods: Last 30 days, specific date ranges, quarterly reports Custom timeframes: Modify the Google Ads query date parameters Multi-account management Sequential processing: Handles multiple accounts automatically Error handling: Continues processing if individual accounts fail Rate limiting: Built-in waits prevent API quota issues Batch size: No limit on number of accounts processed Data organization features Dynamic monthly columns Automatic detection: Determines previous month column (B-M) Column mapping: January=B, February=C, ..., December=M Data placement: Updates correct month automatically Multi-year support: Handles year transitions seamlessly Campaign performance breakdown Each account populates 10 rows of data: Performance Max Cost (Row 2) Performance Max Conversions/Value (Row 3) Demand Gen Cost (Row 4) Demand Gen Conversions/Value (Row 5) Search Cost (Row 6) Search Conversions/Value (Row 7) Video Cost (Row 8) Video Conversions/Value (Row 9) Shopping Cost (Row 10) Shopping Conversions/Value (Row 11) Data processing logic Cost conversion: Automatically converts micros to euros (÷1,000,000) Precision rounding: Rounds to 2 decimal places for clean presentation Zero handling: Shows 0 for campaign types with no activity Data validation: Handles missing or null values gracefully Results interpretation Monthly performance tracking Historical data: Year-over-year comparison across all channels Channel performance: Identify best-performing advertising types Budget allocation: Data-driven decisions for campaign investments Trend analysis: Month-over-month growth or decline patterns Account-level insights Multi-client view: Consolidated reporting across all managed accounts Campaign diversity: Understanding which channels clients use most Performance benchmarks: Compare similar account types and industries Resource allocation: Focus on high-performing accounts and channels Use cases Agency reporting automation Client dashboards: Automated population of monthly performance reports Budget planning: Historical data for next month's budget recommendations Performance reviews: Ready-to-present data for client meetings Trend identification: Spot patterns across multiple client accounts Internal performance tracking Team productivity: Track account management efficiency Campaign optimization: Identify underperforming channels for improvement Growth analysis: Monitor client account growth and expansion Forecasting: Use historical data for future performance predictions Strategic planning Budget allocation: Data-driven distribution across advertising channels Channel strategy: Determine which campaign types to emphasize Client retention: Proactive identification of declining accounts New business: Performance data to support proposals and pitches Workflow limitations Monthly execution: Designed for monthly reporting (not real-time) API dependencies: Requires stable Google Ads and Sheets API access Rate limiting: Sequential processing prevents parallel account handling Template dependency: Requires specific spreadsheet structure for proper data placement Previous month focus: Optimized for completed month data (run early in new month) Manual credential setup: Requires individual configuration of API tokens and customer IDs
by Nansen
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works This workflow listens for an incoming chat message and routes it to an AI Agent. The agent is powered by your preferred Chat Model (such as OpenAI or Anthropic) and extended with the Nansen MCP tool, which enables it to retrieve onchain wallet data, token movements, and address-level insights in real time. The Nansen MCP tool uses HTTP Streamable transport and requires API Key authentication via Header Auth. Read the Documentation: https://docs.nansen.ai/nansen-mcp/overview Set up steps Get your Nansen MCP API key Visit: https://app.nansen.ai/account?tab=api Generate and copy your personal API key. Create a credential for authentication From the homepage, click the dropdown next to "Create Workflow" → "Create Credential". Select Header Auth as the method. Set the Header Name to: NANSEN-API-KEY Paste your API key into the Value field. Save the credential (e.g., Nansen MCP Credentials). Configure the Nansen MCP tool Endpoint: https://mcp.nansen.ai/ra/mcp/ Server Transport: HTTP Streamable Authentication: Header Auth Credential: Select Nansen MCP Credentials Tools to Include: Leave as All (or restrict as needed) Configure the AI Agent Connect your preferred Chat Model (e.g., OpenAI, Anthropic) to the Chat Model input. Connect the Nansen MCP tool to the Tool input. (Optional) Add a Memory block to preserve conversational context. Set up the chat trigger Use the "When chat message received" node to start the flow when a message is received. Test your setup Try sending prompts like: What tokens are being swapped by 0xabc...123? Get recent wallet activity for this address. Show top holders of token XYZ.
by SpaGreen Creative
Send Automatic WhatsApp Order Confirmations from Shopify with Rapiwa API Who’s it for This n8n workflow helps Shopify store owners and teams automatically confirm orders via WhatsApp. It checks if the customer's number is valid using Rapiwa API, sends a personalized message, and logs every attempt in Google Sheets—saving time and reducing manual work. Whether you're a solo entrepreneur or managing a small team, this solution gives you a low-cost alternative to the official WhatsApp Business API, without losing control or personalization. Features Receives new order details via webhook upon order creation or update. Iterates over incoming data in manageable batches for smoother processing. Extracts and formats customer and order details from the Shopify webhook payload. Strips non-numeric characters from WhatsApp numbers for consistent formatting. Uses Rapiwa API to check if the WhatsApp number is valid and active. Branches the workflow based on number validity — separates verified from unverified. Sends a custom WhatsApp confirmation message to verified customers using Rapiwa. Updates Google Sheet rows with status and validity How it Works / What It Does Triggered by a Shopify webhook or by reading rows from a Google Sheet. Normalizes and cleans** the order payload. Extracts details like customer name, phone, items, shipping, and payment info. Cleans phone numbers (removes special characters). Verifies if the number is registered on WhatsApp via Rapiwa API. If valid: Sends a templated WhatsApp message. Updates Google Sheet with validity = verified and status = sent. If invalid: Skips sending. Updates sheet with validity = unverified and status = not sent. Adds wait/delay between sends to prevent rate limits. Keeps an audit trail in the connected Google Sheet. How to Set Up Set up a Shopify webhook for new orders (or connect a Google Sheet). Create a Google Sheet with columns: name, number, order id, item name, total price, validity, status Create and configure a Rapiwa Bearer token in n8n. Add Google Sheets OAuth2 credential in n8n. Import the workflow in n8n and configure these nodes: Webhook or Sheet Trigger Loop Over Items (SplitInBatches) Normalize Payload (Code) Clean WhatsApp Number (Code) Rapiwa WhatsApp Check (HTTP Request) Conditional Branch (If) Send WhatsApp Message (HTTP Request) Update Google Sheet (Google Sheets) Wait Node (delay per send) Requirements Shopify store with order webhook enabled (or order list in Google Sheet) A verified Rapiwa API token A working n8n instance with HTTP and Google Sheets nodes enabled A Google Sheet with required structure and valid OAuth credentials in n8n How to Customize the Workflow Modify the message template with your own brand tone or emojis. Add country-code logic in the Clean Number node if needed. Use a unique order id in your Google Sheet to prevent mismatches. Increase or decrease delay in the Wait node (e.g., 5–10 seconds). Use additional logic in Code nodes to handle discounts, promotions, or more line items. Workflow Highlights Triggered by Shopify webhook update. Receiving new order data form Shopify using webhook Cleans and extracts order data from raw payload. Normalizing and validating the customer’s WhatsApp number using the Rapiwa API Verifies WhatsApp number using Rapiwa's verify-whatsapp endpoint. Sends order confirmation via Rapiwa's send-message endpoint. Logs every result into Google Sheets (verified/unverified + sent/not sent). Setup in n8n 1. Check WhatsApp Registration Use an HTTP Request node: URL: https://app.rapiwa.com/api/verify-whatsapp Method: POST Auth: httpBearerAuth using your Rapiwa token Body: { "number": "cleaned_number" } 2. Branch Based on Validity Use an If node: Condition: {{ $json.data.exists }} == true (or "true" if string) 3. Send Message via Rapiwa Endpoint: https://app.rapiwa.com/api/send-message Method: POST Body: Hi {{ $json.customer_full_name }}, Thank you for shopping with SpaGreen Creative! We're happy to confirm that your order has been successfully placed. 🧾 Order Details • Product: {{ $json.line_item.title }} • SKU: {{ $json.line_item.sku }} • Quantity: {{ $json.line_item.quantity }} • Vendor: {{ $json.line_item.vendor }} • Order ID: {{ $json.name }} • Product ID: {{ $json.line_item.product_id }} 📦 Shipping Information {{ $json.shipping_address.address1 }} {{ $json.shipping_address.address2 }} {{ $json.shipping_address.city }}, {{ $json.shipping_address.country }} - {{ $json.shipping_address.zip }} 💳 Payment Summary • Subtotal: {{ $json.subtotal_price }} BDT • Tax (VAT): {{ $json.total_tax_amount }} BDT • Shipping: {{ $json.total_shipping_amount }} BDT • Discount: {{ $json.total_discount_amount }} BDT • Total Paid: {{ $json.total_price }} BDT Order Date: {{ $json.created_date }} Warm wishes, Team SpaGreen Creative Sample Google Sheet Structure A Google Sheet** formatted like this ➤ Sample | name | number | order id | item name | total price | validity | status | | ----------- | ------------- | ------------- | ------------------------------ | ----------- | -------- | ------ | | Abdul Mannan | 8801322827799| 8986469695806 | Iphone 10 | 1150 | verified | sent | | Abdul Mannan | 8801322827799| 8986469695806 | S25 UltraXXXXeen Android Phone | 23000 | verified | sent | Tips Always ensure phone numbers have a country code (e.g., 880 for BD). Clean numbers with regex: replace(/\D/g, '') Adjust Rapiwa API response parsing depending on actual structure (true vs "true"). Use row_number for sheet updates, or unique order id for better targeting. Use the Wait node to add 3–10 seconds between sends. Important Notes Avoid reordering sheet rows—updates rely on consistent row_number. shopify-app-auth is the credential name used in the export—make sure it's your Rapiwa token. Use a test sheet before going live. Rapiwa has request limits—avoid rapid sending. Add media/image logic later using message_type: media. Future Enhancements (Ideas) Add Telegram/Slack alert once the batch finishes. Include media (e.g., product image, invoice) in the message. Detect and resend failed messages. Integrate with Shopify’s GraphQL API for additional data. Auto-mark fulfillment status based on WhatsApp confirmation. Support & Community WhatsApp: 8801322827799 Discord: discord Facebook Group: facebook group Website: https://spagreen.net Envato/Codecanyon: codecanyon portfolio
by Chris Pryce
Overview This workflow streamlines the process of setting up a chat-bot using the Signal Messager API. What this is for Chat-bot applications have become very popular on Whatsapp and Telegram. However, security conscious people may be hesitant to connect their AI agents to these applications. Compared to Whatsapp and Telegram, the Signal messaging app is more secure and end-to-end encrypted by default. In part because of this, it is more difficult to create a chat-bot application in this app. However, this is still possible to do if you host your own Signal API endpoint. This workflow requires the installation of a community-node package. Some additional setup for the locally hosted Signal API endpoint is also necessary. As such, it will only work with self-hosted instances of n8n. You may use any AI model you wish for this chat-bot, and connect different tools and APIs depending on your use-case. How to setup Step 1: Setup Rest API Before implementing this workflow, you must setup a local Signal Client Rest API. This can be done using a docker container based on this project: bbernhard/signal-cli-rest-api. version: "3" services: signal-cli-rest-api: image: bbernhard/signal-cli-rest-api:latest environment: MODE=normal #supported modes: json-rpc, native, normal #- AUTO_RECEIVE_SCHEDULE=0 22 * * * #enable this parameter on demand (see description below) ports: "8080:8080" #map docker port 8080 to host port 8080. volumes: "./signal-cli-config:/home/.local/share/signal-cli" #map "signal-cli-config" folder on host system into docker container. the folder contains the password and cryptographic keys when a new number is registered After starting the docker-container, you will be able to interact with a local Signal client over a Rest API at http://localhost:8080 (by default, this setting can be modified in the docker-compose file). Step 2: Install Node Package This workflow requires the community-node package developed by ZBlaZe: n8n-nodes-signal-cli-rest-api. Navigate to ++your-n8n-server-address/settings/community-nodes++, click the 'Install' button, and paste in the communiy-node package name '++n8n-nodes-signal-cli-rest-api++' to install this community node. Step 3: Register and Verify Account The last step requires a phone-number. You may use your own phone-number, a pre-paid sim card, or (if you are a US resident), a free Google Voice digital phone-number. An n8n web-form has been created in this workflow to make headless setup easier. In the Form nodes, replace the URL with the address of your local Signal Rest API endpoint. Open the webform and enter the phone number you are using to register your bot's Signal account Signal needs to verify you are human before registering an account. Visit this page to complete the captcha challenge. The right-click the 'Open Signal' button and copy the link address. Paste this into the second form field and hit 'Submit'. At this point you should receive a verification token as an SMS message to the phone-number you are using. Copy this and paste it into the second web-form. Your bot's signal account should be setup now. To use this account in n8n, add the Rest-API address and account-number (phone-number) as a new n8n credential. Step 4: Optional For extra security it is recommended to restrict communication with this chat-bot. In the 'If' workflow node, enter your own signal account phone-number. You may also provide a UUID. This is an identifier number unique to your mobile device. You can find this by sending a test message to your bot's signal account and copying it from the workflow execution data.
by Evoort Solutions
📺 Automated YouTube Video Metadata Extraction Workflow Description: This workflow leverages the YouTube Metadata API to automatically extract detailed video information from any YouTube URL. It uses n8n to automate the entire process and stores the metadata in a neatly formatted Google Docs document. Perfect for content creators, marketers, and researchers who need quick, organized YouTube video insights at scale. ⚙️ Node-by-Node Explanation 1. ✅ On Form Submission This node acts as the trigger. When a user submits a form containing a YouTube video URL, the workflow is activated. Input: YouTube Video URL Platform: Webhook or n8n Form Trigger 2. 🌐 YouTube Metadata API (HTTP Request) This node sends the video URL to the YouTube Metadata API via HTTP request. Action: GET request Headers: -H "X-RapidAPI-Key: YOUR_API_KEY" -H "X-RapidAPI-Host: youtube-metadata1.p.rapidapi.com" Endpoint Example: https://youtube-metadata1.p.rapidapi.com/video?url=YOUTUBE_VIDEO_URL Output: JSON with metadata like: Title Description Views, Likes, Comments Duration Upload Date Channel Info Thumbnails 3. 🧠 Reformat Metadata (Code Node) This node reformats the raw metadata into a clean, human-readable text block. Example Output Format: 🎬 Title: How to Build Workflows with n8n 🧾 Description: This tutorial explains how to build... 👤 Channel: n8n Tutorials 📅 Published On: 2023-05-10 ⏱️ Duration: 10 minutes, 30 seconds 👁️ Views: 45,678 👍 Likes: 1,234 💬 Comments: 210 🔗 URL: https://youtube.com/watch?v=abc123 4. 📝 Append to Google Docs This node connects to your Google Docs and appends the formatted metadata into a selected document. Document Format Example:** 📌 Video Entry – [Date] 🎬 Title: 🧾 Description: 👤 Channel: 📅 Published On: ⏱️ Duration: 👁️ Views: 👍 Likes: 💬 Comments: 🔗 URL: --- 📄 Use Cases Content Creators**: Quickly analyze competitor content or inspirations. Marketers**: Collect campaign video performance data. Researchers**: Compile structured metadata across videos. Social Media Managers**: Create content briefs effortlessly. ✅ Benefits 🚀 Time-saving: Automates manual video data extraction 📊 Accurate: Uses reliable, updated YouTube API 📁 Organized: Formats and stores data in Google Docs 🔁 Scalable: Handles unlimited YouTube URLs 🎯 User-friendly: Simple setup and clean output 🔑 How to Get Your API Key for YouTube Metadata API Go to the YouTube Metadata API on RapidAPI. Sign up or log in to your RapidAPI account. Click Subscribe to Test and choose a pricing plan (free or paid). Copy your API Key shown in the "X-RapidAPI-Key" section. Use it in your HTTP request headers. 🧰 Google Docs Integration – Full Setup Instructions 🔐 Step 1: Enable Google Docs API Go to the Google Cloud Console. Create a new project or select an existing one. Navigate to APIs & Services > Library. Search for Google Docs API and click Enable. Also enable Google Drive API (for document access). 🛠 Step 2: Create OAuth Credentials Go to APIs & Services > Credentials. Click Create Credentials > OAuth Client ID. Select Web Application or Desktop App. Add authorized redirect URIs if needed (e.g., for n8n OAuth). Save your Client ID and Client Secret. 🔗 Step 3: Connect n8n to Google Docs In n8n, go to Credentials > Google Docs API. Add new credentials using the Client ID and Secret from above. Authenticate with your Google account and allow access. 📘 Step 4: Create and Format Your Google Document Go to Google Docs and create a new document. Name it (e.g., YouTube Metadata Report). Optionally, add a title or table of contents. Copy the Document ID from the URL: https://docs.google.com/document/d/DOCUMENT_ID/edit 🔄 Step 5: Use Append Content to Document Node in n8n Use the Google Docs node in n8n with: Operation: Append Content Document ID: Your copied Google Doc ID Content: The formatted video summary string 🎨 Customization Options 💡 Add Tags: Insert hashtags or categories based on video topics. 📆 Organize by Date: Create headers for each day or week’s entries. 📸 Embed Thumbnails: Use thumbnail_url to embed preview images. 📊 Spreadsheet Export: Use Google Sheets instead of Docs if preferred. 🛠 Troubleshooting Tips | Issue | Solution | | ------------------------------ | ------------------------------------------------------------------- | | ❌ Auth Error (Google Docs) | Ensure correct OAuth redirect URI and permissions. | | ❌ API Request Fails | Check API key and request structure; test on RapidAPI's playground. | | 📄 Doc Not Updating | Verify Document ID and sharing permissions. | | 🧾 Bad Formatting | Debug the code node output using logging or console in n8n. | | 🌐 n8n Timeout | Consider using Wait or Split In Batches for large submissions. | 🚀 Ready to Launch? You can deploy this workflow in just minutes using n8n. 👉 Start Automating with n8n
by Rakin Jakaria
Use Cases Analyze e-commerce product pages for conversion optimization, audit SaaS landing pages for signup improvements, or evaluate marketing campaign pages for better lead generation. Good to know At time of writing, Google Gemini API calls have usage costs. See Google AI Pricing for current rates. The workflow analyzes publicly accessible pages only - pages behind login walls or with restricted access won't work. Analysis quality depends on page content structure - heavily image-based pages may receive limited text-based recommendations. How it works User submits a landing page URL through the form trigger interface. The HTTP Request node fetches the complete HTML content from the target landing page. Content is converted from HTML to markdown format for cleaner AI processing and better text extraction. Google Gemini 2.5 Flash analyzes the page using expert CRO knowledge and 2024 conversion best practices. The AI generates specific, actionable recommendations based on actual page content rather than generic advice. Information Extractor processes the analysis into 5 prioritized improvement tips with relevant visual indicators. Results are delivered through a completion form showing concrete steps to improve conversion rates. How to use The form trigger is configured for direct URL submission but can be replaced with webhook triggers for integration into existing websites or apps. Multiple pages can be analyzed sequentially, though each requires a separate workflow execution. Recommendations focus on high-impact changes that don't require heavy development work. Requirements Google Gemini (PaLM) API account for AI-powered analysis Publicly accessible landing pages for analysis N8N instance with proper webhook configuration Customizing this workflow CRO analysis can be tailored for specific industries by modifying the AI system prompt - try focusing on e-commerce checkout flows, SaaS trial conversions, or local business lead capture forms. Add competitive analysis by incorporating multiple URL inputs and comparative recommendations.
by Growth AI
French Public Procurement Tender Monitoring Workflow Overview This n8n workflow automates the monitoring and filtering of French public procurement tenders (BOAMP - Bulletin Officiel des Annonces des Marchés Publics). It retrieves tenders based on your preferences, filters them by market type, and identifies relevant opportunities using keyword matching. Who is this for? Companies seeking French public procurement opportunities Consultants monitoring specific market sectors Organizations tracking government contracts in France What it does The workflow operates in two main phases: Phase 1: Automated Tender Collection Retrieves all tenders from the BOAMP API based on your configuration Filters by market type (Works, Services, Supplies) Stores complete tender data in Google Sheets Handles pagination automatically for large datasets Phase 2: Intelligent Keyword Filtering Downloads and extracts text from tender PDF documents Searches for your specified keywords within tender content Saves matching tenders to a separate "Target" sheet for easy review Tracks processing status to avoid duplicates Requirements n8n instance (self-hosted or cloud) Google account with Google Sheets access Google Sheets API credentials configured in n8n Setup Instructions Step 1: Duplicate the Configuration Spreadsheet Access the template spreadsheet: Configuration Template Click File → Make a copy Save to your Google Drive Note the URL of your new spreadsheet Step 2: Configure Your Preferences Open your copied spreadsheet and configure the Config tab: Market Types - Check the categories you want to monitor: Travaux (Works/Construction) Services Fournitures (Supplies) Search Period - Enter the number of days to look back (e.g., "30" for the last 30 days) Keywords - Enter your search terms as a comma-separated list (e.g., "informatique, cloud, cybersécurité") Step 3: Import the Workflow Copy the workflow JSON from this template In n8n, click Workflows → Import from File/URL Paste the JSON and import Step 4: Update Google Sheets Connections Replace all Google Sheets node URLs with your spreadsheet URL: Nodes to update: Get config (2 instances) Get keyword Get Offset Get All Append row in sheet Update offset Reset Offset Ok Target offre For each node: Open the node settings Update the Document ID field with your spreadsheet URL Verify the Sheet Name matches your spreadsheet tabs Step 5: Configure Schedule Triggers The workflow has two schedule triggers: Schedule Trigger1 (Phase 1 - Tender Collection) Default: 0 8 1 * * (1st day of month at 8:00 AM) Adjust based on how frequently you want to collect tenders Schedule Trigger (Phase 2 - Keyword Filtering) Default: 0 10 1 * * (1st day of month at 10:00 AM) Should run after Phase 1 completes To modify: Open the Schedule Trigger node Click Cron Expression Adjust timing as needed Step 6: Test the Workflow Manually execute Phase 1 by clicking the Schedule Trigger1 node and selecting Execute Node Verify tenders appear in your "All" sheet Execute Phase 2 by triggering the Schedule Trigger node Check the "Target" sheet for matching tenders How the Workflow Works Phase 1: Tender Collection Process Configuration Loading - Reads your preferences from Google Sheets Offset Management - Tracks pagination position for API calls API Request - Fetches up to 100 tenders per batch from BOAMP Market Type Filtering - Keeps only selected market categories Data Storage - Formats and saves tenders to the "All" sheet Pagination Loop - Continues until all tenders are retrieved Offset Reset - Prepares for next execution Phase 2: Keyword Matching Process Keyword Loading - Retrieves search terms from configuration Tender Retrieval - Gets unprocessed tenders from "All" sheet Sequential Processing - Loops through each tender individually PDF Extraction - Downloads and extracts text from tender documents Keyword Analysis - Searches for matches with accent/case normalization Status Update - Marks tender as processed Match Evaluation - Determines if keywords were found Target Storage - Saves relevant tenders with match details Customization Options Adjust API Parameters In the HTTP Request node, you can modify: limit: Number of records per batch (default: 100) Additional filters in the where parameter Modify Keyword Matching Logic Edit the Get query node to adjust: Text normalization (accent removal, case sensitivity) Match proximity requirements Context length around matches Change Data Format Update the Format Results node to modify: Date formatting PDF URL generation Field mappings Spreadsheet Structure Your Google Sheets should contain these tabs: Config** - Your configuration settings Offset** - Pagination tracking (managed automatically) All** - Complete tender database Target** - Filtered tenders matching your keywords Troubleshooting No tenders appearing in "All" sheet: Verify your configuration period isn't too restrictive Check that at least one market type is selected Ensure API is accessible (test the HTTP Request node) PDF extraction errors: Some PDFs may be malformed or protected Check the URL generation in Format Results node Verify PDF URLs are accessible in a browser Duplicate tenders in Target sheet: Ensure the "Ok" status is being written correctly Check the Filter node is excluding processed tenders Verify row_number matching in update operations Keywords not matching: Keywords are case-insensitive and accent-insensitive Verify your keywords are spelled correctly Check the extracted text contains your terms Performance Considerations Phase 1 processes 100 tenders per iteration with a 10-second wait between batches Phase 2 processes tenders sequentially to avoid overloading PDF extraction Large datasets (1000+ tenders) may take significant time to process Consider running Phase 1 less frequently if tender volume is manageable Data Privacy All data is stored in your Google Sheets No external databases or third-party storage BOAMP API is publicly accessible (no authentication required) Ensure your Google Sheets permissions are properly configured Support and Updates This workflow retrieves data from the BOAMP public API. If API structure changes, nodes may require updates. Monitor the workflow execution logs for errors and adjust accordingly.
by gotoHuman
Collaborate with an AI Agent on a joint document, e.g. for creating your content marketing strategy, a sales plan, project status updates, or market analysis. The AI Agent generates markdown text that you can review and edit it in gotoHuman, and only then is the existing Google Doc updated. In this example we use AI to update our company's content strategy for the next quarter. How It Works The AI Agent has access to other documents that provide enough context to write the content strategy. We ask it to generate the text in markdown format. To ensure our strategy document is not changed without our approval, we request a human review using gotoHuman. There the markdown content can be edited and properly previewed. Our workflow resumes once the review is completed. We check if the content was approved and then write the (potentially edited) markdown to our Google Docs file via the Google Drive node. How to set up Most importantly, install the verified gotoHuman node before importing this template! (Just add the node to a blank canvas before importing. Works with n8n cloud and self-hosted) Set up your credentials for gotoHuman, OpenAI, and Google Docs/Drive In gotoHuman, select and create the pre-built review template "Strategy agent" or import the ID: F4sbcPEpyhNKBKbG9C1d Select this template in the gotoHuman node Requirements You need accounts for gotoHuman (human supervision) OpenAI (Doc writing) Google Docs/Drive How to customize Let the workflow run on a schedule, or create and connect a manual trigger in gotoHuman that lets you capture additional human input to feed your agent Provide the agent with more context to write the content strategy Use the gotoHuman response (or a Google Drive file change trigger) to run additional AI agents that can execute on the new strategy
by Atta
This workflow automatically turns any YouTube video into a structured blog post with Gemini AI. By sending a simple POST request with a YouTube URL to a webhook, it downloads the video’s audio, transcribes the content, and generates a blog-ready article with a title, description, tags, and category. The final result, along with the full transcript and original video URL, is delivered to your chosen webhook or CMS. How it works: The workflow handles the entire process of transforming YouTube videos into complete blog posts using Gemini AI transcription and structured text generation. Once triggered, it: Downloads the video’s audio Transcribes the spoken content into text Generates a blog post in the same language as the video’s original language Creates: A clear and engaging title A short description Suggested category and tags The full transcript of the video The original YouTube video URL This makes it easy to repurpose video content into publish-ready articles in minutes. This template is ideal for content creators, marketers, educators, and bloggers who want to quickly turn video content into written posts without manual transcription or editing. Setup Instructions Install yt-dlp on your local machine or server where n8n runs. This is required to download YouTube audio. Get a Google Gemini API key and configure it in your AI nodes. Webhook Input Configuration: Endpoint: The workflow starts with a Webhook Trigger. Method: POST Example Request Body: { "videoUrl": "https://www.youtube.com/watch?v=lW5xEm7iSXk" } Configure Output Webhook: Add your target endpoint in the last node where the blog post JSON is sent. This could be your CMS, a Notion database, or another integration. Customization Guidance Writing Style:** Update the AI Agent’s prompt to adjust tone (e.g., casual, professional, SEO-optimized). Metadata:** Modify how categories and tags are generated to fit your website’s taxonomy. Integration:** Swap the final webhook with WordPress, Ghost, Notion, or Slack to fit your publishing workflow. Transcript Handling:** Save the full transcript separately if you also want searchable video archives.
by WeblineIndia
Fill iOS localization gaps from .strings → Google Sheets and PR with placeholders (GitHub) This n8n workflow automatically identifies missing translations in .strings files across iOS localizations (e.g., Base.lproj vs fr.lproj) and generates a report in Google Sheets. Optionally, it creates a GitHub PR to insert placeholder strings ("TODO_TRANSLATE") so builds don't fail. Supports DRY\_RUN mode. Who’s it for iOS teams who want fast feedback on missing translations. Localization managers who want a shared sheet to assign work to translators. How it works A GitHub Webhook triggers on push or pull request. The iOS repo is scanned for .strings files under Base.lproj or en.lproj and their target-language counterparts. It compares keys and identifies what’s missing. A new or existing Google Sheet tab (e.g., fr) is updated with missing entries. If enabled, it creates a GitHub PR with placeholder keys (e.g., "TODO_TRANSLATE"). How to set up Import the Workflow JSON into your n8n instance. Set Config Node values like: { "GITHUB_OWNER": "your-github-user-name", "GITHUB_REPO": "your-iOS-repo-name", "BASE_BRANCH": "develop", "SHEET_ID": "<YOUR_GOOGLE_SHEET_ID>", "ENABLE_PR": "true", "IOS_SOURCE_GLOB": "/Base.lproj/*.strings,/en.lproj/*.strings", "IOS_TARGET_GLOB": "*/.lproj/*.strings", "PLACEHOLDER_VALUE": "TODO_TRANSLATE", "BRANCH_TEMPLATE": "chore/l10n-gap-{{YYYYMMDD}}", } Create GitHub Webhook URL: https://your-n8n-instance/webhook/l10n-gap-ios Content-Type: application/json Trigger on: Push, Pull Request Connect credentials GitHub token with repo scope Google Sheets API (Optional) Slack OAuth + SMTP Requirements | Tool | Needed For | Notes | | ---------------- | -------------------- | ---------------------------------------- | | GitHub Repo | Webhook, API for PRs | repo token or App | | Google Sheets | Sheet output | Needs valid SHEET_ID or create-per-run | | Slack (optional) | Notifications | chat:write scope | | SMTP (optional) | Email fallback | Standard SMTP creds | How to customize Multiple Locales**: Add comma-separated values to TARGET_LANGS_CSV (e.g., fr,de,es). Globs**: Adjust IOS_SOURCE_GLOB and IOS_TARGET_GLOB to scan only certain modules or file patterns. Ignore Rules**: Add IGNORE_KEY_PREFIXES_CSV to skip certain internal/debug strings. Placeholder Value**: Change PLACEHOLDER_VALUE to something meaningful like "@@@". Slack/Email**: Set SLACK_CHANNEL and EMAIL_FALLBACK_TO_CSV appropriately. DRY\_RUN**: Set to true to skip GitHub PR creation but still update the sheet. Add‑ons Android support:** Add a second path for strings.xml (values → values-<lang>), same diff → Sheets → placeholder PR. Multiple languages at once:** Expand TARGET_LANGS_CSV and loop tabs + placeholder commits per locale. .stringsdict handling:** Validate plural/format entries and open a precise PR. Translator DMs:** Provide a LANG → Slack handle/email map to DM translators with their specific file/key counts. GitLab/Bitbucket variants:** Replace GitHub API calls with GitLab/Bitbucket equivalents to open Merge Requests. Use Case Examples Before a test build, ensure fr has all keys present—placeholders keep the app compiling. Weekly run creates a single sheet for translators and a PR with placeholders, avoiding last‑minute breakages. A new screen adds 12 strings; the bot flags and pre‑fills them across locales. Common troubleshooting | Issue | Possible Cause | Solution | | ------------------------ | --------------------------------------------- | ------------------------------------------------------ | | No source files found | Glob doesn't match Base.lproj or en.lproj | Adjust IOS_SOURCE_GLOB | | Target file missing | fr.lproj doesn’t exist yet | Will be created in placeholder PR | | Parsing skips entries | Non-standard string format in file | Ensure proper .strings format "key" = "value"; | | Sheet not updating | SHEET_ID missing or insufficient permission | Add valid ID or allow write access | | PR not created | ENABLE_PR=false or no missing keys | Enable PR and ensure at least one key is missing | | Slack/Email not received | Missing credentials or config | Configure Slack/SMTP properly and set recipient fields | Need Help? Want to expand this for Android? Loop through 5+ locales at once? Or replace GitHub with GitLab? Contact our n8n Team at WeblineIndia with your repo & locale setup and we’ll help tailor it to your translation workflow!
by AFK Crypto
Try It Out! 🚀 Reddit Crypto Intelligence & Market Spike Detector ⸻ 🧠 Workflow Description Reddit Crypto Intelligence & Market Spike Detector is an automated market sentiment and price-monitoring workflow that connects social chatter with real-time crypto price analytics. It continuously scans new posts from r/CryptoCurrency, extracts recently mentioned coins, checks live price movements via CoinGecko, and alerts you on Discord when a significant spike or drop occurs. This automation empowers traders, analysts, and communities to spot early market trends before they become mainstream — all using free APIs and open data. ⸻ ⚙️ How It Works Monitor Reddit Activity ◦ Automatically fetches the latest posts from r/CryptoCurrency using Reddit’s free RSS feed. ◦ Captures trending titles, post timestamps, and mentions of coins or tokens (e.g., $BTC, $ETH, $SOL, $PEPE). Extract Coin Mentions ◦ A Code Node parses the feed using regex (\$[A-Za-z0-9]{2,10}) to identify any symbols or tickers discussed. ◦ Removes duplicates and normalizes all results for accurate data mapping. Fetch Market Data ◦ Each detected coin symbol is matched with CoinGecko’s public API to fetch live market data, including current price, market rank, and 24-hour price change. ◦ No API key required — completely free and reliable source. Detect Market Movement ◦ A second Code Node filters the fetched data to identify price movements greater than ±5% within the last 24 hours. ◦ This helps isolate meaningful market action from routine fluctuations. Generate and Send Alerts ◦ When a spike or dip is detected, the workflow composes a rich alert message including: ▪ 💎 Coin name and symbol ▪ 💰 Current price ▪ 📈 24h percentage change ▪ 🕒 Timestamp of detection ◦ The message is sent automatically to your Discord channel using a preconfigured webhook. ⸻ 💬 Example Output 🚨 Crypto Reddit Mention & Price Spike Alert! 🚨 💎 ETHEREUM (ETH) 💰 $3,945.23 📈 Change: +6.12% 💎 SOLANA (SOL) 💰 $145.88 📈 Change: +8.47% 🕒 Checked at: 2025-10-31T15:00:00Z If no coins cross the ±5% threshold: “No price spikes detected in the latest Reddit check.” 🔔 #MarketIntel #CryptoSentiment #PriceAlert ⸻ 🪄 Key Features • 🧠 Social + Market Intelligence – Combines Reddit sentiment with live market data to detect potential early signals. • 🔎 Automated Coin Detection – Dynamically identifies newly discussed tokens from live posts. • 📊 Smart Spike Filtering – Highlights only meaningful movements above configurable thresholds. • 💬 Discord Alerts – Delivers clear, structured, and timestamped alerts to your community automatically. • ⚙️ Fully No-Cost Stack – Utilizes free Reddit and CoinGecko APIs with no authentication required. ⸻ 🧩 Use Cases • Crypto Traders: Detect early hype or momentum shifts driven by social chatter. • Analysts: Automate social sentiment tracking tied directly to live market metrics. • Community Managers: Keep members informed about trending coins automatically. • Bots & AI Assistants: Integrate this logic to enhance automated trading signals or alpha alerts. ⸻ 🧰 Required Setup • Discord Webhook URL – For automatic alert posting. • (Optional) CoinGecko API endpoint (no API key required). • n8n Instance – Self-hosted or Cloud; free tier is sufficient. • Workflow Schedule – Recommended: hourly (Cron Node interval = 1 hour). ⸻ AFK Crypto Website: afkcrypto.com