by Rapiwa
Who is this for? This workflow is for online store owners, support teams, and marketing staff who want to automatically verify WhatsApp numbers and send order invoice links or personalized order updates to customers. It’s built against WooCommerce order webhooks but can be adapted to Shopify or other e-commerce platforms that provide billing and line_items. What this Workflow Does Receives order events (Webhook / WooCommerce order.updated). Normalizes the payload into a compact object: { data: { customer, products, invoice_link } } via a Code node. Iterates items in batches (SplitInBatches) to control throughput. Cleans phone numbers (removes non-digits) and verifies WhatsApp registration using Rapiwa (/api/verify-whatsapp). Sends templated WhatsApp messages through Rapiwa (/api/send-message) for verified numbers. Logs every attempt into Google Sheets: one sheet for verified & sent rows, another for unverified & not sent rows. Uses a Wait node to throttle and loop back into the batch processor. Key Features Trigger-based automation (Webhook or WooCommerce trigger). Payload normalization and mapping via JavaScript Code nodes. Controlled batching (SplitInBatches) to avoid rate limits. Pre-send verification of WhatsApp numbers using Rapiwa. Conditional branching with the IF node to separate verified vs unverified flows. Personalized message templates that pull customer and product fields from the mapped data. Logging and audit trail stored in Google Sheets (two separate append flows). How to Use — Step-by-step Setup Add credentials in n8n Rapiwa: Create an HTTP Bearer credential and paste your Bearer token (example name used in the flow: Rapiwa Bearer Auth). Google Sheets: Create an OAuth2 credential (example: Google Sheets). WooCommerce: Add WooCommerce API credentials for the trigger (or configure Shopify credentials if adapting). Import / configure nodes in n8n Webhook (or WooCommerce Trigger): receive order payloads. Example Webhook path is present in the exported flow. Code node Format Webhook Response Data: map body.billing, body.line_items, body.payment_url into { data: { customer, products, invoice_link } }. Code node Clean WhatsApp Number: ensure the phone number is a string and strip non-digits: String(rawNumber).replace(/\D/g, ""). HTTP Request Check valid whatsapp number Using Rapiwa: POST to https://app.rapiwa.com/api/verify-whatsapp with { number }. Use the Rapiwa Bearer credential. IF If: check verification result. The flow compares {{$json.data.exists}} to "true" in the exported flow; normalize types if your API returns booleans. HTTP Request Rapiwa Sender: POST to https://app.rapiwa.com/api/send-message with number, message_type: 'text', and a templated message (see message template in the flow). Google Sheets Store State of Rows in Verified & Sent and Store State of Rows in Unverified & Not Sent Google Sheet Column Structure Create these columns exactly (the Google Sheets nodes in the flow expect these names): A Google Sheet formatted like this ➤ sample | Name | Number | Email | Address | Product Title | Product ID | Size | Quantity | Total Price | Product Image | Invoice Link | Product Status | Validity | Status | |-----------------|---------------|-------------------|--------------|------------------------------------------------|------------|------|----------|----------------|--------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------|----------------|------------|----------| | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | Air Force 1 Reigning Champ Dark Grey 1:1 - 40 | 251 | 40 | 1 | BDT 5800.00 | | Invoice | on-hold | verified | sent | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | Air Force 1 Reigning Champ Dark Grey 1:1 - 40 | 251 | 40 | 1 | BDT 5800.00 | | Invoice | on-hold | unverified | not sent | Customization Ideas Adapt the Code mapping node for Shopify payloads or other marketplaces. Iterate and include multiple products in the message instead of using products[0]. Add filters in the Code node (e.g., only process orders with total > 5000). Add fallback channels (SMS or email) for unverified numbers. Persist logs into a database for analytics and retention beyond Google Sheets. Add admin notifications (Slack, email) at the end of each run. Useful Links Dashboard:** https://app.rapiwa.com Official Website:** https://rapiwa.com Documentation:** https://docs.rapiwa.com Support & Help WhatsApp**: Chat on WhatsApp Discord**: SpaGreen Community Facebook Group**: SpaGreen Support Website**: https://spagreen.net Developer Portfolio**: Codecanyon SpaGreen
by Daniel Shashko
How it Works This workflow automatically monitors your Google Ads campaigns every day, analyzing performance with AI-powered scoring to identify scaling opportunities and catch issues before they drain your budget. Each morning at 9 AM, it fetches all active campaign data including clicks, impressions, conversions, costs, and conversion rates from your Google Ads account. The AI analysis engine evaluates four critical dimensions: CTR (click-through rate) to measure ad relevance, conversion rate to assess landing page effectiveness, cost per conversion to evaluate profitability, and traffic volume to identify scale-readiness. Each campaign receives a performance score (0-100 points) and is automatically categorized as Excellent (75+), Good (55-74), Fair (35-54), or Underperforming (0-34). High-performing campaigns trigger instant Slack alerts to your PPC team with detailed scaling recommendations and projected ROI improvements, while underperforming campaigns generate urgent alerts with specific optimization actions. Every campaign is logged to your Google Sheets dashboard with daily metrics, and the system generates personalized email reports—action-oriented scaling plans for top performers and troubleshooting guides for campaigns needing attention. The entire analysis takes minutes, providing your team with daily intelligence reports that would otherwise require hours of manual spreadsheet work and data analysis. Who is this for? PPC managers and paid media specialists drowning in campaign data and manual reporting Marketing agencies managing multiple client accounts needing automated performance monitoring E-commerce brands running high-spend campaigns who can't afford budget waste Growth teams looking to scale winners faster and pause losers immediately Anyone spending $5K+ monthly on Google Ads who needs data-driven optimization decisions Setup Steps Setup time:** Approx. 15-25 minutes (credential configuration, dashboard setup, alert customization) Requirements:** Google Ads account with active campaigns Google account with a tracking spreadsheet Slack workspace SMTP email provider (Gmail, SendGrid, etc.) Create a Google Sheets dashboard with two tabs: "Daily Performance" and "Campaign Log" with appropriate column headers. Set up these nodes: Schedule Daily Check: Pre-configured to run at 9 AM daily (adjust timing if needed). Fetch Google Ads Data: Connect your Google Ads account and authorize API access. AI Performance Analysis: Review scoring thresholds (CTR, conversion rate, cost benchmarks). Route by Performance: Automatically splits campaigns into high-performers vs. issues. Update Campaign Dashboard: Connect Google Sheets and select your "Daily Performance" tab. Log All Campaigns: Select your "Campaign Log" tab for historical tracking. Slack Alerts: Connect workspace and configure separate channels for scaling opportunities and performance issues. Generate Action Plan: Customize email templates with your brand voice and action items. Email Performance Report: Configure SMTP and set recipient email addresses. Credentials must be entered into their respective nodes for successful execution. Customization Guidance Scoring Weights:** Adjust point values for CTR (30), conversion rate (35), cost efficiency (25), and volume (10) in the AI Performance Analysis node based on your business priorities. Performance Thresholds:** Modify the 75-point Excellent threshold and 55-point Good threshold to match your campaign quality distribution and industry benchmarks. Benchmark Values:** Update CTR benchmarks (5% excellent, 3% good, 1.5% average) and conversion rate targets (10%, 5%, 2%) for your industry. Alert Channels:** Create separate Slack channels for different alert types or route critical alerts to Microsoft Teams, Discord, or SMS via Twilio. Email Recipients:** Configure different recipient lists for scaling alerts (executives, growth team) vs. optimization alerts (campaign managers). Schedule Frequency:** Change from daily to hourly monitoring for high-spend campaigns, or weekly for smaller accounts. Additional Platforms:** Duplicate the workflow structure for Facebook Ads, Microsoft Ads, or LinkedIn Ads with platform-specific nodes. Budget Controls:** Add nodes to automatically pause campaigns exceeding cost thresholds or adjust bids based on performance scores. Once configured, this workflow will continuously monitor your ad spend, identify opportunities worth thousands in additional revenue, and alert you to issues before they waste your budget—transforming manual reporting into automated intelligence. Built by Daniel Shashko Connect on LinkedIn
by vinci-king-01
How it works This workflow automatically processes bank statements from various formats and extracts structured transaction data with intelligent categorization using AI. Key Steps File Upload - Accepts bank statements via webhook upload (PDF, Excel, CSV formats). Smart Format Detection - Automatically routes files to appropriate processors (PDF text extraction or spreadsheet parsing). AI-Powered Extraction - Uses GPT-4 to extract account details, transactions, and balances from statement data. Data Processing & Categorization - Cleans, validates, and automatically categorizes transactions into expense categories. Database Storage - Saves processed data to PostgreSQL database for analysis and reporting. API Response - Returns structured summary with transaction counts, expense totals, and category breakdowns. Set up steps Setup time: 8-12 minutes Configure OpenAI credentials - Add your OpenAI API key for AI-powered data extraction. Set up PostgreSQL database - Connect your PostgreSQL database and create the required table structure. Configure webhook endpoint - The workflow provides a /upload-statement endpoint for file uploads. Customize transaction categories - Modify the AI prompt to include your preferred expense categories. Test the workflow - Upload a sample bank statement to verify the extraction and categorization process. Set up database table - Ensure your PostgreSQL database has a bank_statements table with appropriate columns. Features Multi-format support**: PDF, Excel, CSV bank statements AI-powered extraction**: GPT-4 extracts account details and transactions Automatic categorization**: Expenses categorized as groceries, dining, gas, shopping, utilities, healthcare, entertainment, income, fees, or other Data validation**: Cleans and validates transaction data with error handling Database storage**: PostgreSQL integration for data persistence API responses**: Clean JSON responses with transaction summaries and category breakdowns Smart routing**: Automatic format detection and appropriate processing paths
by Intuz
This n8n template from Intuz delivers a complete and automated solution to streamline your development workflow for a single repository. By embedding specific keywords and a JIRA issue ID within your git commit commands, this workflow automatically creates a Pull Request in GitHub and simultaneously updates the corresponding JIRA ticket. This provides a complete, seamless integration that eliminates manual steps and keeps your project management perfectly in sync with your codebase. How it works This workflow acts as a powerful bridge between your Git repository and your project management tools, driven entirely by the structure of your commit messages. GitHub Webhook Trigger: The workflow starts when a developer pushes a new commit to a specified repository in GitHub. Parse Commit Message: A Code node extracts key information from the commit message: The JIRA Issue Key (e.g., FF-1196). The base branch for the PR (e.g., development). Action commands like [auto-pr] and [taskcompleted]. Conditional PR Creation: An IF node checks if the [auto-pr] command is present. If yes, it uses the GitHub node to automatically create a pull request from the developer's branch to the specified base branch. If no, this step is skipped, allowing for multiple commits before a PR is made. Conditional JIRA Update: Another IF node checks for the [taskcompleted] command. If yes, it uses the JIRA node to transition the corresponding issue to your "Done" status (e.g., "Task Completed" or "In Review"). If no, the JIRA issue remains in its current state, perfect for work-in-progress commits. How to Use: Quick Start Guide Click the "Use Template" button to import this workflow into your n8n instance. Configure the GitHub Trigger: Open the "GitHub Push Trigger" node. It will display a unique Webhook URL. Copy this URL. In your GitHub repository, go to Settings > Webhooks > Add webhook. Paste the URL into the Payload URL field. Set the Content type to application/json. Under "Which events would you like to trigger this webhook?", select Just the push event. Click Add webhook. Connect Your Accounts: GitHub: Select your GitHub API credential in the "Create Pull Request" node. JIRA : Select your JIRA API credential in the "Update JIRA Issue Status" node. Customize the JIRA Transition (Important): Open the "Update JIRA Issue Status" node. In the Transition parameter, you need to set the specific status you want to move the issue to (e.g., 'Done', 'Completed', 'In Review'). You can use the ID or the exact name of the transition from your JIRA project's workflow. Activate the Workflow: Save your changes and activate the workflow. You're ready to automate! Example Commit Message: git commit -m "FF-1196 Implement OAuth login [auto-pr,development,taskcompleted]" Key Requirements to Use Template An active n8n instance. A GitHub account with repository admin permissions to create webhooks. A JIRA Cloud account with permissions to update issues. Developers who can follow the specified git commit message format. Connect with us Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Worflow Automation Click here- Get Started
by PDF Vector
Overview Transform your contract management process with this enterprise-grade workflow that handles the complete contract lifecycle - from initial intake through execution, monitoring, and renewal. This comprehensive solution combines AI-powered contract analysis with automated risk scoring, clause comparison, obligation tracking, and proactive alerts. It integrates with multiple data sources including email, SharePoint, contract CLM systems, and creates a centralized contract intelligence hub that prevents revenue leakage, ensures compliance, and accelerates deal velocity. What You Can Do This advanced workflow orchestrates a complete contract management ecosystem that monitors multiple channels (email, Google Drive, SharePoint, APIs) for new contracts and amendments. It extracts and analyzes over 50 contract data points using AI, performs multi-dimensional risk assessment across legal, financial, and operational factors, compares clauses against your approved template library, tracks all obligations and key dates with automated reminders, integrates with Salesforce/CRM for deal alignment, routes contracts through dynamic approval workflows based on risk scores, generates executive dashboards with contract analytics, and maintains a searchable repository with version control. The system handles complex scenarios including multi-party agreements, framework contracts with statements of work, international contracts requiring jurisdiction analysis, and M&A due diligence requiring bulk contract review. Who It's For Designed for enterprise legal operations teams managing thousands of contracts annually, procurement departments negotiating complex vendor agreements, contract managers overseeing multi-million dollar portfolios, compliance teams ensuring regulatory adherence across jurisdictions, sales operations needing faster contract turnaround, and C-suite executives requiring contract intelligence for strategic decisions. Essential for organizations in regulated industries (healthcare, finance, government) and companies undergoing digital transformation of their legal operations. The Problem It Solves Manual contract management creates massive operational risks and inefficiencies. Organizations typically have contracts scattered across emails, shared drives, and filing cabinets with no central visibility. This leads to missed renewal deadlines costing 5-10% of contract value, unauthorized contract variations creating compliance risks, obligation failures resulting in penalties and damaged relationships, and inability to leverage favorable terms across similar contracts. Studies show that inefficient contract management costs organizations up to 9% of annual revenue. This workflow creates a single source of truth for all contracts, automates tracking and compliance, and provides predictive insights to prevent issues before they occur. Setup Instructions Multi-Channel Integration: Configure connectors for email (Office 365/Gmail), Google Drive, SharePoint, and contract management systems PDF Vector Setup: Install PDF Vector node and configure API with enterprise rate limits Database Configuration: Set up PostgreSQL/MySQL for contract repository with proper indexing Template Library: Upload your standard contract templates and approved clause library Risk Framework: Configure risk scoring matrix for your industry (legal, financial, operational risks) Approval Matrix: Define approval routing based on contract value, type, and risk score CRM Integration: Connect to Salesforce/HubSpot for opportunity and account alignment Notification Setup: Configure Slack/Teams channels and email distribution lists Dashboard Creation: Set up Tableau/PowerBI connectors for executive reporting Security Configuration: Enable encryption, audit logging, and role-based access controls Key Features Intelligent Intake System**: Monitor email attachments, shared folders, CRM uploads, and API submissions Advanced AI Extraction**: Extract 50+ data points including nested obligations and conditional terms Multi-Dimensional Risk Scoring**: Analyze legal, financial, operational, and reputational risks Clause Library Comparison**: Compare against approved templates and flag deviations Obligation Management**: Track deliverables, milestones, and SLAs with automated alerts Dynamic Approval Routing**: Route based on AI risk score, contract value, and deviation analysis Version Control & Redlining**: Track all changes and maintain complete audit trail Salesforce Integration**: Sync contract data with opportunities and accounts Predictive Analytics**: Forecast renewal likelihood and negotiation outcomes Bulk Processing**: Handle M&A due diligence with parallel processing of hundreds of contracts Multi-Language Support**: Process contracts in 15+ languages with automatic translation Executive Dashboards**: Real-time visibility into contract portfolio and risk exposure Customization Options Implement industry-specific modules for healthcare (BAAs, DPAs), financial services (ISDAs, loan agreements), technology (SaaS, licensing), or government contracting. Add AI models trained on your historical contracts for better extraction accuracy. Create custom risk factors for emerging regulations like AI governance or ESG compliance. Build integration with specific CLM systems (Ironclad, Docusign CLM, Icertis). Implement advanced analytics including contract similarity scoring, win-rate analysis by clause variations, and automatic playbook generation. Add blockchain integration for smart contract execution and configure automated contract assembly for standard agreements. Note: This workflow uses the PDF Vector community node. Make sure to install it from the n8n community nodes collection before using this template.
by Risper
🤖AI-Powered Appointment Scheduling with Google Calendar & Sheets Virtual Receptionist Automate customer conversations with an AI-powered virtual receptionist. This workflow can chat naturally with clients, answer general business questions (like services, location, and hours), check availability in Google Calendar, book appointments, and save customer details in Google Sheets. Fully customizable for any business type — salons, clinics, agencies, consultants, and more. 📖 How It Works Welcome the customer when the customer says hi AI greets warmly: “Hello! I’m [AI name] from [Business name].” Answer general questions Provides instant replies about services, pricing, business location, hours, and availability. Understand their need Identifies the service requested and preferred time. Check availability Queries Google Calendar for open slots. Gather customer details Collects name, phone, and email (optional). Confirm booking Creates the appointment in Google Calendar. Save records Logs booking and customer info into Google Sheets. ⚙️ Setup Steps (Quick) Connect your Google Calendar and Google Sheets accounts. Add your business details (name, type, services, hours, policies) to the Business Info Sheet. Configure your OpenAI API key (or use n8n free credits). Optional: Connect Twilio WhatsApp for direct chat responses. 🏢 Example Business Info (Google Sheet) | business_id | business_name | business_type | location | phone | email | services | calendar_id | timezone | currency | working_hours | ai_name | ai_personality | ai_role | emergency_available | booking_advance_days | cancellation_hours | |-------------|-----------------|---------------------|----------------------------------|-----------------|---------------------------|----------|-----------------------|----------|----------|--------------------------------|---------|-----------------------------------|------------------------------------------------------------------------------------------------|----------------------|----------------------|-------------------| |001| Luxe Hair Studio | Hair & Beauty Salon | 123 Main Street, New York, NY 10001 | 1 (XXX) XXX-XXXX | yourbusiness@email.com | “Haircut & Styling (60 minutes, $3500…)Hair Coloring (120 minutes, $8000…)…” | calendar-id-here | GMT -3 | USD | Mon–Sat: 9:00 AM – 7:00 PM, Sun: Closed | bella | Friendly, Stylish, Professional | Manages bookings, answers FAQs, recommends services, gives beauty tips, sends reminders, etc. | no | 10 | 24 | ✅ Purpose: Supplies context (services, pricing, hours, AI personality, booking policies). 💡 The AI uses this sheet to answer general business questions (e.g., “Where are you located?”, “Do you do hair colouring?”, “What are your working hours?”). 📊 Appointments Sheet Example | client_number | client_name | event_id | summary | services | |----------------|-------------|-----------|----------------------------------|----------| | 001 | Sarah Lee | evt-10293 | Appointment with Sarah Lee – Haircut & Styling | Haircut & Styling | | 002 | John Smith | evt-10294 | Appointment with John Smith – Highlights | Highlights | ✅ Purpose: Logs confirmed bookings with service details and links back to Google Calendar. 💡 Features ✅ AI receptionist with conversation memory ✅ Answers FAQs – location, services, hours, pricing ✅ Google Calendar integration for real-time availability ✅ Google Sheets integration for customer records & reporting ✅ Customizable AI name, role, and personality 🔑 Who It’s For Salons & Spas** – Manage bookings and FAQs Clinics & Health Services** – Automated scheduling + patient info Agencies & Consultants** – Answer inquiries + schedule meetings Any Service Business** – Save time, improve customer experience
by Wan Dinie
AI Website Analyzer to Product Ideas with FireCrawl and GPT-4.1 This n8n template demonstrates how to use AI to analyze any website and generate product ideas or summaries based on the website's content and purpose. Use cases are many: Try analyzing competitor websites, discovering product opportunities, understanding business models, or generating insights from landing pages! Good to know At time of writing, Firecrawl offers up to 500 free API calls. See Firecrawl Pricing for updated info. OpenAI API costs vary by model. GPT-3.5 is cheaper while GPT-4 and above offer deeper analysis but cost more per request. How it works We'll collect a website URL via a manual form trigger. The URL is sent to the Firecrawl API, which deeply crawls and analyzes the website content. Firecrawl returns the scraped data, including page structure, content, and metadata. The scraped data is then sent to OpenAI's API with a custom prompt. OpenAI generates an AI-powered summary analyzing what the website is doing, its purpose, and potential product ideas. The final output is displayed or can be stored for further use. How to use The manual trigger node is used as an example, but feel free to replace this with other triggers such as webhook or even a form. You can analyze multiple URLs by looping through a list, but of course, the processing will take longer and cost more. Requirements Firecrawl API key (get free 500 calls at https://firecrawl.dev) OpenAI API key for AI analysis Valid website URLs to analyze Customizing this workflow Change the output format from HTML to JSON, Markdown, or plain text by editing the Firecrawl parameters. Modify the AI prompt to focus on specific aspects like pricing strategy, target audience, or UX analysis. Upgrade to GPT-4.1, GPT-5.1, or GPT-5.2 for more advanced and detailed analysis. Add a webhook trigger to analyze websites automatically from other apps or services. Store results in a database like Supabase or Google Sheets for tracking competitor analysis over time.
by Kevin Meneses
What this workflow does This template extracts high-intent SEO keywords from any web page and turns them into a ranked keyword list you can use for content planning, landing pages, and SEO strategy. It runs in 3 phases: Scrape the target URL* with Decodo Decodo – Web Scraper for n8n Use AI to extract seed keywords* and understand the page topic Enrich each seed keyword with real Google SERP data via SerpApi* (related searches + questions + competitors), then apply a JavaScript scoring system to rank the best opportunities The final output is saved to Google Sheets as a clean table of ranked keywords. Who this workflow is for SEO consultants and agencies SaaS marketers and growth teams Founders validating positioning and messaging Content teams looking for “what people actually search for” This workflow is especially useful when you want keywords with commercial / solution intent, not generic single-word terms. Workflow overview Phase 1 — Scrape & clean page content Reads the URL from Google Sheets Scrapes the page via Decodo Cleans HTML into plain text (token-friendly) Phase 2 — AI keyword extraction AI returns a structured JSON with: brand / topic 5–10 mid-tail seed keywords intent + audience hints Phase 3 — SERP enrichment + scoring Uses SerpApi to fetch: related searches People Also Ask questions competitor domains Scores and ranks keywords based on: -- source type (related searches / PAA / organic) -- frequency across seeds -- modifiers (pricing, best, free, docs, etc.) -- mid-tail length preference Setup (step by step) 1) Google Sheets (input) Create a sheet with: Column name: urls One URL per row 2) Google Sheets (output) Create an output sheet with columns like: keyword score intent_hint source_type Tip: Clear the output sheet before each run if you want a clean export. 3) Decodo Add your Decodo credentials The URL is taken automatically from Google Sheets Decodo – Web Scraper for n8n 4) SerpApi Add your SerpApi key in the SerpApi node 5) AI Model Connect your preferred AI model (Gemini / OpenAI) The prompt is optimized to output valid JSON only Self-hosted disclaimer This is a community template. You must configure your own credentials (Google Sheets, Decodo, SerpApi, AI). Results depend on page accessibility and page content quality.
by Ian Kerins
Overview This n8n template automates the process of researching niche topics. It searches for a topic on Wikipedia, scrapes the relevant page using ScrapeOps, extracts the history or background section, and uses AI to generate a concise summary and timeline. The results are automatically saved to Google Sheets for easy content planning. Who is this for? Content Creators**: Quickly gather background info for videos or articles. Marketers**: Research niche markets and product histories. Educators/Students**: Generate timelines and summaries for study topics. Researchers**: Automate the initial data gathering phase. What problems it solves Time Consumption**: Manually reading and summarizing Wikipedia pages takes time. Blocking**: Scraping Wikipedia directly can sometimes lead to IP blocks; ScrapeOps handles this. Unstructured Data**: Raw HTML is hard to use; this workflow converts it into a clean, structured format (JSON/CSV). How it works Define Topic: You set a keyword in the workflow. Locate Page: The workflow queries the Wikipedia API to find the correct page URL. Smart Scraping: It uses the ScrapeOps Proxy API to fetch the page content reliably. Extraction: A code node intelligently parses the HTML to find "History", "Origins", or "Background" sections. AI Processing: GPT-4o-mini summarizes the text and extracts key dates for a timeline. Storage: The structured data is appended to a Google Sheet. Setup steps (~ 5-10 minutes) ScrapeOps Account: Register for a free API key at ScrapeOps. Configure the ScrapeOps Scraper node with your API key. OpenAI Account: Add your OpenAI credentials to the Message a model node. Google Sheets: Create a Google Sheet. You can duplicate this Template Sheet (copy the headers). Connect your Google account to the Append row in sheet node and select your new sheet. Pre-conditions An active ScrapeOps account. An OpenAI API key (or another LLM credential). A Google account for Sheets access. Disclaimer This template uses ScrapeOps as a community node. You are responsible for complying with Wikipedia's Terms of Use, robots directives, and applicable laws in your jurisdiction. Scraping targets may change at any time; adjust render/scroll/wait settings and parsers as needed. Use responsibly for legitimate business purposes.
by Alexandru Burca
Daily News Digest Video Generator for YouTube Shorts Instalations Instructions Youtube Instalation Instructions Overview This workflow automatically creates and publishes daily news digest videos from WordPress articles to YouTube. It runs every evening at 7 PM, compiling the day's top stories from a news portal into a professionally formatted vertical video (1080x1920px) optimized for social media platforms like YouTube Shorts. What It Does 1. 🕐 Scheduled Trigger Runs automatically every day at 19:00 (7 PM) 2. 📰 Fetches Today's Articles Retrieves all published WordPress posts from the current day 3. ✅ Validates Content Ensures there are at least 3 articles before proceeding 4. 🎬 Video Detection Scans article content HTML for embedded videos Extracts MP4 URLs from WordPress video players Parses wp-playlist-script JSON data Falls back to ` and <source>` tag detection 5. 🧹 Data Processing Extracts** article titles, links, and featured media IDs Decodes HTML entities**: Converts – to –, " to ", etc. Fetches featured images** from WordPress Media API Assigns default images** for articles without featured media Calculates reading time** per article (3-7 seconds based on word count) Cleans text**: Removes HTML tags and normalizes whitespace 6. 🎥 Video Generation (via Shotstack API) Intro Slide (3 seconds) Black background Large logo (centered) Title on center Current date in DD-MM-YYYY format News Slides (3-7 seconds each) Each article is displayed with: Background**: Video (if available) or featured image, cropped to fit Dark overlay**: 40% opacity black layer for text readability Article headline**: Large white text at top Small logo**: Top-right corner Pagination counter**: Bottom-right white badge (e.g., "1 / 22") CTA button**: Centered CTA Background music**: Subtle looped audio track Transitions**: Smooth fade in/out between slides Outro Slide (3 seconds) Identical to intro slide Provides clean ending to the video 7. ⏳ Processing Wait Waits 30 seconds for Shotstack to render the video Polls Shotstack API to verify video completion 8. 📥 Download Video Retrieves the finished MP4 file from Shotstack Downloads video data for YouTube upload 9. 📤 YouTube Upload Automatically uploads to YouTube with: Title**: "Daily Digest - [Day] [Weekday], [Year]" Description**: Same as title Category**: News & Politics Made for kids**: Yes Tags**: dailydigest ✨ Key Features Intelligent Content Handling ✅ Automatic video/image detection and intelligent media selection ✅ Dynamic reading time calculation for optimal viewer engagement ✅ HTML entity cleaning for proper text display (WordPress compatibility) ✅ Fallback default images for articles without media ✅ Video background support with automatic muting Professional Video Production ✅ Vertical format optimized for mobile viewing (1080x1920px) ✅ Professional branding with logos and consistent styling ✅ Smooth fade transitions between slides ✅ Background music with looping support ✅ Dynamic pagination counters ✅ Call-to-action buttons for engagement Customization ✅ Centralized variables for easy branding updates ✅ Configurable logos, colors, and text ✅ Adjustable reading time calculation ✅ Flexible date formatting ✅ Customizable audio track 🎯 Use Cases Perfect for: 📰 News websites wanting to repurpose daily articles 📱 Media outlets creating social media content 🎥 Content creators automating video production 🔄 Publishers maximizing content distribution 📊 Marketing teams driving traffic from social platforms 🔧 Customization Options Easy Changes Update logos by changing logo_big and logo_small URLs Modify branding colors via button_bg_color variable Adjust button text with button_text variable Change video title with daily_digest_text variable Update background music by replacing audio URL Advanced Customization Adjust reading time formula in calculateReadingTime() function Modify date format in getRomanianDate() function Change video dimensions (currently 1080x1920) Update font family and sizes Adjust overlay opacity and colors Modify transition effects 📋 Prerequisites Required Credentials WordPress API - Access your WordPress site Shotstack API - API key for video rendering (Stage environment) YouTube OAuth2 - Authenticated YouTube account for uploads
by WeblineIndia
Smart Contract Event Monitor (Web3) This workflow automatically monitors the Ethereum blockchain, extracts USDT transfer events, filters large-value transactions, stores them in Airtable and sends a clean daily summary alert to Slack. This workflow checks the latest Ethereum block every day and identifies high-value USDT transfers. It fetches on-chain logs using Alchemy, extracts sender/receiver/value details, filters transactions above a threshold, stores them in Airtable and finally sends a single clear summary alert to Slack. You receive: Daily blockchain check (automated) Airtable tracking of all high-value USDT transfers A Slack alert summarizing the count + the largest transfer Ideal for teams wanting simple, automated visibility of suspicious or large crypto movements without manually scanning the blockchain. Quick Start – Implementation Steps Add your Alchemy Ethereum Mainnet API URL in both HTTP nodes. Connect and configure your Airtable base & table. Connect your Slack credentials and set the channel for alerts. Adjust the value threshold in the IF node (default: 1,000,000,000). Activate the workflow — daily monitoring begins instantly. What It Does This workflow automates detection of high-value USDT transfers on Ethereum: Fetches the latest block number using Alchemy. Retrieves all USDT Transfer logs from that block. Extracts structured data: Sender Receiver Amount Contract Block number Transaction hash Filters only transactions above a configurable threshold. Saves each high-value transaction into Airtable for record-keeping. Generates a summary including: Total number of high-value transfers The single largest transfer Sends one clean alert message to Slack. This ensures visibility of suspicious or large fund movements with no repeated alerts. Who’s It For This workflow is ideal for: Crypto analytics teams Blockchain monitoring platforms Compliance teams tracking high-value activity Web3 product teams Developers needing automated USDT transfer tracking Anyone monitoring whale movements / suspicious transactions Requirements to Use This Workflow To run this workflow, you need: n8n instance** (cloud or self-hosted) Alchemy API URL** (Ethereum Mainnet) Airtable base** + Personal Access Token Slack workspace** with API permissions Basic understanding of Ethereum logs, hex values & JSON data How It Works Daily Check – Workflow runs automatically at your set time. Get Latest Block Number – Fetches newest Ethereum block from Alchemy. Fetch USDT Logs – Queries all Transfer events (ERC-20 standard). Extract Transaction Details – Converts hex → readable data. Filter High-Value Transactions – Keeps only large value transfers. Save to Airtable – Adds each transfer record to your database. Generate Summary – Finds the largest transfer & total count. Send Slack Alert – Notifies your team with one clean summary. Setup Steps Import the provided n8n JSON file. Open the Get Latest Block and Fetch Logs HTTP nodes → add your Alchemy API URL. Ensure USDT contract address (already included):0xdAC17F958D2ee523a2206206994597C13D831ec7 Connect your Airtable account and map: Contract From Address To Address Value Block Number txHash Connect Slack API credentials and choose your channel. Change the threshold limit in the IF node if needed (default: 1B). Activate the workflow — done! How To Customize Nodes Customize Value Threshold Modify the IF node: Increase or decrease the minimum transfer value Change logic to smaller or larger whale-tracking Customize Airtable Storage You can add fields like: Timestamp Token symbol USD price (using price API) Transaction status Risk classification Customize Slack Alerts You may add: Emojis Mentions (@channel, @team) Links to Etherscan Highlighted blocks for critical transfers Customize Web3 Provider Replace Alchemy with: Infura QuickNode Public RPC (not recommended for reliability) Add-Ons (Optional Enhancements) You can extend this workflow to: Track multiple ERC-20 tokens Process several blocks instead of just the latest Add price conversion (USDT → USD value) Detect transfers to suspicious wallets Generate daily or weekly summary reports in Slack Create a dashboard using Airtable Interfaces Add OpenAI-based insights (large spike, suspicious pattern, etc.) Use Case Examples 1\. Whale Tracking Detect large USDT movements (>1M or >5M). 2\. Compliance Monitoring Log high-value transfers in Airtable for audits. 3\. Real-Time Alerts Slack alerts tell your team instantly about big movements. 4\. On-Chain Analytics Automate structured extraction of Ethereum logs. 5\. Exchange Monitoring Detect large inflows/outflows to known addresses. Troubleshooting Guide | Issue | Possible Cause | Solution | |------------------------|-----------------------------------|---------------------------------------------------------| | No data in Airtable | Logs returned empty | Ensure USDT transfer events exist in that block | | Values are “zero” | Hex parsing failed | Check extract-code logic | | Slack alert not sent | Invalid credentials | Update Slack API key | | Airtable error | Wrong field names | Match Airtable column names exactly | | HTTP request fails | Wrong RPC URL | Re-check Alchemy API key | | Workflow not running | Schedule disabled | Enable "Daily Check" node | Need Help? If you need help customizing or extending this workflow — adding multi-token monitoring, setting up dashboards, improving alerts or scaling this for production — our n8n workflow developers at WeblineIndia can assist you with advanced automation.
by Cheng Siong Chin
How It Works The workflow starts with a scheduled trigger that activates at set intervals. Behavioral data from multiple sources is parsed and sent to the MCDN routing engine, which intelligently assigns leads to the right teams based on predefined rules. AI-powered scoring evaluates each prospect’s potential, ensuring high-quality leads are prioritized. The results are synced to the CRM, and updates are reflected on an analytics dashboard for real-time visibility. Setup Steps Trigger: Define schedule frequency. Data Fetch: Configure APIs for all behavioral data sources. MCDN Router: Set routing rules, thresholds, and team assignments. AI Models: Connect OpenAI/NVIDIA APIs and configure scoring prompts. CRM Integration: Enter credentials for Salesforce, HubSpot, or other CRMs. Dashboard: Link to analytics tools like Tableau or Google Sheets for reporting. Prerequisites API credentials: NVIDIA AI, OpenAI, CRM platform; data sources; spreadsheet/analytics access Use Cases Lead prioritization for sales teams; customer segmentation; automated routing; Customization Adjust routing rules, add custom scoring models, modify team assignments, expand data sources, integrate additional AI providers Benefits Reduces manual lead routing 90%; improves scoring accuracy; accelerates sales cycle; enables data-driven team assignments;