by vinci-king-01
Product Price Monitor with Mailchimp and Baserow ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow scrapes multiple e-commerce sites for product pricing data, stores the historical prices in Baserow, analyzes weekly trends, and emails a neatly formatted seasonal report to your Mailchimp audience. It is designed for retailers who need to stay on top of seasonal pricing patterns to make informed inventory and pricing decisions. Pre-conditions/Requirements Prerequisites Running n8n instance (self-hosted or n8n cloud) ScrapeGraphAI community node installed Mailchimp account with at least one audience list Baserow workspace with edit rights Product URLs or SKU list from target e-commerce platforms Required Credentials | Credential | Used By | Scope | |------------|---------|-------| | ScrapeGraphAI API Key | ScrapeGraphAI node | Web scraping | | Mailchimp API Key & Server Prefix | Mailchimp node | Sending emails | | Baserow API Token | Baserow node | Reading & writing records | Baserow Table Setup Create a table named price_tracker with the following fields: | Field Name | Type | Example | |------------|------|---------| | product_name | Text | “Winter Jacket” | | product_url | URL | https://example.com/winter-jacket | | current_price | Number | 59.99 | | scrape_date | DateTime | 2023-11-15T08:21:00Z | How it works This workflow scrapes multiple e-commerce sites for product pricing data, stores the historical prices in Baserow, analyzes weekly trends, and emails a neatly formatted seasonal report to your Mailchimp audience. It is designed for retailers who need to stay on top of seasonal pricing patterns to make informed inventory and pricing decisions. Key Steps: Schedule Trigger**: Fires every week (or custom CRON) to start the monitoring cycle. Code (Prepare URLs)**: Loads or constructs the list of product URLs to monitor. SplitInBatches**: Processes product URLs in manageable batches to avoid rate-limit issues. ScrapeGraphAI**: Scrapes each product page and extracts the current price and name. If (Price Found?)**: Continues only if scraping returns a valid price. Baserow**: Upserts the scraped data into the price_tracker table. Code (Trend Analysis)**: Aggregates weekly data to detect price increases, decreases, or stable trends. Set (Mail Content)**: Formats the trend summary into an HTML email body. Mailchimp**: Sends the seasonal price-trend report to the selected audience segment. Sticky Note**: Documentation node explaining business logic in-workflow. Set up steps Setup Time: 10-15 minutes Clone the template: Import the workflow JSON into your n8n instance. Install ScrapeGraphAI: n8n-nodes-scrapegraphai via the Community Nodes panel. Add credentials: a. ScrapeGraphAI API Key b. Mailchimp API Key & Server Prefix c. Baserow API Token Configure Baserow node: Point it to your price_tracker table. Edit product list: In the “Prepare URLs” Code node, replace the sample URLs with your own. Adjust schedule: Modify the Schedule Trigger CRON expression if weekly isn’t suitable. Test run: Execute the workflow manually once to verify credentials and data flow. Activate: Turn on the workflow for automatic weekly monitoring. Node Descriptions Core Workflow Nodes: Schedule Trigger** – Initiates the workflow on a weekly CRON schedule. Code (Prepare URLs)** – Generates an array of product URLs/SKUs to scrape. SplitInBatches** – Splits the array into chunks of 5 URLs to stay within request limits. ScrapeGraphAI** – Scrapes each URL, using XPath/CSS selectors to pull price & title. If (Price Found?)** – Filters out failed or empty scrape results. Baserow** – Inserts or updates the price record in the database. Code (Trend Analysis)** – Calculates week-over-week price changes and flags anomalies. Set (Mail Content)** – Creates an HTML table with product, current price, and trend arrow. Mailchimp** – Sends or schedules the email campaign. Sticky Note** – Provides inline documentation and edit hints. Data Flow: Schedule Trigger → Code (Prepare URLs) → SplitInBatches SplitInBatches → ScrapeGraphAI → If (Price Found?) → Baserow Baserow → Code (Trend Analysis) → Set (Mail Content) → Mailchimp Customization Examples Change scraping frequency // Schedule Trigger CRON for daily at 07:00 UTC 0 7 * * * Add competitor comparison column // Code (Trend Analysis) item.competitor_price_diff = item.current_price - item.competitor_price; return item; Data Output Format The workflow outputs structured JSON data: { "product_name": "Winter Jacket", "product_url": "https://example.com/winter-jacket", "current_price": 59.99, "scrape_date": "2023-11-15T08:21:00Z", "weekly_trend": "decrease" } Troubleshooting Common Issues Invalid ScrapeGraphAI key – Verify the API key and ensure your subscription is active. Mailchimp “Invalid Audience” error – Double-check the audience ID and that the API key has correct permissions. Baserow “Field mismatch” – Confirm your table fields match the names/types in the workflow. Performance Tips Limit each SplitInBatches run to ≤10 URLs to reduce scraping timeouts. Enable caching in ScrapeGraphAI to avoid repeated requests to the same URL within short intervals. Pro Tips: Use environment variables for all API keys to avoid hard-coding secrets. Add an extra If node to alert you if a product’s price drops below a target threshold. Combine with n8n’s Slack node for real-time alerts in addition to Mailchimp summaries.
by Khairul Muhtadin
Decodo Amazon Product Recommender delivers instant, AI-powered shopping recommendations directly through Telegram. Send any product name and receive Amazon product analysis featuring price comparisons, ratings, sales data, and categorized recommendations (budget, premium, best value) in under 40 seconds—eliminating hours of manual research. Why Use This Workflow? Time Savings: Reduce product research from 45+ minutes to under 30 seconds Decision Quality: Compare 20+ products automatically with AI-curated recommendations Zero Manual Work: Complete automation from message input to formatted recommendations Ideal For E-commerce Entrepreneurs:** Quickly research competitor products, pricing strategies, and market trends for inventory decisions Smart Shoppers & Deal Hunters:** Get instant product comparisons with sales volume data and discount tracking before purchasing Product Managers & Researchers:** Analyze Amazon marketplace positioning, customer sentiment, and pricing ranges for competitive intelligence How It Works Trigger: User sends product name via Telegram (e.g., "iPhone 15 Pro Max case") AI Validation: Gemini 2.5 Flash extracts core product keywords and validates input authenticity Data Collection: Decodo API scrapes Amazon search results, extracting prices, ratings, reviews, sales volume, and product URLs Processing: JavaScript node cleans data, removes duplicates, calculates value scores, and categorizes products (top picks, budget, premium, best value, most popular) Intelligence Layer: AI generates personalized recommendations with Telegram-optimized markdown formatting, shortened product names, and clean Amazon URLs Output & Delivery: Formatted recommendations sent to user with categorized options and direct purchase links Error Handling: Admin notifications via separate Telegram channel for workflow monitoring Setup Guide Prerequisites | Requirement | Type | Purpose | |-------------|------|---------| | n8n instance | Essential | Workflow execution platform | | Decodo Account | Essential | Amazon product data scraping | | Telegram Bot Token | Essential | Chat interface for user interactions | | Google Gemini API | Essential | AI-powered product validation and recommendations | | Telegram Account | Optional | Admin error notifications | Installation Steps Import the JSON file to your n8n instance Configure credentials: Decodo API: Sign up at decodo.com → Dashboard → Scraping APIs → Web Advanced → Copy BASIC AUTH TOKEN Telegram Bot: Message @BotFather on Telegram → /newbot → Copy HTTP API token (format: 123456789:ABCdefGHI...) Google Gemini: Obtain API key from Google AI Studio for Gemini 2.5 Flash model Update environment-specific values: Replace YOUR-CHAT-ID in "Notify Admin" node with your Telegram chat ID for error notifications Verify Telegram webhook IDs are properly configured Customize settings: Adjust AI prompt in "Generate Recommendations" node for different output formats Set character limits (default: 2500) for Telegram message length Test execution: Send test message to your Telegram bot: "iPhone 15 Pro" Verify processing status messages appear Confirm recommendations arrive with properly formatted links Customization Options Basic Adjustments: Character Limit**: Modify 2500 in AI prompt to adjust response length (Telegram max: 4096) Advanced Enhancements: Multi-language Support**: Add language detection and translation nodes for international users Price Tracking**: Integrate Google Sheets to log historical prices and trigger alerts on drops Image Support**: Enable Telegram photo messages with product images from scraping results Troubleshooting Common Issues: | Problem | Cause | Solution | |---------|-------|----------| | "No product detected" for valid inputs | AI validation too strict or ambiguous query | Add specific product details (model number, brand) in user input | | Empty recommendations returned | Decodo API rate limit or Amazon blocking | Wait 60 seconds between requests; verify Decodo account status | | Telegram message formatting broken | Special characters in product names | Ensure Telegram markdown mode is set to "Markdown" (legacy) not "MarkdownV2" | Use Case Examples Scenario 1: E-commerce Store Owner Challenge: Needs to quickly assess competitor pricing and product positioning for new inventory decisions without spending hours browsing Amazon Solution: Sends "wireless earbuds" to bot, receives categorized analysis of 20+ products with price ranges ($15-$250), top sellers, and discount opportunities Result: Identifies $35-$50 price gap in market, sources comparable product, achieves 40% profit margin Scenario 2: Smart Shopping Enthusiast Challenge: Wants to buy a laptop backpack but overwhelmed by 200+ Amazon options with varying prices and unclear value propositions Solution: Messages "laptop backpack" to bot, gets AI recommendations sorted by budget ($30), premium ($50+), best value (highest discount + good ratings), and most popular (by sales volume) Result: Purchases "Best Value" recommendation with 35% discount, saves $18 and 45 minutes of research time Created by: Khaisa Studio Category: AI | Productivity | E-commerce | Tags: amazon, telegram, ai, product-research, shopping, automation, gemini Need custom workflows? Contact us Connect with the creator: Portfolio • Workflows • LinkedIn • Medium • Threads
by vinci-king-01
Medical Research Tracker with Matrix and Pipedrive ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow automatically monitors selected government and healthcare-policy websites, extracts newly published or updated policy documents, logs them as deals in a Pipedrive pipeline, and announces critical changes in a Matrix room. It gives healthcare administrators and policy analysts a near real-time view of policy developments without manual web checks. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n cloud) ScrapeGraphAI community node installed Active Pipedrive account with at least one pipeline Matrix account & accessible room for notifications Basic knowledge of n8n credential setup Required Credentials ScrapeGraphAI API Key** – Enables the scraping engine Pipedrive OAuth2 / API Token** – Creates & updates deals Matrix Credentials** – Homeserver URL, user, access token (or password) Specific Setup Requirements | Variable | Description | Example | |----------|-------------|---------| | POLICY_SITES | Comma-separated list of URLs to scrape | https://health.gov/policies,https://who.int/proposals | | PD_PIPELINE_ID | Pipedrive pipeline where deals are created | 5 | | PD_STAGE_ID_ALERT | Stage ID for “Review Needed” | 17 | | MATRIX_ROOM_ID | Room to send alerts (incl. leading !) | !policy:matrix.org | Edit the initial Set node to provide these values before running. How it works This workflow automatically monitors selected government and healthcare-policy websites, extracts newly published or updated policy documents, logs them as deals in a Pipedrive pipeline, and announces critical changes in a Matrix room. It gives healthcare administrators and policy analysts a near real-time view of policy developments without manual web checks. Key Steps: Scheduled Trigger**: Runs every 6 hours (configurable) to start the monitoring cycle. Code (URL List Builder)**: Generates an array from POLICY_SITES for downstream batching. SplitInBatches**: Iterates through each policy URL individually. ScrapeGraphAI**: Scrapes page titles, publication dates, and summary paragraphs. If (New vs Existing)**: Compares scraped hash with last run; continues only for fresh content. Merge (Aggregate Results)**: Collects all “new” policies into a single payload. Set (Deal Formatter)**: Maps scraped data to Pipedrive deal fields. Pipedrive Node**: Creates or updates a deal per policy item. Matrix Node**: Posts a formatted alert message in the specified Matrix room. Set up steps Setup Time: 15-20 minutes Install Community Node – In n8n, go to Settings → Community Nodes → Install and search for ScrapeGraphAI. Add Credentials – Create New credentials for ScrapeGraphAI, Pipedrive, and Matrix under Credentials. Configure Environment Variables – Open the Set (Initial Config) node and replace placeholders (POLICY_SITES, PD_PIPELINE_ID, etc.) with your values. Review Schedule – Double-click the Schedule Trigger node to adjust the interval if needed. Activate Workflow – Click Activate. The workflow will run at the next scheduled interval. Verify Outputs – Check Pipedrive for new deals and the Matrix room for alert messages after the first run. Node Descriptions Core Workflow Nodes: stickyNote** – Provides an at-a-glance description of the workflow logic directly on the canvas. scheduleTrigger** – Fires the workflow periodically (default 6 hours). code (URL List Builder)** – Splits the POLICY_SITES variable into an array. splitInBatches** – Ensures each URL is processed individually to avoid timeouts. scrapegraphAi** – Parses HTML and extracts policy metadata using XPath/CSS selectors. if (New vs Existing)** – Uses hashing to ignore unchanged pages. merge** – Combines all new items so they can be processed in bulk. set (Deal Formatter)** – Maps scraped fields to Pipedrive deal properties. matrix** – Sends formatted messages to a Matrix room for team visibility. pipedrive** – Creates or updates deals representing each policy update. Data Flow: scheduleTrigger → code → splitInBatches → scrapegraphAi → if → merge → set → pipedrive → matrix Customization Examples 1. Add another data field (e.g., policy author) // Inside ScrapeGraphAI node → Selectors { "title": "//h1/text()", "date": "//time/@datetime", "summary": "//p[1]/text()", "author": "//span[@class='author']/text()" // new line } 2. Switch notifications from Matrix to Email // Replace Matrix node with “Send Email” { "to": "policy-team@example.com", "subject": "New Healthcare Policy Detected: {{$json.title}}", "text": "Summary:\n{{$json.summary}}\n\nRead more at {{$json.url}}" } Data Output Format The workflow outputs structured JSON data for each new policy article: { "title": "Affordable Care Expansion Act – 2024", "url": "https://health.gov/policies/acea-2024", "date": "2024-06-14T09:00:00Z", "summary": "Proposes expansion of coverage to rural areas...", "source": "health.gov", "hash": "2d6f1c8e3b..." } Troubleshooting Common Issues ScrapeGraphAI returns empty objects – Verify selectors match the current HTML structure; inspect the site with developer tools and update the node configuration. Duplicate deals appear in Pipedrive – Ensure the “Find or Create” option is enabled in the Pipedrive node, using the page hash or url as a unique key. Performance Tips Limit POLICY_SITES to under 50 URLs per run to avoid hitting rate limits. Increase Schedule Trigger interval if you notice ScrapeGraphAI rate-limiting. Pro Tips: Store historical scraped data in a database node for long-term audit trails. Use the n8n Workflow Executions page to replay failed runs without waiting for the next schedule. Add an Error Trigger node to emit alerts if scraping or API calls fail.
by Jitesh Dugar
Revolutionize your recruitment process with intelligent AI-driven candidate screening that evaluates resumes, scores applicants, and automatically routes them based on fit - saving 10-15 hours per week on initial screening. 🎯 What This Workflow Does Transforms your hiring pipeline from manual resume review to intelligent automation: 📝 Captures Applications - Jotform intake with resume upload 🤖 AI Resume Analysis - OpenAI parses skills, experience, education, and red flags 🎯 Intelligent Scoring - Evaluates candidates against job requirements with structured scoring (0-100) 🚦 Smart Routing - Automatically routes based on AI recommendation: Strong Yes (85-100): Instant Slack alert → Interview invitation Maybe/Yes (60-84): Manager review → Approval workflow No (<60): Polite rejection email 📊 Analytics Tracking - All data logged to Google Sheets for hiring insights ✨ Key Features AI Resume Parsing**: Extracts structured data from any resume format Intelligent Scoring System**: Multi-dimensional evaluation (skills match, experience quality, cultural fit) Structured Output**: Consistent JSON schema ensures reliable data for decision-making Automated Communication**: Personalized emails for every candidate outcome Human-in-the-Loop**: Manager approval for borderline candidates Comprehensive Analytics**: Track conversion rates, average scores, and hiring metrics Customizable Job Requirements**: Easy prompt editing to match any role 💼 Perfect For Startups & Scale-ups**: Processing 50+ applications per week HR Teams**: Wanting to reduce time-to-hire by 40-60% Technical Recruiters**: Screening engineering, product, or design roles Growing Companies**: Scaling hiring without scaling headcount 🔧 What You'll Need Required Integrations Jotform** - Application intake form (free tier works) Create your form for free on Jotform using this link OpenAI API** - GPT-4o-mini for cost-effective AI analysis (~$0.15 per candidate) Gmail** - Automated candidate communication Google Sheets** - Hiring database and analytics Optional Integrations Slack** - Instant alerts for hot candidates Linear/Asana** - Task creation for interview scheduling Calendar APIs** - Automated interview booking 🚀 Quick Start Import Template - Copy JSON and import into n8n Create Jotform - Use provided field structure (name, email, resume upload, etc.) Add API Keys - OpenAI, Jotform, Gmail, Google Sheets Customize Job Requirements - Edit AI screening prompt with your role details Personalize Emails - Update templates with your company branding Test & Deploy - Submit test application and verify all nodes 🎨 Customization Options Adjust Scoring Thresholds**: Change routing logic based on your needs Multiple Positions**: Clone workflow for different roles with unique requirements Add Technical Assessments**: Integrate HackerRank, CodeSignal, or custom tests Interview Scheduling**: Connect Calendly or Google Calendar for auto-booking ATS Integration**: Push data to Lever, Greenhouse, or BambooHR Diversity Tracking**: Add demographic fields and analytics Reference Checking**: Automate reference request emails 📈 Expected Results 90% reduction** in manual resume review time 24-hour response time** to all candidates Zero missed applications** - every candidate gets feedback Data-driven hiring** - track what works with comprehensive analytics Better candidate experience** - fast, professional communication Consistent evaluation** - eliminate unconscious bias with structured AI scoring 🏆 Use Cases Technology Companies Screen 100+ engineering applications per week, identify top 10% instantly, schedule interviews same-day. Agencies & Consultancies Evaluate consultant candidates across multiple skill dimensions, route to appropriate practice areas. High-Volume Hiring Process retail, customer service, or sales applications at scale with consistent quality. Remote-First Teams Evaluate global candidates 24/7, respond instantly regardless of timezone. 💡 Pro Tips Train Your AI**: After 50+ applications, refine prompts based on false positives/negatives A/B Test Thresholds**: Experiment with score cutoffs to optimize for your needs Build Talent Pipeline**: Keep "maybe" candidates in CRM for future roles Track Source Effectiveness**: Add UTM parameters to measure which job boards deliver best candidates Continuous Improvement**: Weekly review of AI assessments to calibrate accuracy 🎓 Learning Resources This workflow demonstrates: AI Agents with structured output Multi-stage conditional routing Human-in-the-loop automation Binary data processing (resume files) Email automation with HTML templates Real-time notifications Analytics and data logging Perfect for learning advanced n8n automation patterns! Ready to transform your hiring process? Import this template and start screening candidates intelligently in under 30 minutes. Questions or customization needs? The workflow includes detailed sticky notes explaining each section.
by Rapiwa
Who Is This For? This n8n workflow enables automated cross-selling by identifying each WooCommerce customer's most frequently purchased product, finding a related product to recommend, and sending a personalized WhatsApp message using the Rapiwa API. It also verifies whether the user's number is WhatsApp-enabled before sending, and logs both successful and unsuccessful attempts to Google Sheets for tracking. What This Workflow Does Retrieves all paying customers from your WooCommerce store Identifies each customer's most purchased product Finds the latest product in the same category as their most purchased item Cleans and verifies customer phone numbers for WhatsApp compatibility Sends personalized WhatsApp messages with product recommendations Logs all activities to Google Sheets for tracking and analysis Handles both verified and unverified numbers appropriately Key Features Customer Segmentation:** Automatically identifies paying customers from your WooCommerce store Product Analysis:** Determines each customer's most purchased product Smart Recommendations:** Finds the latest products in the same category as customer favorites WhatsApp Integration:** Uses Rapiwa API for message delivery Phone Number Validation:** Verifies WhatsApp numbers before sending messages Dual Logging System:** Tracks both successful and failed message attempts in Google Sheets Rate Limiting:** Uses batching and wait nodes to prevent API overload Personalized Messaging:** Includes customer name and product details in messages Requirements WooCommerce store with API access Rapiwa account with API access for WhatsApp verification and messaging Google account with Sheets access Customer phone numbers in WooCommerce (stored in billing.phone field) How to Use — Step-by-Step Setup 1. Credentials Setup WooCommerce API: Configure WooCommerce API credentials in n8n (e.g., "WooCommerce (get customer)" and "WooCommerce (get customer data)") Rapiwa Bearer Auth: Create an HTTP Bearer credential with your Rapiwa API token Google Sheets OAuth2: Set up OAuth2 credentials for Google Sheets access 2. Configure Google Sheets Ensure your sheet has the required columns as specified in the Google Sheet Column Structure section 3. Verify Code Nodes Code (get paying_customer): Filters customers to include only those who have made purchases Get most buy product id & Clear Number: Identifies the most purchased product and cleans phone numbers 4. Configure HTTP Request Nodes Get customer data: Verify the WooCommerce API endpoint for retrieving customer orders Get specific product data: Verify the WooCommerce API endpoint for product details Get specific product recommend latest product: Verify the WooCommerce API endpoint for finding latest products by category Check valid WhatsApp number Using Rapiwa: Verify the Rapiwa endpoint for WhatsApp number validation Rapiwa Sender: Verify the Rapiwa endpoint for sending messages Google Sheet Required Columns You’ll need two Google Sheets (or two tabs in one spreadsheet): A Google Sheet formatted like this ➤ sample The workflow uses a Google Sheet with the following columns to track coupon distribution: Both must have the following headers (match exactly): | name | number | email | address1 | price | suk | title | product link | validity | staus | | ---------- | ------------- | ----------------------------------------------- | ----------- | ----- | --- | ---------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------ | ---------- | -------- | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | 850 | | Sharp Most Demanding Hoodie x Nike | https://your_shop_domain/p-img-nike | verified | sent | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | 850 | | Sharp Most Demanding Hoodie x Nike | https://your_shop_domain/p-img-nike | unverified | not sent | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | 850 | | Sharp Most Demanding Hoodie x Nike | https://your_shop_domain/p-img-nike | verified | sent | Important Notes Phone Number Format:** The workflow cleans phone numbers by removing all non-digit characters. Ensure your WooCommerce phone numbers are in a compatible format. API Rate Limits:** Rapiwa and WooCommerce APIs have rate limits. Adjust batch sizes and wait times accordingly. Data Privacy:** Ensure compliance with data protection regulations when sending marketing messages. Error Handling:** The workflow logs unverified numbers but doesn't have extensive error handling. Consider adding error notifications for failed API calls. Product Availability:** The workflow recommends the latest product in a category, but doesn't check if it's in stock. Consider adding stock status verification. Testing:** Always test with a small batch before running the workflow on your entire customer list. Useful Links Dashboard:** https://app.rapiwa.com Official Website:** https://rapiwa.com Documentation:** https://docs.rapiwa.com Support & Help WhatsApp**: Chat on WhatsApp Discord**: SpaGreen Community Facebook Group**: SpaGreen Support Website**: https://spagreen.net Developer Portfolio**: Codecanyon SpaGreen
by Intuz
This n8n template from Intuz provides a complete and automated solution for creating and distributing sophisticated release notes. It connects to GitHub and JIRA to gather data from recent commits and completed tickets, using specific keywords or labels to identify key features for inclusion. This information is then processed by Google Gemini to automatically generate well-written, human-like release notes, which are then distributed via email to stakeholders, creating a complete, end-to-end communication pipeline for every new software release. This template is perfect for development teams looking to streamline their release process, ensure consistent communication, and eliminate the manual effort of writing release notes. How to use 1. Set up Credentials: GitHub JIRA (Software Cloud API) Google Gemini (or another PaLM/LLM provider) Your SMTP email server. 2. Configure the GitHub Trigger: Select the Github Trigger node. In the Repository Owner field, enter your GitHub username or organization name. In the Repository Name field, select the repository you want to monitor. 3. Verify the JIRA Integration: Important:** This workflow assumes your commit messages contain a JIRA key (e.g., "PROJ-123: Fix login bug"). Select the first Code node. It uses a regular expression ([A-Z]+-\\d+)/i to find JIRA keys. Adjust this expression if your team uses a different format. Select the Get an issue node and ensure your JIRA credentials are correctly configured. 4. Customize the AI Prompt: Select the Basic LLM Chain node. You can edit the prompt to change the tone, style, or structure of the generated HTML release note to match your company's standards. 5. Configure Email Notifications: Select the Send email node. Update the To Email field with the recipient's email address (e.g., a team distribution list or a stakeholder's email). Customize the From Email and Subject line as needed. 6. Activate Workflow: Save your changes and activate the workflow. Now, every push to your configured repository will trigger the automated generation and sending of release notes. Required Tools GitHub: To trigger the workflow on code pushes. JIRA: To fetch details about the tasks and bugs included in the release. Google Gemini: To intelligently generate the release note content. (You can swap this for another LLM supported by n8n). SMTP Provider: To send the final release note via email. Connect with us Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Worflow Automation Click here- Get Started
by Dr. Firas
💥 Generate UGC Promo Videos with Blotato and Sora 2 for eCommerce 🧩 Who is this for? This workflow is perfect for eCommerce brands, content creators, and marketing teams who want to automatically generate short, eye-catching videos from their product images — without editing software or manual work. 🚀 What problem does this workflow solve? Creating engaging promotional videos manually can be time-consuming and expensive. This automation eliminates that friction by combining Blotato, Sora 2, and AI scripting to turn static product images into dynamic UGC-style videos ready for TikTok, Instagram Reels, and YouTube Shorts. ⚙️ What this workflow does This workflow: Receives a product image directly from Telegram or another input source. Analyzes the image with OpenAI Vision to understand the product’s features and audience. Generates a natural, short UGC-style script using GPT-based AI. Sends the image and script to Sora 2 via the Fal API to generate a vertical promotional video. Monitors the video status every 15 seconds until completion. Downloads or automatically publishes the final video to your social platforms. 🧠 Setup Create a Fal.ai API key and set it in your n8n credentials (Authorization: Key YOUR_FAL_KEY). Connect your Telegram, OpenAI, and HTTP Request nodes as shown in the workflow. Make sure the Build Public Image URL node outputs a valid, public image link. In the HTTP Request node for Sora 2, set: Method: POST URL: https://fal.run/fal-ai/sora-2/image-to-video Headers: Authorization: Key YOUR_FAL_KEY Content-Type: application/json Body: Raw JSON with parameters like prompt, image_url, duration, and aspect_ratio. Run the workflow and monitor the execution logs for your video URL. Blotato → API key for social media publishing 🎨 How to customize this workflow to your needs 🧾 Change the video tone: Edit the OpenAI prompt to produce educational, emotional, or luxury-style scripts. 🎬 Adjust duration or format: Use Sora 2’s supported durations (4, 8, or 12 seconds) and aspect ratios (e.g., 9:16 for social media). 📲 Auto-publish your videos: Connect the TikTok, Instagram, or YouTube upload nodes for full automation. ✨ Add branding: Include overlays, logos, or end screens via CapCut or an external API integration. 🎥 Watch This Tutorial 👋 Need help or want to customize this? 📩 Contact: LinkedIn 📺 YouTube: @DRFIRASS 🚀 Workshops: Mes Ateliers n8n 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube / 🚀 Mes Ateliers n8n
by Samir Saci
Tags: ESL, English Learning, Podcasts, RSS, AI Exercises, ElevenLabs Context Hi! I’m Samir, Supply Chain Engineer and Data Scientist based in Paris, and founder of the startup LogiGreen. I created this workflow for my mother, who is currently learning English, to turn the BBC 6 Minute English podcast into ready-to-use English lessons. The lesson includes vocabulary, exercises and discussion questions along with links to access the podcast content (audio and transcript). > Use this assistant to automatically share English lessons from a renowned podcast. 📬 For business inquiries, you can find me on LinkedIn Who is this template for? This template is designed for: ESL teachers** who want a fresh, structured lesson every week from real-life audio Independent learners** who want a guided way to study English with podcasts Language schools or content creators** who send regular English lessons by email What does this workflow do? This workflow acts as an AI-powered English lesson generator from podcast episodes. Runs every Sunday at 20:00 using a Schedule Trigger and reads the BBC 6 Minute English RSS feed Checks a Data Table of archived episodes and filters out those already sent (using their guid) Keeps the latest unsent episode and loads its web page content via HTTP Parses the HTML in a Code node to extract the episode description, full transcript and BBC vocabulary list Calls three AI nodes (OpenAI) to generate: a motivational email hook message fill-in-the-blank vocabulary exercises discussion questions related to the topic Combines all vocabulary words and sends them to ElevenLabs to generate a slow-paced audio track for listening practice Builds a prettify HTML email that includes: title, description, hook, vocabulary list, exercises, discussion questions and resource links Sends the final lesson by email via the Gmail node, with the vocabulary audio attached For example, this is the latest email generated by the workflow: P.S.: You can customise the footer to your school or company identity. 🎥 Tutorial I advise you to check the tutorial on my YouTube channel for the details on how to set up the nodes and customise the content: Next Steps Follow the stickers to set up all the nodes: Replace the Data Table reference with your own (storing at least guid, title, link, processed_date) Set up your OpenAI credentials in the three Model nodes Set up your ElevenLabs credentials and choose a voice in the audio node Configure your Gmail credentials and recipient email address in the Send Email node Adapt the RSS feed URL if you want to track another podcast or source Customise the HTML email (colours, logo, footer text) in the Prepare Email Code node Adjust the schedule (time or frequency) if you prefer another cadence Submitted: 18 November 2025 Template designed with n8n version 1.116.2
by Yang
🛍️ Pick Best-Value Products from Any Website Using Dumpling AI, GPT-4o, and Google Sheets Who’s it for This workflow is for eCommerce researchers, affiliate marketers, and anyone who needs to compare product listings across sites like Amazon. It’s perfect for quickly identifying top product picks based on delivery speed, free shipping, and price. What it does Just submit a product listing URL. The workflow will crawl it using Dumpling AI, take screenshots of the pages, and pass them to GPT-4o to extract up to 3 best-value picks. It analyzes screenshots visually—no HTML scraping needed. Each result includes: product name price review count free delivery date (if available) How it works 📝 Receives a URL through a web form 🧠 Uses Dumpling AI to crawl the website 📸 Takes screenshots of each product listing 🔍 GPT-4o analyzes each image to pick top products 🔧 A code node parses and flattens the output 📊 Google Sheets stores the result 📧 Sends the spreadsheet link via email Requirements Dumpling AI token** OpenAI key** (GPT-4o) Google Sheet** with columns: product name, price, reviews no., free_delivery_date > You can customize the AI prompt to extract other visual insights (e.g., ratings, specs).
by Colton Randolph
This n8n workflow automatically scrapes TechCrunch articles, filters for AI-related content using OpenAI, and delivers curated summaries to your Slack channels. Perfect for individuals or teams who need to stay current on artificial intelligence developments without manually browsing tech news sites. Who's it for AI product teams tracking industry developments and competitive moves Tech investors monitoring AI startup coverage and funding announcements Marketing teams following AI trends for content and positioning strategies Executives needing daily AI industry briefings without manual research overhead Development teams staying current on AI tools, frameworks, and breakthrough technologies How it works The workflow runs on a daily schedule, crawling a specificed amount of TechCrunch articles from the current year. Firecrawl extracts clean markdown content while bypassing anti-bot measures and handling JavaScript rendering automatically. Each article gets analyzed by an AI research assistant that determines if the content relates to artificial intelligence, machine learning, AI companies, or AI technology. Articles marked as "NOT_AI_RELATED" get filtered out automatically. For AI-relevant articles, OpenAI generates focused 3-bullet-point summaries that capture key insights. These summaries get delivered to your specified Slack channel with the original TechCrunch article title and source link for deeper reading. How to set up Configure Firecrawl: Add your Firecrawl API key to the HTTP Request node Set OpenAI credentials: Add your OpenAI API key to the AI Agent node Connect Slack: Configure your Slack webhook URL and target channel Adjust scheduling: Set your preferred trigger frequency (daily recommended) Test the workflow: Run manually to verify article extraction and Slack delivery Requirements Firecrawl account** with API access for TechCrunch web scraping OpenAI API key** for AI content analysis and summarization Slack workspace** with webhook permissions for message delivery n8n instance** (cloud or self-hosted) for workflow execution How to customize the workflow Source expansion: Modify the HTTP node URL to target additional tech publications beyond TechCrunch, or adjust the article limit and date filtering for different coverage needs. AI focus refinement: Update the OpenAI prompt to focus on specific AI verticals like generative AI, robotics, or ML infrastructure. Add company names or technology terms to the relevance filtering logic. Summary formats: Change from 3-bullet summaries to executive briefs, technical analyses, or competitive intelligence reports by modifying the OpenAI summarization prompt. Multi-channel delivery: Extend beyond Slack to email notifications, Microsoft Teams, or database storage for historical trend analysis and executive dashboards.
by Onur
Automated B2B Lead Generation: Google Places, Scrape.do & AI Enrichment This workflow is a powerful, fully automated B2B lead generation engine. It starts by finding businesses on Google Maps based on your criteria (e.g., "dentists" in "Istanbul"), assigns a quality score to each, and then uses Scrape.do to reliably access their websites. Finally, it leverages an AI agent to extract valuable contact information like emails and social media profiles. The final, enriched data is then neatly organized and saved directly into a Google Sheet. This template is built for reliability, using Scrape.do to handle the complexities of web scraping, ensuring you can consistently gather data without getting blocked. 🚀 What does this workflow do? Automatically finds businesses using the Google Places API based on a category and location you define. Calculates a leadScore for each business based on its rating, website presence, and operational status to prioritize high-quality leads. Filters out low-quality leads** to ensure you only focus on the most promising prospects. Reliably scrapes the website of each high-quality lead using Scrape.do to bypass common blocking issues and retrieve the raw HTML. Uses an AI Agent (OpenAI) to intelligently parse the website's HTML and extract hard-to-find contact details (emails, social media links, phone numbers). Saves all enriched lead data** to a Google Sheet, creating a clean, actionable list for your sales or marketing team. Runs on a schedule**, continuously finding new leads without any manual effort. 🎯 Who is this for? Sales & Business Development Teams:** Automate prospecting and build targeted lead lists. Marketing Agencies:** Generate leads for clients in specific industries and locations. Freelancers & Consultants:** Quickly find potential clients for your services. Startups & Small Businesses:** Build a customer database without spending hours on manual research. ✨ Benefits Full Automation:** Set it up once and let it run on a schedule to continuously fill your pipeline. AI-Powered Enrichment:** Go beyond basic business info. Get actual emails and social profiles that aren't available on Google Maps. Reliable Website Access:* Leverages *Scrape.do** to handle proxies and prevent IP blocks, ensuring consistent data gathering from target websites. High-Quality Leads:** The built-in scoring and filtering system ensures you don't waste time on irrelevant or incomplete listings. Centralized Database:** All your leads are automatically organized in a single, easy-to-access Google Sheet. ⚙️ How it Works Schedule Trigger: The workflow starts automatically at your chosen interval (e.g., daily). Set Parameters: You define the business type (searchCategory) and location (locationName) in a central Set node. Find Businesses: It calls the Google Places API to get a list of businesses matching your criteria. Score & Filter: A custom Function node scores each lead. An IF node then separates high-quality leads from low-quality ones. Loop & Enrich: The workflow processes each high-quality lead one by one. It uses a scraping service (Scrape.do) to reliably fetch the lead's website content. An AI Agent (OpenAI) analyzes the website's footer to find contact and social media links. Save Data: The final, enriched lead information is appended as a new row in your Google Sheet. 📋 n8n Nodes Used Schedule Trigger Set HTTP Request (for Google Places & Scrape.do) Function If Split in Batches (Loop Over Items) HTML Langchain Agent (with OpenAI Chat Model & Structured Output Parser) Google Sheets 🔑 Prerequisites An active n8n instance. Google Cloud Project* with the *Places API** enabled. Google Places API Key**, stored in n8n's Header Auth credentials. A Scrape.do Account and API Token**. This is essential for reliably scraping websites without your n8n server's IP getting blocked. OpenAI Account & API Key** for the AI-powered data extraction. Google Account** with access to Google Sheets. Google Sheets API Credentials (OAuth2)** configured in n8n. A Google Sheet* prepared with columns to store the lead data (e.g., BusinessName, Address, Phone, Website, Email, Facebook, etc.*). 🛠️ Setup Import the workflow into your n8n instance. Configure Credentials: Create and/or select your credentials for: Google Places API: In the 2. Find Businesses (Google Places) node, select your Header Auth credential containing your API key. Scrape.do: In the 6a. Scrape Website HTML node, configure credentials for your Scrape.do account. OpenAI: In the OpenAI Chat Model node, select your OpenAI credentials. Google Sheets: In the 7. Save to Google Sheets node, select your Google Sheets OAuth2 credentials. Define Your Search: In the 1. Set Search Parameters node, update the searchCategory and locationName values to match your target market. Link Your Google Sheet: In the 7. Save to Google Sheets node, select your Spreadsheet and Sheet Name from the dropdown lists. Map the incoming data to the correct columns in your sheet. Set Your Schedule: Adjust the Schedule Trigger to run as often as you like (e.g., once a day). Activate the workflow! Your automated lead generation will begin on the next scheduled run.
by Gilbert Onyebuchi
Automatically turn your Google Calendar into a fully-automated notification system with email alerts, SMS reminders, and a live performance dashboard - all powered by n8n. This automation helps you never miss an event, while giving you clear visibility into what notifications were sent, when, and how reliably they ran. What This Automation Does This solution is built as 4 connected workflows that run on a schedule and work together: 1. Daily Email Summary (Morning) Every morning, the workflow: Reads today’s events from Google Calendar Formats them into a clean email Sends a daily schedule summary via Mailchimp or SendGrid 2. Daily SMS Summary Shortly after, it: Sends a concise SMS overview of today’s meetings using Twilio 3. 15-Minute Event Reminders Before each event: Sends an individual SMS reminder Skips all-day events automatically 4. Weekly Schedule Preview Every Sunday: Sends a week-ahead summary so you can plan in advance Live Reporting & Dashboard All workflow activity is logged automatically into Google Sheets, which powers a real-time analytics dashboard showing: Number of notifications sent Success vs failure rates Daily and weekly execution stats Visual charts powered by Chart.js No manual tracking needed, everything updates automatically. How the Workflow Is Structured The automation is grouped into 3 clear sections: Section 1: Calendar Data Collection Pulls events from Google Calendar Filters relevant meetings Prepares clean event data Section 2: Notifications & Messaging Formats emails and SMS messages Sends reminders and summaries Handles scheduling logic Section 3: Logging & Reporting Saves every execution to Google Sheets Updates daily stats automatically Feeds the live dashboard SUPPORT & FEEDBACK Questions or issues? Connect with me on LinkedIn Want to see it in action? Try the live report demo: Click here