by PollupAI
This workflow automates the prioritization and escalation of customer support tickets. It acts as an intelligent triage system that identifies high-value customers and potential churn risks in HubSpot, syncs them to Jira, and enforces response times via Slack alerts. Whoβs it for This workflow is ideal for Customer Success (CS) teams, Support Leads, and Account Managers who need to ensure VIP clients and critical issues receive immediate engineering or support attention without manual monitoring. How it works The workflow runs on a schedule to process new tickets: Monitor: Checks HubSpot every 10 minutes for newly created tickets. Enrich: Retrieves the associated Contactβs data, specifically looking for Annual Revenue and Lifecycle Stage. Analyze: A Code node evaluates the ticket content and customer value. It assigns a "Severity" level (Critical/High/Normal) based on revenue thresholds (>10k) or churn-risk keywords (e.g., "refund," "lawyer," "cancel"). Action: Creates a formatted Jira task with all context included and notifies a Slack channel. SLA Check: Waits 15 minutes to allow for a response. Escalate: If the Jira ticket status hasn't changed to "In Progress" or "Done" after the wait period, it triggers a high-priority "Churn Risk Escalation" alert in Slack. Requirements HubSpot** account (CRM and Service Hub). Jira Software Cloud** account. Slack** workspace. How to set up Configure your credentials for HubSpot, Jira Software, and Slack. In the HubSpot: Get Associations and Get Contact Data nodes, ensure the properties match your internal naming conventions. In the Jira: Create Triage Ticket node, select your specific Project and Issue Type from the dropdown lists. In the Slack nodes, select the channel where you want alerts to be posted. How to customize the workflow Integrate other tools:** This system is modular and works with any other tool (contact us for help). You can easily replace the nodes to use your specific stack: CRM: Pipedrive, WeClapp Ticketing System: Zendesk, Intercom, FreshDesk Modify Logic:* Open the *Code: Calculate Severity** node to change the revenue threshold (currently set to 10,000). You can also replace the manual keyword matching with an LLM (AI) node to intelligently analyze ticket sentiment and intent. Adjust SLA:* Change the duration in the *Wait: Response Timer** node if your Service Level Agreement (SLA) differs from the default 15 minutes. Change Status Check:* Update the *Check: Escalation Needed?** node if your team uses different Jira statuses (e.g., "Under Review" instead of "In Progress") to determine if a ticket is being handled.
by Alexandru Burca
Daily News Digest Video Generator for YouTube Shorts Instalations Instructions Youtube Instalation Instructions Overview This workflow automatically creates and publishes daily news digest videos from WordPress articles to YouTube. It runs every evening at 7 PM, compiling the day's top stories from a news portal into a professionally formatted vertical video (1080x1920px) optimized for social media platforms like YouTube Shorts. What It Does 1. π Scheduled Trigger Runs automatically every day at 19:00 (7 PM) 2. π° Fetches Today's Articles Retrieves all published WordPress posts from the current day 3. β Validates Content Ensures there are at least 3 articles before proceeding 4. π¬ Video Detection Scans article content HTML for embedded videos Extracts MP4 URLs from WordPress video players Parses wp-playlist-script JSON data Falls back to ` and <source>` tag detection 5. π§Ή Data Processing Extracts** article titles, links, and featured media IDs Decodes HTML entities**: Converts – to β, " to ", etc. Fetches featured images** from WordPress Media API Assigns default images** for articles without featured media Calculates reading time** per article (3-7 seconds based on word count) Cleans text**: Removes HTML tags and normalizes whitespace 6. π₯ Video Generation (via Shotstack API) Intro Slide (3 seconds) Black background Large logo (centered) Title on center Current date in DD-MM-YYYY format News Slides (3-7 seconds each) Each article is displayed with: Background**: Video (if available) or featured image, cropped to fit Dark overlay**: 40% opacity black layer for text readability Article headline**: Large white text at top Small logo**: Top-right corner Pagination counter**: Bottom-right white badge (e.g., "1 / 22") CTA button**: Centered CTA Background music**: Subtle looped audio track Transitions**: Smooth fade in/out between slides Outro Slide (3 seconds) Identical to intro slide Provides clean ending to the video 7. β³ Processing Wait Waits 30 seconds for Shotstack to render the video Polls Shotstack API to verify video completion 8. π₯ Download Video Retrieves the finished MP4 file from Shotstack Downloads video data for YouTube upload 9. π€ YouTube Upload Automatically uploads to YouTube with: Title**: "Daily Digest - [Day] [Weekday], [Year]" Description**: Same as title Category**: News & Politics Made for kids**: Yes Tags**: dailydigest β¨ Key Features Intelligent Content Handling β Automatic video/image detection and intelligent media selection β Dynamic reading time calculation for optimal viewer engagement β HTML entity cleaning for proper text display (WordPress compatibility) β Fallback default images for articles without media β Video background support with automatic muting Professional Video Production β Vertical format optimized for mobile viewing (1080x1920px) β Professional branding with logos and consistent styling β Smooth fade transitions between slides β Background music with looping support β Dynamic pagination counters β Call-to-action buttons for engagement Customization β Centralized variables for easy branding updates β Configurable logos, colors, and text β Adjustable reading time calculation β Flexible date formatting β Customizable audio track π― Use Cases Perfect for: π° News websites wanting to repurpose daily articles π± Media outlets creating social media content π₯ Content creators automating video production π Publishers maximizing content distribution π Marketing teams driving traffic from social platforms π§ Customization Options Easy Changes Update logos by changing logo_big and logo_small URLs Modify branding colors via button_bg_color variable Adjust button text with button_text variable Change video title with daily_digest_text variable Update background music by replacing audio URL Advanced Customization Adjust reading time formula in calculateReadingTime() function Modify date format in getRomanianDate() function Change video dimensions (currently 1080x1920) Update font family and sizes Adjust overlay opacity and colors Modify transition effects π Prerequisites Required Credentials WordPress API - Access your WordPress site Shotstack API - API key for video rendering (Stage environment) YouTube OAuth2 - Authenticated YouTube account for uploads
by Salman Mehboob
Stop writing blog posts manually. This workflow monitors Google News every 12 hours on any topic you choose, automatically selects the most relevant article, scrapes the source content, generates a 100% original SEO-optimized blog post using Claude AI, creates a featured image with Google Gemini, and publishes the complete post to WordPress with RankMath meta title and meta description fully automated, zero manual work. Perfect for bloggers, content marketers, digital agencies, WordPress site owners, and anyone who wants to keep their website updated with fresh, AI-written content without spending hours writing or publishing. What problem does this solve? Publishing fresh, well-written blog content consistently is one of the hardest parts of running a website. Finding relevant news, writing original articles, creating images, and publishing everything to WordPress takes hours every single time. This workflow eliminates all of that. It runs on a schedule, picks the best news story on your topic, writes a completely original article from scratch, and publishes it to your WordPress site, including featured image, meta title, and meta description while you focus on everything else. How it works Scheduled trigger runs every 12 hours automatically No manual action required. The workflow runs on its own twice a day. Fetch latest news via SerpAPI Google News SerpAPI fetches the freshest articles from Google News based on your search query. The default query is seo but you can change it to any topic - tech, finance, health, marketing, AI, real estate, or anything your blog covers. One parameter change is all it takes. Check already-published articles via Google Sheets A Google Sheets node reads all previously published news links from your tracking sheet. This ensures the same article is never processed twice - even if it keeps appearing in Google News days later. Remove duplicates with Merge node The Merge node compares incoming articles against your published history and removes any matches. Only genuinely new, unprocessed articles move forward. Sort by date and limit to top 10 Remaining articles are sorted by publish date, newest first. A Limit node keeps only the top 10 most recent items for evaluation. AI agent selects the single best article All 10 article titles and links are combined into one item and sent to an AI Agent powered by Claude (via OpenRouter). The agent reads every title and picks the one most relevant and valuable for your website's audience - filtering out off-topic, low-quality, or irrelevant results. It returns the winning article's title and link. Scrape the full source article An HTTP Request node fetches the complete HTML of the selected article. An HTML Extract node pulls only the meaningful content - headings (h1, h2, h3) and paragraphs (p) - stripping out ads, navigation menus, and everything else. An Aggregate node joins all extracted text into one clean block ready for the AI. Claude generates a 100% original article A second AI Agent (Claude via OpenRouter) receives the scraped content as research material only. It does not rewrite or paraphrase the source. It writes a completely new, original, SEO-optimized article based on the topic and ideas - with its own structure, wording, and insights. The output includes: Article title Meta title (under 60 characters) Meta description (under 160 characters) Full article body in Markdown format A featured image generation prompt Google Gemini generates the featured image The image prompt produced by Claude is sent to Google Gemini's image generation model. The image is created in 16:9 ratio, suitable as a WordPress featured image. Upload image to WordPress Media Library An HTTP Request node uploads the generated image binary directly to your WordPress site via the REST API. WordPress returns an image ID which is stored for the next step. Convert Markdown to HTML The article content is in Markdown format. A Markdown node converts it to clean HTML before publishing. Arrange all data in one place A Set node collects all required fields - title, HTML content, image ID, meta title, meta description, and original news source link - into one organised item. Create the WordPress post The WordPress node creates the post with title, content, author, and category. You can set the post to draft for review or publish directly. The category ID is configurable to match your site structure. Attach featured image and RankMath SEO meta An HTTP PUT request updates the post to attach the featured image using the stored image ID, and writes the RankMath SEO title and meta description using registered REST API meta fields. Log to Google Sheets The original news link and the published WordPress post URL are saved back to your Google Sheets tracking file. This is what prevents the same article from ever being processed again on future runs. What you need SerpAPI** β for Google News fetching OpenRouter** β to access Claude (anthropic/claude-sonnet-4.5) Google Gemini (PaLM API)** β for featured image generation WordPress** β with Application Password authentication Google Sheets** β with two columns: news_link and wp_post_link RankMath SEO plugin** on WordPress (or adapt for Yoast β see note below) One-time WordPress setup for RankMath meta fields Add this PHP snippet via the Code Snippets plugin or your theme's functions.php to enable writing RankMath SEO title and description through the REST API: add_action('rest_api_init', function () { register_post_meta('post', 'rank_math_title', [ 'show_in_rest' => true, 'single' => true, 'type' => 'string', 'auth_callback' => fn() => current_user_can('edit_posts'), ]); register_post_meta('post', 'rank_math_description', [ 'show_in_rest' => true, 'single' => true, 'type' => 'string', 'auth_callback' => fn() => current_user_can('edit_posts'), ]); }); Using Yoast SEO instead? Replace the meta keys with _yoast_wpseo_title and _yoast_wpseo_metadesc in the last HTTP node. How to customise it for your topic Change the q parameter in the SerpAPI node to any keyword -digital marketing, cryptocurrency, AI tools, content marketing, web design, or anything else. The entire workflow adapts automatically. The AI agent will select the most relevant article for your niche and write accordingly. For assistance and support: salmanmehboob1947@gmail.com Linkedin: https://www.linkedin.com/in/salman-mehboob-pro/
by n8n Team
This workflow creates or updates a Mautic contact when a new event is scheduled in Calendly. The first name and email address are the only two fields that get updated. Prerequisites Calendly account and Calendly credentials. Mautic account and Mautic credentials. How it works Triggers on a new event in Calendly. Creates a new contact in Mautic if the email address does not exist in Mautic. Updates the contact's first name in Mautic if the email address exists in Mautic.
by vinci-king-01
How it works This workflow automatically scrapes commercial real estate listings from LoopNet and sends opportunity alerts to Telegram while logging data to Google Sheets. Key Steps Scheduled Trigger - Runs every 24 hours to collect fresh CRE market data AI-Powered Scraping - Uses ScrapeGraphAI to extract property information from LoopNet Market Analysis - Analyzes listings for opportunities and generates market insights Smart Notifications - Sends Telegram alerts only when investment opportunities are found Data Logging - Stores daily market metrics in Google Sheets for trend analysis Set up steps Setup time: 10-15 minutes Configure ScrapeGraphAI credentials - Add your ScrapeGraphAI API key for web scraping Set up Telegram connection - Connect your Telegram bot and specify the target channel Configure Google Sheets - Set up Google Sheets integration for data logging Customize the LoopNet URL - Update the URL to target specific CRE markets or property types Adjust schedule - Modify the trigger timing based on your market monitoring needs Keep detailed configuration notes in sticky notes inside your workflow
by Jorge MartΓnez
Generate social posts from GitHub pushes to Twitter and LinkedIn On each GitHub push, this workflow checks if the commit set includes README.md and CHANGELOG.md, fetches both files, lets an LLM generate a Twitter and LinkedIn post, then publishes to Twitter and LinkedIn (Person). Apps & Nodes Trigger:** Webhook Logic:** IF, Merge, Aggregate GitHub:** Get Repository File (Γ2) Files:** Extract from File (text) (Γ2) AI:** OpenAI Chat Model β LLM Chain (+ Structured Output Parser) Publish:** Twitter, LinkedIn (Person) Prerequisites GitHub:** OAuth2 or PAT with repo read. OpenAI:** API key. Twitter:* OAuth2 app with *Read and Write; scopes tweet.read tweet.write users.read offline.access. LinkedIn (Person):* OAuth2 credentials; *required scope:** w_member_social, openid. Setup GitHub Webhook: Repo β Settings β Webhooks Payload URL: https://<your-n8n-domain>/webhook/github/push Content type: application/json β’ Event: Push β’ Secret (optional) β’ Branches as needed. Credentials: Connect GitHub, OpenAI, Twitter, and LinkedIn (Person). How it Works Webhook receives GitHub push payload. IF checks that README and CHANGELOG appear in added/modified. GitHub (Get Repository File) pulls README.md and CHANGELOG.md. Extract from File (text) converts both binaries to text. Merge & Aggregate combines into one item with both contents. LLM (OpenAI + Parser) returns a JSON with twitter and linkedin. Twitter posts the tweet. LinkedIn (Person) posts the LinkedIn text.
by n8n Team
This workflow creates a new contact in Mautic when a new customer is created in Shopify. By default, the workflow will fill the first name, last name, and email address. You can add any other fields you require. Prerequisites Shopify account and Shopify credentials. Mautic account and Mautic credentials. How it works Triggers on a new customer in Shopify. Sends the required data to Mautic to create a new contact.
by Cheng Siong Chin
Introduction Generates complete scientific papers from title and abstract using AI. Designed for researchers, automating literature search, content generation, and citation formatting. How It Works Extracts input, searches academic databases (CrossRef, Semantic Scholar, OpenAlex), merges sources, processes citations, generates AI sections (Introduction, Literature Review, Methodology, Results, Discussion, Conclusion), compiles document. Workflow Template Webhook β Extract Data β Search (CrossRef + Semantic Scholar + OpenAlex) β Merge Sources β Process References β Prepare Context β AI Generate (Introduction + Literature Review + Methodology + Results + Discussion + Conclusion via OpenAI) β Merge Sections β Compile Document Workflow Steps Input & Search: Webhook receives title/abstract; searches CrossRef, Semantic Scholar, OpenAlex; merges and processes references AI Generation: OpenAI generates six sections with in-text citations using retrieved references Assembly: Merges sections; compiles formatted document with reference list Setup Instructions Trigger & APIs: Configure webhook URL; add OpenAI API key; customize prompts Databases: Set up CrossRef, Semantic Scholar, OpenAlex API access; configure search parameters Prerequisites OpenAI API, CrossRef API, Semantic Scholar API, OpenAlex API, webhook platform, n8n instance Customization Adjust reference limits, modify prompts for research fields, add citation styles (APA/IEEE), integrate databases (PubMed, arXiv), customize outputs (DOCX/LaTeX/PDF) Benefits Automates paper drafting, comprehensive literature integration, proper citations
by Rahul Joshi
Description: Automate your developer onboarding quality checks with this n8n workflow template. Whenever a new onboarding task is created in ClickUp, the workflow logs it to Google Sheets, evaluates its completeness using Azure OpenAI GPT-4o-mini, and alerts your team in Slack if critical details are missing. Perfect for engineering managers, DevOps leads, and HR tech teams who want to maintain consistent onboarding quality and ensure every developer gets the tools, credentials, and environment setup they need β without manual review. β What This Template Does (Step-by-Step) β‘ Step 1: Auto-Trigger on ClickUp Task Creation Listens for new task creation events (taskCreated) in your ClickUp workspace to initiate the audit automatically. π Step 2: Log Task Details to Google Sheets Records essential task data β task name, assignee, and description β creating a central audit trail for all onboarding activities. π§ Step 3: AI Completeness Analysis (GPT-4o-mini) Uses Azure OpenAI GPT-4o-mini to evaluate each onboarding task for completeness across key areas: Tooling requirements Credential setup Environment configuration Instruction clarity Outputs: β Score (0β100) β οΈ List of Missing Items π‘ Suggestions for Improvement π¦ Step 4: Apply Quality Gate Checks whether the AI-generated completeness score is below 80. Incomplete tasks automatically move to the alert stage for review. π’ Step 5: Alert Team via Slack Sends a structured Slack message summarizing the issue, including: Task name & assignee Completeness score Missing checklist items Recommended next actions This ensures your team fixes incomplete onboarding items before they impact new hires. π§ Key Features π€ AI-driven task completeness scoring π Automatic task logging for audit visibility βοΈ Smart quality gate (score threshold < 80) π’ Instant Slack alerts for incomplete tasks π End-to-end automation from ClickUp to Slack πΌ Use Cases π Audit onboarding checklists for new developers π§© Standardize environment setup and credential handover π¨ Identify missing steps before onboarding deadlines π Maintain onboarding consistency across teams π¦ Required Integrations ClickUp API β to detect new onboarding tasks Google Sheets API β to store audit logs and history Azure OpenAI (GPT-4o-mini) β to evaluate completeness Slack API β to alert the team on incomplete entries π― Why Use This Template? β Ensures every new developer receives a full, ready-to-start setup β Eliminates manual checklist verification β Improves onboarding quality and compliance tracking β Creates a transparent audit trail for continuous improvement
by Influencers Club
How it works: Get multi social platform data for event attendees with their email and send personalized emails to onboard them as organic creators or ambassadors. Step by step workflow to enrich event attendees emails from Eventbrite with multi social (Instagram, Tiktok, Youtube, Twitter, Onlyfans, Twitch and more) profiles, analytics and metrics using the influencers.club API and sending personalized partnership outreach via SendGrid. Set up: Eventbrite (can be swapped for any event CRM, general CRM or Google Sheet) Influencers.club SendGrid (can be swapped for any marketing email sender eg. Mailchimp, drip or programmatic email sender like Mailgun)
by Rahul Joshi
π Description This workflow automates end-to-end AI-driven inventory intelligence, transforming Airtable stock data into optimized reorder recommendations, daily operational summaries, and instant Slack alerts. It fetches all inventory rows, validates structure, computes reorder and safety-stock metrics using strict formulas, merges multi-batch AI output into a unified dataset, and distributes actionable insights across Email and Slack. Invalid or corrupted Airtable rows are logged to Google Sheets for audit and cleanup. The workflow ensures deterministic inventory math (zero improvisation), strict JSON compliance, and reliable multi-channel reporting for operations teams. βοΈ What This Workflow Does (Step-by-Step) βΆοΈ Manual Trigger β Start Inventory Optimization Runs the full optimization and reporting pipeline on demand. π¦ Fetch Inventory Records from Airtable Retrieves all SKU records (ID, ItemName, SKU, quantities, reorder levels) from the Airtable Inventory table. π Validate Inventory Record Structure (IF) Ensures each record contains a valid id. Valid β routed to AI optimization Invalid β saved to Google Sheets. π Log Invalid Inventory Rows to Google Sheet Captures malformed or incomplete Airtable items for audit checks and data hygiene. π§ Configure GPT-4o β Inventory Optimization Model Defines the AI model for stock-level calculations using strict formulas: SuggestedReorderPoint = ReorderLevel Γ 1.2 SuggestedSafetyStock = ReorderLevel Γ 0.5 StockStatus logic: Critical if QuantityInStock β€ SuggestedSafetyStock Needs Reorder if QuantityInStock β€ SuggestedReorderPoint OK otherwise π€ Generate Inventory Optimization Output (AI) The AI engine analyzes each SKU and returns: Suggested reorder point Suggested safety stock Updated stock status Clean structured JSON for each item All without markdown, hallucination, or additional logic. π§© Merge AI Optimization Results (Code) Consolidates all partial AI responses into one complete JSON dataset containing all SKUs. π§ Configure GPT-4o β Email Summary Model Prepares the AI model used for generating a professional operations-team email. π§ Generate Inventory Email Summary (AI) Creates a manager-ready email including: High-level inventory health Detailed SKU summaries Alerts for low, reorder-level, or critical stock Recommended actions for todayβs operations π¨ Email Inventory Summary to Manager (Gmail) Sends the completed inventory summary to the operations manager. π§ Configure GPT-4o β Slack Summary Model Sets up GPT-4o to produce a compact, emoji-supported Slack summary. π¬ Generate Inventory Slack Summary (AI) Builds a Slack-optimized message containing: One-line inventory health Bullet list of SKUs with stock status Clear alerts for reorder-level or critical items One recommended action line π‘ Notify Operations Team on Slack Delivers the optimized Slack summary to the operations Slack user/channel for real-time visibility. π§© Prerequisites Airtable access token Azure OpenAI GPT-4o credentials Google Sheets OAuth Slack API credentials Gmail OAuth π‘ Key Benefits β AI-powered stock calculations with strict formulas β Reliable reorder and safety-stock predictions β Instant multi-channel reporting (Email + Slack) β Full audit logging for invalid data β Zero hallucinationsβpure structured JSON β Faster decision-making for operations teams π₯ Perfect For Operations & supply-chain teams Inventory managers Retail & e-commerce units Businesses using Airtable for stock tracking
by Ian Kerins
Overview This n8n template automates the process of researching niche topics. It searches for a topic on Wikipedia, scrapes the relevant page using ScrapeOps, extracts the history or background section, and uses AI to generate a concise summary and timeline. The results are automatically saved to Google Sheets for easy content planning. Who is this for? Content Creators**: Quickly gather background info for videos or articles. Marketers**: Research niche markets and product histories. Educators/Students**: Generate timelines and summaries for study topics. Researchers**: Automate the initial data gathering phase. What problems it solves Time Consumption**: Manually reading and summarizing Wikipedia pages takes time. Blocking**: Scraping Wikipedia directly can sometimes lead to IP blocks; ScrapeOps handles this. Unstructured Data**: Raw HTML is hard to use; this workflow converts it into a clean, structured format (JSON/CSV). How it works Define Topic: You set a keyword in the workflow. Locate Page: The workflow queries the Wikipedia API to find the correct page URL. Smart Scraping: It uses the ScrapeOps Proxy API to fetch the page content reliably. Extraction: A code node intelligently parses the HTML to find "History", "Origins", or "Background" sections. AI Processing: GPT-4o-mini summarizes the text and extracts key dates for a timeline. Storage: The structured data is appended to a Google Sheet. Setup steps (~ 5-10 minutes) ScrapeOps Account: Register for a free API key at ScrapeOps. Configure the ScrapeOps Scraper node with your API key. OpenAI Account: Add your OpenAI credentials to the Message a model node. Google Sheets: Create a Google Sheet. You can duplicate this Template Sheet (copy the headers). Connect your Google account to the Append row in sheet node and select your new sheet. Pre-conditions An active ScrapeOps account. An OpenAI API key (or another LLM credential). A Google account for Sheets access. Disclaimer This template uses ScrapeOps as a community node. You are responsible for complying with Wikipedia's Terms of Use, robots directives, and applicable laws in your jurisdiction. Scraping targets may change at any time; adjust render/scroll/wait settings and parsers as needed. Use responsibly for legitimate business purposes.