by WeblineIndia
Zoho CRM Data Quality Guardian using n8n Schedule, Code & Email Nodes 🚀 Quick Start Guide This n8n workflow automatically audits your Zoho CRM leads on a schedule, cleans and validates emails and phone numbers, detects duplicates, enriches missing company data, generates a structured quality report and sends it as a styled HTML email. It helps maintain a clean, reliable CRM without manual effort. ⚡ Quick Implementation Steps Import the workflow into n8n Connect your Zoho CRM credentials Configure Email (SMTP/Gmail/Outlook) credentials Set the module (Leads is used by default) Configure the Schedule Trigger (hourly/daily) Run a test execution Check the email report in your inbox 📌 What It Does This workflow acts as an automated data quality auditor for your Zoho CRM. It runs on a scheduled basis and fetches all lead records, then processes each record individually through a series of validation and transformation steps. Email addresses are cleaned and validated using a regex pattern, while phone numbers are standardized into a consistent international format. It also detects duplicate records by comparing combinations of email and phone values. If company information is missing, the workflow intelligently derives it from the email domain or assigns a fallback value. Each record is then evaluated against defined business rules to generate quality flags such as invalid email, invalid phone, duplicate entry or missing company. Finally, the workflow aggregates all processed records into a structured summary report, converts it into a clean HTML format and automatically sends it via email. This provides teams with a clear and actionable view of CRM data quality. 👥 Who’s It For Sales and CRM teams managing large datasets Data quality and operations teams Marketing teams relying on clean lead data Businesses using Zoho CRM with growing databases Automation engineers building CRM governance workflows ⚙️ Requirements To use this workflow, you will need: An active n8n instance Access to Zoho CRM Configured Zoho OAuth2 credentials in n8n Configured Email (SMTP/Gmail/Outlook) credentials Basic understanding of workflow execution in n8n 🛠️ How It Works & How To Set Up 1. Schedule Trigger Setup The workflow starts with the Schedule Trigger node It is configured to run every hour at minute 5 You can modify this interval based on your needs 2. Connect Zoho CRM Open the Fetch Zoho CRM Records node Connect your Zoho OAuth2 credentials Resource is set to Lead and operation is getAll It retrieves all available records 3. Split Records The Split Records for Processing node parses the fetched data Converts bulk records into individual items for processing 4. Email Validation Cleans email (trim + lowercase) Validates using regex Flags invalid formats 5. Phone Normalization Removes non-numeric characters Converts numbers to international format Assigns country codes (IN, US, UK, UAE, DE, AU) Invalid lengths are cleared 6. Duplicate Detection Combines email + phone as a unique key Marks records as duplicates if repeated 7. Company Enrichment If company is missing: Extracts domain from email Converts to readable company name Falls back to “Unknown Company” 8. Quality Flags Generation Flags include: Invalid Email Invalid Phone Duplicate Missing Company Calculates a quality score (100 - 25 per issue) 9. Summary Report Aggregates results into: Total records Valid records Invalid counts Duplicate counts Generates a detailed report array 10. HTML Report Formatting Converts JSON report into a structured HTML layout Formats summary and detailed records into readable sections Applies inline styling for email compatibility 11. Email Delivery Sends the formatted HTML report via email Allows automated delivery to stakeholders Ensures timely visibility of CRM data quality 🔧 How To Customize Nodes Schedule Trigger** Change frequency (e.g., every 4 hours, daily) Zoho CRM Node** Switch from Leads to Contacts if needed Email Validation Node** Modify regex to enforce stricter validation Phone Normalization Node** Add more country codes or rules Duplicate Detection Node** Change logic (e.g., only email-based duplicates) Company Enrichment Node** Integrate external enrichment APIs Quality Flags Node** Add custom business rules or scoring logic HTML Formatting Node** Customize layout, colors or structure of the report Email Node** Change recipients, subject line or add CC/BCC ➕ Add-ons (Extend This Workflow) Send reports to Slack or Microsoft Teams Store reports in Google Sheets or a database Auto-update cleaned records back in Zoho CRM Attach CSV/Excel report in email Integrate with data enrichment tools (e.g., Clearbit) Build dashboards using BI tools Add alerts for high duplicate rates 💼 Use Case Examples CRM data cleaning automation Lead validation before sales outreach Duplicate lead prevention system Data quality monitoring dashboards Automated reporting to stakeholders > There can be many more use cases depending on how you extend and integrate this workflow. 🧯 Troubleshooting Guide | Issue | Possible Cause | Solution | |------|--------------|---------| | No data fetched | Zoho credentials not connected | Reconnect OAuth2 credentials | | Workflow not triggering | Schedule not active | Enable workflow and check trigger settings | | Emails not sent | Email credentials not configured | Verify SMTP/Gmail settings | | Email layout broken | Unsupported HTML/CSS | Use inline styles only | | Emails marked invalid incorrectly | Regex too strict | Adjust validation pattern | | Phone numbers missing | Invalid length or format | Update normalization logic | | Duplicate detection not accurate | Key logic too simple | Enhance matching criteria | | Company not enriched | Email missing or malformed | Add fallback logic or external API | 🤝 Need Help? If you need assistance setting up this workflow, customizing validation rules or building advanced automation on top of it, the team at WeblineIndia is here to help. We specialize in: n8n workflow development CRM automation solutions Data quality and enrichment systems Custom integrations and dashboards Feel free to reach out to WeblineIndia for tailored solutions or to extend this workflow to match your business needs.
by Stephan Koning
Who is it for This is built for the plastering and stucco owners who are out on the tools while the partner (the 'Patricia' of the business) tries to keep the admin from exploding. It's for anyone losing money because high-value quotes or urgent fires are buried under newsletters and spam. What it does This workflow is the Headless Brain that turns your messy inbox into an actionable Action-Grid. It doesn't just read emails; it classifies them into a modular SLA system (Red, Orange, Yellow, Green). It instantly routes leads and complaints into specific NinjaPipe Lists, ensuring that your most critical inquiries never go cold and your "Sleeping Revenue" is protected. The StuccoOS Method This classifying system is a core module of the StuccoOS ecosystem. In this modular setup, your Pipeline shows the journey, but your Lists show the urgency. But even on its own, this workflow solves your "inquiry leakage" by making sure you only see what needs an immediate response. How it works The webhook triggers on every incoming email. Noise Filter:** It weeds out automated junk and sent items to keep your CRM data clean. AI Classification:* The LLM analyzes sentiment and category—identifying *AFTERCARE (RED)* for urgent complaints or *SLEEPING REVENUE (ORANGE)** for quote follow-ups. Action Routing:** It checks NinjaPipe for the contact. If they don't exist, it creates them. SLA Injection:** The contact is dropped into the correct priority List in NinjaPipe, triggering your internal response deadlines (1h, 4h, 24h, etc.). Requirements AI API Key** (any will do) NinjaPipe CRM** account and API Token AgentMail** (if you want do advance labels) to feed the webhook How to set up Credentials:** Connect your AI provider and NinjaPipe API token. List Mapping:* Open the *Select CRM List for SLA** node. Replace the placeholder UUIDs in the listMap with your actual NinjaPipe List IDs. Webhook:** OR any email trigger will work. Note: This workflow is a modular component. It is built for NinjaPipe CRM, but it can be adapted to any CRM that supports a basic Contact + List structure through API. Right now, this workflow mainly uses the CRM for routing and SLA tracking. It does not heavily rely on AgentMail Labels yet, but it can. Labels are a great way to organize conversations inside AgentMail before or alongside CRM sync.
by vinci-king-01
Approval Workflow Handler – SendGrid & Baserow This workflow automates the end-to-end approval process for any request type (e.g., purchase orders, content sign-off, access permissions). It routes the request to designated approvers, records every decision in a Baserow table, and notifies requesters and stakeholders via SendGrid at each stage. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted, desktop, or n8n cloud) SendGrid account with an API Key Baserow workspace & table set up to store approval records Basic understanding of n8n node configuration Required Credentials SendGrid API Key** – For sending transactional emails Baserow Personal API Token** – For creating, updating, and querying table rows Specific Setup Requirements | Baserow Column | Type | Purpose | Example Value | | -------------- | -------- | ------------------------------------ | ------------- | | request_id | text | Unique identifier for each request | 2542 | | title | text | Short description of the request | “PO > $5K” | | status | single select | Tracks state (Pending, Approved, Rejected) | “Pending” | | requester | text | Email of person creating the request | alice@acme.io | | approver | text | Email of assigned approver | bob@acme.io | | updated_at | date | Last status change timestamp | | How it works This workflow automates the end-to-end approval process for any request type (e.g., purchase orders, content sign-off, access permissions). It routes the request to designated approvers, records every decision in a Baserow table, and notifies requesters and stakeholders via SendGrid at each stage. Key Steps: Trigger**: A Manual Trigger (or any upstream workflow) injects the initial request data. Create Record (Baserow)**: Store the new request as a “Pending” row. Notify Approver (SendGrid)**: Email the approver with approval/denial links. Wait for Action**: Hold execution until the approver clicks a link that calls the workflow’s Webhook URL. Decision Branch (If node)**: Determine whether the request is Approved or Rejected. Update Record (Baserow)**: Write the new status and timestamp back to the row. Notify Requester (SendGrid)**: Send the final decision to the original requester. Error Handling**: Error Trigger captures any unhandled failures and notifies ops. Set up steps Setup Time: 15-25 minutes Clone or import the template into your n8n instance. Add credentials a. Go to Credentials → New → SendGrid and paste your API key. b. Go to Credentials → New → Baserow and paste your Personal API Token. Configure environment variables (optional but recommended) APPROVER_EMAILS – Comma-separated list of default approvers. STAKEHOLDER_EMAILS – Comma-separated list of CC recipients. Edit the Baserow node Select your workspace and the “Approvals” table that matches the column schema above. Customize email templates in both SendGrid nodes (subject, HTML content, variables). Update Wait node’s webhook URL if running self-hosted behind a reverse proxy. Run a test execution using the Manual Trigger; confirm emails are delivered and the Baserow table updates correctly. Switch the trigger (optional) from Manual to Webhook or Schedule for production use. Enable workflow to begin processing live approval requests. Node Descriptions Core Workflow Nodes: Manual Trigger** – Starts the workflow during testing or via UI. Set (Initialize Request)** – Normalizes incoming data and generates a unique request_id. Baserow (Create Row)** – Inserts a new “Pending” record. SendGrid (Notify Approver)** – Sends approval request email with dynamic links. Wait** – Pauses execution until the approver responds. If (Decision)** – Routes the flow based on approved vs rejected. Baserow (Update Row)** – Writes final status and timestamp. SendGrid (Notify Requester)** – Communicates final decision. Merge** – Consolidates parallel branches before ending. Error Trigger** – Captures errors, logs them, and optionally notifies ops via email. Sticky Notes** – Contain inline documentation for maintainers. Data Flow: Manual Trigger → Set → Baserow (Create Row) → SendGrid (Notify Approver) → Wait Wait → If → (Approved ⬅︎ or ➡︎ Rejected) → Baserow (Update Row) → SendGrid (Notify Requester) → Merge Customization Examples 1. Auto-assign approver based on request amount // Code node: Dynamic approver selection const amount = items[0].json.amount; items[0].json.approver = amount > 10000 ? 'cfo@acme.io' : 'manager@acme.io'; return items; 2. Slack notification instead of email // Replace SendGrid node with Slack node { "channel": "#approvals", "text": Request ${$json["request_id"]} was approved by ${$json["approver"]} } Data Output Format The workflow outputs structured JSON data: { "request_id": "2542", "title": "PO > $5K", "status": "Approved", "requester": "alice@acme.io", "approver": "bob@acme.io", "updated_at": "2024-04-27T15:41:22.347Z" } Troubleshooting Common Issues Emails not sending – Verify SendGrid API key and account sender verification; check node credentials. Baserow “permission denied” – Ensure the Personal API Token has access to the workspace and table. Wait node never resumes – Confirm the public webhook URL is reachable and correctly embedded in email links. Performance Tips Batch approvals in a single workflow run when possible to reduce API overhead. Set up Baserow table indexing on request_id for faster lookups. Pro Tips: Use the Error Trigger to post incidents to a dedicated Slack or Microsoft Teams channel. Store reusable email templates in a separate “Settings” sheet or in n8n’s global static data. Add analytics by sending events to PostHog or Amplitude after each approval. This is a community-contributed workflow template. It is provided “as-is” without warranty; review and test thoroughly before using in production.
by Websensepro
Overview Stop applying manually. This workflow acts as your personal AI recruiter, automating the end-to-end process of finding high-quality jobs, tailoring your resume, and preparing personalized outreach emails to decision-makers. What this workflow does Scrapes Real-Time Jobs:** Uses Apify to pull live job listings from LinkedIn based on your specific keywords (e.g., "AI Automation"). Smart Filtering:** Uses GPT-4o-mini to analyze job descriptions against your skills and automatically discards roles that aren't a good fit. Hyper-Personalized Resume:** Uses GPT-4o to rewrite your "Master Resume" specifically for the target job description. Document Generation:** Creates a new Google Doc with the tailored resume and automatically sets sharing permissions. Decision Maker Enrichment:** Uses Anymail Finder to locate the verified email address of the Company CEO or Hiring Manager. Cold Email Draft:** Generates a personalized pitch in Gmail (Drafts folder) with the link to your custom resume attached. Setup Requirements To run this workflow, you will need to set up credentials in n8n for the following services. Please ensure you use n8n credentials and do not hardcode API keys into the HTTP nodes: Google Drive & Docs:** To read your master resume and create new application files. Apify Account:** To run the LinkedIn Job Scraper actor. OpenAI API Key:** For logic (GPT-4o-mini) and writing (GPT-4o). Anymail Finder API:** To find contact email addresses. Gmail:** To create the draft emails. How to use Upload Resume: Paste your "Master Resume" text into the first Google Docs node or connect your existing file. Configure Credentials: Add your API keys in the n8n credentials section for all services listed above. Set Search Criteria: Update the JSON body in the Apify node with your desired LinkedIn job search URL. Run: Execute the workflow and watch your drafts folder fill up with ready-to-send applications.
by Tomohiro Goto
🧠 How it works This workflow automatically translates messages between Japanese and English inside Slack — perfect for mixed-language teams. In our real-world use case, our 8-person team includes Arif, an English-speaking teammate from Indonesia, while the rest mainly speak Japanese. Before using this workflow, our daily chat often included: “Can someone translate this for Arif?” “I don’t understand what Arif wrote — can someone summarize it in Japanese?” “I need to post this announcement in both languages, but I don’t know the English phrasing.” This workflow fixes that communication gap without forcing anyone to change how they talk. Built with n8n and Google Gemini 2.5 Flash, it automatically detects the input language, translates to the opposite one, and posts the result in the same thread, keeping every channel clear and contextual. ⚙️ Features Unified translation system with three Slack triggers: 1️⃣ Slash Command /trans – bilingual posts for announcements. 2️⃣ Mention Trigger @trans – real-time thread translation for team discussions. 3️⃣ Reaction 🇯🇵 / 🇺🇸 – personal translation view for readers. Automatic JA ↔ EN detection and translation via Gemini 2.5 Flash 3-second instant ACK to satisfy Slack’s response timeout Shared Gemini translation core across all three modes Clean thread replies using chat.postMessage 💼 Use Cases Global teams** – Keep Japanese and English speakers in sync without switching tools. Project coordination** – Use mentions for mixed-language stand-ups and updates. Announcements** – Auto-generate bilingual company posts with /trans. Cross-cultural communication** – Help one-language teammates follow along instantly. 💡 Perfect for Global companies** with bilingual or multilingual teams Startups** collaborating across Japan and Southeast Asia Developers** exploring Slack + Gemini + n8n automation patterns 🧩 Notes You can force a specific translation direction (JA→EN or EN→JA) inside the Code node. Adjust the system prompt to match tone (“business-polite”, “casual”, etc.). Add glossary replacements for consistent terminology. If the bot doesn’t respond, ensure your app includes the following scopes: app_mentions:read, chat:write, reactions:read, channels:history, and groups:history. Always export your workflow with credentials OFF before sharing or publishing. ✨ Powered by Google Gemini 2.5 Flash × n8n × Slack API A complete multilingual layer for your workspace — all in one workflow. 🌍
by Ranjan Dailata
Who this is for This workflow is designed for: Recruiters, Talent Intelligence Teams, and HR tech builders automating resume ingestion. Developers and data engineers building ATS (Applicant Tracking Systems) or CRM data pipelines. AI and automation enthusiasts looking to extract structured JSON data from unstructured resume sources (PDFs, DOCs, HTML, or LinkedIn-like URLs). What problem this workflow solves Resumes often arrive in different formats (PDF, DOCX, web profile, etc.) that are difficult to process automatically. Manually extracting fields like candidate name, contact info, skills, and experience wastes time and is prone to human error. This workflow: Converts any unstructured resume into a structured JSON Resume format. Ensures the output aligns with the JSON Resume Schema. Saves the structured result to Google Sheets and local disk for easy tracking and integration with other tools. What this workflow does The workflow automates the entire resume parsing pipeline: Step 1: Trigger Starts manually with an Execute Workflow button. Step 2: Input Setup A Set Node defines the resume_url (e.g., a hosted resume link). Step 3: Resume Content Extraction Sends the URL to Thordata Universal API, which retrieves the web content, cleans HTML/CSS, and extracts structured text and metadata. Step 4: Convert HTML → Markdown Converts the HTML content into Markdown to prepare for AI model parsing. Step 5: JSON Resume Builder (AI Extraction) Sends the Markdown to OpenAI GPT-4.1-mini, which extracts: basics: name, email, phone, location work: companies, roles, achievements education: institutions, degrees, dates skills, projects, certifications, languages, and more The output adheres to the JSON Resume Schema. Step 6: Output Handling Saves the final structured resume: Locally to disk Appends to a Google Sheet for analytics or visualization. Setup Prerequisites n8n instance (self-hosted or cloud) Credentials for: Thordata Universal API (HTTP Bearer Token). First time users Signup OpenAI API Key Google Sheets OAuth2 integration Steps Import the provided workflow JSON into n8n. Configure your Thordata Universal API Token under Credentials → HTTP Bearer Auth. Connect your OpenAI account under Credentials → OpenAI API. Link your Google Sheets account (used in the Append or update row in sheet node). Replace the resume_url in the Set Node with your own resume file or hosted link. Execute the workflow. How to customize this workflow Input Sources Replace the Manual Trigger with: A Webhook Trigger to accept resumes uploaded from your website. A Google Drive / Dropbox Trigger to process uploaded files automatically. Output Destinations Send results to: Notion, Airtable, or Supabase via API nodes. Slack / Email for recruiter notifications. Language Model Options You can upgrade from gpt-4.1-mini → gpt-4.1 or a custom fine-tuned model for improved accuracy. Summary Unstructured Resume Parser with Thordata Universal API + OpenAI GPT-4.1-mini — automates the process of converting messy, unstructured resumes into clean, structured JSON data. It leverages Thordata’s Universal API for document ingestion and preprocessing, then uses OpenAI GPT-4.1-mini to extract key fields such as name, contact details, skills, experience, education, and achievements with high accuracy.
by Ayis Saliaris Fasseas
How It Works 1.Gmail Trigger Continuously monitors your Gmail inbox for new messages. Captures the email’s subject, body, and metadata. Sends the extracted content to the Email Content Classifier. 2.Email Content Classification The Email Content Classifier analyzes the email content using natural language processing. Compares the message against predefined Gmail labels: Ads Work Personal Financial Other (fallback label) Users can add or rename categories to match their specific needs. Uses context, tone, and keywords to determine the most accurate label. 3.Applying Gmail Labels Sends the classification result to the corresponding Gmail label node. Automatically applies the matching Gmail label in your inbox. If the classifier cannot confidently match the message, the Other label is used as a fallback. Setup Steps Connect Gmail Accounts Connect your Gmail account in the Gmail Trigger and in each Gmail label node. Configure the Email Content Classifier Map the incoming Gmail message body to inputText. Ensure the classifier node has access to a language model credential (Anthropic or other). Test the Workflow Send a few sample emails to yourself to confirm that labels are correctly applied. Tweak Categories if Needed Adjust category names in the classifier node to match your Gmail labels exactly. Customization Add or rename categories in the classifier to reflect your specific email types. Create corresponding Gmail label nodes for each new category. Expand or modify categories as your workflow evolves to improve organization and efficiency. Use Cases Automatic inbox organization and sorting. Separation of work, personal, financial, and promotional emails. Improved productivity by making important emails easier to locate. Custom categorization for specialized workflows. Troubleshooting Tips Emails not being labeled → check API permissions and message ID references. Wrong label assigned → update classifier examples or refine category descriptions. Classifier not returning a category → confirm fallback category “Other” is configured. Workflow not triggering → reconnect Gmail Trigger authentication and ensure the workflow is active.
by Roshan Ramani
Nano Banana Pro AI Product Advertisement Generator via Telegram Who's It For E-commerce businesses needing quick product ads Social media marketers without design resources Small business owners creating promotional content Product photographers seeking automated enhancements What It Does Transforms basic product photos into professional advertisements using AI. Users send a product image with caption text via Telegram, and receive commercial-grade ads with studio lighting, premium backgrounds, and typography overlays. How It Works User sends product photo with caption to Telegram bot Image converted to base64 for AI processing Google Gemini analyzes image and extracts marketing text from caption AI generates detailed design enhancement instructions (400+ words) Nano Banana Pro creates 1-2 professional advertisement versions Enhanced images automatically sent back to user Requirements Telegram Bot API credentials (via BotFather) Google Gemini API key with nano-banana-pro-preview access n8n instance (self-hosted or cloud) Setup Instructions Create Telegram Bot Message BotFather on Telegram Send /newbot command and follow prompts Copy the API token Configure n8n Credentials Add Telegram Bot API token Add Google Gemini API key Import workflow JSON Update credential references Activate workflow Test the Workflow Send image with caption format: "Product Name | Tagline | Call to Action" Example: "Premium Sneakers | Mountain Edition | Shop Now" Key Features Original product remains 100% unchanged Text extracted only from user's caption (no AI-generated taglines) Professional design enhancements applied Studio-quality lighting and color grading Luxury background selection based on product category Typography overlays using caption text 30-60 second processing time Returns 1-2 advertisement variants Node Breakdown Telegram Trigger - Listens for messages with images Download Image File - Retrieves image from Telegram servers Image to Base64 - Converts image for AI processing AI Design Analysis - Gemini extracts caption text and generates design blueprint covering composition, lighting, backgrounds, color grading, effects, and typography Combine Image & Analysis - Merges image data with design instructions Prepare API Payload - Structures data for Nano Banana Pro API Generate Enhanced Image - Creates professional ad using AI Convert Base64 to Image - Converts first generated ad to file Convert Base64 to Image1 - Converts second ad variant (if available) Send Image - Returns enhanced ads to user via Telegram Customization Options Adjust Design Style Modify AI Design Analysis prompt to change lighting intensity, background preferences, color grading, or typography styles Change Caption Parsing Update extraction rules for different text elements or multi-language support Add Output Formats Request different aspect ratios (16:9 social media, 4:5 Instagram, 9:16 Stories) Error Handling Add fallback nodes to handle image generation failures Usage Analytics Insert database node to track requests and caption data Caption Examples "NIKE AIR MAX | Run Beyond Limits | Shop Now" "Himalayan Coffee Beans - Fresh from the Mountains - Order Today" "Luxury Smartwatch | Track Your Success | Available Now" Important Notes Product never altered, only enhanced visually Empty captions result in ads without text overlays Best with clear photos on simple backgrounds Monitor API quotas to avoid rate limits Processing time varies by API response speed
by Masaki Go
About This Template This workflow turns complex data or topics sent via LINE into beautiful, easy-to-understand Infographics. It combines Gemini (to analyze data and structure the visual layout) and Nano Banana Pro (accessed via Kie.ai API) to generate high-quality, data-rich graphics (Charts, timelines, processes). How It Works Input: User sends a topic or data points via LINE (e.g., "Japan's Energy Mix: 20% Solar, 10% Wind..."). Data Visualization Logic: Gemini acts as an Information Designer, deciding the best chart type (Pie, Bar, Flow) and layout for the data. Render: Nano Banana generates a professional 3:4 Vertical Infographic. Smart Polling: The workflow uses a loop to check the API status every 5 seconds, ensuring it waits exactly as long as needed. Delivery: Uploads to S3 and sends the visual report back to LINE. Who It’s For Social Media Managers needing quick visual content. Educators and presenters summarizing data. Consultants creating quick visual reports on the go. Requirements n8n** (Cloud or Self-hosted). Kie.ai API Key** (Nano Banana Pro). Google Gemini API Key**. AWS S3 Bucket** (Public access). LINE Official Account**. Setup Steps Credentials: Configure Header Auth for Kie.ai and your other service credentials. Webhook: Add the production URL to LINE Developers console.
by WeblineIndia
Webhook → OpenAI → Jira “Bug Suspicion” → Slack QA Escalation This workflow ingests bug reports via a webhook, uses OpenAI to triage and tag them, creates a Jira Bug in project APP with AI-driven labels and alerts QA in Slack. Import the JSON, add OpenAI + Jira + Slack credentials, set the webhook path, choose your Slack channels and activate. Quick Start – Implement in 60 Seconds Import the JSON into n8n. Add credentials to AI Bug Analysis (OpenAI), Create Jira nodes and both Slack Alert nodes. Set webhook path advanced-bug-triage; test with a POST body containing priority, summary and category. Adjust Slack channels qa-alerts-high and qa-general if needed. Activate and verify a test POST flows through Jira and Slack. That’s it. Jira issue gets created and Slack gets notified instantly. What It Does The workflow acts as an AI-assisted bug triage bridge. A webhook receives incoming bug suspicions, which are then analyzed by OpenAI to determine priority and category. Based on the AI output, the flow routes to the appropriate Jira creation path and applies standardized labels for consistent reporting. After creating the Jira Bug in project APP, the workflow escalates to Slack: high-priority items go to qa-alerts-high, while normal items go to qa-general. The result is a fast, low-friction path from external bug signals to actionable Jira issues with immediate QA visibility. Who’s It For QA teams wanting automated Jira escalation. Developers integrating external systems with Jira. Product teams capturing automated “bug suspicion” signals. Monitoring or Sentry-like pipelines. Companies wanting lightweight reporting without building custom infrastructure. Pre-Requisites n8n (cloud or self-hosted). Jira account with permission to create Bug issues. Jira project key: APP (or customize). OpenAI credentials (for AI Bug Analysis) Slack Workspace + Bot token. Ability to send POST request to n8n Webhook endpoint. How It Works & Setup Instructions Webhook Trigger** (advanced-bug-triage): Accepts POST payloads (e.g., summary, description, priority, category). AI Bug Analysis** (OpenAI): Analyzes the payload for sentiment/priority/category (configure your prompt/fields as needed). Priority Switch**: Routes items to the correct Jira creation path (High/Medium/Low). Create Jira (High/Medium/Low)**: Creates Bug issues in project APP, labeling with ai-triaged and the AI-detected category. Slack Alert (High / Normal)**: Notifies QA with the Jira key; high priority goes to qa-alerts-high, others to qa-general. Step 1: Configure Webhook Node Method: POST Path: bug-suspicion Endpoint example: https://YOUR-N8N-URL/webhook/bug-suspicion Step 2: Add OpenAI Credentials Open OpenAI node Select credentials Modify the prompts as needed Step 3: Add Jira Credentials Open Create Jira Bug node Select credentials Ensure access to project APP Ensure permission to create Bug issue type Step 4: Add Slack Credentials Open Slack QA Escalation node Choose Slack Bot credentials Set QA channel Slack message uses: Issue is created in jira for this key <ISSUE-KEY> Step 5: Test Webhook { "title": "Login button unresponsive" } Step 6: Activate Workflow Enable Active toggle. How to Customize Nodes Webhook Trigger Add API keys, tokens or Basic Auth Add JSON validation Jira Node You may add: "additionalFields": { "labels": "bug-suspicion,auto-detected", "description": "={{$json["details"]}}" } Slack Node Customize formatting, attachments, mentions or channels. AI Node for Bug Analysis Tune the prompt, map input fields or adjust model parameters for stricter/looser triage. Priority Switch Modify routing thresholds, add more branches or change default fallback. Add-ons (Optional Enhancements) Email alerts. Severity scoring using AI. Push bug data to Notion or Google Sheets. Add screenshots/logs. Multi-channel notifications. Auto-assign Jira issues based on category or component. Add a fallback email notification for high-priority tickets. Push payloads to a data store (e.g., Sheets/DB) for analytics. Add a secondary Slack DM to on-call for P1. Enrich tickets with logs/links/screenshots from the payload. Use Case Examples Automated QA test failures → Jira + Slack. Monitoring system detects abnormal activity. Browser extension for internal bug reporting. CI/CD pipeline error → instant QA alert. External scripts or tools triggering bug reports. Monitoring alerts auto-create Jira bugs with AI-prioritized severity and Slack escalation. Customer support form pushes suspected bugs directly into Jira with category labels. QA automation failures stream into Jira with priority-based Slack alerts. SRE on-call receives P1 Slack alerts while lower priorities route to the general QA channel. Product beta feedback is categorized by AI and logged as Jira bugs for triage. Troubleshooting Guide | Issue | Cause | Solution | |-------|--------|-----------| | Webhook not receiving data | Wrong URL/method | Use POST + correct path | | Jira issue not created | Wrong credentials/project | Verify Jira credentials + APP project | | Slack message not sent | Bot not allowed in channel | Invite bot to channel | | Jira fields empty | Missing JSON field | Ensure payload includes "title" | | Slack shows undefined | Jira response changed | Add Debug node to inspect output | | Workflow not running | Not activated | Turn ON "Active" | Need Help? If you want help customizing this workflow or building similar n8n workflow automations, the WeblineIndia team can assist with: Jira integrations Slack automation API-based bug pipelines DevOps automation AI-driven severity scoring And so much more. Reach out anytime for implementation or enhancements.
by s3110
Title Japanese Document Translation Quality Checker with DeepL & Google Drive to Slack Who’s it for Localization teams, QA reviewers, and operations leads who need a fast, objective signal on Japanese document translation quality without manual checks. What it does / How it works This workflow watches a Google Drive folder for new Japanese documents, exports the text, translates JA→EN with DeepL, then back-translates EN→JA. It compares the original and back-translation to estimate a quality score and summarizes differences. A Google Docs report is generated, and a Slack message posts the score, difference count, and report link—so teams can triage quickly. How to set up Connect credentials for Google Drive, DeepL, and Slack. Point the Google Drive Trigger to your “incoming JP docs” folder. In the Workflow Configuration (Set) node, fill targetFolder (report destination) and slackChannel. Run once, then activate and drop a test doc. Requirements n8n (Cloud or self-hosted), Google Drive, DeepL, and Slack credentials; two Drive folders (incoming, reports). How to customize the workflow Tune the diff logic (character → token/line level, normalization rules), adjust score thresholds and Slack formatting, or add reviewer routing/Jira creation for low-score cases. Always avoid hardcoded secrets; keep user-editable variables in the Set node.
by vinci-king-01
Breaking News Aggregator with Telegram and Redis ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow monitors selected government websites, regulatory bodies, and legal-news portals for new or amended regulations relevant to specific industries. It scrapes the latest headlines, compares them against previously recorded items in Redis, and pushes real-time compliance alerts to a Telegram channel or chat. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or cloud) ScrapeGraphAI community node installed Redis server accessible from n8n Telegram Bot created via BotFather (Optional) Cron node if you want fully automated scheduling instead of manual trigger Required Credentials ScrapeGraphAI API Key** – Enables ScrapeGraphAI scraping functionality Telegram Bot Token** – Allows n8n to send messages via your bot Redis Credentials** – Host, port, and (if set) password for your Redis instance Redis Setup Requirements | Key Name | Description | Example | |----------|-------------|---------| | latestRegIds | Redis Set used to store hashes/IDs of the most recent regulatory articles processed | latestRegIds | > Hint: Use a dedicated Redis DB (e.g., DB 1) to keep workflow data isolated from other applications. How it works This workflow monitors selected government websites, regulatory bodies, and legal-news portals for new or amended regulations relevant to specific industries. It scrapes the latest headlines, compares them against previously recorded items in Redis, and pushes real-time compliance alerts to a Telegram channel or chat. Key Steps: Manual Trigger / Cron**: Starts the workflow manually or on a set schedule (e.g., daily at 06:00 UTC). Code (Define Sources)**: Returns an array of URL objects pointing to regulatory pages to monitor. SplitInBatches**: Iterates through each source URL in manageable chunks. ScrapeGraphAI**: Extracts article titles, publication dates, and article URLs from each page. Merge (Combine Results)**: Consolidates scraped items into a single stream. If (Deduplication Check)**: Verifies whether each article ID already exists in Redis. Set (Format Message)**: Creates a human-readable Telegram message string. Telegram**: Sends the formatted compliance alert to your chosen chat/channel. Redis (Add New IDs)**: Stores the article ID so it is not sent again in the future. Sticky Note**: Provides inline documentation inside the workflow canvas. Set up steps Setup Time: 10-15 minutes Install community nodes: In n8n, go to Settings → Community Nodes and install n8n-nodes-scrapegraphai. Create credentials: a. Telegram → Credentials → Telegram API → paste your bot token. b. Redis → Credentials → Redis → fill host, port, password, DB. c. ScrapeGraphAI → Credentials → ScrapeGraphAI API → enter your key. Configure the “Define Sources” Code node: Replace the placeholder URLs with the regulatory pages you need to monitor. Update Telegram chat ID: Open any chat with your bot and use https://api.telegram.org/bot<token>/getUpdates to find the chat.id. Insert this value in the Telegram node. Adjust frequency: Replace the Manual Trigger with a Cron node (e.g., daily 06:00 UTC). Test the workflow: Execute once manually; confirm messages appear in Telegram and that Redis keys are created. Activate: Enable the workflow so it runs automatically according to your schedule. Node Descriptions Core Workflow Nodes: Manual Trigger** – Allows on-demand execution during development/testing. Code (Define Sources)** – Returns an array of page URLs and meta info to the workflow. SplitInBatches** – Prevents overloading websites by scraping in controlled groups. ScrapeGraphAI** – Performs the actual web scraping using an AI-assisted parser. Merge** – Merges data streams from multiple batches into one. If (Check Redis)** – Filters out already-processed articles using Redis SET membership. Set** – Shapes output into a user-friendly Telegram message. Telegram** – Delivers compliance alerts to stakeholders in real time. Redis** – Persists article IDs to avoid duplicate notifications. Sticky Note** – Contains usage tips directly on the canvas. Data Flow: Manual Trigger → Code (Define Sources) → SplitInBatches → ScrapeGraphAI ScrapeGraphAI → Merge → If (Check Redis) If (true) → Set → Telegram → Redis Customization Examples Change industries or keywords // Code node snippet return [ { url: "https://regulator.gov/energy-updates", industry: "Energy", keywords: ["renewable", "grid", "tariff"] }, { url: "https://financewatch.gov/financial-rules", industry: "Finance", keywords: ["AML", "KYC", "cryptocurrency"] } ]; Modify Telegram message formatting // Set node “Parameters → Value” items[0].json.message = 🛡️ ${$json.industry} Regulation Update\n\n${$json.title}\n${$json.date}\n${$json.url}; return items; Data Output Format The workflow outputs structured JSON data: { "title": "EU Proposes New ESG Disclosure Rules", "date": "2024-04-18", "url": "https://europa.eu/legal/eu-proposes-esg-disclosure", "industry": "Finance" } Troubleshooting Common Issues Empty scraped data – Verify CSS selectors/XPath in the ScrapeGraphAI node; website structure may have changed. Duplicate alerts – Ensure Redis credentials point to the same DB across nodes; otherwise IDs are not shared. Performance Tips Limit SplitInBatches to 2-3 URLs at a time if sites implement rate limiting. Use environment variables for credentials to simplify migration between stages. Pro Tips: Combine this workflow with n8n’s Error Trigger to log failures to Slack or email. Maintain a CSV of source URLs in Google Sheets and fetch it dynamically via the Google Sheets node. Pair with the Webhook node to let team members add new sources on the fly.