by Cuong Nguyen
Description: Turn your n8n workflow into an automated competitive intelligence unit. This template monitors competitor activities across blog feeds and YouTube channels to detect strategic shifts. Instead of simply aggregating links, it uses Apify to fetch full video transcripts and Google Gemini to analyze the content's underlying message, tone, and positioning. The final output is a structured strategic briefing sent to Telegram and archived in Notion for long-term research. Who is this for This workflow is designed for market researchers, founders, and content strategists who need deep insights into competitor activities without manually scrubbing video timelines or reading daily blog posts. How it works Ingest: The workflow pulls the latest articles from configured RSS feeds and searches for new videos on specific YouTube channels. Deep Dive: It automatically triggers an Apify actor to scrape the full transcript of any new video, ensuring the AI analyzes the spoken content rather than just metadata. Analyze: Google Gemini processes the consolidated text to identify core messages, hidden strategies, and suggests potential counter-tactics. Report: Telegram:** Delivers a concise HTML executive summary with direct source links. Notion:** Appends a comprehensive report to a database using formatted Notion Blocks (Headings, Toggles, and Bullet points). Requirements Apify Account:** A free account is sufficient to run the youtube-transcript-scraper. Services:** Google Gemini (PaLM), YouTube Data API, Telegram, Notion. How to set up Credentials: Configure your API keys for YouTube, Apify, Google Gemini, Telegram, and Notion. Notion Setup: Create a new Database with two properties: Name (Title) and date (Date). Copy the Database ID from the URL into the Notion node. Data Sources: Update the Channel ID in the YouTube nodes to track your target competitors. Update the Feed URL in the RSS nodes. How to customize the workflow Scale Up:** To monitor more competitors, duplicate the YouTube or RSS nodes in the "Data Sources" section and connect them to the Merge node. Adjust Analysis:* Modify the system prompt in the *Google Gemini** node to focus on specific intelligence needs (e.g., "Focus on pricing changes" or "Identify new feature releases"). Token Optimization:** Use the configuration in the Code - Data Prep node to limit the number of items processed daily if you are using free-tier API limits. Need Help or Want to Customize This? Contact me for consulting and support: Email: cuongnguyen@aiops.vn
by Alok Kumar
Make your unstructured large documents LLM ready markdown using LandingAI Document Parsing. Automatically watches a Google Drive folder, submits new documents to Landing.ai for parsing, caches processed files in - Supabase to avoid reprocessing, and reliably polls results with retry and timeout handling. Use Cases Automated document ingestion for RAG pipelines Invoice, contract, or report parsing AI-powered document analysis workflows Knowledge base ingestion from Google Drive Preventing duplicate document processing in ETL pipelines External services: Google Drive Landing.ai Supabase Credentials Required Required Google Drive OAuth2 Landing.ai API (HTTP Bearer Token) Supabase API How it works Once the pdf land in google drive location it trigger and it convert pdf (even more then 200 pages to LLM ready markdown). It also check in database if the parsing is already done or not, this help to avoid any unnecessary landingAI api call. Setup Instructions Step 1: Google Drive Create or select a folder in Google Drive Copy the folder ID Update the Google Drive Trigger node with this folder ID Step 2: Landing.ai Create a Landing.ai account Generate an API key Add it in n8n as an HTTP Bearer Auth credential Update the organization-id header if required Step 3: Supabase Create a Supabase project Create a table named landing_parse_cache Add fields such as: file_id document_name mime_type file_size_bytes job_id job_status markdown uploaded_at workflow_run_id Connect Supabase credentials in n8n Expected Input A document uploaded into the configured Google Drive folder (PDF, DOCX, or other supported formats) Expected Output Parsed markdown content stored in Supabase Metadata including: File ID File name MIME type File size Job ID Processing status Early exit if the document already exists in cache Error Handling & Edge Cases Cache check to prevent duplicate processing Retry-based polling for async job completion Timeout detection for stuck jobs Large file output URL handling Detailed logging for debugging and audits Customization Ideas Push parsed output to a vector database Trigger Slack or email notifications Store results in cloud storage (S3, GCS) Extend into a RAG or AI agent pipeline Categories Document Processing AI & LLM Knowledge Management Automation Difficulty Level Advanced Happy Automating - from Alok
by WeblineIndia
Environment Config Diff & Propagate for Android Builds This workflow automatically detects changes in the .env.staging file in a GitHub repository and keeps Android configuration files (build.gradle and gradle.properties) in sync.It creates a new Git branch, applies updates safely, opens a pull request and notifies the team on Slack — all without manual effort. Whenever .env.staging changes: The workflow detects the change via GitHub webhook Compares ENV values with Android config files Automatically updates build.gradle and gradle.properties Creates a new Git branch Opens a pull request Sends a Slack notification You get: Automatic Android config synchronization** Safe updates via pull requests** Zero manual copying of ENV values** Instant Slack visibility for the team** Ideal for teams managing Android app configuration across environments without mistakes. What It Does This workflow automates Android configuration updates end-to-end: Listens for GitHub push events. Checks if .env.staging was modified. Stops execution if ENV file was not changed. Fetches .env.staging and gradle.properties from main. Converts both files into easy-to-compare key-value pairs. Compares ENV values against Gradle values. Creates a new Git branch for changes. Fetches files from the new branch. Identifies which variables must be updated. Applies ENV values to: build.gradle gradle.properties Commits the changes. Creates a pull request. Sends a Slack notification with PR details. This ensures Android configs are always aligned with ENV files. Who’s It For This workflow is ideal for: Android development teams DevOps & platform engineers CI/CD automation teams Teams managing multiple environments (staging / prod) Organizations avoiding manual config drift Anyone tired of copy-pasting ENV values Requirements to Use This Workflow To run this workflow, you need: n8n instance** (cloud or self-hosted) GitHub repository** with: .env.staging app/build.gradle gradle.properties GitHub Personal Access Token** Slack workspace** + API credentials Basic understanding of Android config files How It Works GitHub Webhook TriggerListens for commits pushed to the repository. ENV Change CheckWorkflow continues only if .env.staging changed. Fetch & Parse FilesReads ENV and Gradle files and converts them into key-value format. Compare Config ValuesFinds missing or mismatched variables. Create Safe BranchGenerates a timestamp-based branch from main. Apply UpdatesUpdates only the required values in: build.gradle gradle.properties Commit ChangesSaves updates to the new branch. Create Pull RequestOpens a PR for review. Notify SlackSends PR link and details to the team. Setup Steps Import the provided n8n workflow JSON. Configure GitHub credentials in all GitHub & HTTP nodes. Set your repository name and owner. Configure Slack credentials and select a channel. Ensure .env.staging exists in your repo. Activate the workflow. Push a change to .env.staging — automation starts instantly How To Customize Nodes Customize ENV File Replace .env.staging with .env.production or others. Update filename in GitHub fetch + check nodes. Customize Android Files Extend logic to include: local.properties BuildConfig.kt settings.gradle Customize Branch Naming Modify the Generate New Branch Name node. Add environment names or commit IDs. Customize Slack Message You can add: Emojis PR author name Changed variable list Mentions (@team, @android) Add-Ons (Optional Enhancements) You can extend this workflow to: Support multiple ENV files Add approval checks before PR creation Auto-merge after approval Validate ENV variable formats Send diff summary to Slack Block secrets from being committed Add Jira / Linear ticket creation Trigger Android CI build after PR creation Use Case Examples 1\. Environment Sync Keep Android configs aligned with staging ENV automatically. 2\. CI/CD Safety Prevent broken builds due to mismatched config values. 3\. Team Transparency Everyone sees config updates via Slack + PRs. 4\. DevOps Automation Remove repetitive manual config updates. 5\. Audit Friendly All changes tracked through Git history & PRs. Troubleshooting Guide | Issue | Possible Cause | Solution | |------------------------|----------------------------------|------------------------------------------------------| | Workflow not triggered | Webhook not configured | Re-register GitHub webhook | | No PR created | .env.staging not changed | Ensure the file is modified | | Wrong values updated | Parsing logic issue | Check KEY=VALUE format | | Slack message not sent | Invalid credentials | Reconnect Slack API | | Commit failed | Missing permissions | Check GitHub token scopes | Need Help? If you need help extending or customizing this workflow, adding production support, CI integrations, security checks or enterprise-scale automation, then our n8n workflow development team at WeblineIndia can help you build robust, production-ready workflows.
by Antonio Gasso
Build an intelligent WhatsApp assistant that automatically responds to customer messages using AI. This template uses the Evolution API community node for WhatsApp integration and OpenAI for natural language processing, with built-in conversation memory powered by Redis to maintain context across messages. > ⚠️ Self-hosted requirement: This workflow uses the Evolution API community node, which is only available on self-hosted n8n instances. It will not work on n8n Cloud. What this workflow does Receives incoming WhatsApp messages via Evolution API webhook Filters and processes text, audio, and image messages Transcribes audio messages using OpenAI Whisper Analyzes images using GPT-4 Vision Generates contextual responses with conversation memory Sends replies back through WhatsApp Who is this for? Businesses wanting to automate customer support on WhatsApp Teams needing 24/7 automated responses with AI Developers building multimodal chat assistants Companies looking to reduce response time on WhatsApp Setup instructions Evolution API: Install and configure Evolution API on your server. Create an instance and obtain your API key and instance name. Redis: Set up a Redis instance for conversation memory. You can use a local installation or a cloud service like Redis Cloud. OpenAI: Get your API key from platform.openai.com with access to GPT and Whisper models. Webhook: Configure your Evolution API instance to send webhooks to your n8n webhook URL. Customization options Modify the system prompt in the AI node to change the assistant's personality and responses Adjust the Redis TTL to control how long conversation history is retained Add additional message type handlers for documents, locations, or contacts Integrate with your CRM or database to personalize responses Credentials required Evolution API credentials (self-hosted) OpenAI API key Redis connection
by Asfandyar Malik
Who’s it for For HR professionals, recruiters, and hiring managers who want to automate the initial CV screening and candidate evaluation process. This workflow helps teams efficiently assess applicants based on submitted answers and resume data — saving hours of manual review and ensuring fair, consistent scoring. How it works This workflow automates CV screening using Google Drive, Google Sheets, and Gemini AI. When a candidate submits a form with their answers and CV, the file is uploaded to Drive, converted from PDF to plain text, and merged with the form data. Gemini AI then analyzes both inputs, comparing skills, experience, and responses to generate consistency, job-fit, and final scores. Finally, the results are parsed, saved to Google Sheets, and automatically sorted by score, providing a ranked list of candidates for easy review. How to set up Connect your Google Drive and Google Sheets credentials in n8n. Configure your Form Trigger to capture candidate answers and CV uploads. Set up the Extract from File node to parse PDF files into text. Add your Gemini AI credentials securely using n8n’s credential system (no hardcoded keys). Execute the workflow once to verify that CVs are uploaded, analyzed, and ranked in the connected Google Sheet. Requirements n8n account (cloud or self-hosted). Google Drive and Google Sheets integrations. Gemini AI (Chat Model) API credentials. A connected form (e.g., Typeform, n8n Form Trigger) How to customize You can modify the AI prompt to align with your company’s job criteria or evaluation style. Add more scoring categories (e.g., education, technical skills, experience). Change the output destination — send results to Airtable, Notion, or Slack. Enhance it with dashboards or extra nodes for reporting and analytics. ⚠️ Disclaimer This workflow uses Gemini AI, which may require self-hosting for community node compatibility. Ensure that no personal or sensitive candidate data is shared externally when using AI services.
by Ranjan Dailata
Who this is for This workflow is designed for researchers, marketing teams, customer success managers, and survey analysts who want to automatically generate AI-powered summaries of form responses collected via Jotform — turning raw feedback into actionable insights. It is ideal for: Teams conducting market research or post-event surveys. Customer experience teams that collect feedback via forms and need instant, digestible summaries. Product managers seeking concise overviews of user comments and suggestions. Analysts who want to compare comprehensive vs. abstract summaries for richer intelligence. What problem this workflow solves Analyzing open-ended Jotform responses manually can be slow, repetitive, and error-prone. This workflow automates the process by generating two AI summaries for every response: Comprehensive Summary — captures all factual details from the response. Abstract Summary — rephrases and synthesizes insights at a higher, conceptual level. With this workflow: Each response is summarized instantly using Google Gemini AI. Both comprehensive and abstract summaries are automatically generated and stored. Data is persisted in Google Sheets, DataTable, and Google Docs for further use. What this workflow does This n8n workflow transforms Jotform submissions into structured summaries using Google Gemini. Step-by-Step Breakdown Webhook Trigger (Jotform Integration) Listens for new Jotform submissions using the Webhook node. Receives full form data via the Webhook response. Set the Input Fields Extracts and assigns key fields like: FormTitle SubmissionID Body (the formatted form data) Prepares structured JSON to feed into the AI summarization stage. Comprehensive & Abstract Summarizer Powered by Google Gemini Chat Model (models/gemini-2.0-flash-exp). Custom prompt: You are an expert comprehensive summarizer. Build a detailed and abstract summary of the following {{ $json.body.pretty }}. Produces two distinct summaries: comprehensive_summary abstract_summary Structured Output Parser Ensures Gemini output matches a defined JSON schema: { "comprehensive_summary": "", "abstract_summary": "" } Guarantees reliable downstream integration with Sheets and Docs. Persist on DataTable Saves both summaries into an n8n DataTable for historical tracking or visualization. Useful for teams running internal analytics within n8n Cloud or self-hosted environments. Append or Update Row in Google Sheets Writes both summaries into a connected Google Sheet. Columns: comprehensive_summary abstract_summary Create Google Document Automatically generates a Google Docs file titled: {FormTitle}-{SubmissionID} Acts as a per-submission record with a placeholder ready for AI summary insertion. Update Google Document Inserts both summaries directly into the newly created Google Doc: Comprehensive Summary: [Full detailed summary] Abstract Summary: [Conceptual summary] Each doc becomes a polished, shareable insight artifact. Concepts Used in the Workflow Comprehensive Summarization Comprehensive summarization captures every important detail in a factual, exhaustive way — ideal when accuracy and completeness matter. Goal: Provide a detailed understanding of user responses without losing nuance. Best For: Research surveys Customer service logs Support ticket summaries Feedback traceability Abstract Summarization Abstract summarization rephrases and synthesizes ideas, offering high-level insights rather than copying text. Goal: Capture the essence and implications of feedback — ideal for storytelling and executive reviews. Best For: Executive summaries Marketing insights Customer trend analysis Blog-style recaps Setup Instructions Pre-requisite If you are new to Jotform, Please do signup using Jotform Signup For the purpose of demonstation, we are considering the Jotforms Prebuilt Form as a example. Follow these steps to deploy and customize the workflow: Step 0: Local n8n This step is required for the locally hosted n8n only. Please make sure to setup and install ngrok and follow the steps to configure and run ngrok on your local with the n8n port. This is how you can run. ngrok http 5678 Copy the base URL ex: https://2c6ab9f2c746.ngrok-free.app/ as it will be utilized as part of the webhook configuration for the Jotform. Step 1: Configure Jotform Webhook Copy the webhook URL generated by n8n’s Jotform Trigger node. In your Jotform dashboard, go to: Settings → Integrations → Webhooks → Add Webhook If you are executing this workflow on a self hosted n8n instance, please follow the steps for setting up ngrok and format the Webhook URL so that the Jotform can make a Webhook POST over the public URL. Copy the Webhook URL generated by n8n. You can copy the URL by double clicking on the Jotform Trigger node. Make sure to replace the base url with the above Step 0, if you are running the workflow from your local machine. Step 2: Connect Google Gemini Navigate to n8n → Credentials → Google Gemini (PaLM API). Add API credentials and select the model: models/gemini-2.0-flash-exp Test the connection before proceeding. Step 3: Configure the Structured Output Parser Open the Structured Output Parser node. Ensure the schema includes: { "comprehensive_summary": "", "abstract_summary": "" } Modify or expand schema fields if additional summaries (e.g., “sentiment_summary”) are needed. Step 4: Connect Google Sheets Link your Google Sheets OAuth2 credentials. Specify: Document ID (Google Sheet URL) Sheet Name (e.g., “Sheet1”) Map columns to: comprehensive_summary abstract_summary Step 5: Enable DataTable Storage (Optional) Use the DataTable node to maintain a permanent database within n8n Cloud. Configure the schema fields for: comprehensive_summary abstract_summary Step 6: Generate and Update Google Docs Link your Google Docs account under n8n credentials. The workflow auto-creates and updates a doc per submission, embedding both summaries for easy sharing. How to Customize Add Sentiment Analysis** After generating the summary, insert another Google Gemini node to classify the tone of each response — for example, Positive, Neutral, or Negative. This helps you track user sentiment trends over time. Send Alerts for Urgent Feedback** Use an IF node to check if the abstract summary contains words such as “urgent,” “issue,” or “negative.” If triggered, automatically send an alert through Slack, Gmail, or Discord, so the team can respond immediately. Enable Multi-Language Support** Insert a Language Detection node before the Gemini summarizer. Once the language is detected, modify the summarizer prompt dynamically to summarize in that same language — ensuring localized insights. Add Topic Extraction** Include an additional Gemini text extraction node that identifies major topics or recurring themes from each response before summarization. This creates structured insights ready for analytics or tagging. Integrate with CRM or Ticketing Systems** Connect your workflow to HubSpot, Salesforce, or Zendesk to automatically create new records or tickets based on the feedback type or sentiment. This closes the loop between survey collection and actionable response. Summary This workflow automates survey intelligence generation from Jotform submissions — powered by Google Gemini AI — delivering dual-layer summarization outputs directly into Google Sheets, DataTables, and Google Docs. Benefits: Instant comprehensive and abstract summaries per submission. Ready-to-use outputs for reports, dashboards, and client deliverables.
by Yasser Sami
Outlook & Notion Knowledge Base Builder for AI Agents This n8n template lets you automatically build and maintain an AI-ready knowledge base from Outlook emails and Notion pages. It stores both sources in a Pinecone vector database so your AI agent can reference them when generating answers, extracting information, or drafting context-aware emails. Who’s it for Teams using Outlook for client or customer communications. Businesses maintaining documentation or notes in Notion. Developers or AI builders who want to create custom knowledge bases for intelligent agents. How it works / What it does Outlook Integration: When you move an email or thread to a specific folder (e.g., “knowledgebase”), the workflow triggers automatically. It captures the entire conversation, removes duplicates, and stores it in Pinecone under a namespace called "emails". The AI agent later uses these stored emails to learn tone, structure, and context for drafting responses. Notion Integration: When a page is added to a designated Notion database, the workflow triggers. The page content is extracted and embedded into Pinecone under the "knowledgebase" namespace. This allows your agent to use your Notion pages as factual reference material when answering questions. AI Agent Setup: Uses two Pinecone tools — one to reference Notion content (knowledge base) and another for emails (style & tone). Equipped with Cohere embeddings for multilingual support and OpenAI for conversation or drafting tasks. When queried, the agent retrieves relevant knowledge from both namespaces to produce accurate, context-rich replies. How to set up Import this template into your n8n account. Connect your Outlook account and specify which folder to watch (e.g., “knowledgebase”). Connect your Notion account and select your target database. Connect your Pinecone, Cohere, and OpenAI credentials. Activate the workflow — new Notion pages and Outlook threads will automatically update your knowledge base. Requirements n8n account. Microsoft Outlook account. Notion account and database. Pinecone account for vector storage. Cohere API key for embeddings. OpenAI API key for AI model. How to customize the workflow Namespaces:** Rename or expand namespaces (e.g., “sales_emails”, “internal_docs”) to organize knowledge types. Embeddings Model:** Replace Cohere with OpenAI embeddings if preferred. Agent Behavior:** Adjust the agent’s system message to change its tone or purpose (e.g., “act as a sales assistant”). Extra Sources:** Extend this workflow to include PDFs, websites, or Slack messages. Sync Schedule:** Modify trigger intervals to control how frequently updates are captured. This workflow automatically builds a living, AI-powered knowledge base from your Outlook and Notion data — perfect for intelligent support, research, or content generation.
by Nikitha
Who’s it for This template is ideal for IT support teams, internal helpdesk automation engineers, and developers building intelligent ticketing systems. It helps streamline ITSM workflows by automatically classifying user queries, retrieving relevant knowledge base entries, and triggering incident creation in ServiceNow. How it works / What it does This workflow uses Google Gemini and Qdrant to power an intelligent ITSM assistant. When a user submits a query via chat: The Text Classifier categorizes the input as an Incident, Request, or Other. Based on the category: Incidents are automatically logged in ServiceNow. Requests trigger an HTTP call (e.g., for provisioning or access). Other queries are routed to an AI Agent that searches the FAQBase in Qdrant and responds contextually. The Gemini LLM enriches responses and summaries. The Qdrant Vector Store retrieves semantically similar answers from a pre-embedded FAQ knowledge base. The Summarization Chain condenses incident details for better tracking. Sticky notes are used throughout the workflow to document each node’s purpose and improve maintainability. How to set up Connect your Google Gemini API, Qdrant, and ServiceNow credentials. Populate the FAQBase collection in Qdrant with your ITSM knowledge base. Deploy the webhook to receive chat inputs. Test the flow using the Manual Trigger node. Customize the classifier categories and Gemini prompts as needed. Requirements Google Gemini API access Qdrant vector database with embedded FAQ data ServiceNow account with API access n8n instance with LangChain nodes installed How to customize the workflow Modify the Text Classifier categories to suit your organization’s ticket types. Add more FAQ entries to Qdrant for broader coverage. Replace the HTTP Request node with integrations relevant to your ITSM tools. Adjust the Gemini prompts to reflect your tone and support style. Extend the workflow with Slack, Teams, or email notifications for ticket updates.
by AureusR
WhatsApp customer service bot (with voice note transcription) handling FAQ, service enquiries and schedule appointments Who’s it for This template is designed for businesses that provide customer support and appointment-based services over WhatsApp. It’s ideal for service providers (e.g., clinics, salons, repair shops, consultants) that want to automate FAQs, share service information, handle voice note inquiries, and schedule appointments without manual effort. How it works / What it does This workflow creates a WhatsApp customer service assistant that: Transcribes voice notes** sent by customers into text for further processing. Answers customer FAQs by looking up a Google Sheet knowledge base. Provides service information (name, description, price) from a Google Sheet. Schedules appointments by: Asking the customer which service they want. Collecting their preferred day and time. Checking Google Calendar for available slots. Offering 3 options and letting the customer choose. Collecting name, email, and phone number. Creating the confirmed appointment in Google Calendar. Sends all customer-facing messages via a WhatsApp integration node. How to set up Connect your tools Link your Google Sheets for FAQs and Services. Connect your Google Calendar account. Configure your WhatsApp integration. Connect a transcription service (e.g., Whisper, Google Speech-to-Text, or another transcription API). Prepare your data FAQs Google Sheet → must contain columns: id | question | answer Services Google Sheet → must contain columns: id | service_name | service_description | price Adjust the flow Update the service names and questions to match your business. Set the correct time zone in the Google Calendar node. Update the WhatsApp integration node with your business account. Configure the transcription node with your chosen API credentials. Requirements Google Sheets (for FAQs and Services) Google Calendar WhatsApp integration in n8n Speech-to-Text API (for transcribing voice notes) How to customize the workflow Adding new FAQs**: Update the Google Sheet with new rows. Changing services**: Modify the Services Google Sheet to reflect updated offerings or prices. Custom messages**: Update the agent_reply node text to reflect your brand tone. Advanced logic**: Add routing for voice-note-only customers, VIP handling, or multilingual support. Notes This template uses multiple external integrations (Google Sheets, Google Calendar, WhatsApp, Speech-to-Text).
by Shadrack
How it works An AI-powered sales agent on WhatsApp that handles product inquiries using your Supabase knowledge base and n8n catalog. Customers can send text, voice notes, or images to ask about products, pricing, and specs. The agent responds with detailed answers, product images, and FAQs, creating a complete self-service sales experience. (I primarily designed this for furniture business, consider tailoring it) Setup steps Connect Supabase credentials for your knowledge base and FAQs Configure n8n tables with your product catalog (prices, descriptions, image links) Set up WhatsApp Business API integration Add your product categories and common queries to the AI context Test with sample product questions and image uploads Customization tips Structure your catalog tables with clear columns (SKU, price, description, image_url) Add industry-specific terminology to the AI prompt Create templated responses for common FAQs to ensure consistency Enable voice-to-text transcription for better voice note handling Add product recommendation logic based on customer queries Set up image recognition for customers sending product photos to identify items
by ConceptRecall
Who is this for? This workflow is designed for software teams, project managers, and developers who manage work across Azure DevOps and GitHub. It helps organizations that use Azure DevOps for work item tracking but rely on GitHub for issue management and collaboration. If you need to ensure that your DevOps Stories and Tasks are mirrored in GitHub issues while keeping a single source of truth in Google Sheets, this workflow is for you. What problem is this workflow solving? / Use case Managing projects across multiple platforms often leads to missed updates and poor traceability. Stories created in Azure DevOps may not be tracked properly in GitHub.\ Tasks under Stories often lose visibility when teams split between platforms.\ Manual syncing between tools takes time and causes human errors. This workflow solves that problem by automating the sync between Azure DevOps Stories and GitHub Issues, while also keeping a Google Sheets record for cross-referencing and reporting. What this workflow does Triggers from Azure DevOps Stories -- When a Story is created or updated, the workflow is activated.\ Creates a GitHub Issue -- A new issue is generated in the specified GitHub repository.\ Assigns a random collaborator -- One repository collaborator is randomly assigned to the issue.\ Logs mapping in Google Sheets -- The Azure DevOps Story ID, GitHub Issue number, and URL are stored for tracking.\ Triggers from Azure DevOps Tasks -- When a Task linked to a Story is created, the workflow looks up its parent in Google Sheets.\ Updates the GitHub Issue -- The parent GitHub Issue is updated with a clickable link to the new Task for better visibility. Setup Connect your accounts GitHub (OAuth2 or personal token)\ Google Sheets (OAuth2)\ Azure DevOps (Webhook integration) Configure Webhooks Add the workflow's webhook URLs to Azure DevOps service hooks for Work Item Created/Updated events. Update repository details Set the GitHub repository where issues should be created. Customize Sheets Use the provided Google Sheet or link your own for issue mappings. How to customize this workflow to your needs Modify assignment logic**: Instead of random collaborator assignment, edit the Code node to assign issues based on workload or labels.\ Change Sheet schema**: Add more fields (e.g., State, IterationPath) to your Google Sheet for richer reporting.\ Expand task linking**: Customize the way Tasks are appended to GitHub issues (e.g., group by state, show due dates). Powered By Concept Recall https://conceptrecall.com
by Alexandra Spalato
Companies Email Finder & Lead Generation Automation Short Description Automatically find company domains, extract decision maker emails (CEO, Sales, Marketing), validate email quality, and build a comprehensive prospect database using AI-powered search and professional email finding APIs. Detailed Description This comprehensive lead generation workflow transforms a simple list of company names into a complete prospect database with verified decision maker contacts. The system automatically discovers official company websites, finds key decision makers' email addresses, validates email quality, and organizes everything in a structured database for immediate outreach. Perfect for sales teams, marketing agencies, business developers, and anyone who needs to build high-quality prospect lists efficiently and cost-effectively. Key Features Intelligent Domain Discovery**: Uses Serper.dev and AI to find official company websites from search results Multi-Role Email Finding**: Automatically extracts emails for: CEOs and C-level executives Sales decision makers Marketing decision makers Email Quality Validation**: Classifies emails as "valid" or "risky" for better deliverability Smart Fallback System**: Searches for additional company emails when decision makers aren't found Duplicate Prevention**: Removes duplicate contacts automatically Batch Processing**: Handles large company lists efficiently with intelligent batching Database Integration**: Stores all data in NocoDB with proper organization and status tracking Rate Limiting**: Includes delays and error handling to respect API limits Who This Workflow Is For Sales Teams**: Building targeted prospect lists for outbound campaigns Marketing Agencies**: Creating lead databases for client campaigns Business Development**: Finding decision makers for partnership opportunities Recruiters**: Locating hiring managers and HR contacts Entrepreneurs**: Building contact lists for product launches or fundraising Lead Generation Services**: Automating prospect research for clients Problems This Workflow Solves Manual Research Time**: Eliminates hours of manual company and contact research Incomplete Contact Data**: Ensures you have decision makers, not just generic emails Email Deliverability Issues**: Validates email quality before outreach campaigns Data Organization**: Maintains clean, structured prospect databases Scaling Bottlenecks**: Processes hundreds of companies automatically High Lead Generation Costs**: Reduces dependency on expensive lead generation services Setup Requirements Required API Credentials Serper.dev API Key**: For company domain search and discovery OpenAI API Key**: For intelligent domain extraction from search results AnyMailFinder API Key**: For decision maker email discovery and validation NocoDB API Token**: For database storage and management Database Structure Companies Table: Id (Number): Unique company identifier company_name (Text): Company name to search location (Text): Company location for better search results url (URL): Discovered company website domain (Text): Extracted domain name status (Select): Processing status tracking emails (Text): All discovered company emails company_emails_status (Text): Email validation status Contacts Table: companies_id (Number): Link to parent company name (Text): Contact full name position (Text): Job title/role email (Email): Contact email address email_status (Text): Email validation status linkedin_url (URL): LinkedIn profile (when available) System Requirements Active n8n instance (self-hosted or cloud) NocoDB database instance Active API subscriptions for Serper.dev, OpenAI, and AnyMailFinder How It Works Phase 1: Domain Discovery Company Processing: Retrieves companies from database in batches Domain Search: Uses Serper.dev to search for official company websites AI Domain Extraction: OpenAI analyzes search results to identify official domains Database Updates: Stores discovered domains and URLs Phase 2: Decision Maker Discovery Multi-Role Search: Finds emails for CEO, Sales, and Marketing decision makers using AnyMailFinder Email Validation: Validates email deliverability and flags risky addresses Contact Creation: Stores validated contacts with full details Status Tracking: Updates company status based on email discovery success Phase 3: Company Email Backup Gap Analysis: Identifies companies with no valid decision maker emails Bulk Email Search: Finds up to 20 additional company emails using AnyMailFinder Final Updates: Stores all discovered emails for comprehensive coverage Customization Options Search Parameters Modify search queries for better domain discovery using Serper.dev Adjust location-based search parameters Customize AI prompts for domain extraction Decision Maker Roles Add new decision maker categories (HR, Finance, Operations, etc.) Modify existing role search parameters in AnyMailFinder Customize email validation criteria Data Processing Adjust batch sizes for different processing speeds Modify rate limiting delays Customize error handling and retry logic Database Schema Add custom fields for industry, company size, etc. Integrate with different database systems Customize data validation rules API Costs and Credits AnyMailFinder**: 2 credits per valid email found, 1 credit per bulk company search Serper.dev**: ~$1 per 1000 searches OpenAI**: Minimal costs for domain extraction prompts Estimated Cost**: about $0.03 per company processed (depending on email discovery success) Benefits Save 20+ Hours Weekly**: Automate prospect research that takes hours manually Higher Quality Leads**: Get decision makers, not generic contact@ emails Better Deliverability**: Email validation reduces bounce rates Scalable Process**: Handle thousands of companies automatically Cost Effective**: Much cheaper than traditional lead generation services Complete Database**: Build comprehensive prospect databases with all contact details Use Cases Outbound Sales Campaigns**: Build targeted prospect lists for cold outreach Partnership Development**: Find decision makers at potential partner companies Market Research**: Understand company structures and contact hierarchies Recruitment**: Locate hiring managers and HR contacts Investor Relations**: Find contacts at potential investor companies Vendor Outreach**: Identify procurement and operations contacts Installation Instructions Import the workflow JSON into your n8n instance Set up NocoDB database with required table structures Configure all API credentials in the credential manager (including Serper.dev and AnyMailFinder) Update NocoDB connection settings with your database details Test with a small batch of companies before full deployment Monitor API usage and adjust batch sizes as needed Best Practices Start with high-quality company names and locations Monitor AnyMailFinder credit usage to manage costs Use Serper.dev efficiently with targeted search queries Regularly clean and validate your prospect database Respect email deliverability best practices Follow GDPR and data privacy regulations Use rate limiting to avoid API restrictions Error Handling Built-in retry mechanisms for API failures Continues processing even if individual companies fail Create an Error Workflow