by Humble Turtle
Github Deployer Agent Overview The Github Deployer Agent is an intelligent automation tool that integrates with Slack to streamline code deployment workflows. Powered by Anthropic's Claude 3.5 and Tavily for web search, it enables seamless, context-aware file pushes to a GitHub repository with minimal user input. Capabilities Accepts natural language via Slack Automatically pushes code to a default GitHub repository Uses Claude 3.5 for code generation and decision-making Leverages Tavily for real-time web search to enhance context Supports folder structure hints to ensure clean and organized repositories Required Connections To operate correctly, the following integrations must be in place: Slack API Token with permission to read messages and post responses GitHub Personal Access Token with repo write permissions Tavily API Key for external search functionality Claude 3.5 API Access via Anthropic Detailed configuration instructions are provided in the workflow Example Input From Slack, you can send messages like: "Generate a basic README.md for my Python project and store it in the root directory." Customising This Workflow You can tailor the workflow by: Modifying default folder paths or repository settings Integrate Jira node to use issue keys as default folder naming Add slack file upload option
by Evoort Solutions
TikTok Transcript Generator Overview This automated workflow extracts transcripts from TikTok videos by reading video URLs from a Google Sheet, calling the API via TikTok Transcript Generator, cleaning the subtitle data, and updating the sheet with transcripts. It efficiently handles batches, errors, and rate limits to provide a seamless transcription process. Key Features Batch processing:** Reads and processes multiple TikTok video URLs from Google Sheets. Automatic transcript generation:* Uses the *TikTok Transcript Generator API on RapidAPI**. Clean subtitle output:** Removes timestamps and headers for clear transcripts. Error handling:** Marks videos with no available transcript. Rate limiting:* Implements wait times to avoid API throttling on *RapidAPI**. Seamless Google Sheets integration:** Updates the same sheet with transcript results and statuses. API Used TikTok Transcript Generator API** Google Sheet Columns | Column Name | Description | |----------------|-----------------------------------------| | Video Url | URL of the TikTok video to transcribe | | Transcript | Generated transcript text (updated by workflow) | | Generated Date | Date when the transcript was generated (YYYY-MM-DD) | Workflow Nodes Explanation | Node Name | Type | Purpose | |--------------------------|-----------------------|-------------------------------------------------------------------| | When clicking ‘Execute workflow’ | Manual Trigger | Manually starts the entire transcription workflow. | | Google Sheets2 | Google Sheets (Read) | Reads TikTok video URLs and transcript data from Google Sheets. | | Loop Over Items | Split In Batches | Processes rows in smaller batches to control execution speed. | | If | Conditional Check | Filters videos needing transcription (URL present, transcript empty). | | HTTP Request | HTTP Request | Calls the TikTok Transcript Generator API on RapidAPI to fetch transcripts. | | If1 | Conditional Check | Checks for valid API responses (handles 404 errors). | | Code | Code (JavaScript) | Cleans and formats raw subtitle text by removing timestamps. | | Google Sheets | Google Sheets (Update)| Updates the sheet with cleaned transcripts and generation dates. | | Google Sheets1 | Google Sheets (Update)| Updates sheet with “No transcription available” message on error.| | Wait | Wait | Adds delay between batches to avoid API rate limits on RapidAPI. | Challenges Resolved Manual Transcription Effort:** Eliminates the need to manually transcribe TikTok videos, saving time and reducing errors. API Rate Limits:* Introduces batching and wait periods to avoid exceeding API usage limits on *RapidAPI**, ensuring smooth execution. Incomplete or Missing Data:** Filters out videos already transcribed and handles missing transcripts gracefully by logging appropriate messages. Data Formatting Issues:** Cleans raw subtitle data to provide readable, timestamp-free transcripts. Data Synchronization:** Updates transcripts back into the same Google Sheet row, maintaining data consistency and ease of access. Use Cases Content creators wanting to transcribe TikTok videos automatically. Social media analysts extracting text data for research. Automation enthusiasts integrating transcript generation into workflows. How to Use Prepare a Google Sheet with the columns: Video Url, Transcript, and Generated Date. Connect your Google Sheets account in the workflow. Enter your RapidAPI key for the TikTok Transcript Generator API. Execute the workflow to generate transcripts. View transcripts and generated dates directly in your Google Sheet. Try this workflow to automate your TikTok video transcriptions efficiently! Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Yaron Been
CTO Agent with Engineering Team Description Complete AI-powered engineering department with a Chief Technology Officer (CTO) agent orchestrating specialized engineering team members for comprehensive software development and technical operations. Overview This n8n workflow creates a comprehensive engineering department using AI agents. The CTO agent analyzes technical requests and delegates tasks to specialized agents for software architecture, DevOps, security, quality assurance, backend development, and frontend development. Features Strategic CTO agent using OpenAI O3 for complex technical decision-making Six specialized engineering agents powered by GPT-4.1-mini for efficient execution Complete software development lifecycle coverage from architecture to deployment Automated DevOps pipelines and infrastructure management Security assessments and compliance frameworks Quality assurance and test automation strategies Full-stack development capabilities Team Structure CTO Agent**: Technical leadership and strategic delegation (O3 model) Software Architect Agent**: System design, patterns, technology stack decisions DevOps Engineer Agent**: CI/CD pipelines, infrastructure automation, containerization Security Engineer Agent**: Application security, vulnerability assessments, compliance QA Test Engineer Agent**: Test automation, quality strategies, performance testing Backend Developer Agent**: Server-side development, APIs, database architecture Frontend Developer Agent**: UI/UX development, responsive design, frontend frameworks How to Use Import the workflow into your n8n instance Configure OpenAI API credentials for all chat models Deploy the webhook for chat interactions Send technical requests via chat (e.g., "Design a scalable microservices architecture for our e-commerce platform") The CTO will analyze and delegate to appropriate specialists Receive comprehensive technical deliverables Use Cases Full Stack Development**: Complete application architecture and implementation System Architecture**: Scalable designs for microservices and distributed systems DevOps Automation**: CI/CD pipelines, containerization, cloud deployment strategies Security Audits**: Vulnerability assessments, secure coding practices, compliance Quality Assurance**: Test automation frameworks, performance testing strategies Technical Documentation**: API documentation, system diagrams, deployment guides Requirements n8n instance with LangChain nodes OpenAI API access (O3 for CTO, GPT-4.1-mini for specialists) Webhook capability for chat interactions Optional: Integration with development tools and platforms Cost Optimization O3 model used only for strategic CTO decisions GPT-4.1-mini provides 90% cost reduction for specialist tasks Parallel processing enables simultaneous agent execution Code template library reduces redundant development work Integration Options Connect to development platforms (GitHub, GitLab, Bitbucket) Integrate with project management tools (Jira, Trello, Asana) Link to monitoring and logging systems Export to documentation platforms Contact & Resources Website**: nofluff.online YouTube**: @YaronBeen LinkedIn**: Yaron Been Tags #SoftwareEngineering #TechStack #DevOps #SecurityFirst #QualityAssurance #FullStackDevelopment #Microservices #CloudNative #TechLeadership #EngineeringAutomation #n8n #OpenAI #MultiAgentSystem #EngineeringExcellence #DevAutomation #TechInnovation
by Robert Breen
This n8n workflow dynamically generates a realistic sample dataset based on a single topic you provide. It uses OpenAI (via LangChain) and n8n’s built-in nodes to: Generate structured JSON data for 5 columns with 3–5 values each Flatten that data into a single text blob Infer meaningful column names via a second AI call Pivot, split, merge, and rename columns automatically Output a clean, labeled dataset ready for export or further processing ⚙️ Prerequisites OpenAI API Key Visit: https://platform.openai.com/account/api-keys Create a new key In n8n: Credentials → New → OpenAI API, paste key, name it “OpenAi account” LangChain nodes enabled in your n8n instance 🥇 Step 1: Set Up OpenAI Credential Go to OpenAI API Keys Create and copy your key In n8n: Credentials → New → OpenAI API → paste key as “OpenAi account” 🥈 Step 2: Manual Trigger Add Manual Trigger to start the workflow 🥉 Step 3: Set Topic Add a Set node named Set Topic to Search Field: Topic = n8n use cases (or any topic you choose) ✨ Step 4: Generate Structured Data LangChain Agent** node Generate Random Data Connect to OpenAI Chat Model1 and Tool: Inject Creativity1 System prompt: instruct AI to output 5 columns of realistic values in JSON 🔧 Step 5: Parse AI Output Structured Output Parser** to validate JSON 🔄 Step 6: Flatten Data Code** node Outpt all Data to One Field Joins all values into a comma-separated string for column naming 🧠 Step 7: Generate Column Names LangChain Agent** Generate Column Names Connect to OpenAI Chat Model2 Prompt: infer 5 column names from the string 🔢 Step 8: Pivot Names Row Code** node Pivot Column Names transforms array into { column1: name1, … } 🪓 Step 9: Split Columns 5 SplitOut nodes to break each array back into rows per column 🔗 Step 10: Merge Rows Merge** node Merge Columns together using combineByPosition 🏷️ Step 11: Rename Columns Set** node Rename Columns assigns the AI-generated names to each column 🔗 Step 12: Final Output Merge** Append Column Names combines data and header row 🏁 Done! You now have a fully AI-driven, labeled dataset generated from a single topic—no external services needed. Easily extend by adding a Google Sheets or HTTP node to export. 📬 Need Help or Want to Customize This? 📧 robert@ynteractive.com 🔗 LinkedIn
by Puspak
Workflow Overview This workflow automatically fetches the latest "Ask HN: Who is hiring?" posts from Hacker News, extracts individual job listings, cleans the raw text, converts them into structured job listings using Google Gemini AI, and saves them into Airtable. Components It’s a full end-to-end automation system combining: Algolia API** for HN data Text cleaning** Gemini AI (via LangChain)** for parsing job descriptions Structured JSON extraction** Airtable integration** to store the final data 🎯 Use Cases Automatically build a job board from HN posts Track startup hiring trends Feed remote job alerts into a CRM or Slack Enrich a hiring intelligence database 🔧 Nodes & Services Used HTTP Request (Algolia + Firebase API) SplitOut, Set, Filter, Function, Limit Google Gemini (via LangChain integration) Output Parser Structured Airtable (API token required) 📌 Credentials Required Google Gemini (PaLM/Gemini API) Airtable Personal Access Token Algolia Application ID & API Key (via Header Auth) 📦 Tags hacker-news, jobs, airtable, ai, gemini, automation, hn, langchain, workflow Screenshots
by PDF Vector
Overview Transform your contract management process with this enterprise-grade workflow that handles the complete contract lifecycle - from initial intake through execution, monitoring, and renewal. This comprehensive solution combines AI-powered contract analysis with automated risk scoring, clause comparison, obligation tracking, and proactive alerts. It integrates with multiple data sources including email, SharePoint, contract CLM systems, and creates a centralized contract intelligence hub that prevents revenue leakage, ensures compliance, and accelerates deal velocity. What You Can Do This advanced workflow orchestrates a complete contract management ecosystem that monitors multiple channels (email, Google Drive, SharePoint, APIs) for new contracts and amendments. It extracts and analyzes over 50 contract data points using AI, performs multi-dimensional risk assessment across legal, financial, and operational factors, compares clauses against your approved template library, tracks all obligations and key dates with automated reminders, integrates with Salesforce/CRM for deal alignment, routes contracts through dynamic approval workflows based on risk scores, generates executive dashboards with contract analytics, and maintains a searchable repository with version control. The system handles complex scenarios including multi-party agreements, framework contracts with statements of work, international contracts requiring jurisdiction analysis, and M&A due diligence requiring bulk contract review. Who It's For Designed for enterprise legal operations teams managing thousands of contracts annually, procurement departments negotiating complex vendor agreements, contract managers overseeing multi-million dollar portfolios, compliance teams ensuring regulatory adherence across jurisdictions, sales operations needing faster contract turnaround, and C-suite executives requiring contract intelligence for strategic decisions. Essential for organizations in regulated industries (healthcare, finance, government) and companies undergoing digital transformation of their legal operations. The Problem It Solves Manual contract management creates massive operational risks and inefficiencies. Organizations typically have contracts scattered across emails, shared drives, and filing cabinets with no central visibility. This leads to missed renewal deadlines costing 5-10% of contract value, unauthorized contract variations creating compliance risks, obligation failures resulting in penalties and damaged relationships, and inability to leverage favorable terms across similar contracts. Studies show that inefficient contract management costs organizations up to 9% of annual revenue. This workflow creates a single source of truth for all contracts, automates tracking and compliance, and provides predictive insights to prevent issues before they occur. Setup Instructions Multi-Channel Integration: Configure connectors for email (Office 365/Gmail), Google Drive, SharePoint, and contract management systems PDF Vector Setup: Install PDF Vector node and configure API with enterprise rate limits Database Configuration: Set up PostgreSQL/MySQL for contract repository with proper indexing Template Library: Upload your standard contract templates and approved clause library Risk Framework: Configure risk scoring matrix for your industry (legal, financial, operational risks) Approval Matrix: Define approval routing based on contract value, type, and risk score CRM Integration: Connect to Salesforce/HubSpot for opportunity and account alignment Notification Setup: Configure Slack/Teams channels and email distribution lists Dashboard Creation: Set up Tableau/PowerBI connectors for executive reporting Security Configuration: Enable encryption, audit logging, and role-based access controls Key Features Intelligent Intake System**: Monitor email attachments, shared folders, CRM uploads, and API submissions Advanced AI Extraction**: Extract 50+ data points including nested obligations and conditional terms Multi-Dimensional Risk Scoring**: Analyze legal, financial, operational, and reputational risks Clause Library Comparison**: Compare against approved templates and flag deviations Obligation Management**: Track deliverables, milestones, and SLAs with automated alerts Dynamic Approval Routing**: Route based on AI risk score, contract value, and deviation analysis Version Control & Redlining**: Track all changes and maintain complete audit trail Salesforce Integration**: Sync contract data with opportunities and accounts Predictive Analytics**: Forecast renewal likelihood and negotiation outcomes Bulk Processing**: Handle M&A due diligence with parallel processing of hundreds of contracts Multi-Language Support**: Process contracts in 15+ languages with automatic translation Executive Dashboards**: Real-time visibility into contract portfolio and risk exposure Customization Options Implement industry-specific modules for healthcare (BAAs, DPAs), financial services (ISDAs, loan agreements), technology (SaaS, licensing), or government contracting. Add AI models trained on your historical contracts for better extraction accuracy. Create custom risk factors for emerging regulations like AI governance or ESG compliance. Build integration with specific CLM systems (Ironclad, Docusign CLM, Icertis). Implement advanced analytics including contract similarity scoring, win-rate analysis by clause variations, and automatic playbook generation. Add blockchain integration for smart contract execution and configure automated contract assembly for standard agreements. Note: This workflow uses the PDF Vector community node. Make sure to install it from the n8n community nodes collection before using this template.
by Babish Shrestha
🚀 Build Your Own Knowledge Chatbot Using Google Drive Create a smart chatbot that answers questions using your Google Drive PDFs—perfect for support, internal docs, education, or research. 🛠️ Quick Setup Guide** Step 1: Prerequisites n8n instance (cloud or self-hosted) Google Drive account (with PDFs) Supabase account (vector database) OpenAI API key PostgreSQL database (for chat memory) else remove the node Step 2: Supabase Setup Create supabase account (its free) Create a project Copy the sql and paste it in supabase sql editor -- Enable the pgvector extension to work with embedding vectors create extension vector; -- Create a table to store your documents create table documents ( id bigserial primary key, content text, -- corresponds to Document.pageContent metadata jsonb, -- corresponds to Document.metadata embedding vector(1536) -- 1536 works for OpenAI embeddings, change if needed ); -- Create a function to search for documents create function match_documents ( query_embedding vector(1536), match_count int default null, filter jsonb DEFAULT '{}' ) returns table ( id bigint, content text, metadata jsonb, similarity float ) language plpgsql as $$ #variable_conflict use_column begin return query select id, content, metadata, 1 - (documents.embedding <=> query_embedding) as similarity from documents where metadata @> filter order by documents.embedding <=> query_embedding limit match_count; end; $$; Step 3: Import & Configure n8n Workflow Import this template into n8n Add credentials: OpenAI API key Google Drive OAuth2 Supabase URL & service key PostgreSQL connection Set your Google Drive folder ID in triggers Step 4: Test & Use Add a PDF to your Drive folder → check Supabase for new entries Start the workflow and chat → ask questions about your documents. "What can you help me with?" Multi-turn chat → context is maintained per user ⚡ Features Auto-syncs new/updated PDFs from Google Drive Extracts, chunks, and vectorizes text Finds relevant info and answers questions Maintains chat history per user 📝 Troubleshooting Check folder permissions & IDs if no docs found Verify API keys & Supabase setup for errors Ensure PostgreSQL is connected for chat memory Tags: RAG, Chatbot, Google Drive, Supabase, OpenAI, n8n Setup Time: ~20 minutes
by Alexandra Spalato
Companies Email Finder & Lead Generation Automation Short Description Automatically find company domains, extract decision maker emails (CEO, Sales, Marketing), validate email quality, and build a comprehensive prospect database using AI-powered search and professional email finding APIs. Detailed Description This comprehensive lead generation workflow transforms a simple list of company names into a complete prospect database with verified decision maker contacts. The system automatically discovers official company websites, finds key decision makers' email addresses, validates email quality, and organizes everything in a structured database for immediate outreach. Perfect for sales teams, marketing agencies, business developers, and anyone who needs to build high-quality prospect lists efficiently and cost-effectively. Key Features Intelligent Domain Discovery**: Uses Serper.dev and AI to find official company websites from search results Multi-Role Email Finding**: Automatically extracts emails for: CEOs and C-level executives Sales decision makers Marketing decision makers Email Quality Validation**: Classifies emails as "valid" or "risky" for better deliverability Smart Fallback System**: Searches for additional company emails when decision makers aren't found Duplicate Prevention**: Removes duplicate contacts automatically Batch Processing**: Handles large company lists efficiently with intelligent batching Database Integration**: Stores all data in NocoDB with proper organization and status tracking Rate Limiting**: Includes delays and error handling to respect API limits Who This Workflow Is For Sales Teams**: Building targeted prospect lists for outbound campaigns Marketing Agencies**: Creating lead databases for client campaigns Business Development**: Finding decision makers for partnership opportunities Recruiters**: Locating hiring managers and HR contacts Entrepreneurs**: Building contact lists for product launches or fundraising Lead Generation Services**: Automating prospect research for clients Problems This Workflow Solves Manual Research Time**: Eliminates hours of manual company and contact research Incomplete Contact Data**: Ensures you have decision makers, not just generic emails Email Deliverability Issues**: Validates email quality before outreach campaigns Data Organization**: Maintains clean, structured prospect databases Scaling Bottlenecks**: Processes hundreds of companies automatically High Lead Generation Costs**: Reduces dependency on expensive lead generation services Setup Requirements Required API Credentials Serper.dev API Key**: For company domain search and discovery OpenAI API Key**: For intelligent domain extraction from search results AnyMailFinder API Key**: For decision maker email discovery and validation NocoDB API Token**: For database storage and management Database Structure Companies Table: Id (Number): Unique company identifier company_name (Text): Company name to search location (Text): Company location for better search results url (URL): Discovered company website domain (Text): Extracted domain name status (Select): Processing status tracking emails (Text): All discovered company emails company_emails_status (Text): Email validation status Contacts Table: companies_id (Number): Link to parent company name (Text): Contact full name position (Text): Job title/role email (Email): Contact email address email_status (Text): Email validation status linkedin_url (URL): LinkedIn profile (when available) System Requirements Active n8n instance (self-hosted or cloud) NocoDB database instance Active API subscriptions for Serper.dev, OpenAI, and AnyMailFinder How It Works Phase 1: Domain Discovery Company Processing: Retrieves companies from database in batches Domain Search: Uses Serper.dev to search for official company websites AI Domain Extraction: OpenAI analyzes search results to identify official domains Database Updates: Stores discovered domains and URLs Phase 2: Decision Maker Discovery Multi-Role Search: Finds emails for CEO, Sales, and Marketing decision makers using AnyMailFinder Email Validation: Validates email deliverability and flags risky addresses Contact Creation: Stores validated contacts with full details Status Tracking: Updates company status based on email discovery success Phase 3: Company Email Backup Gap Analysis: Identifies companies with no valid decision maker emails Bulk Email Search: Finds up to 20 additional company emails using AnyMailFinder Final Updates: Stores all discovered emails for comprehensive coverage Customization Options Search Parameters Modify search queries for better domain discovery using Serper.dev Adjust location-based search parameters Customize AI prompts for domain extraction Decision Maker Roles Add new decision maker categories (HR, Finance, Operations, etc.) Modify existing role search parameters in AnyMailFinder Customize email validation criteria Data Processing Adjust batch sizes for different processing speeds Modify rate limiting delays Customize error handling and retry logic Database Schema Add custom fields for industry, company size, etc. Integrate with different database systems Customize data validation rules API Costs and Credits AnyMailFinder**: 2 credits per valid email found, 1 credit per bulk company search Serper.dev**: ~$1 per 1000 searches OpenAI**: Minimal costs for domain extraction prompts Estimated Cost**: about $0.03 per company processed (depending on email discovery success) Benefits Save 20+ Hours Weekly**: Automate prospect research that takes hours manually Higher Quality Leads**: Get decision makers, not generic contact@ emails Better Deliverability**: Email validation reduces bounce rates Scalable Process**: Handle thousands of companies automatically Cost Effective**: Much cheaper than traditional lead generation services Complete Database**: Build comprehensive prospect databases with all contact details Use Cases Outbound Sales Campaigns**: Build targeted prospect lists for cold outreach Partnership Development**: Find decision makers at potential partner companies Market Research**: Understand company structures and contact hierarchies Recruitment**: Locate hiring managers and HR contacts Investor Relations**: Find contacts at potential investor companies Vendor Outreach**: Identify procurement and operations contacts Installation Instructions Import the workflow JSON into your n8n instance Set up NocoDB database with required table structures Configure all API credentials in the credential manager (including Serper.dev and AnyMailFinder) Update NocoDB connection settings with your database details Test with a small batch of companies before full deployment Monitor API usage and adjust batch sizes as needed Best Practices Start with high-quality company names and locations Monitor AnyMailFinder credit usage to manage costs Use Serper.dev efficiently with targeted search queries Regularly clean and validate your prospect database Respect email deliverability best practices Follow GDPR and data privacy regulations Use rate limiting to avoid API restrictions Error Handling Built-in retry mechanisms for API failures Continues processing even if individual companies fail Create an Error Workflow
by System Admin
Example Input Data: {"projectIdea":"A hub for all things Oasis","projectName":"Oasis Hub","teamMembers":[{"name":"Will Stenzel","email":"stenzel.w@northeastern.edu"},{"name":"Jane Doe","email":"doe.j@...
by AureusR
WhatsApp customer service bot (with voice note transcription) handling FAQ, service enquiries and schedule appointments Who’s it for This template is designed for businesses that provide customer support and appointment-based services over WhatsApp. It’s ideal for service providers (e.g., clinics, salons, repair shops, consultants) that want to automate FAQs, share service information, handle voice note inquiries, and schedule appointments without manual effort. How it works / What it does This workflow creates a WhatsApp customer service assistant that: Transcribes voice notes** sent by customers into text for further processing. Answers customer FAQs by looking up a Google Sheet knowledge base. Provides service information (name, description, price) from a Google Sheet. Schedules appointments by: Asking the customer which service they want. Collecting their preferred day and time. Checking Google Calendar for available slots. Offering 3 options and letting the customer choose. Collecting name, email, and phone number. Creating the confirmed appointment in Google Calendar. Sends all customer-facing messages via a WhatsApp integration node. How to set up Connect your tools Link your Google Sheets for FAQs and Services. Connect your Google Calendar account. Configure your WhatsApp integration. Connect a transcription service (e.g., Whisper, Google Speech-to-Text, or another transcription API). Prepare your data FAQs Google Sheet → must contain columns: id | question | answer Services Google Sheet → must contain columns: id | service_name | service_description | price Adjust the flow Update the service names and questions to match your business. Set the correct time zone in the Google Calendar node. Update the WhatsApp integration node with your business account. Configure the transcription node with your chosen API credentials. Requirements Google Sheets (for FAQs and Services) Google Calendar WhatsApp integration in n8n Speech-to-Text API (for transcribing voice notes) How to customize the workflow Adding new FAQs**: Update the Google Sheet with new rows. Changing services**: Modify the Services Google Sheet to reflect updated offerings or prices. Custom messages**: Update the agent_reply node text to reflect your brand tone. Advanced logic**: Add routing for voice-note-only customers, VIP handling, or multilingual support. Notes This template uses multiple external integrations (Google Sheets, Google Calendar, WhatsApp, Speech-to-Text).
by ConceptRecall
Who is this for? This workflow is designed for software teams, project managers, and developers who manage work across Azure DevOps and GitHub. It helps organizations that use Azure DevOps for work item tracking but rely on GitHub for issue management and collaboration. If you need to ensure that your DevOps Stories and Tasks are mirrored in GitHub issues while keeping a single source of truth in Google Sheets, this workflow is for you. What problem is this workflow solving? / Use case Managing projects across multiple platforms often leads to missed updates and poor traceability. Stories created in Azure DevOps may not be tracked properly in GitHub.\ Tasks under Stories often lose visibility when teams split between platforms.\ Manual syncing between tools takes time and causes human errors. This workflow solves that problem by automating the sync between Azure DevOps Stories and GitHub Issues, while also keeping a Google Sheets record for cross-referencing and reporting. What this workflow does Triggers from Azure DevOps Stories -- When a Story is created or updated, the workflow is activated.\ Creates a GitHub Issue -- A new issue is generated in the specified GitHub repository.\ Assigns a random collaborator -- One repository collaborator is randomly assigned to the issue.\ Logs mapping in Google Sheets -- The Azure DevOps Story ID, GitHub Issue number, and URL are stored for tracking.\ Triggers from Azure DevOps Tasks -- When a Task linked to a Story is created, the workflow looks up its parent in Google Sheets.\ Updates the GitHub Issue -- The parent GitHub Issue is updated with a clickable link to the new Task for better visibility. Setup Connect your accounts GitHub (OAuth2 or personal token)\ Google Sheets (OAuth2)\ Azure DevOps (Webhook integration) Configure Webhooks Add the workflow's webhook URLs to Azure DevOps service hooks for Work Item Created/Updated events. Update repository details Set the GitHub repository where issues should be created. Customize Sheets Use the provided Google Sheet or link your own for issue mappings. How to customize this workflow to your needs Modify assignment logic**: Instead of random collaborator assignment, edit the Code node to assign issues based on workload or labels.\ Change Sheet schema**: Add more fields (e.g., State, IterationPath) to your Google Sheet for richer reporting.\ Expand task linking**: Customize the way Tasks are appended to GitHub issues (e.g., group by state, show due dates). Powered By Concept Recall https://conceptrecall.com
by tanaypant
This workflow is the second of three. You can find the other workflkows here: Incident Response Workflow - Part 1 Incident Response Workflow - Part 2 Incident Response Workflow - Part 3 We have the following nodes in the workflow: Webhook node: This trigger node listens to the event when the Acknowledge button is clicked. PagerDuty node: This node changes the status of the incident report from 'Triggered' to 'Acknowledged' in PagerDuty. Mattermost node: This node publishes a message in the auxiliary channel saying that the status of the incident report has been changed to Acknowledged.