by Yaron Been
CTO Agent with Engineering Team Description Complete AI-powered engineering department with a Chief Technology Officer (CTO) agent orchestrating specialized engineering team members for comprehensive software development and technical operations. Overview This n8n workflow creates a comprehensive engineering department using AI agents. The CTO agent analyzes technical requests and delegates tasks to specialized agents for software architecture, DevOps, security, quality assurance, backend development, and frontend development. Features Strategic CTO agent using OpenAI O3 for complex technical decision-making Six specialized engineering agents powered by GPT-4.1-mini for efficient execution Complete software development lifecycle coverage from architecture to deployment Automated DevOps pipelines and infrastructure management Security assessments and compliance frameworks Quality assurance and test automation strategies Full-stack development capabilities Team Structure CTO Agent**: Technical leadership and strategic delegation (O3 model) Software Architect Agent**: System design, patterns, technology stack decisions DevOps Engineer Agent**: CI/CD pipelines, infrastructure automation, containerization Security Engineer Agent**: Application security, vulnerability assessments, compliance QA Test Engineer Agent**: Test automation, quality strategies, performance testing Backend Developer Agent**: Server-side development, APIs, database architecture Frontend Developer Agent**: UI/UX development, responsive design, frontend frameworks How to Use Import the workflow into your n8n instance Configure OpenAI API credentials for all chat models Deploy the webhook for chat interactions Send technical requests via chat (e.g., "Design a scalable microservices architecture for our e-commerce platform") The CTO will analyze and delegate to appropriate specialists Receive comprehensive technical deliverables Use Cases Full Stack Development**: Complete application architecture and implementation System Architecture**: Scalable designs for microservices and distributed systems DevOps Automation**: CI/CD pipelines, containerization, cloud deployment strategies Security Audits**: Vulnerability assessments, secure coding practices, compliance Quality Assurance**: Test automation frameworks, performance testing strategies Technical Documentation**: API documentation, system diagrams, deployment guides Requirements n8n instance with LangChain nodes OpenAI API access (O3 for CTO, GPT-4.1-mini for specialists) Webhook capability for chat interactions Optional: Integration with development tools and platforms Cost Optimization O3 model used only for strategic CTO decisions GPT-4.1-mini provides 90% cost reduction for specialist tasks Parallel processing enables simultaneous agent execution Code template library reduces redundant development work Integration Options Connect to development platforms (GitHub, GitLab, Bitbucket) Integrate with project management tools (Jira, Trello, Asana) Link to monitoring and logging systems Export to documentation platforms Contact & Resources Website**: nofluff.online YouTube**: @YaronBeen LinkedIn**: Yaron Been Tags #SoftwareEngineering #TechStack #DevOps #SecurityFirst #QualityAssurance #FullStackDevelopment #Microservices #CloudNative #TechLeadership #EngineeringAutomation #n8n #OpenAI #MultiAgentSystem #EngineeringExcellence #DevAutomation #TechInnovation
by Davide
This workflow automates the entire process of creating, managing, and publishing AI-generated videos using OpenAI Sora2 Pro, Google Sheets, Google Drive, and YouTube. Advantages ✅ Fully Automated Video Pipeline From idea to YouTube publication with zero manual intervention after setup. ✅ Centralized Control via Google Sheets Simple spreadsheet interface — no need to use APIs or dashboards directly. ✅ AI-Powered Video Creation Uses OpenAI Sora2 Pro for generating professional-quality videos from text prompts. ✅ SEO-Optimized Titles with GPT-5 Automatically creates catchy, keyword-rich titles optimized for YouTube engagement. ✅ Cloud Integration Seamless use of Google Drive for file management and YouTube for publishing. ✅ Scalable and Repeatable Can handle multiple videos in sequence, triggered manually or at regular intervals. ✅ Error-Resilient and Transparent Uses conditional checks (“Completed?” node) and real-time updates in Google Sheets to ensure reliability and visibility. How it Works This workflow automates the entire process of generating AI videos and publishing them to YouTube, using a Google Sheet as the central control panel. Trigger & Data Fetch: The workflow is triggered either manually or on a schedule. It starts by reading a specific Google Sheet to find new video requests. A new request is identified as a row where the "PROMPT" and "DURATION" columns are filled, but the "VIDEO" column is empty. AI Video Generation: For each new request, it takes the prompt and duration, then sends a request to the Fal.ai Sora-2 Pro model via its API to generate the video. The system then enters a polling loop, checking the video generation status every 60 seconds until it is COMPLETED. Post-Processing & Upload: Once the video is ready, the workflow performs three parallel actions: Fetch Video & Upload to Drive: It retrieves the generated video file and uploads it to a specified folder in Google Drive for archiving. Generate YouTube Title: It sends the original prompt to OpenAI's GPT-5 (or another specified model) to generate an optimized, SEO-friendly title for the YouTube video. Publish to YouTube: It takes the generated video file and the AI-created title and uses the Upload-Post.com service to automatically publish the video to a connected YouTube channel. Update & Log: Finally, the workflow updates the original Google Sheet row with the URL of the archived video in Google Drive and the newly created YouTube video URL, providing a complete audit trail. Set up Steps To configure this workflow, follow these steps: Prepare the Google Sheet: Create a Google Sheet with at least these columns: PROMPT, DURATION, VIDEO, and YOUTUBE_URL. In the n8n "Get new video" and update nodes, configure the documentId and sheetName to point to your specific Google Sheet. Configure Fal.ai API Key: Create an account on fal.ai and obtain your API key. In both the "Create Video" and "Get status" HTTP Request nodes, set up the HTTP Header Authentication. Set the Name to Authorization and the Value to Key YOUR_API_KEY. Set up Upload-Post.com for YouTube: Create an account on Upload-Post.com and get your API key. Connect your YouTube channel as a "profile". In the "HTTP Request" node (for uploading), configure the Header Auth with Name: Authorization and Value: Apikey YOUR_UPLOAD_POST_API_KEY. Replace YOUR_USERNAME in the node's body parameters with the profile name you created on Upload-Post.com (e.g., test1). Configure OpenAI (Optional but Recommended): The "Generate title" node uses an OpenAI model. Ensure you have valid OpenAI API credentials set up in n8n for this node to function and create optimized titles. Finalize Paths and Activate: In the "Upload Video" node, specify the correct Google Drive folder ID where you want the videos to be saved. Once all credentials and paths are set, you can activate the workflow and set the "Schedule Trigger" node to run at your desired interval (e.g., every 5 minutes). Need help customizing? Contact me for consulting and support or add me on Linkedin.
by InfyOm Technologies
✅ What problem does this workflow solve? Salon staff often spend hours juggling appointment calls, managing bookings manually, and keeping track of customer preferences. This workflow automates your entire salon appointment system via WhatsApp, delivering a personalized and human-like booking experience using AI, memory, and Google Sheets. 💡 Main Use Cases 💁♀️ Offer personalized stylist recommendations by remembering customer preferences and past visits. 📅 Provide real-time availability and salon opening hour information. 📝 Book and update appointments directly from customer chat. 🔁 Simplify appointment changes by recalling previous booking details. 🧠 Enable context-aware, memory-driven conversations across multiple interactions. 🧠 How It Works – Step-by-Step 1. 📲 Chat Message Trigger The workflow is triggered whenever a customer sends a message to your WhatsApp salon assistant. 2. 🧠 Memory Buffer for Context Management The assistant uses a Memory Buffer to: Recognize returning customers Avoid repeating questions Maintain conversation flow across multiple sessions This enables a seamless and intelligent dialogue with each customer. 3. 💇 Stylist & Service Lookup When the customer asks for stylist suggestions, available time slots, or services: Extracts request details using AI Queries a Google Sheet containing: Stylist availability Service types Salon opening hours Returns personalized recommendations based on preferences and availability 4. ✅ Appointment Booking Collects all necessary info: Date, time, selected service, stylist, contact info Stores the appointment in Google Sheets Sends a confirmation message to the customer in WhatsApp 5. 🔄 Modify or Cancel Bookings Customers can update or cancel appointments Bot matches records by phone number Modifies or deletes the appointment in the sheet accordingly 🧩 Integrations Used WhatsApp Integration** (via Twilio, Meta API, or other connector) OpenAI/GPT Model** for natural conversation flow and extraction Google Sheets** as a simple and effective appointment database Memory Buffer** for ongoing context across chats 👤 Who can use this? Perfect for: 💇♀️ Salons and barbershops 💅 Spas and beauty centers 🧖♀️ Wellness studios 🛠 Developers building vertical AI assistants for SMBs If you're looking to modernize your booking process and impress customers with an AI-powered, memory-enabled WhatsApp bot—this workflow delivers. 🚀 Benefits ⏰ Save time for your staff 🧠 Offer truly personalized experiences 📲 Book appointments 24/7 via WhatsApp 📋 Keep all records organized in Google Sheets 🧘 Reduce human error and double bookings 📦 Ready to Launch? Just configure: ✅ Your WhatsApp number + webhook integration ✅ Google Sheet with stylist and service data ✅ OpenAI key for AI-powered conversation ✅ Memory Buffer to enable smart replies And your salon will be ready to offer automated, intelligent booking—right from a simple WhatsApp chat.
by Frankie Wong
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This n8n workflow template helps you automatically convert unstructured contact information—such as customer details copied from emails, web forms, or chat messages—into clean, structured JSON using an AI agent. What It Does: Accepts unstructured contact data via a Webhook (as form-data under the key prompt) Uses AI to intelligently extract key fields such as: Company Name First Name Last Name Address City Country Phone Fax Email Parses and formats the extracted data into a valid JSON object Prepares the output for seamless integration into systems like: Dolibarr Other ERP/CRM platforms Any service that consumes JSON via API or webhook Use Cases: Automate manual data entry from emails into your ERP system Clean and normalize contact data from various input sources Reduce human error in your customer onboarding workflows This template saves you time and ensures consistency across your business systems. Simply connect your systems and let the automation handle the rest.
by Isight
Dental Clinic Automation: Scheduling, Availability & Patient Lookup This workflow automates dental appointment management through a phone-based assistant. It listens for requests like booking, rescheduling, canceling, checking insurance, looking up appointments, and finding available time slots. Each request is processed through a Switch node and then routed to your Supabase database for action. How it works Once a request is received, the workflow uses the patient’s phone number to identify them. Then, it: Booking: Checks for available time, creates or retrieves the patient record, and stores the appointment. Rescheduling: Confirms the new date, avoids double-booking, and updates the record. Canceling: Removes the appointment and sends a confirmation. Insurance: Looks up the member ID and provides a status (accepted or not). Availability: Finds the doctor’s existing appointments and generates available 60-minute slots. Appointment & doctor lists: Retrieves and presents clean, structured information for the assistant. Each action ends with a webhook response that the phone system reads back to the patient. Setup steps Add your Supabase credentials to the Supabase nodes. Connect your phone/voice system to the webhook URL. Ensure Supabase table and column names match the workflow. Test all actions (booking, rescheduling, canceling, etc.) before going live. Customization tips (optional) You can update working hours, appointment durations, or add new services by modifying the availability logic or Switch node routing.
by Gabriel Santos
Who’s it for Teams and project managers who want to turn meeting transcripts into actionable Trello tasks automatically, without worrying about duplicate cards. What it does This workflow receives a transcript file in .txt format and processes it with AI to extract clear, concise tasks. Each task includes a short title, a description, an assignee (if mentioned), and a deadline (if available). The workflow then checks Trello for duplicates across all lists, comparing both card titles (name) and descriptions (desc). If a matching card already exists, the workflow returns the existing Trello card ID. If not, it creates a new card in the predefined default list. Finally, the workflow generates a user-friendly summary: how many tasks were found, how many already existed, how many new cards were created, and how many tasks had no assignee or deadline. Requirements A Trello account with API credentials configured in n8n (no hardcoded keys). An OpenAI (or compatible) LLM account connected in n8n. How to customize Adjust similarity thresholds for title/description matching in the Trello Sub-Agent. Modify the summary text to always return in your preferred language. Extend the Trello card creation step with labels, members, or due dates.
by Sk developer
Automated DA PA Checker Workflow for SEO Analysis Description This n8n workflow collects a website URL via form submission, retrieves SEO metrics like Domain Authority (DA) and Page Authority (PA) using the Moz DA PA Checker API, and stores the results in Google Sheets for easy tracking and analysis. Node-by-Node Explanation On form submission – Captures the website input from the user to pass to the Moz DA PA Checker API. DA PA API Request – Sends the website to the Moz DA PA Checker API via RapidAPI to fetch DA, PA, spam score, DR, and organic traffic. If – Checks if the API request to the Moz DA PA Checker API returned a successful response. Clean Output – Extracts only the useful data from the Moz DA PA Checker API response for saving. Google Sheets – Appends the cleaned SEO metrics to a Google Sheet for record-keeping. Use Cases SEO Analysis** – Quickly evaluate a website’s DA/PA metrics for optimization strategies. Competitor Research** – Compare domain authority and organic traffic with competitors. Link Building** – Identify high-authority domains for guest posting and backlinks. Domain Purchase Decisions** – Check metrics before buying expired or auctioned domains. Benefits Automated Workflow** – From input to Google Sheets without manual intervention. Accurate Metrics* – Uses the trusted *Moz DA PA Checker API** for DA, PA, spam score, DR, and traffic. Instant Insights** – Get SEO scores in seconds for faster decision-making. Easy Integration** – Seamless connection between RapidAPI and Google Sheets for data storage.
by Shelly-Ann Davy
Build authentic Reddit presence and generate qualified leads through AI-powered community engagement that provides genuine value without spam or promotion. 🎯 What This Workflow Does: This intelligent n8n workflow monitors 9 targeted subreddits every 4 hours, uses AI to analyze posts for relevance and lead potential, generates authentic helpful responses that add value to discussions, posts comments automatically, and captures high-quality leads (70%+ potential score) directly into your CRM—all while maintaining full Reddit compliance and looking completely human. ✨ Key Features: 6 Daily Checks: Monitors subreddits every 4 hours for fresh content 9 Subreddit Coverage: Customizable list of target communities AI Post Analysis: Determines relevance, intent, and lead potential Intelligent Engagement: Only comments when you can add genuine value Authentic Responses: AI-generated comments that sound human, not promotional Lead Scoring: 0-1.0 scale identifies high-potential prospects (0.7+ captured) Automatic CRM Integration: High-quality leads flow directly to Supabase Rate Limit Protection: 60-second delays ensure Reddit API compliance Native Reddit Integration: Official n8n Reddit node with OAuth2 Beginner-Friendly: 14+ detailed sticky notes explaining every component 🎯 Target Subreddits (Customizable): Insurance & Claims: r/Insurance - General insurance questions r/ClaimAdvice - Claim filing help r/AutoInsurance - Auto coverage discussions r/FloodInsurance - Flood damage queries r/PropertyInsurance - Property coverage Property & Home: r/homeowners - Property issues and claims r/RoofingContractors - Roof damage discussions Financial & Legal: r/PersonalFinance - Insurance decisions r/legaladvice - Legal aspects of claims 🤖 AI Analysis Components: Post Evaluation: Relevance score (0-100%) User intent detection Damage type identification (hail, water, fire, wind) Urgency level (low/medium/high) Lead potential score (0-1.0) Recommended services Engagement opportunity assessment Decision Criteria: Should engage? (boolean) Can we add value? (quality check) Is this promotional? (avoid spam) Lead worth capturing? (70%+ threshold) Typical Engagement Rate: 5-10% of analyzed posts (67-135 comments/day) 🔧 Technical Stack: Trigger: Schedule (every 4 hours, 6x daily) Reddit API: Native n8n node with OAuth2 AI Analysis: Supabase Edge Functions Response Generation: AI-powered contextual replies Lead Capture: Supabase CRM integration Rate Limiting: Wait node (60-second delays)
by Nima Salimi
Overview This n8n workflow automatically retrieves the monthly CrUX (Chrome User Experience) Report from Google BigQuery and updates the data in NocoDB. It removes the previous month’s data before inserting the new dataset, ensuring your database always contains the latest CrUX rankings for website origins. The flow is fully automated, using schedule triggers to handle both data cleanup and data insertion each month. ✅ Tasks ⏰ Runs automatically on a monthly schedule 🔢 Converts the month name to a numeric value for table selection 🧹 Deletes last month’s CrUX data from NocoDB 🌐 Queries Google BigQuery for the latest monthly dataset 💾 Inserts the new CrUX rankings into NocoDB ⚙️ Keeps your database up to date with zero manual effort 🛠 How to Use 1️⃣ Set Up BigQuery Access Connect your Google BigQuery credentials. Make sure your project includes access to the chrome-ux-report public dataset. 2️⃣ Adjust the Query In the Google BigQuery node, change the LIMIT value to control how many top-ranked sites are retrieved. Ensure the {{ $json.table }} field correctly references the dataset for the desired month (e.g., 202509). 3️⃣ Prepare NocoDB Table Create a table in NocoDB with fields: origin, crux_rank, and any additional metadata you wish to track. 4️⃣ Schedule Automation The workflow includes two Schedule Trigger nodes: One runs the data cleanup process (deletes last month). One runs the data insertion for the new month. 5️⃣ Run or Activate the Workflow Activate it to run automatically each month. You can also run it manually to refresh data on demand. 📋 Prerequisites Before running this workflow, make sure you complete the following setup steps: 🧱 Enable BigQuery API Go to Google Cloud Console → APIs & Services Enable the BigQuery API for your project. 📊 Access the Chrome UX Report Dataset In BigQuery, search for “Chrome UX Report” in the Marketplace or go directly to: https://console.cloud.google.com/marketplace/product/chrome-ux-report/chrome-ux-report Click “View Dataset” and add it to your BigQuery project. 🔑 Connect BigQuery to n8n In n8n, create credentials for your Google BigQuery account using Service Account Authentication. Ensure the account has permission to query the chrome-ux-report dataset. 🗄️ Create a NocoDB Table In NocoDB, create a new table to store your CrUX data with the following fields: origin → Short text crux_rank → Number ⚙️ Connect NocoDB to n8n Use your NocoDB API Token to connect and allow the workflow to read/write data. What is CrUX Rank? CrUX Rank (Chrome User Experience Rank) is a metric from Google’s Chrome UX Report (CrUX) dataset that indicates a website’s popularity based on real user visits. It reflects how frequently an origin (website) is loaded by Chrome users around the world. A lower rank number means the site is more popular (e.g., rank 1 = top site). The data is collected from anonymized Chrome usage statistics, aggregated monthly. This rank helps you track site popularity trends and compare your domain’s visibility over time.
by Gilbert Onyebuchi
This workflow automatically discovers developers on GitHub, enriches their data with email addresses, removes duplicates, and saves everything into a structured Google Sheets CRM. No manual searching, copying, or data cleaning required. It’s perfect for recruiter teams, SaaS founders, agencies, and outbound marketers who need fresh developer leads every day without spending hours on GitHub. How It Works This automation is divided into 3 clear stages: Find Developers on GitHub. The workflow runs on a schedule (daily/hourly). Enrich Developer Data with Emails. The workflow checks if a developer already has an email. If not, it automatically uses Hunter.io to find a professional email address. Remove Duplicates & Save to Google Sheets What You Get Automatic developer sourcing Email enrichment using Hunter.io Built-in duplicate detection Clean, enriched data you can use instantly for outreach What You Need GitHub API Hunter.io API key Google Sheets connection n8n (self-hosted or cloud)
by SIENNA
Automated FTP/SFTP to MinIO Object Backup with Scheduling $\mapsto$ Can work with FTP/SFTP Servers like your Wordpress Website ! What this workflow does ? This workflow performs automated, periodic backups of files from a FTP or SFTP server directory to a MinIO S3 bucket running locally or on a dedicated container/VM/server. It can also work if the MinIO bucket is running on a remote cloud provider's infrastructure; you just need to change the URL and keys. Who's this intended for ? Storage administrators, cloud architects, or DevOps who need a simple and scalable solution for retrieving data from a remote FTP or SFTP Server. This can also be practical for Wordpress Devs that need to backup data from a server hosting a Wordpress Website. In that case, you'll just need to specify the folder that you want to backup (could be one from wordpress/uploads or even the root one) How it works This workflow uses commands to list and download files from a specific directory on a FTP-SFTP Server, then send them to MinIO using their version of the S3 API. The source directory can be a specific one or the entire server (the root directory) Requirements None, just a source folder/directory on a FTP/SFTP Server and a destination bucket on MinIO. You'll also need to get MinIO running. You're using Proxmox VE ? Create a MinIO LXC Container : https://community-scripts.github.io/ProxmoxVE/scripts?id=minio Need a Backup from another Cloud Storage Provider ? Need automated backup from another Cloud Storage Provider ? $\mapsto$ Check out our templates, we've done it with AWS, Azure, and GCP, and we even have a version for FTP/SFTP servers! $\odot$ These workflow can be integrated to bigger ones and modified to best suit your needs ! You can, for example, replace the MinIO node to another S3 Bucket from another Cloud Storage Provider (Backblaze, Wasabi, Scaleway, OVH, ...)
by SerpApi
Sync Google Maps Reviews to Google Sheets for Any Google Maps Query How it works This workflow accepts any query you might run on actual Google Maps to search for places. The search happens through SerpApi's Google Maps API. Once the workflow receives place results from Google Maps, it loops through each place fetching reviews using SerpApi's Google Maps Reviews API. By default, the workflow will be limited to fetch up to 50 reviews per place. This can be customized in the 'Set Review Limit' node}. The first page of reviews for a place will only return 8 reviews. All subsequent pages will return up to 20 reviews. The fetched reviews are sent to a connected Google Sheet. How to use Create a free SerpApi account here: https://serpapi.com/ Add SerpApi credentials to n8n. Your SerpApi API key is here: https://serpapi.com/manage-api-key Connect your Google Sheets accounts to n8n. Help available here: https://n8n.io/integrations/google-sheets/ Create a Google Sheet with these column headers: name, iso_date, rating, snippet Connect your Google Sheet in the 'Append Reviews' Google Sheet node Update the Search Query in the 'Search Google Maps' node to set your own query (Optional) Update the review limit from the default 50 in the 'Set Review Limit' node. Set it to a very high number (e.g. 50000) to get all possible reviews. Hit 'Test Workflow' to manually trigger the workflow. Limitations Can only retrieve the top 20 results from Google Maps. It won't paginate to get more results. The workflow could be extended to support Google Maps Pagination. Warning Each request to SerpApi consumes 1 search credit. Be mindful of how many search credits your account has before requesting more reviews than your account supports. As an example, if a Google Maps query returns 20 results and you fetch the default limit of 50 reviews per page, this will use up to 61 SerpApi search credits. Documentation Google Maps API Google Maps Reviews API SerpApi n8n Node Intro Guide