by Oneclick AI Squad
This workflow automatically tracks shipments across multiple courier services, updates Google Sheets, and notifies customers via WhatsApp/Email when status changes. Good to know Runs daily at 9 AM and only sends notifications when shipment status actually changes API costs may apply for courier tracking (Delhivery, DHL) and WhatsApp notifications Requires Google Sheet with proper column structure and valid API credentials Currently supports Delhivery and DHL APIs but easily expandable How it works Daily Trigger**: Cron node runs workflow every day at 9 AM Get Shipments List**: Fetches all shipment data from Google Sheet Filter Active Shipments**: Excludes delivered orders and empty tracking numbers Route by Courier**: Directs shipments to appropriate API (Delhivery or DHL) Track via APIs**: Makes real-time tracking calls to courier services Parse Tracking Data**: Normalizes different API responses and detects status changes Check Status Change**: Only processes shipments with actual status updates Update & Notify**: Simultaneously updates Google Sheet, sends WhatsApp message, and email notification Execution Summary**: Logs workflow performance metrics How to use Import the JSON workflow into n8n Create Google Sheet with columns: tracking_number, order_id, customer_email, customer_phone, courier, status, location, last_updated, estimated_delivery Configure credentials: Google Sheets OAuth2, Delhivery API, DHL API, WhatsApp API, SMTP Replace YOUR_GOOGLE_SHEET_ID with actual sheet ID Test workflow manually before enabling daily schedule Requirements Google Sheets API access Courier API keys (Delhivery, DHL) WhatsApp Business API credentials SMTP email service n8n instance (self-hosted or cloud) Customizing this workflow Add courier services**: Create new HTTP Request node and update routing logic Change frequency**: Modify cron expression (hourly: 0 * * * *, twice daily: 0 9,17 * * *) Customize notifications**: Edit WhatsApp/Email templates in respective nodes Add CRM integration**: Insert CRM update node after status change detection
by Rahul Joshi
Description: Automate your developer onboarding quality checks with this n8n workflow template. Whenever a new onboarding task is created in ClickUp, the workflow logs it to Google Sheets, evaluates its completeness using Azure OpenAI GPT-4o-mini, and alerts your team in Slack if critical details are missing. Perfect for engineering managers, DevOps leads, and HR tech teams who want to maintain consistent onboarding quality and ensure every developer gets the tools, credentials, and environment setup they need — without manual review. ✅ What This Template Does (Step-by-Step) ⚡ Step 1: Auto-Trigger on ClickUp Task Creation Listens for new task creation events (taskCreated) in your ClickUp workspace to initiate the audit automatically. 📊 Step 2: Log Task Details to Google Sheets Records essential task data — task name, assignee, and description — creating a central audit trail for all onboarding activities. 🧠 Step 3: AI Completeness Analysis (GPT-4o-mini) Uses Azure OpenAI GPT-4o-mini to evaluate each onboarding task for completeness across key areas: Tooling requirements Credential setup Environment configuration Instruction clarity Outputs: ✅ Score (0–100) ⚠️ List of Missing Items 💡 Suggestions for Improvement 🚦 Step 4: Apply Quality Gate Checks whether the AI-generated completeness score is below 80. Incomplete tasks automatically move to the alert stage for review. 📢 Step 5: Alert Team via Slack Sends a structured Slack message summarizing the issue, including: Task name & assignee Completeness score Missing checklist items Recommended next actions This ensures your team fixes incomplete onboarding items before they impact new hires. 🧠 Key Features 🤖 AI-driven task completeness scoring 📊 Automatic task logging for audit visibility ⚙️ Smart quality gate (score threshold < 80) 📢 Instant Slack alerts for incomplete tasks 🔄 End-to-end automation from ClickUp to Slack 💼 Use Cases 🎓 Audit onboarding checklists for new developers 🧩 Standardize environment setup and credential handover 🚨 Identify missing steps before onboarding deadlines 📈 Maintain onboarding consistency across teams 📦 Required Integrations ClickUp API – to detect new onboarding tasks Google Sheets API – to store audit logs and history Azure OpenAI (GPT-4o-mini) – to evaluate completeness Slack API – to alert the team on incomplete entries 🎯 Why Use This Template? ✅ Ensures every new developer receives a full, ready-to-start setup ✅ Eliminates manual checklist verification ✅ Improves onboarding quality and compliance tracking ✅ Creates a transparent audit trail for continuous improvement
by Pawel
Description Very straightforward workflow. It checks the Epic Games website if the HTML container with free games has changed. If it did then it will send a notification to Discord with a list of embeds containing those games. Requirements You will need to install n8n-nodes-puppeteer community node Setup There are two nodes that notify Discord. One at the very end and one in the loop in case of error. Configure them with a webhook or a bot, whatever suits you. That's all.
by Rahi
n8n Workflow: AI-Personalized Email Outreach (Smartlead) 🔄 Purpose This workflow automates cold email campaigns by: Fetching leads Generating hyper-personalized email content using AI Sending emails via Smartlead API Logging campaign activity into Google Sheets 🧩 Workflow Structure Schedule Trigger Starts the workflow automatically at scheduled intervals. Ensures continuous campaign execution. Get Leads Fetches lead data (name, email, company, role, industry). Serves as the input for personalization. Loop Over Leads Processes each lead one by one. Maintains individualized email generation. Aggregate Lead Data Collects and formats lead attributes. Prepares structured input for the AI model. Basic LLM Chain #1 Generates personalized snippets/openers using AI. Tailored based on company, role, and industry. Update Row (Google Sheets) Saves AI outputs (snippets) for tracking and QA. Basic LLM Chain #2 Expands snippet into a full personalized email draft. Includes subject line + email body. Information Extractor Extracts structured fields from AI output: Subject Greeting Call-to-Action (CTA) Closing Update Row (Google Sheets) Stores finalized draft in Google Sheets. Provides visibility and audit trail. Code Formats email into Smartlead-compatible payload. Maps fields like subject, body, and recipient details. Smartlead API Request Sends the personalized email through Smartlead. Returns message ID and delivery status. Basic LLM Chain #3 (Optional) Generates follow-up versions for multi-step campaigns. Ensures varied engagement over time. Information Extractor (Follow-ups) Structures follow-up emails into ready-to-send format. Update Row (Google Sheets) Updates campaign logs with: Smartlead send status Message IDs AI personalization notes ⚙️ Data Flow Summary Trigger** → Runs workflow Get Leads** → Fetch lead records LLM Personalization** → Create openers + full emails Google Sheets** → Save drafts & logs Smartlead API** → Send personalized email Follow-ups** → Generate and log structured follow-up messages 📊 Use Case Automates hyper-personalized cold email outreach at scale. Uses AI to improve response rates with contextual personalization. Provides full visibility by saving drafts and send logs in Google Sheets. Integrates seamlessly with Smartlead for sending and tracking.
by Rahul Joshi
📘 Description This workflow automates dependency update risk analysis and reporting using Jira, GPT-4o, Slack, and Google Sheets. It continuously monitors Jira for new package or dependency update tickets, uses AI to assess their risk levels (Low, Medium, High), posts structured comments back into Jira, and alerts the DevOps team in Slack — all while logging historical data into Google Sheets for visibility and trend analysis. This ensures fast, data-driven decisions for dependency upgrades, improved code stability, and reduced security risks — with zero manual triage. ⚙️ What This Workflow Does (Step-by-Step) 🟢 When Clicking “Execute Workflow” Manually triggers the dependency risk analysis sequence for immediate review or scheduled monitoring. 📋 Fetch All Active Jira Issues Retrieves all active Jira issues to identify tickets related to dependency or package updates. Provides the complete dataset — including summary, status, and assignee information — for AI-based risk evaluation. ✅ Validate Jira Query Response Verifies that Jira returned valid issue data before proceeding. If data exists → continues filtering dependency updates. If no data or API error → logs the failure to Google Sheets. Prevents workflow from continuing with empty or broken datasets. 🔍 Identify Dependency Update Issues Filters Jira issues to find only dependency-related tickets (keywords like “update,” “bump,” “package,” or “library”). This ensures only relevant version update tasks are analyzed — filtering out unrelated feature or bug tickets. 🏷️ Extract Relevant Issue Metadata Extracts essential fields such as key, summary, priority, assignee, status, and created date for downstream AI processing. Simplifies the data payload and ensures accurate, structured analysis. 📢 Alert DevOps Team in Slack Immediately notifies the assigned DevOps engineer via Slack DM about any new dependency update issue. Includes formatted details like summary, key, status, priority, and direct Jira link for quick access. Ensures rapid visibility and faster response to potential risk tickets. 🤖 AI-Powered Risk Assessment Analyzer Uses GPT-4o (Azure OpenAI) to intelligently evaluate each dependency update’s risk level and impact summary. Considers factors such as: Dependency criticality Version change type (major/minor/patch) Security or EOL indicators Potential breaking changes Outputs a clean JSON with fields: {"risk_level": "Low | Medium | High","impact_summary": "Short human-readable explanation"} Helps DevOps teams prioritize updates with context. 🧠 GPT-4o Language Model Configuration Configures the AI reasoning engine for precise, context-aware DevOps assessments. Optimized for consistent technical tone and cost-efficient batch evaluation. 📊 Parse AI Response to Structured Data Safely parses the AI’s JSON output, removing markdown artifacts and ensuring structure. Adds parsed fields — risk_level and impact_summary — back to the Jira context. Includes fail-safes to prevent crashes on malformed AI output (fallbacks to “Unknown” and “Failed to parse”). 💬 Post AI Risk Assessment to Jira Ticket Automatically posts the AI’s analysis as a comment on the Jira issue: Displays 🤖 AI Risk Assessment Report header Shows Risk Level and Impact Summary Includes a checklist of next steps for developers Creates a permanent audit trail for each dependency decision inside Jira. 📈 Log Dependency Updates to Tracking Dashboard Appends all analyzed updates into Google Sheets, recording: Date Jira Key & Summary Risk Level & Impact Summary Assignee & Status This builds a historical dependency risk database that supports: Trend monitoring Security compliance reviews Dependency upgrade metrics DevOps productivity tracking 📊 Log Jira Query Failures to Error Sheet If the Jira query fails, the workflow automatically logs the error (API/auth/network) into a centralized error sheet for troubleshooting and visibility. 🧩 Prerequisites Jira Software Cloud API credentials Azure OpenAI (GPT-4o) access Slack API connection Google Sheets OAuth2 credentials 💡 Key Benefits ✅ Automated dependency risk assessment ✅ Instant Slack alerts for update visibility ✅ Historical tracking in Google Sheets ✅ Reduced manual triage and faster decision-making ✅ Continuous improvement in release reliability and security 👥 Perfect For DevOps and SRE teams managing large dependency graphs Engineering managers monitoring package updates and risks Security/compliance teams tracking vulnerability fix adoption Product teams aiming for stable CI/CD pipelines
by Jorge Martínez
Generate social posts from GitHub pushes to Twitter and LinkedIn On each GitHub push, this workflow checks if the commit set includes README.md and CHANGELOG.md, fetches both files, lets an LLM generate a Twitter and LinkedIn post, then publishes to Twitter and LinkedIn (Person). Apps & Nodes Trigger:** Webhook Logic:** IF, Merge, Aggregate GitHub:** Get Repository File (×2) Files:** Extract from File (text) (×2) AI:** OpenAI Chat Model → LLM Chain (+ Structured Output Parser) Publish:** Twitter, LinkedIn (Person) Prerequisites GitHub:** OAuth2 or PAT with repo read. OpenAI:** API key. Twitter:* OAuth2 app with *Read and Write; scopes tweet.read tweet.write users.read offline.access. LinkedIn (Person):* OAuth2 credentials; *required scope:** w_member_social, openid. Setup GitHub Webhook: Repo → Settings → Webhooks Payload URL: https://<your-n8n-domain>/webhook/github/push Content type: application/json • Event: Push • Secret (optional) • Branches as needed. Credentials: Connect GitHub, OpenAI, Twitter, and LinkedIn (Person). How it Works Webhook receives GitHub push payload. IF checks that README and CHANGELOG appear in added/modified. GitHub (Get Repository File) pulls README.md and CHANGELOG.md. Extract from File (text) converts both binaries to text. Merge & Aggregate combines into one item with both contents. LLM (OpenAI + Parser) returns a JSON with twitter and linkedin. Twitter posts the tweet. LinkedIn (Person) posts the LinkedIn text.
by A Z
⚡ Quick Setup Import this workflow into your n8n instance. Add your Apify, Google Sheets, and Firecrawl credentials. Activate the workflow to start your automated lead enrichment system. Copy the webhook URL from the MCP trigger node. Connect AI agents using the MCP URL. 🔧 How it Works This solution combines two powerful workflows to deliver fully enriched, AI-ready business leads from Google Maps: Apify Google Maps Scraper Node: Collects business data and, if enabled, enriches each lead with contact details and social profiles. Leads Missing Enrichment: Any leads without contact or social info are automatically saved to a Google Sheet. Firecrawl & Code Node Workflow: A second workflow monitors the Google Sheet, crawls each business’s website using Firecrawl, and extracts additional social media profiles or contact info using a Code node. Personalization Logic: AI-powered nodes generate tailored outreach content for each enriched lead. Native Integration: The entire process is exposed as an MCP-compatible interface, returning enriched and personalized lead data directly to the AI agent. 📋 Available Operations Business Search: Find businesses on Google Maps by location, category, or keyword. Lead Enrichment: Automatically append contact details, social profiles, and other business info using Apify and Firecrawl. Personalized Outreach Generation: Create custom messages or emails for each lead. Batch Processing: Handle multiple leads in a single request. Status & Error Reporting: Get real-time feedback on processing, enrichment, and crawling. 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: Search queries (location, keywords, categories) Enrichment options (contact, social, etc.) Personalization variables (name, business type, etc.) Response Format: Returns fully enriched lead data and personalized outreach content in a structured format.
by Rahul Joshi
📊 Description Automate post-purchase workflows by instantly fetching successful Stripe payments, matching them to corresponding automation templates in Google Sheets, and sending customers personalized access emails using AI-generated content. This system ensures each buyer receives their digital template, password, and onboarding details automatically after payment. 💳📩🤖 What This Template Does Step 1: Triggers daily at 7:00 AM IST to fetch all successful payment charges from Stripe. ⏰ Step 2: Retrieves payment intent and product details for each successful charge to enrich context. 💰 Step 3: Validates required fields (order reference, product name, customer name, email). ✅ Step 4: Matches purchased product with the automation record in Google Sheets via AI lookup. 🔍 Step 5: Combines Stripe and Sheet data into one record, ensuring accuracy and completeness. 🔄 Step 6: Filters out already-processed customers to avoid duplicate sends. 🧮 Step 7: Generates a personalized thank-you email using Azure OpenAI (GPT-4o-mini) including access links, password, and onboarding tips. 💌 Step 8: Sends the email through Gmail to the customer automatically. 📧 Step 9: Logs each transaction and email delivery into Google Sheets for tracking and auditing. 📊 Key Benefits ✅ Fully automated Stripe-to-email delivery flow ✅ Zero manual intervention — instant template delivery ✅ AI-personalized HTML emails with customer details ✅ Centralized purchase logging and analytics ✅ Eliminates duplicates and ensures smooth customer experience Features Scheduled daily trigger (7:00 AM IST) Stripe API integration for payment and product details Google Sheets lookup for automation files and passwords GPT-powered email content generation Gmail API integration for delivery Google Sheets logging for audit trail Requirements Stripe API credentials Google Sheets OAuth2 credentials Gmail OAuth2 credentials Azure OpenAI API credentials Target Audience SaaS or digital product sellers using Stripe Automation template marketplaces Small teams delivering digital assets via email Businesses seeking instant customer fulfillment
by Onur
Amazon Product Scraper with Scrape.do & AI Enrichment > This workflow is a fully automated Amazon product data extraction engine. It reads product URLs from a Google Sheet, uses Scrape.do to reliably fetch each product page’s HTML without getting blocked, and then applies an AI-powered extraction process to capture key product details such as name, price, rating, review count, and description. All structured results are neatly stored back into a Google Sheet for easy access and analysis. This template is designed for consistency and scalability—ideal for marketers, analysts, and e-commerce professionals who need clean product data at scale. 🚀 What does this workflow do? Reads Input URLs:** Pulls a list of Amazon product URLs from a Google Sheet. Scrapes HTML Reliably:* Uses *Scrape.do** to bypass Amazon’s anti-bot measures, ensuring the page HTML is always retrieved successfully. Cleans & Pre-processes HTML:** Strips scripts, styles, and unnecessary markup, isolating only relevant sections like title, price, ratings, and feature bullets. AI-Powered Data Extraction:** A LangChain/OpenRouter GPT-4 node verifies and enriches key fields—product name, price, rating, reviews, and description. Stores Structured Results:** Appends all extracted and verified product data to a results tab in Google Sheets. Batch & Loop Control:** Handles multiple URLs efficiently with Split In Batches to process as many products as you need. 🎯 Who is this for? E-commerce Sellers & Dropshippers:** Track competitor prices, ratings, and key product features automatically. Marketing & SEO Teams:** Collect product descriptions and reviews to optimize campaigns and content. Analysts & Data Teams:** Build accurate product databases without manual copy-paste work. ✨ Benefits High Success Rate:* *Scrape.do** handles proxy rotation and CAPTCHA challenges automatically, outperforming traditional scrapers. AI Validation:** LLM verification ensures data accuracy and fills in gaps when HTML elements vary. Full Automation:** Runs on-demand or on a schedule to keep product datasets fresh. Clean Output:** Results are neatly organized in Google Sheets, ready for reporting or integration with other tools. ⚙️ How it Works Manual or Scheduled Trigger: Start the workflow manually or via a cron schedule. Input Source: Fetch URLs from a Google Sheet (TRACK_SHEET_GID). Scrape with Scrape.do: Retrieve full HTML from each Amazon product page using your SCRAPEDO_TOKEN. Clean & Pre-Extract: Strip irrelevant code and use regex to pre-extract key fields. AI Extraction & Verification: LangChain GPT-4 model refines and validates product name, description, price, rating, and reviews. Save Results: Append enriched product data to the results sheet (RESULTS_SHEET_GID). 📋 n8n Nodes Used Manual Trigger / Schedule Trigger Google Sheets (read & append) Split In Batches HTTP Request (Scrape.do) Code (clean & pre-extract HTML) LangChain LLM (OpenRouter GPT-4) Structured Output Parser 🔑 Prerequisites Active n8n instance. Scrape.do API token** (bypasses Amazon anti-bot measures). Google Sheets** with: TRACK_SHEET_GID: tab containing product URLs. RESULTS_SHEET_GID: tab for results. Google Sheets OAuth2 credentials** shared with your service account. OpenRouter / OpenAI API credentials** for the GPT-4 model. 🛠️ Setup Import the Workflow into your n8n instance. Set Workflow Variables: SCRAPEDO_TOKEN – your Scrape.do API key. WEB_SHEET_ID – Google Sheet ID. TRACK_SHEET_GID – sheet/tab name for input URLs. RESULTS_SHEET_GID – sheet/tab name for results. Configure Credentials for Google Sheets and OpenRouter. Map Columns in the “add results” node to match your Google Sheet (e.g., name, price, rating, reviews, description). Run or Schedule: Start manually or configure a schedule for continuous data extraction. This Amazon Product Scraper delivers fast, reliable, and AI-enriched product data, ensuring your e-commerce analytics, pricing strategies, or market research stay accurate and fully automated.
by Destiya Wijayanto
This template provides a set of MCP tools to manage personal budgets and expenses. This MCP tools can be integrated to any AI client that support MCP integration. How it works It stores transaction records and budget in google sheet It will give warning if expense is above budget How to setup Sign in with google in google sheet nodes Copy google sheet template (link available in the sticky note) Target google sheet nodes to the right sheet Integrate with AI client Enjoy!!
by Abdul Matheen
Automated Invoice-Processing AI Agent for n8n Overview The Automated Invoice-Processing AI Agent in n8n is designed to streamline and optimize invoice management for finance teams and accounts payable (AP) professionals. This solution addresses the common challenge of verifying invoice data manually, cross-checking it against purchase orders (POs), and ensuring compliance before releasing payments. By intelligently fetching invoices from Google Drive, extracting key details, validating them against PO records from Google Sheets, and automating the next actions, this system reduces human intervention, minimizes errors, and accelerates the payment process. Target Audience This automation primarily serves finance and AP teams responsible for managing large volumes of vendor invoices. It also supports finance managers, procurement departments, and auditors who require accuracy in payment reconciliation, ensuring that invoices align with approved POs before processing. Business Problem Addressed Organizations frequently struggle with time-consuming manual invoice verification and data entry. Discrepancies between invoices and purchase orders can lead to payment delays, compliance risks, or duplicate payments. This n8n-based AI agent automates that process—ensuring that every invoice is validated, exceptions are flagged to the finance team promptly, and payments of smaller value (under defined thresholds) are processed automatically. Prerequisites Active n8n account or self-hosted instance Google Drive and Google Sheets connected via n8n credentials LLM (AI node) configured for document extraction (optional but recommended) A Google Sheet set up with existing PO data (including PO Number, Amount, and Date fields) Setup Instructions Connect Google Drive and Google Sheets integrations within n8n. Configure the workflow trigger to monitor a designated "Invoices" folder. Add a document-parsing node to extract invoice details such as PO Number, Invoice Date, and Amount. Implement conditional logic: If the invoice amount > 5000, the agent cross-references PO details from the Google Sheet. If it matches, it updates the PO sheet status to “Process Payment.” If not, an automated email notifies the finance team. If the amount ≤ 5000, the workflow marks it for direct payment. Test the workflow with sample invoices before full deployment. Customization Options Adjust the payment threshold value (e.g., 10,000 instead of 5,000). Customize the email notification template and recipient list. Integrate with accounting systems such as QuickBooks or SAP for end-to-end automation. Add audit logging nodes to create traceability for every action taken. This AI-driven automation brings speed, accuracy, and scalability to invoice verification—empowering finance professionals to focus on analytical and strategic tasks rather than repetitive manual work.
by Matthew
Automated Cold Email Personalization This workflow automates the creation of highly personalized cold outreach emails by extracting lead data, scraping company websites, and leveraging AI to craft unique email components. This is ideal for sales teams, marketers, and business development professionals looking to scale their outreach efforts while maintaining a high degree of personalization. How It Works Generate Batches: The workflow starts by generating a sequence of numbers, defining how many leads to process in batches. Scrape Lead Data: It uses an external API (Apify) to pull comprehensive lead information, including contact details, company data, and social media links. Fetch Client Data: The workflow then retrieves relevant client details from your Google Sheet based on the scraped data. Scrape Company Website: The lead's company website is automatically scraped to gather content for personalization. Summarize Prospect Data: An OpenAI model analyzes both the scraped website content and the individual's profile data to create concise summaries and identify unique angles for outreach. Craft Personalized Email: A more advanced OpenAI model uses these summaries and specific instructions to generate the "icebreaker," "intro," and "value proposition" components of a personalized cold email. Update Google Sheet: Finally, these generated email components are saved back into your Google Sheet, enriching your lead records for future outreach. Google Sheet Structure Your Google Sheet must have the following exact column headers to ensure proper data flow: Email** (unique identifier for each lead) Full Name** Headline** LinkdIn** cityName** stateName** company/cityName** Country** Company Name** Website** company/businessIndustry** Keywords** icebreaker** (will be populated by the workflow) intro** (will be populated by the workflow) value\_prop** (will be populated by the workflow) Setup Instructions Add Credentials: In n8n, add your OpenAI API key via the Credentials menu. Connect your Google account via the Credentials menu for Google Sheets access. You will also need an Apify API key for the Scraper node. Configure Google Sheets Nodes: Select the Client data and Add email data to sheet nodes. For each, choose your Google Sheets credential, select your spreadsheet, and the specific sheet name. Ensure all column mappings are correct according to the "Google Sheet Structure" section above. Configure Apify Scraper Node: Select the Scraper node. Update the Authorization header with your Apify API token (Bearer KEY). In the JSON Body, set the searchUrl to your Apollo link (or equivalent source URL for lead data). Configure OpenAI Nodes: Select both Summarising prospect data and Creating detailed email nodes. Choose your OpenAI credential from the dropdown. In the Creating detailed email node's prompt, replace PUT YOUR COMPANY INFO HERE with your company's context and verify the target sector for the email generation. Verify Update Node: On the final Add email data to sheet node, ensure the Operation is set to Append Or Update and the Matching Columns field is set to Email. Customization Options 💡 Trigger: Change the When clicking 'Execute workflow' node to an automatic trigger, such as a **Cron node for daily runs, or a Google Sheets trigger when new rows are added. Lead Generation: Modify the **Code node to change the number of leads processed per run (currently set to 50). Scraping Logic**: Adjust the Scraper node's parameters (e.g., count) or replace the Apify integration with another data source if needed. AI Prompting: Experiment with the prompts in the **Summarising prospect data and Creating detailed email OpenAI nodes to refine the tone, style, length, or content focus of the generated summaries and emails. AI Models**: Test different OpenAI models (e.g., gpt-3.5-turbo, gpt-4o) in the OpenAI nodes to find the optimal balance between cost, speed, and output quality. Data Source/CRM**: Replace the Google Sheets nodes with integrations for your preferred CRM (e.g., HubSpot, Salesforce) or a database (e.g., PostgreSQL, Airtable) to manage your leads.