by Nikan Noorafkan
š Google Ads + OpenAI + Sheets ā Monthly AI Performance Analysis Automate monthly ad performance insights with AI-powered recommendations š§© Overview This workflow automatically analyzes Google Ads performance every month, using the Google Ads API and OpenAI (GPT-4o) to uncover which ad themes, categories, and messages perform best. It then generates a structured AI report, saves it to Google Sheets, and sends a Slack summary to your marketing team. š” Perfect for digital marketers, agencies, and growth analysts who want automated campaign insights without manually crunching numbers. āļø Features ā Automatically runs on the 1st of each month ā Fetches last 30 days of ad performance via Google Ads API (GAQL) ā Uses GPT-4o for natural-language insights & improvement ideas ā Groups ads by category and theme (e.g., āFree Shipping,ā āPremiumā) ā Generates a clean, formatted markdown report ā Archives reports in Google Sheets for trend tracking ā Notifies your Slack channel with AI-driven recommendations š§ Architecture | Component | Purpose | | ------------------- | ------------------------------------------------ | | n8n | Workflow engine | | Google Ads API | Source of ad performance data | | OpenAI (GPT-4o) | Analyzes CTR patterns and writes recommendations | | Google Sheets | Report archiving and history tracking | | Slack | Team notifications | š§ Workflow Logic (Summary) Monthly Trigger (1st of Month) ā¬ļø 1ļøā£ Get Performance Data (Google Ads API) Fetches 30-day CTR, clicks, impressions for all responsive search ads. ā¬ļø 2ļøā£ Prepare Performance Data Groups data by ad group and theme keywords, builds an AI prompt. ā¬ļø 3ļøā£ AI Agent (LangChain) + GPT-4o Analyzes patterns and generates actionable insights. ā¬ļø 4ļøā£ Generate Report (Code) Formats a Markdown report with AI recommendations and KPIs. ā¬ļø 5ļøā£ Save to Google Sheets Archives results for long-term analytics. ā¬ļø 6ļøā£ Send Report to Slack Delivers the summary directly to your marketing channel. š Environment Variables | Variable | Example | Description | | ------------------------ | ----------------------------- | ------------------------------ | | GOOGLE_ADS_CUSTOMER_ID | 123-456-7890 | Google Ads customer account ID | | GOOGLE_ADS_API_VERSION | v17 | Current Ads API version | | GOOGLE_SHEET_ID | 1xA1B2c3D4EFgH... | Target spreadsheet ID | | OPENAI_API_KEY | sk-xxxxx | OpenAI API key for GPT-4o | | SLACK_WEBHOOK_URL | https://hooks.slack.com/... | Slack incoming webhook | š Credential Setup | Service | Type | Required Scopes | | ----------------- | ----------------------------- | ---------------------------------------------- | | Google Ads | OAuth2 (googleAdsOAuth2Api) | https://www.googleapis.com/auth/adwords | | OpenAI | API key (openAiApi) | Full access | | Google Sheets | OAuth2 | https://www.googleapis.com/auth/spreadsheets | | Slack | Webhook | chat:write | š§± Node-by-Node Breakdown | Node | Purpose | Key Configuration | | ---------------------------------- | ----------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------- | | Monthly Trigger | Starts workflow on 1st of every month | Cron: 0 0 1 * * | | Get Performance Data | Queries Ads data | Endpoint: https://googleads.googleapis.com/v17/customers/{id}/googleAds:searchQuery: GAQL (CTR, clicks, impressions, last 30 days) | | Prepare Performance Data | Aggregates and builds AI prompt | Groups by ad group and theme, computes CTRs | | AI Agent ā Analyze Performance | Passes formatted data to GPT-4o | System message: āYou are a Google Ads performance analystā¦ā | | OpenAI Chat Model (GPT-4o) | Analytical reasoning engine | Model: gpt-4o, Temperature 0.2 | | Generate Report | Parses AI output, formats Markdown report | Adds recommendations + next steps | | Save Report to Sheets | Archives report | Sheet name: Performance Reports | | Send Report (Slack) | Sends summary | Uses report_markdown variable | š§ AI Report Example 30-Day Performance Analysis Report Executive Summary Analyzed: 940 ads Period: Last 30 days Top Performing Categories Running Shoes: 9.4% CTR (120 ads) Fitness Apparel: 8.2% CTR (90 ads) Top Performing Themes "Free Shipping" messaging: 9.8% CTR (58 ads) "Premium" messaging: 8.5% CTR (44 ads) AI-Powered Recommendations [HIGH] Emphasize āFree Shippingā across more ad groups. Expected Impact: +5 % CTR [MEDIUM] Test āPremium Qualityā vs. āNew Arrivals.ā Expected Impact: +3 % CTR Next Steps Implement new ad variations A/B test messaging Re-analyze next month š§© Testing Procedure 1ļøā£ Temporarily disable the cron trigger. 2ļøā£ Run the workflow manually. 3ļøā£ Confirm: Google Ads node returns JSON with results. AI Agent output is valid JSON. Report is written to Sheets. Slack message received. 4ļøā£ Re-enable the monthly trigger once verified. š§¾ Output in Google Sheets | Date | Ads Analyzed | Top Category | Top Theme | Key Recommendations | Generated At | | ---------- | ------------ | ------------- | ------------- | ---------------------------------- | ----------------- | | 2025-10-01 | 940 | Running Shoes | Free Shipping | āAdd Free Shipping copy to 10 adsā | 2025-10-01T00:05Z | šŖ Maintenance | Frequency | Task | | --------- | ----------------------------------------- | | Monthly | Review AI accuracy and update themes list | | Quarterly | Refresh Google Ads API credentials | | As needed | Update GAQL fields for new metrics | āļø API Verification Endpoint: POST https://googleads.googleapis.com/v17/customers/{customer_id}/googleAds:search Scopes: https://www.googleapis.com/auth/adwords GAQL Query: SELECT ad_group_ad.ad.id, ad_group_ad.ad.responsive_search_ad.headlines, ad_group.name, metrics.impressions, metrics.clicks, metrics.ctr FROM ad_group_ad WHERE segments.date DURING LAST_30_DAYS AND metrics.impressions > 100 ORDER BY metrics.clicks DESC LIMIT 1000 ā Fully valid query ā verified for GAQL syntax, fields, and resource joins. ā OAuth2 flow handled by n8nās googleAdsOAuth2Api. ā Optional: add "timeout": 60000 for large accounts. š Metrics of Success | KPI | Target | | -------------------------- | ---------------- | | Report accuracy | ā„ 95 % | | Monthly automation success | ā„ 99 % | | CTR improvement tracking | +3ā5 % over time | š References Google Ads API Docs LangChain in n8n OpenAI API Reference Google Sheets API Slack Incoming Webhooks šÆ Conclusion You now have a fully automated Google Ads performance analysis workflow powered by: Google Ads API** for granular metrics OpenAI GPT-4o** for intelligent recommendations Google Sheets** for archiving Slack** for team-wide updates š” Result: A recurring, data-driven optimization loop that improves ad performance every month ā with zero manual effort.
by Fayzul Noor
This workflow is built for digital marketers, sales professionals, influencer agencies, and entrepreneurs who want to automate Instagram lead generation. If youāre tired of manually searching for profiles, copying email addresses, and updating spreadsheets, this automation will save you hours every week. It turns your process into a smart system that finds, extracts, and stores leads while you focus on growing your business. How it works / What it does This n8n automation completely transforms how you collect Instagram leads using AI and API integrations. Hereās a simple breakdown of how it works: Set your targeting parameters using the Edit Fields node. You can specify your platform (Instagram), field of interest such as ābeauty & hair,ā and target country like āUSA.ā Generate intelligent search queries with an AI Agent powered by GPT-4o-mini. It automatically creates optimized Google search queries to find relevant Instagram profiles in your chosen niche and location. Extract results from Google using Apifyās Google Search Scraper, which collects hundreds of Instagram profile URLs that match your search criteria. Fetch detailed Instagram profile data using Apifyās Instagram Scraper. This includes usernames, follower counts, and profile bios where contact information usually appears. Use AI to extract emails from the profile biographies with the Information Extractor node powered by GPT-3.5-turbo. It identifies emails even when they are hidden or creatively formatted. Store verified leads in a PostgreSQL database. The workflow automatically adds new leads or updates existing ones with fields like username, follower count, email, and niche. Once everything is set up, the system runs on autopilot and keeps building your database of quality leads around the clock. How to set up Follow these steps to get your Instagram Lead Generation Machine running: Import the JSON file into your n8n instance. Add your API credentials: Apify token for the Google and Instagram scrapers OpenAI API key for the AI-powered nodes PostgreSQL credentials for storing leads Open the Edit Fields node and set your platform, field of interest, and target country. Run the workflow manually using the Manual Trigger node to test it. Once confirmed, replace the manual trigger with a schedule or webhook to run it automatically. Check your PostgreSQL database to ensure the leads are being saved correctly. Requirements Before running the workflow, make sure you have the following: An n8n account or instance (self-hosted or n8n Cloud) An Apify account for accessing the Google and Instagram scrapers OpenAI API access for generating smart search queries and extracting emails A PostgreSQL database to store your leads Basic understanding of how n8n workflows and nodes operate How to customize the workflow This workflow is flexible and can be customized to fit your business goals. Hereās how you can tailor it: Change your niche or location by updating the Edit Fields node. You can switch from ābeauty influencers in the USAā to āfitness coaches in Canadaā in seconds. Add more data fields to collect additional information such as engagement rates, bio keywords, or profile categories. Just modify the PostgreSQL node and database schema. Connect to your CRM or email system to automatically send introduction emails or add new leads to your marketing pipeline. Use different triggers such as a scheduled cron trigger for daily runs or a webhook trigger to start the workflow through an API call. Filter higher-quality leads by adding logic to capture only profiles with a minimum number of followers or verified emails.
by Trung Tran
š Code of Conduct Q&A Slack Chatbot with RAG Powered > Empower employees to instantly access and understand the companyās Code of Conduct via a Slack chatbot, powered by Retrieval-Augmented Generation (RAG) and LLMs. š§āš¼ Whoās it for This workflow is designed for: HR and compliance teams** to automate policy-related inquiries Employees** who want quick answers to Code of Conduct questions directly inside Slack Startups or enterprises** that need internal compliance self-service tools powered by AI āļø How it works / What it does This RAG-powered Slack chatbot answers user questions based on your uploaded Code of Conduct PDF using GPT-4 and embedded document chunks. Here's the flow: Receive Message from Slack: A webhook triggers when a message is posted in Slack. Check if itās a valid query: Filters out non-user messages (e.g., bot mentions). Run Agent with RAG: Uses GPT-4 with Query Data Tool to retrieve relevant document chunks. Returns a well-formatted, context-aware answer. Send Response to Slack: Fetches user info and posts the answer back in the same channel. Document Upload Flow: HR can upload the PDF Code of Conduct file. Itās parsed, chunked, embedded using OpenAI, and stored for future query retrieval. A backup copy is saved to Google Drive. š ļø How to set up Prepare your environment: Slack Bot token & webhook configured (Sample slack app manifest: https://wisestackai.s3.ap-southeast-1.amazonaws.com/slack_bot_manifest.json) OpenAI API key (for GPT-4 & embedding) Google Drive credentials (optional for backup) Upload the Code of Conduct PDF: Use the designated node to upload your document (Sample file: https://wisestackai.s3.ap-southeast-1.amazonaws.com/20220419-ingrs-code-of-conduct-policy-en.pdf) This triggers chunking ā embedding ā data store. Deploy the chatbot: Host the webhook and connect it to your Slack app. Share the command format with employees (e.g., @CodeBot Can I accept gifts from partners?) Monitor and iterate: Improve chunk size or embed model if queries arenāt accurate. Review unanswered queries to enhance coverage. š Requirements n8n (Self-hosted or Cloud) Slack App (with chat:write, users:read, commands) OpenAI account (embedding + GPT-4 access) Google Drive integration (for backups) Uploaded Code of Conduct in PDF format š§© How to customize the workflow | What to Customize | How to Do It | |-----------------------------|------------------------------------------------------------------------------| | š¤ Prompt style | Edit the System & User prompts inside the Code Of Conduct Agent node | | š Document types | Upload additional policy PDFs and tag them differently in metadata | | š¤ Agent behavior | Tune GPT temperature or replace with different LLM | | š¬ Slack interaction | Customize message formats or trigger phrases | | š Data Store engine | Swap to Pinecone, Weaviate, Supabase, etc. depending on use case | | š Multilingual support | Preprocess text and support locale detection via Slack metadata |
by Trung Tran
Create AI-Powered Chatbot for Candidate Evaluation on Slack > This workflow connects a Slack chatbot with AI agents and Google Sheets to automate candidate resume evaluation. It extracts resume details, identifies the applied job from the message, fetches the correct job description, and provides a summarized evaluation via Slack and tracking sheet. Perfect for HR teams using Slack. Whoās it for This workflow is designed for: HR Teams, **Recruiters, and Hiring Managers Working in software or tech companies using Slack, Google Sheets, and n8n Who want to automate candidate evaluation based on uploaded profiles and applied job positions How it works / What it does This workflow is triggered when a Slack user mentions the HR bot and attaches a candidate profile PDF. The workflow performs the following steps: Trigger from Slack Mention A user mentions the bot in Slack with a message like: @HRBot Please evaluate this candidate for the AI Engineer role. (with PDF attached) Input Validation If no file is attached, the bot replies: "Please upload the candidate profile file before sending the message." Extract Candidate Profile Downloads the attached PDF from Slack Uses Extract from File to parse the resume into text Profile Analysis (AI Agent) Sends the resume text and message to the Profile Analyzer Agent Identifies: Candidate name, email, and summary Applied position (from message) Looks up the Job Description PDF URL using Google Sheets Job Description Retrieval Downloads and parses the matching JD PDF HR Evaluation (AI Agent) Sends both the candidate profile and job description to HR Expert Agent Receives a summarized fit evaluation and insights Output and Logging Sends evaluation result back to Slack in the original thread Updates a Google Sheet with evaluation data for tracking How to set up Slack Setup Create a Slack bot and install it into your workspace Enable the app_mention event and generate a bot token Connect Slack to n8n using Slack Bot credentials Google Sheets Setup Create a sheet mapping Position Title ā Job Description URL Create another sheet for logging evaluation results n8n Setup Add a Webhook Trigger for Slack mentions Connect Slack, Google Sheets, and GPT-4 credentials Set up agents (Profile Analyzer Agent, HR Expert Agent) with appropriate prompts Deploy & Test Mention your bot in Slack with a message and file Confirm the reply and entry in the evaluation tracking sheet Requirements n8n (self-hosted or cloud) Slack App with Bot Token OpenAI or Azure OpenAI account (for GPT-4) Google Sheets (2 sheets: job mapping + evaluation log) Candidate profiles in PDF format Defined job titles and descriptions How to customize the workflow You can easily adapt this workflow to your teamās needs: | Customization Area | How to Customize | |--------------------------|----------------------------------------------------------------------------------| | Job Mapping Source | Replace Google Sheet with Airtable or Notion DB | | JD Format | Use Markdown or inline descriptions instead of PDF | | Evaluation Output Format | Change from Slack message to Email or Notion update | | HR Agent Prompt | Customize to match your company tone or include scoring rubrics | | Language Support | Add support for bilingual input/output (e.g., Vietnamese & English) | | Workflow Trigger | Trigger from slash command or form instead of @mention |
by DIGITAL BIZ TECH
AI-Powered Website Chatbot with Google Drive Knowledge Base Overview This workflow combines website chatbot intelligence with automated document ingestion and vectorization ā enabling live Q&A from both chat input and processed Google Drive files. It uses Mistral AI for OCR + embeddings, and Qdrant for vector search. Chatbot Flow Trigger:** When chat message received or webhook based upon deployed chatbot Model:** OpenAI gpt-4.1-mini Memory:** Simple Memory (Buffer Window) Vector Search Tool:** Qdrant Vector Store Embeddings:** Mistral Cloud Agent:** website chat agent Responds based on chatdbtai Supabase content Enforces brand tone and informative documents. Integratration with both: Embedded chat UI Webhook Document ā Knowledge Base Pipeline Triggered manually to keep vector store up-to-date. Steps Google Drive (brand folder) ā Fetch files from folder Website kb (ID: 1o3DK9Ceka5Lqb8irvFSfEeB8SVGG_OL7) Loop Over Items ā For each file: Set metadata Download file Upload to Mistral for OCR Get Signed URL Run OCR extraction (mistral-ocr-latest) If OCR success ā Pass to chunking pipeline Else ā skip and continue Chunking Logic (Code node) Splits document into 1,000-character JSON chunks Adds metadata (source, char positions, file ID) Default Data Loader + Text Splitter ā Prepares chunks for embedding Embeddings (Mistral Cloud) ā Generates embeddings for text chunks Qdrant Vector Store (Insert mode) ā Saves embeddings into docragtestkb collection Wait ā Optional delay between batches Integrations Used | Service | Purpose | Credential | |----------|----------|------------| | Google Drive | File source | Google Drive account 6 rn dbt | | Mistral Cloud | OCR + embeddings | Mistral Cloud account 2 dbt rn | | Qdrant | Vector storage | QdrantApi account | | OpenAI | Chat model | OpenAi account 8 dbt digi | Agent System Prompt Summary > āYou are the official AI assistant for this website. Use chatdbtai only as your knowledge source. Respond conversationally, list offerings clearly, link blogs, and say āI couldnāt find that on this siteā if no match.ā Key Features ā Automated OCR + chunking ā vectorization ā Persistent memory for chat sessions ā Multi-channel (Webhook + Embedded Chat) ā Fully brand-guided, structured responses ā Live data retrieval from Qdrant vector store Summary > A unified workflow that turns brand files + web content into a knowledge base that powers a intelligent chatbot ā capable of responding to visitors in real time, powered by Mistral, OpenAI, and Qdrant. Need Help or More Workflows? Want to customize this workflow for your business or integrate it with your existing tools? Our team at Digital Biz Tech can tailor it precisely to your use case from automation logic to AI-powered enhancements. š” We can help you set it up for free ā from connecting credentials to deploying it live. Contact: shilpa.raju@digitalbiz.tech Website: https://www.digitalbiz.tech LinkedIn: https://www.linkedin.com/company/digitalbiztech/ You can also DM us on LinkedIn for any help.
by Jitesh Dugar
Transform college admissions from an overwhelming manual process into an intelligent, efficient, and equitable system that analyzes essays, scores applicants holistically, and identifies top candidatesāsaving 40+ hours per week while improving decision quality. šÆ What This Workflow Does Automates comprehensive application review with AI-powered analysis: š Application Intake - Captures complete college applications via Jotform š AI Essay Analysis - Deep analysis of personal statements and supplemental essays for: Writing quality, authenticity, and voice AI-generated content detection Specificity and research quality Red flags (plagiarism, inconsistencies, generic writing) šÆ Holistic Review AI - Evaluates applicants across five dimensions: Academic strength (GPA, test scores, rigor) Extracurricular profile (leadership, depth, impact) Personal qualities (character, resilience, maturity) Institutional fit (values alignment, contribution potential) Diversity contribution (unique perspectives, experiences) š¦ Smart Routing - Automatically categorizes and routes applications: Strong Admit (85-100): Slack alert ā Director email ā Interview invitation ā Fast-track Committee Review (65-84): Detailed analysis ā Committee discussion ā Human decision Standard Review (<65): Acknowledgment ā Human verification ā Standard timeline š Comprehensive Analytics - All applications logged with scores, recommendations, and outcomes ⨠Key Features AI Essay Analysis Engine Writing Quality Assessment**: Grammar, vocabulary, structure, narrative coherence Authenticity Detection**: Distinguishes genuine voice from AI-generated content (GPT detectors) Content Depth Evaluation**: Self-awareness, insight, maturity, storytelling ability Specificity Scoring**: Generic vs tailored "Why Us" essays with research depth Red Flag Identification**: Plagiarism indicators, privilege blindness, inconsistencies, template writing Thematic Analysis**: Core values, motivations, growth narratives, unique perspectives Holistic Review Scoring (0-100 Scale) Academic Strength (35%)**: GPA in context, test scores, course rigor, intellectual curiosity Extracurricular Profile (25%)**: Quality over quantity, leadership impact, commitment depth Personal Qualities (20%)**: Character, resilience, empathy, authenticity, self-awareness Institutional Fit (15%)**: Values alignment, demonstrated interest, contribution potential Diversity Contribution (5%)**: Unique perspectives, life experiences, background diversity Intelligent Candidate Classification Admit**: Top 15% - clear admit, exceptional across multiple dimensions Strong Maybe**: Top 15-30% - competitive, needs committee discussion Maybe**: Top 30-50% - solid but not standout, waitlist consideration Deny**: Below threshold - does not meet competitive standards (always human-verified) Automated Workflows Priority Candidates**: Immediate Slack alerts, director briefs, interview invitations Committee Cases**: Detailed analysis packets, discussion points, voting workflows Standard Processing**: Professional acknowledgments, timeline communications Interview Scheduling**: Automated invitations with candidate-specific questions š¼ Perfect For Selective Colleges & Universities**: 15-30% acceptance rates, holistic review processes Liberal Arts Colleges**: Emphasis on essays, personal qualities, institutional fit Large Public Universities**: Processing thousands of applications efficiently Graduate Programs**: MBA, law, medical school admissions Scholarship Committees**: Evaluating merit and need-based awards Honors Programs**: Identifying top candidates for selective programs Private High Schools**: Admissions teams with holistic processes š Admissions Impact Efficiency & Productivity 40-50 hours saved per week** on initial application review 70% faster** essay evaluation with AI pre-analysis 3x more applications** processed per reader Zero data entry** - all information auto-extracted Consistent evaluation** across thousands of applications Same-day turnaround** for top candidate identification Decision Quality Improvements Objective scoring** reduces unconscious bias Consistent criteria** applied to all applicants Essay authenticity checks** catch AI-written applications Holistic view** considers all dimensions equally Data-driven insights** inform committee discussions Fast-track top talent** before competitors Equity & Fairness Standardized evaluation** ensures fair treatment First-generation flagging** provides context Socioeconomic consideration** in holistic scoring Diverse perspectives valued** in diversity score Bias detection** in essay analysis Audit trail** for compliance and review Candidate Experience Instant acknowledgment** of application receipt Professional communication** at every stage Clear timelines** and expectations Interview invitations** for competitive candidates Respectful process** for all applicants regardless of outcome š§ What You'll Need Required Integrations Jotform** - Application intake forms Create your form for free on JotForm using this link OpenAI API** - GPT-4o for analysis (~$0.15-0.25 per application) Gmail/Outlook** - Applicant and staff communication (free) Google Sheets** - Application database and analytics (free) Optional Integrations Slack** - Real-time alerts for strong candidates ($0-8/user/month) Google Calendar** - Interview scheduling automation (free) Airtable** - Advanced application tracking (alternative to Sheets) Applicant Portal Integration** - Status updates via API CRM Systems** - Slate, TargetX, Salesforce for higher ed š Setup Guide (3-4 Hours) Step 1: Create Application Form (60 min) Build comprehensive Jotform with sections: Basic Information Full name, email, phone High school, graduation year Intended major Academic Credentials GPA (weighted/unweighted, scale) SAT score (optional) ACT score (optional) Class rank (if available) Academic honors Essays (Most Important!) Personal statement (650 words max) "Why Our College" essay (250-300 words) Supplemental prompts (program-specific) Activities & Achievements Extracurricular activities (list with hours/week, years) Leadership positions (with descriptions) Honors and awards Community service hours Work experience Additional Information First-generation college student (yes/no) Financial aid needed (yes/no) Optional: demographic information Optional: additional context Step 2: Import n8n Workflow (15 min) Copy JSON from artifact n8n: Workflows ā Import ā Paste Includes all nodes + 7 detailed sticky notes Step 3: Configure OpenAI API (20 min) Get API key: https://platform.openai.com/api-keys Add to both AI nodes (Essay Analysis + Holistic Review) Model: gpt-4o (best for nuanced analysis) Temperature: 0.3 (consistency with creativity) Test with sample application Cost: $0.15-0.25 per application (essay analysis + holistic review) Step 4: Customize Institutional Context (45 min) Edit AI prompts to reflect YOUR college: In Holistic Review Prompt, Update: College name and type Acceptance rate Average admitted student profile (GPA, test scores) Institutional values and culture Academic programs and strengths What makes your college unique Desired student qualities In Essay Analysis Prompt, Add: Specific programs to look for mentions of Faculty names applicants should reference Campus culture keywords Red flags specific to your institution Step 5: Setup Email Communications (30 min) Connect Gmail/Outlook OAuth Update all recipient addresses: admissions-director@college.edu admissions-committee@college.edu Email addresses for strong candidate alerts Customize email templates: Add college name, logo, branding Update contact information Adjust tone to match institutional voice Include decision release dates Add applicant portal links Step 6: Configure Slack Alerts (15 min, Optional) Create channel: #admissions-strong-candidates Add webhook URL or bot token Test with mock strong candidate Customize alert format and recipients Step 7: Create Admissions Database (30 min) Google Sheet with columns:
by aditya vadaganadam
This n8n template turns chat questions into structured financial reports using Gemini and posts them to a Discord channel via webhook. Ask about tickers, sectors, or theses (e.g., āNVDA longāterm outlook?ā or āGold ETF shortāterm drivers?ā) and receive a concise, shareable report. Good to know Not financial advice: Use for insights only; verify independently. Model availability can vary by region. If you see āmodel not found,ā it may be geoārestricted. Costs depend on model and tokens. Check current Gemini pricing for updates. Discord messages are limited to ~2000 characters per post; long reports may need splitting. Rate limits: Discord webhooks are rateālimited; add short waits for bursts. How it works Chat Trigger collects the userās question (public chat supported when the workflow is activated). Conversation Memory keeps a short window of recent messages to maintain context. Connect Gemini provides the LLM (e.g., geminiā2.5āflashālite) and parameters (temperature, tokens). Agent (agent1) applies a financial analysis System Message to produce structured insights. Structured Output Parser enforces a simple JSON schema: idea (oneāline thesis) + analysis (Markdown sections). Code formats a Discordāready Markdown report (title, question, executive summary, sections, disclaimer). Edit Fields maps the formatted report to a clean content field. Discord Webhook posts the final report to your channel. How to use Start with the builtāin Chat Trigger: click Open chat, ask a question, and verify the Discord post. Replace or augment with a Cron or Webhook trigger for scheduled or programmatic runs. For richer context, add HTTP Request nodes (prices, news, filings) and pass summaries to the agent. Requirements n8n instance with internet access Google AI (Gemini) API key Discord server with a webhook URL Customising this workflow System Message: Adjust tone, depth, risk profile, and required sections (Summary, Drivers, Risks, Metrics, Next Steps, Takeaway). Model settings: Switch models or tune temperature/tokens in Connect Gemini. Schema: Extend the parser and formatter with fields like drivers[], risks[], or metrics{}. Formatting: Edit the Code node to change headings, emojis, disclaimers, or add timestamps. Operations: Add retries, message splitting for long outputs, and rateālimit handling for Discord.
by vinci-king-01
Sales Pipeline Automation Dashboard with AI Lead Intelligence šÆ Target Audience Sales managers and team leads Business development representatives Marketing teams managing lead generation CRM administrators and sales operations Account executives and sales representatives Sales enablement professionals Revenue operations (RevOps) teams š Problem Statement Manual lead qualification and sales pipeline management is inefficient and often leads to missed opportunities or poor lead prioritization. This template solves the challenge of automatically scoring, qualifying, and routing leads using AI-powered intelligence to maximize conversion rates and sales team productivity. š§ How it Works This workflow automatically processes new leads using AI-powered intelligence, scores and qualifies them based on multiple factors, and automates the entire sales pipeline from lead capture to deal creation. Key Components Dual Trigger System - Scheduled monitoring and webhook triggers for real-time lead processing AI-Powered Lead Intelligence - Advanced scoring algorithm based on 7 key factors Multi-Source Data Enrichment - LinkedIn and Crunchbase integration for comprehensive lead profiles Automated Sales Actions - Intelligent routing, task creation, and follow-up sequences Multi-Platform Integration - HubSpot CRM, Slack notifications, and Google Sheets dashboard š Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | timestamp | DateTime | When the lead was processed | "2024-01-15T10:30:00Z" | | lead_id | String | Unique lead identifier | "LEAD-2024-001234" | | first_name | String | Lead's first name | "John" | | last_name | String | Lead's last name | "Smith" | | email | String | Lead's email address | "john@company.com" | | company_name | String | Company name | "Acme Corp" | | job_title | String | Lead's job title | "Marketing Director" | | lead_score | Number | AI-calculated score (0-100) | 85 | | grade | String | Lead grade (A+, A, B+, B, C+) | "A+" | | category | String | Lead category | "Enterprise" | | priority | String | Priority level | "Critical" | | lead_source | String | How the lead was acquired | "Website Form" | | assigned_rep | String | Assigned sales representative | "Senior AE" | | company_size | String | Company employee count | "201-500 employees" | | industry | String | Company industry | "Technology" | | funding_stage | String | Company funding stage | "Series B" | | estimated_value | String | Estimated deal value | "$50K-100K" | š ļø Setup Instructions Estimated setup time: 25-30 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials HubSpot CRM account with API access Google Sheets account with API access Slack workspace for notifications (optional) Email service for welcome emails (optional) Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai npm install n8n-nodes-slack 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up HubSpot CRM Integration Add HubSpot API credentials Grant necessary permissions for contacts, deals, and tasks Configure custom properties for lead scoring and qualification Test the connection to ensure it's working 4. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for sales pipeline data Configure the sheet name (default: "Sales Pipeline") 5. Configure Lead Scoring Parameters Update the lead scoring weights in the Code node Customize ideal customer profile criteria Set automation trigger thresholds Adjust sales rep assignment logic 6. Set up Notification Channels Configure Slack webhook or API credentials Set up email service credentials for welcome emails Define notification preferences for different lead grades Test notification delivery 7. Configure Triggers Set up webhook endpoint for real-time lead capture Configure scheduled trigger for periodic monitoring Choose appropriate time zones for your business hours Test both trigger mechanisms 8. Test and Validate Run the workflow manually with sample lead data Check HubSpot for proper contact and deal creation Verify Google Sheets data formatting Test all notification channels š Workflow Customization Options Modify Lead Scoring Algorithm Adjust scoring weights for different factors Add new scoring criteria (geographic location, technology stack, etc.) Customize ideal customer profile parameters Implement industry-specific scoring models Extend Data Enrichment Add more data sources (ZoomInfo, Apollo, etc.) Include social media presence analysis Add technographic data collection Implement intent signal detection Customize Sales Automation Modify follow-up sequences for different lead categories Add more sophisticated sales rep assignment logic Implement territory-based routing Add automated meeting scheduling Output Customization Add data visualization and reporting features Implement sales pipeline analytics Create executive dashboards with key metrics Add conversion rate tracking and analysis š Use Cases Lead Qualification**: Automatically score and qualify incoming leads Sales Pipeline Management**: Streamline the entire sales process Lead Routing**: Intelligently assign leads to appropriate sales reps Follow-up Automation**: Ensure consistent and timely follow-up Sales Intelligence**: Provide comprehensive lead insights Performance Tracking**: Monitor sales team and pipeline performance šØ Important Notes Respect LinkedIn and Crunchbase terms of service and rate limits Implement appropriate delays between requests to avoid rate limiting Regularly review and update your lead scoring parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Ensure GDPR compliance for lead data processing š§ Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status HubSpot API errors: Check API key and permissions Google Sheets permission errors: Check OAuth2 scope and permissions Lead scoring errors: Review the Code node's JavaScript logic Rate limiting: Adjust request frequency and implement delays Support Resources: ScrapeGraphAI documentation and API reference HubSpot API documentation and developer resources n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Sales automation best practices and guidelines
by AppUnits AI
Automated Invoice Creation & Team Notification with Jotform, Xero, Outlook, and Telegram This workflow automates the entire process of receiving a product/service order, checking or creating a customer in Xero, generating an invoice, emailing it, and notifying the sales team for example after sometime if no action has been taken yet (via Telegram) ā all triggered by a form submission (via Jotform). How It Works Receive Submission Triggered when a user submits a form. Collects data like customer details, selected product/service, etc. Check If Customer Exists Searches Xero to determine if the customer already exists. ā If Customer Exists: Update customer details. ā If Customer Doesnāt Exist: Create a new customer in Xero. Create The Invoice Generates a new invoice for the customer using the item selected. Send The Invoice Automatically sends the invoice via email to the customer. Wait For Sometime Now we will wait for 30 seconds (by default, you can change it) and then get the invoice details from Xero Notify The Team Notifies the sales team for example via Telegram in case no action has been taken on the invoice and thus the team can act fast. Who Can Benefit from This Workflow? Freelancers** Service Providers** Consultants & Coaches** Small Businesses** E-commerce or Custom Product Sellers** Requirements Jotform webhook setup, more info here Xero credentials, more info here Make sure that products/services values in Jotform are exactly the same as your item Code in your Xero account Email setup, update email node (Send email), more info about Outlook setup here LLM model credentials Telegram credentials, more info here
by Mirai
Icebreaker Generator powered with ChatGPT This n8n template crawls a company website, distills the content with AI, and produces a short, personalized icebreaker you can drop straight into your cold emails or CRM. Perfect for SDRs, founders, and agencies who want āreal researchā at scale. Good to know Works from a Google Sheet of leads (domain + LinkedIn, etc.). Handles common scrape failures gracefully and marks the leadās Status as Error. Uses ChatGPT to summarize pages and craft one concise, non-generic opener. Output is written back to the same Google Sheet (IceBreaker, Status). Youāll need Google credentials (for Sheets) and OpenAI credentials (for GPT). How it works Step 1 ā Discover internal pages Reads a leadās website from Google Sheets. Scrapes the home page and extracts all links. A Code node cleans the list (removes emails/anchors/social/external domains, normalizes paths, de-duplicates) and returns unique internal URLs. If the home page is unreachable or no links are found, the lead is marked Error and the workflow moves on. Step 2 ā Convert pages to text Visits each collected URL and converts the response into HTML/Markdown text for analysis. You can cap depth/amount with the Limit node. Step 3 ā Summarize & generate the icebreaker A GPT node produces a two-paragraph abstract for each page (JSON output). An Aggregate node merges all abstracts for the company. Another GPT node turns the merged summary into a personalized, multi-line icebreaker (spartan tone, non-obvious details). The result is written back to Google Sheets (IceBreaker = ..., Status = Done). The workflow loops to the next lead. How to use Prepare your sheet Include at least: organization_website_url, linkedin_url, and any other lead fields you track. Keep an empty IceBreaker and Status column for the workflow to fill. Connect credentials Google Sheets: use the Google account that owns the sheet and link it in the nodes. OpenAI: add your API key to the GPT nodes (āSummarize Website Pageā, āGenerate Multiline Icebreakerā). Run the workflow Start with the Manual Trigger (or replace with a schedule/webhook). Adjust Limit if you want fewer/more pages per company. Watch Status (Done/Error) and IceBreaker populate in your sheet. Requirements n8n instance Google Sheets account & access to the leads sheet OpenAI API key (for summarization + icebreaker generation) Customizing this workflow Tone & format: tweak the prompts (both GPT nodes) to match your brand voice and structure. Depth: change the Limit node to scan more/less pages; add simple rules to prioritize certain paths (e.g., /about, /blog/*). Fields: write additional outputs (e.g., Company Summary, Key Products, Recent News) back to new sheet columns. Lead selection: filter rows by Status = "" (or custom flags) to only process untouched leads. Error handling: expand the Error branch to retry with www./HTTPāHTTPS or to log diagnostics in a separate tab. Tips Keep icebreakers short, specific, and free of clichĆ©sāsmall, non-obvious details from the site convert best. Start with a small batch to validate quality, then scale up. Consider adding a rate limit if target sites throttle requests. In short: Sheet ā crawl internal pages ā AI abstracts ā single tailored icebreaker ā write back to the sheet, then repeat for the next lead. This automation can work great with our automation for automated cold emailing.
by Intuz
This n8n template from Intuz provides a complete solution to automate your entire invoicing process. It intelligently syncs confirmed sales orders from your Airtable base to QuickBooks, automatically creating new customers if they don't exist before generating a perfectly matched invoice. It then logs all invoice details back into Airtable, creating a flawless, end-to-end financial workflow. Use Cases 1. Accounting & Finance Teams: Automatically generate QuickBooks invoices from new orders confirmed in Airtable. Keep all invoices and customer details synced across systems in real time. 2. Sales & Operations Teams: Track order status and billing progress directly from Airtable without switching platforms. Ensure every confirmed sale automatically triggers an invoice in QuickBooks. 3. Business Owners / Admins: Eliminate double-entry between Airtable and QuickBooks. Maintain accurate, audit-ready financial records with minimal effort. How it works 1. Trigger from Airtable: The workflow starts instantly when a sales order is ready to be invoiced in your Airtable base (triggered via a webhook). 2. Check for Customer in QuickBooks: It searches your QuickBooks account to see if the customer from the sales order already exists. 3. Create New Customer (If Needed): If the customer is not found, it automatically creates a new customer record in QuickBooks using the details from your Airtable Customers table. 4. Create QuickBooks Invoice: Using the correct customer record (either existing or newly created), it gathers all order line items from Airtable and generates a detailed invoice in QuickBooks. 5. Log Invoice Back to Airtable: After the invoice is successfully created, the workflow updates your Airtable base by adding a new record to your Invoices & Payments table and updating the original Confirmed Orders record with the new QuickBooks Invoice ID, marking it as synced. Key Requirements to Use This Template 1. n8n Instance: An active n8n account (Cloud or self-hosted). 2. Airtable Base: An Airtable base on a "Pro" plan or higher with tables for Confirmed Orders, Customers, Order Lines, Product & Service, and Invoices & Payments. Field names must match those in the setup guide. 3. QuickBooks Online Account: An active QuickBooks Online account with API access. Step-by-Step Setup Instructions Step 1: Import and Configure the n8n Workflow Import Workflow:** In n8n, import the Client-Quickbook-Invoices-via-AirTable.json file. Get Webhook URL:** Click on the first node, "Webhook". Copy the "Test URL". Keep this n8n tab open. Configure Airtable Nodes:** There are six Airtable nodes. For each one, connect your Airtable credentials and select the correct Base and Table. Configure QuickBooks Nodes:** There are four QuickBooks-related nodes. For each one, connect your QuickBooks Online credentials. CRITICAL:** Click on the "Create Invoice URL" (HTTP Request) node. You must edit the URL and replace the placeholder number (9341455145770046) with your own QuickBooks Company ID. (Find this in your QuickBooks account settings under "Billing & Subscription"). Save and Activate**: Click "Save", then toggle the workflow to "Active". After activating, copy the new "Production URL" from the Webhook node. Customization Guide You can adapt this template for various workflows by tweaking a few nodes: Use a different Airtable Base:** Update the Base ID and Table ID in all Airtable nodes (Get Orders Records, Get Customer Details, Get Products, etc.). Switch from Sandbox to Live QuickBooks:** Replace the Sandbox company ID and endpoint in the āCreate Invoice URLā node with your production QuickBooks company ID. Add more invoice details:** Edit the Code and Parse in HTTP nodes to include additional fields (like Tax, Shipping, or Notes). Support multiple currencies:** Add a āCurrencyā field mapping in both Airtable and QuickBooks nodes. Connect with us Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Workflow Automation Click here- Get Started
by Colton Randolph
This n8n workflow automatically scrapes TechCrunch articles, filters for AI-related content using OpenAI, and delivers curated summaries to your Slack channels. Perfect for individuals or teams who need to stay current on artificial intelligence developments without manually browsing tech news sites. Who's it for AI product teams tracking industry developments and competitive moves Tech investors monitoring AI startup coverage and funding announcements Marketing teams following AI trends for content and positioning strategies Executives needing daily AI industry briefings without manual research overhead Development teams staying current on AI tools, frameworks, and breakthrough technologies How it works The workflow runs on a daily schedule, crawling a specificed amount of TechCrunch articles from the current year. Firecrawl extracts clean markdown content while bypassing anti-bot measures and handling JavaScript rendering automatically. Each article gets analyzed by an AI research assistant that determines if the content relates to artificial intelligence, machine learning, AI companies, or AI technology. Articles marked as "NOT_AI_RELATED" get filtered out automatically. For AI-relevant articles, OpenAI generates focused 3-bullet-point summaries that capture key insights. These summaries get delivered to your specified Slack channel with the original TechCrunch article title and source link for deeper reading. How to set up Configure Firecrawl: Add your Firecrawl API key to the HTTP Request node Set OpenAI credentials: Add your OpenAI API key to the AI Agent node Connect Slack: Configure your Slack webhook URL and target channel Adjust scheduling: Set your preferred trigger frequency (daily recommended) Test the workflow: Run manually to verify article extraction and Slack delivery Requirements Firecrawl account** with API access for TechCrunch web scraping OpenAI API key** for AI content analysis and summarization Slack workspace** with webhook permissions for message delivery n8n instance** (cloud or self-hosted) for workflow execution How to customize the workflow Source expansion: Modify the HTTP node URL to target additional tech publications beyond TechCrunch, or adjust the article limit and date filtering for different coverage needs. AI focus refinement: Update the OpenAI prompt to focus on specific AI verticals like generative AI, robotics, or ML infrastructure. Add company names or technology terms to the relevance filtering logic. Summary formats: Change from 3-bullet summaries to executive briefs, technical analyses, or competitive intelligence reports by modifying the OpenAI summarization prompt. Multi-channel delivery: Extend beyond Slack to email notifications, Microsoft Teams, or database storage for historical trend analysis and executive dashboards.