by Michael Taleb
How it works Watches a Google Drive folder for new (scanned) invoices. Each new file automatically triggers the workflow. Downloads and processes each invoice through OCR Space to extract the text. Extracts the company name (e.g. from the “billed to” field) and uses an AI agent to cross-reference it against a database in Google Sheets. If a match is found, retrieves the correct recipient email and sends the invoice as an attachment. If no match or an error occurs, the workflow alerts an operator by email for manual review. Setting up the workflow Connect Google Drive • In n8n, connect your Google Drive account. • Create or select a folder where you will upload scanned invoices. Connect Gmail (or another email service) • Add your Gmail account as a credential in n8n. • This will be used to send the processed invoice to the correct recipient. Set up OCR.Space • Create a free OCR.Space account: https://ocr.space • In n8n, create a Generic Credential (Header Auth). • Use apikey as the name and your OCR API key as the value. Connect the AI Agent • Add your OpenAI API key as a credential in n8n. • The AI Agent will extract the company name from the invoice text and match it against your database. • If a match is found, it retrieves the correct email. Prepare the Google Sheet database • Make a copy of the database sheet: Google Sheet Template • Fill it with company names and recipient emails. • Connect your Google account to n8n and link this sheet to the workflow. Run the workflow • When a new invoice is uploaded to your Google Drive folder, the workflow will: • Extract the text with OCR.Space • Use the AI Agent to identify the company name • Cross-reference it with your Google Sheet database • Send the invoice automatically to the correct recipient via Gmail • If no match is found, an error email is sent to you for manual review
by Trung Tran
Automated SSL/TLS Certificate Expiry Report for AWS > Automatically generates a weekly report of all AWS ACM certificates, including status, expiry dates, and renewal eligibility. The workflow formats the data into both Markdown (for PDF export to Slack) and HTML (for email summary), helping teams stay on top of certificate compliance and expiration risks. Who’s it for This workflow is designed for DevOps engineers, cloud administrators, and compliance teams who manage AWS infrastructure and need automated weekly visibility into the status of their SSL/TLS certificates in AWS Certificate Manager (ACM). It's ideal for teams that want to reduce the risk of expired certs, track renewal eligibility, and maintain reporting for audit or operational purposes. How it works / What it does This n8n workflow performs the following actions on a weekly schedule: Trigger: Automatically runs once a week using the Weekly schedule trigger. Fetch Certificates: Uses Get many certificates action from AWS Certificate Manager to retrieve all certificate records. Parse Data: Processes and reformats certificate data (dates, booleans, SANs, etc.) into a clean JSON object. Generate Reports: 📄 Markdown Report: Uses the Certificate Summary Markdown Agent (OpenAI) to generate a Markdown report for PDF export. 🌐 HTML Report: Uses the Certificate Summary HTML Agent to generate a styled HTML report for email. Deliver Reports: Converts Markdown to PDF and sends it to Slack as a file. Sends HTML content as a formatted email. How to set up Configure AWS Credentials in n8n to allow access to AWS ACM. Create a new workflow and use the following nodes in sequence: Schedule Trigger: Weekly (e.g., every Monday at 08:00 UTC) AWS ACM → Get many certificates Function Node → Parse ACM Data: Converts and summarizes certificate metadata OpenAI Chat Node (Markdown Agent) with a system/user prompt to generate Markdown Configure Metadata → Define file name and MIME type (.md) Create document file → Converts Markdown to document stream Convert to PDF Slack Node → Upload the PDF to a channel (Optional) Add a second OpenAI Chat Node for generating HTML and sending it via email Connect Output: Markdown report → Slack file upload HTML report → Email node with embedded HTML Requirements 🟩 n8n instance (self-hosted or cloud) 🟦 AWS account with access to ACM 🟨 OpenAI API key (for ChatGPT Agent) 🟥 Slack webhook or OAuth credentials (for file upload) 📧 Email integration (e.g., SMTP or SendGrid) 📝 Permissions to write documents (Google Drive / file node) How to customize the workflow Change report frequency**: Adjust the Weekly schedule trigger to daily or monthly as needed. Filter certificates**: Modify the function node to only include EXPIRED, IN_USE, or INELIGIBLE certs. Add tags or domains to include/exclude. Add visuals**: Enhance the HTML version with colored rows, icons, or company branding. Change delivery channels**: Replace Slack with Microsoft Teams, Discord, or Telegram. Send Markdown as email attachment instead of PDF. Integrate ticketing**: Create a JIRA/GitHub issue for each certificate that is EXPIRED or INELIGIBLE.
by Raymond Camden
How It Works This N8N template demonstrates using Foxit's Extraction API to get information from an incoming document and then using Diffbot's APIs to turn the text into a list of organizations mentioned in the document and create a summary. How it works Listen for a new file added to a Google Drive folder. When executed, the bits are downloaded. Upload the bits to Foxit. Call the Extract API to get the text contents of the document. Poll the API to see if it's done, and when it is, grab the text. Send the text to Diffbot API to get a list of entities mentioned in the doc as well as the summary. Use a code step to filter the entities returned from Diffbot to ones that are organizations, as well as filtering to a high confidence score. Use another code step to make an HTML string from the previous data. Email it using the GMail node. Requirements A Google account for Google Drive and GMail Foxit developer account (https://developer-api.foxit.com) Diffbot developer account (https://app.diffbot.com/get-started) Next Steps This workflow assumes PDF input, but Foxit has APIs to convert Office docs to PDF and that flow could be added before the Extract API is called. Diffbot returns an incredible set of information and more could be used in the email. Instead of emailing, you could sort documents by organizations into new folders.
by Cheng Siong Chin
How It Works MQTT ingests real-time sensor data from connected devices. The workflow normalizes the values and trains or retrains machine learning models on a defined schedule. An AI agent detects anomalies, validates the results for accuracy, and ensures reliable alerts. Detected issues are then routed to dashboards for visualization and sent via email notifications to relevant stakeholders, enabling timely monitoring and response. Setup Steps MQTT: Configure broker connection, set topic subscriptions, and verify data flow. ML Model: Define retraining schedule and specify historical data sources for model updates. AI Agent: Connect Claude or OpenAI APIs and configure anomaly validation prompts. Alerts: Set dashboard URL and email recipients to receive real-time notifications. Prerequisites MQTT broker credentials; historical training data; OpenAI/Claude API key; dashboard access; email service Use Cases IoT sensor monitoring; server performance tracking; network traffic anomalies; application log analysis; predictive maintenance alerts Customization Adjust sensitivity thresholds; swap ML models; modify notification channels; add Slack/Teams integration; customize validation rules Benefits Reduces detection latency 95%; eliminates manual monitoring; prevents false alerts; enables rapid incident response; improves system reliability
by Jitesh Dugar
Verified Gym Trial Pass with Photo ID Overview Automate gym trial pass generation with email verification, photo ID integration, QR codes, and professional PDF passes. This workflow handles the complete member onboarding process - from signup to verified pass delivery - in under 10 seconds. What This Workflow Does Receives signup data via webhook (name, email, photo URL, validity dates) Verifies email authenticity using VerifiEmail API (blocks disposable emails) Generates unique Pass ID in format GYM-{timestamp} Creates QR code for quick check-in at gym entrance Builds branded pass design with gradient styling, member photo, and validity dates Exports to PDF format for mobile-friendly viewing Sends email with PDF attachment and welcome message Logs all registrations in Google Sheets for record-keeping Returns API response with complete pass details Key Features ✅ Email Verification - Blocks fake and disposable email addresses ✅ Photo ID Integration - Displays member photo on digital pass ✅ QR Code Generation - Instant check-in scanning capability ✅ Professional Design - Gradient purple design with modern styling ✅ PDF Export - Mobile-friendly format that members can save ✅ Automated Emails - Welcome message with pass attachment ✅ Spreadsheet Logging - Automatic record-keeping in Google Sheets ✅ Error Handling - Proper 400 responses for invalid signups ✅ Success Responses - Detailed JSON with all pass information Use Cases Gyms & Fitness Centers** - Trial pass management for new members Yoga Studios** - Week-long trial class passes Sports Clubs** - Guest pass generation with photo verification Wellness Centers** - Temporary access cards for trial periods Co-working Spaces** - Day pass generation with member photos Swimming Pools** - Verified trial memberships with photo IDs What You Need Required Credentials VerifiEmail API - Email verification service Get API key: https://verifi.email HTMLCSSToImage API - PNG image generation Get credentials: https://htmlcsstoimg.com HTMLCSSToPDF API - PDF conversion Get credentials: https://pdfmunk.com Gmail OAuth2 - Email delivery Connect your Google account Enable Gmail API in Google Cloud Console Google Sheets API - Data logging Connect your Google account Same OAuth2 as Gmail Setup Instructions Step 1: Create Google Sheet Create a new Google Sheet named "Gym Trial Passes 2025" Add these column headers in Row 1: Pass ID Name Email Start Date Valid Till Issued At Email Verified Status Step 2: Configure Credentials Add VerifiEmail API credentials Add HTMLCSSToImage credentials Add HTMLCSSToPDF credentials Connect Gmail OAuth2 Connect Google Sheets OAuth2 Step 3: Update Google Sheets Node Open "Log to Google Sheets" node Select your "Gym Trial Passes 2025" sheet Confirm column mappings match your headers Step 4: Test the Workflow Copy the webhook URL from the Webhook node Open Postman and create a POST request Use this test payload: { "name": "Rahul Sharma", "email": "your-email@gmail.com", "photo_url": "https://images.unsplash.com/photo-1633332755192-727a05c4013d?w=400", "start_date": "2025-11-15", "valid_till": "2025-11-22" } Send the request and check: ✅ Email received with PDF pass ✅ Google Sheet updated with new row ✅ Success JSON response returned Step 5: Activate & Use Click "Active" toggle to enable the workflow Integrate webhook URL with your gym's website form Members receive instant verified passes upon signup Expected Responses ✅ Success Response (200 OK) { "status": "success", "message": "Gym trial pass verified and sent successfully! 🎉", "data": { "pass_id": "GYM-1731398400123", "email": "member@example.com", "name": "Rahul Sharma", "valid_from": "November 15, 2025", "valid_till": "November 22, 2025", "email_verified": true, "recorded_in_sheets": true, "pass_sent_to_email": true }, "timestamp": "2025-11-12T10:30:45.123Z" } ❌ Error Response (400 Bad Request) { "status": "error", "message": "Invalid or disposable email address. Please use a valid email to register.", "email_verified": false, "email_provided": "test@tempmail.com" } Customization Options Modify Pass Design Edit the Build HTML Pass node to customize: Colors and gradient (currently purple gradient) Layout and spacing Fonts and typography Logo placement (add your gym logo) Additional branding elements Change Email Template Edit the Send Email with Pass node to modify: Subject line Welcome message Instructions Branding elements Footer content Adjust Validity Period Workflow accepts custom start_date and valid_till from webhook payload. You can also hardcode validity periods in the Generate Pass Details node. Add Additional Fields Extend the workflow to capture: Phone number Emergency contact Medical conditions Membership preferences Referral source Performance Average execution time**: 8-12 seconds Handles**: 100+ passes per hour PDF size**: ~150-250 KB Email delivery**: Instant (Gmail API) Success rate**: 99%+ with valid emails Security & Privacy ✅ Email verification prevents fake signups ✅ Unique Pass IDs prevent duplication ✅ All data logged in your private Google Sheet ✅ No data stored in n8n (passes through only) ✅ HTTPS webhook for secure data transmission ✅ OAuth2 authentication for Google services Tags gym fitness trial-pass email-verification qr-code pdf-generation member-onboarding automation verification photo-id
by Alex
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This n8n template automatically parses bank transaction emails (HDFC, Indian Bank, Indian Overseas Bank, UPI apps like Google pay, Paytm, etc.) - The from email(bank name/UPI apps) is changable, classifies them using Gemini AI, and logs them into a structured Google Sheets budget tracker. It helps you consolidate expenses, compare against monthly budgets, and get real-time alerts when limits are exceeded. 📝 Problem Statement Tracking expenses manually from different bank emails and UPI apps is frustrating, time-consuming, and error-prone. Small transactions often slip through, making budget control difficult. This workflow solves that by: Automatically extracting financial data from Gmail. Categorizing expenses using AI parsing. Saving all data into Google Sheets in a structured way. Comparing with monthly budgets and raising alerts. Target Audience: Individuals who want personal budget automation. Families managing shared household spending. Small teams looking for a lightweight financial log. ⚙️ Setup Prerequisites An n8n instance (self-hosted or cloud). A Google account with Gmail + Google Sheets enabled. Pre-created Google Sheets file with 2 tabs: Expenses Budgets A configured Gemini API connection in n8n. 📊 Google Sheets Template Expenses Tab (columns in order): Timestamp | Date | Account | From | To | Type | Category | Description | Amount | Currency | Source | MessageId | Status Budget Tab (columns in order): Month | Category | Budget Amount | Notes | UpdatedAt Yearly Summary Tab (auto-calculated): Year | Month | Category | Total Expense | Budget | Variance | Alert Variance = Budget - Total Expense Alert = ⚠️ Over Budget when spending > budget 🚀 How It Works Gmail: Gmail Trigger captures new bank/UPI emails. Gemini AI Parser extracts structured details (date, amount, category, etc.). Filter Node ensures only valid financial transactions are logged. Information extractor will extract the information like Date, account, transaction type(Credit/Debit), description, currency, status, messageId, from email, to email, category -> checks if the transaction is 'Credit' or 'Debit' then appends the details to the respective google sheet Budget Validator checks against monthly allocations. If the expense is above the budget is raises an alert and will send a email to the connected account. For sending email I wrote a Google Sheet App script: var ss = SpreadsheetApp.getActiveSpreadsheet(); var monthly = ss.getSheetByName("MonthlySummary"); var yearly = ss.getSheetByName("YearlySummary"); // Get values from Monthly Summary var totalExpense = monthly.getRange("D2").getValue(); var budget = monthly.getRange("E2").getValue(); // Get current date info var now = new Date(); var month = Utilities.formatDate(now, "GMT+5:30", "MM"); var year = Utilities.formatDate(now, "GMT+5:30", "yyyy"); var status = (totalExpense > budget) ? "Alert" : ""; // Append to Yearly Summary yearly.appendRow([year, month, totalExpense, status]); // If budget exceeded, send alert email if (status === "Alert") { var emailAddress = "YOUR EMAIL"; var subject = "⚠️ Budget Exceeded - " + month + "/" + year; var body = "Your total expenses this month (" + totalExpense + ") have exceeded your budget of " + budget + ".\n\n" + "Please review your spending."; MailApp.sendEmail(emailAddress, subject, body); } // 🔄 Reset Monthly Summary var lastRow = monthly.getLastRow(); if (lastRow > 3) { // assuming headers in first 2-3 rows monthly.getRange("A4:C" + lastRow).clearContent(); } // Reset total in D2 monthly.getRange("D2").setValue(0); } Monthly summary auto-calculates the expense and updates the expense for every month and budgets(sum all budgets if there are more than 1 budgets). Yearly Summary auto-updates and raises over-budget alerts. Telegram: Takes input from a telegram bot which is connected to the n8n workflow telegram trigger. Gemini AI Parser extracts structured details (date, amount, category, etc.). Then it checks, whether the manually specified details is 'budget' or 'expense', then splits the data -> parse the data -> then again check whether it is 'Budget' or 'Expense' then appends the structured data to the respective google sheet. Monthly summary auto-calculates the expense and updates the expense for every month and budgets(sum all budgets if there are more than 1 budgets). Yearly Summary auto-updates and raises over-budget alerts. 🔧 Customization Add support for more banks/UPI apps by extending the parser schema. const senderEmail = $input.first().json.From || ""; // Account detection let account = ""; // you can modify the bank names and UPI names here if (/alerts@hdfcbank\.net/i.test(senderEmail)) account = "HDFC Bank"; // you can modify the bank names and UPI names here else if (/ealerts@iobnet\.co\.in/i.test(senderEmail)) account = "Indian Overseas Bank"; else if (/alerts@indianbank\.in/i.test(senderEmail)) account = "Indian Bank"; else if (/@upi|@okhdfcbank|@okaxis|@okicici/i.test(emailBody)) { if (/gpay|google pay/i.test(emailBody)) account = "Google Pay"; else if (/phonepe/i.test(emailBody)) account = "PhonePe"; else if (/paytm/i.test(emailBody)) account = "Paytm"; else account = "UPI"; } else { account = "Other"; } // If account is "Other", skip output if (account === "Other") { return []; } // Output return [{ account, from: senderEmail, // exact Gmail "From" metadata snippet: emailBody, messageId: $input.first().json.id || "" }]; Create custom categories (e.g., Travel, Groceries, Subscriptions). Send real-time alerts via Telegram/Slack/Email using n8n nodes. Share the Google Sheet with family or team for collaborative use. 📌 Usage The workflow runs automatically on every new Gmail transaction email and financial input on the telegram bot. At the end of each month, totals are calculated in the Yearly Summary tab. Users only need to maintain the Budget tab with updated monthly allocations.
by Gracewell
Who Is This For? This workflow is designed for educators, universities, examination departments, and EdTech institutions that need a faster, smarter, and standardized way to prepare exam question papers. What Problem Does This Solve? Creating balanced, outcome-based question papers can take hours or even days of manual effort. Faculty often struggle to: Ensure syllabus coverage across units Maintain Bloom’s Taxonomy alignment Keep a consistent difficulty balance Format papers in institution-specific templates How it works This workflow automatically generates an exam question paper based on syllabus topics submitted via a form and sends it to the entered email address. Here’s the flow in simple steps: Form Submission – A student or faculty fills out a form with subject code, syllabus topics, and their email. AI Question Generation – The workflow passes the syllabus to AI agents (Part A with 2 Marks, Part B with 13 Marks, and Part C with 14 Marks) to create question sets. The marks and the no. of question generated can be customized according to the convenience. Merging Questions – All AI-generated questions are combined into a single structured document. Format into HTML – The questions are formatted into a clean HTML exam paper (can also be extended to PDF). Send by Email– The formatted exam paper is sent to the user’s email (with option to CC/BCC). Set up steps Connect Accounts Connect your OpenAI (or LLM) credentials for AI-powered question generation. Connect your Gmail (or preferred email service) to send emails. Prepare Form Create an n8n form trigger with required fields: Subject with Code Syllabus for Unit 1, 2, 3… Email to receive the paper Customize Question Generation Modify the AI prompts for Parts A, B, and C to fit your syllabus style (e.g., 2-mark, 13-mark, 14-mark). Format the Exam Paper Adjust the HTML template to match your institution’s exam paper layout. Test & Deploy Submit a test form entry. Check the received email to ensure formatting looks good. Deploy the workflow to production for real usage. Need help customizing? ✉️ Contact Me 💼 LinkedIn
by Incrementors
Wikipedia to LinkedIn AI Content Poster with Image via Bright Data 📋 Overview Workflow Description: Automatically scrapes Wikipedia articles, generates AI-powered LinkedIn summaries with custom images, and posts professional content to LinkedIn using Bright Data extraction and intelligent content optimization. 🚀 How It Works The workflow follows these simple steps: Article Input: User submits a Wikipedia article name through a simple form interface Data Extraction: Bright Data scrapes the Wikipedia article content including title and full text AI Summarization: Advanced AI models (OpenAI GPT-4 or Claude) create professional LinkedIn-optimized summaries under 2000 characters Image Generation: Ideogram AI creates relevant visual content based on the article summary LinkedIn Publishing: Automatically posts the summary with generated image to your LinkedIn profile URL Generation: Provides a shareable LinkedIn post URL for easy access and sharing ⚡ Setup Requirements Estimated Setup Time: 10-15 minutes Prerequisites n8n instance (self-hosted or cloud) Bright Data account with Wikipedia dataset access OpenAI API account (for GPT-4 access) Anthropic API account (for Claude access - optional) Ideogram AI account (for image generation) LinkedIn account with API access 🔧 Configuration Steps Step 1: Import Workflow Copy the provided JSON workflow file In n8n: Navigate to Workflows → + Add workflow → Import from JSON Paste the JSON content and click Import Save the workflow with a descriptive name Step 2: Configure API Credentials 🌐 Bright Data Setup Go to Credentials → + Add credential → Bright Data API Enter your Bright Data API token Replace BRIGHT_DATA_API_KEY in all HTTP request nodes Test the connection to ensure access 🤖 OpenAI Setup Configure OpenAI credentials in n8n Ensure GPT-4 model access Link credentials to the "OpenAI Chat Model" node Test API connectivity 🎨 Ideogram AI Setup Obtain Ideogram AI API key Replace IDEOGRAM_API_KEY in the "Image Generate" node Configure image generation parameters Test image generation functionality 💼 LinkedIn Setup Set up LinkedIn OAuth2 credentials in n8n Replace LINKEDIN_PROFILE_ID with your profile ID Configure posting permissions Test posting functionality Step 3: Configure Workflow Parameters Update Node Settings: Form Trigger:** Customize the form title and field labels as needed AI Agent:** Adjust the system message for different content styles Image Generate:** Modify image resolution and rendering speed settings LinkedIn Post:** Configure additional fields like hashtags or mentions Step 4: Test the Workflow Testing Recommendations: Start with a simple Wikipedia article (e.g., "Artificial Intelligence") Monitor each node execution for errors Verify the generated summary quality Check image generation and LinkedIn posting Confirm the final LinkedIn URL generation 🎯 Usage Instructions Running the Workflow Access the Form: Use the generated webhook URL to access the submission form Enter Article Name: Type the exact Wikipedia article title you want to process Submit Request: Click submit to start the automated process Monitor Progress: Check the n8n execution log for real-time progress View Results: The workflow will return a LinkedIn post URL upon completion Expected Output 📝 Content Summary Professional LinkedIn-optimized text Under 2000 characters Engaging and informative tone Bullet points for readability 🖼️ Generated Image High-quality AI-generated visual 1280x704 resolution Relevant to article content Professional appearance 🔗 LinkedIn Post Published to your LinkedIn profile Includes both text and image Shareable public URL Professional formatting 🛠️ Customization Options Content Personalization AI Prompts:** Modify the system message in the AI Agent node to change writing style Character Limits:** Adjust summary length requirements Tone Settings:** Change from professional to casual or technical Hashtag Integration:** Add relevant hashtags to LinkedIn posts Visual Customization Image Style:** Modify Ideogram prompts for different visual styles Resolution:** Change image dimensions based on LinkedIn requirements Rendering Speed:** Balance between speed and quality Brand Elements:** Include company logos or brand colors 🔍 Troubleshooting Common Issues & Solutions ⚠️ Bright Data Connection Issues Verify API key is correctly configured Check dataset access permissions Ensure sufficient API credits Validate Wikipedia article exists 🤖 AI Processing Errors Check OpenAI API quotas and limits Verify model access permissions Review input text length and format Test with simpler article content 🖼️ Image Generation Failures Validate Ideogram API key Check image prompt content Verify API usage limits Test with shorter prompts 💼 LinkedIn Posting Issues Re-authenticate LinkedIn OAuth Check posting permissions Verify profile ID configuration Test with shorter content ⚡ Performance & Limitations Expected Processing Times Wikipedia Scraping:** 30-60 seconds AI Summarization:** 15-30 seconds Image Generation:** 45-90 seconds LinkedIn Posting:** 10-15 seconds Total Workflow:** 2-4 minutes per article Usage Recommendations Best Practices: Use well-known Wikipedia articles for better results Monitor API usage across all services Test content quality before bulk processing Respect LinkedIn posting frequency limits Keep backup of successful configurations 📊 Use Cases 📚 Educational Content Create engaging educational posts from Wikipedia articles on science, history, or technology topics. 🏢 Thought Leadership Transform complex topics into accessible LinkedIn content to establish industry expertise. 📰 Content Marketing Generate regular, informative posts to maintain active LinkedIn presence with minimal effort. 🔬 Research Sharing Quickly summarize and share research findings or scientific discoveries with your network. 🎉 Conclusion This workflow provides a powerful, automated solution for creating professional LinkedIn content from Wikipedia articles. By combining web scraping, AI summarization, image generation, and social media posting, you can maintain an active and engaging LinkedIn presence with minimal manual effort. The workflow is designed to be flexible and customizable, allowing you to adapt the content style, visual elements, and posting frequency to match your professional brand and audience preferences. For any questions or support, please contact: info@incrementors.com or fill out this form: https://www.incrementors.com/contact-us/
by Rahul Joshi
📊 Description Automatically analyze your Instagram posts’ engagement and audience sentiment using GPT-4 to uncover top-performing content and improvement opportunities. 💬📈 This workflow fetches your latest Instagram posts using the Facebook Graph API, evaluates likes, comments, and tone, then generates structured performance insights. The results are logged into Google Sheets, shared via Slack alerts, and emailed through Outlook — empowering your social media team with daily, AI-powered engagement intelligence. 🚀 What This Template Does 1️⃣ Trigger – Runs daily at 10 AM to fetch the latest Instagram posts. ⏰ 2️⃣ Data Fetching – Uses the Facebook Graph API to extract post data, captions, likes, and comments. 📲 3️⃣ Formatting – Cleans and structures post and comment data for analysis. 🧩 4️⃣ AI Evaluation – GPT-4 analyzes engagement metrics and comment sentiment to score post performance. 🤖 5️⃣ Decision Routing – Flags high- and low-performing posts for separate processing. ⚙️ 6️⃣ Notifications – Sends positive performance summaries or negative alerts to Slack. 💬 7️⃣ Logging – Records engagement metrics, sentiment labels, and AI recommendations in Google Sheets. 📊 8️⃣ Reporting – Emails detailed performance summaries to the marketing team via Outlook. 💌 Key Benefits ✅ Automates social performance tracking across Instagram posts ✅ Provides AI-driven sentiment and engagement insights ✅ Flags top or underperforming content for immediate action ✅ Delivers Slack and email reports for team visibility ✅ Centralizes analytics in Google Sheets for trend tracking Features Facebook Graph API integration for Instagram post and comment retrieval GPT-4 sentiment and engagement evaluation Structured JSON insights and recommendations Slack alerts for both positive and negative performance Google Sheets data logging with trend metrics Outlook email reporting for management visibility Customizable scheduling and thresholds Requirements Facebook Graph API credentials connected to your Instagram Business account OpenAI API key for GPT-4 or GPT-4o-mini Slack Bot token with chat:write permissions Google Sheets OAuth2 credentials with edit rights Microsoft Outlook OAuth2 credentials for email delivery Optional environment variables for IDs: SHEET_ID, SLACK_CHANNEL_ID, OUTLOOK_EMAIL Target Audience Marketing and social media teams optimizing engagement 📣 Agencies managing multiple Instagram accounts 🧑💼 Analysts tracking performance metrics and audience tone 📊 Brands automating daily performance reports 📅 Step-by-Step Setup Instructions 1️⃣ Connect your Facebook Graph API to your Instagram Business Account. 2️⃣ Add OpenAI API credentials (use GPT-4 model for best results). 3️⃣ Configure Slack for team notifications and specify the channel ID. 4️⃣ Link Google Sheets and set your sheet ID for data logging. 5️⃣ Connect Microsoft Outlook for daily performance email reports. 6️⃣ Adjust the schedule (default: 10 AM daily) to suit your workflow. 7️⃣ Run a test once, verify data mapping, and enable automation. ✅
by PollupAI
Who's it for This template is for Customer Success and Sales teams who use HubSpot. It automates the critical handoff from sales to success, ensuring every new customer gets a fast, personalized welcome. It's perfect for anyone looking to standardize their onboarding process, save time on manual tasks, and improve the new customer experience using AI. What it does This workflow triggers when a deal's "Is closed won" property is set to True in HubSpot. It assigns a Customer Success Manager (CSM) by querying an n8n Data Table to find the 'least busy' CSM (based on a deal count) and fetches the deal's details to find all associated contacts. It then loops to identify the "Champion" contact by checking their "Buying Role" (hs_buying_role). An AI agent (in the AI: Write Welcome Email node) generates a personalized welcome email, which is converted to HTML and sent via Gmail. Finally, the workflow updates the Champion's contact record in HubSpot and updates the CSM's deal count in the Data Table to keep the logic in sync. How to set up Create and Populate Data Table: This template requires an n8n Data Table to manage CSM assignments. Create a Data Table named csm_assignments. Add two columns: csm_id (String) and deal_count (Number). Add one row for each CSM with their HubSpot Owner ID and a starting deal_count of 0. Link Data Table Nodes: Open the Get CSM List and Increment CSM Deal Count nodes and select the csm_assignments table you just created from the Table dropdown. Configure Variables: In the Configure Template Variables node, you must set your sender info (company_name, sender_name, and sender_email). Customize AI Prompt: In the AI: Write Welcome Email node, update the placeholder [Link to Your Video] and [Link to Your Help Doc] links with your own URLs. Check HubSpot Property: This workflow assumes you use the "Buying Role" (hs_buying_role) contact property to identify your "Champion". If you use a different property, you must update the HubSpot: Get Contact Details and If Role is 'Champion' nodes. Requirements Access to n8n Data Tables. HubSpot (Developer API):** A credential for the Trigger: Deal Is 'Closed Won' node. HubSpot (OAuth2):** A credential for all other HubSpot nodes (Get Deal Details, Get Contact Details, Assign Contact Owner). AI Credentials:** (e.g., OpenAI) Credentials for the AI Model node (the node connected to AI: Write Welcome Email). Email Credentials:** (e.g., Gmail) Credentials for the Gmail: Send Welcome Email node. How to customize the workflow You can easily customize this workflow to send different emails based on deal properties. Add an If node after the HubSpot: Get Deal Details node to check for the deal's value, product line, or region. Based on these properties, you can route the flow to different AI: Write Welcome Email nodes with unique prompts. For example, you could check the contact's 'industry' or 'company size' to send them links to different, more relevant 'Getting Started' videos and documentation.
by Masaki Go
About This Template This workflow automatically generates and sends AI-powered responses to user inquiries from a LINE Official Account. It uses RAG (Retrieval-Augmented Generation) technology to produce natural, context-aware answers based on your FAQ database (Supabase/PostgreSQL). How It Works Receive Questions: An n8n webhook receives messages from your LINE Official Account. FAQ Search: The n8n LangChain Agent analyzes the user’s question and performs a vector search on your Supabase FAQ database. It can also fetch user-specific data (e.g., reservation info) from PostgreSQL. AI Generation: The OpenAI GPT model generates a context-aware answer based on the retrieved information and conversation history. Reply: The response is sent back to the user via the LINE Messaging API. Admin Notifications: (Optional) If the AI cannot answer, the workflow can notify admins (e.g., via LINE WORKS or Slack). Who It’s For Businesses wanting to automate customer support on LINE. Developers building intelligent chatbots with existing FAQ data. Organizations aiming for 24/7 customer service. Requirements An n8n account (cloud or self-hosted) An OpenAI API key A Supabase account (for FAQ data) A PostgreSQL database (for conversation history) A LINE Official Account & Messaging API access token Setup Steps Configure Credentials: Register credentials for OpenAI, Supabase, PostgreSQL, and LINE Messaging API in n8n. Prepare Databases: Create your tables in Supabase (for FAQs) and PostgreSQL (for conversation history). Customize the Prompt: In the "RAG AI Agent" node, edit the system prompt to fit your business and tone. Set Environment Variables: Update URLs, Channel IDs, and API endpoints in the nodes to match your environment. Customization Options Change AI Model:** Select a different model (e.g., gpt-4o) in the "OpenAI Chat Model" node. Add Data Sources:** Add new "Tool" nodes (like an HTTP Request) to the "RAG AI Agent" to access other APIs (e.g., booking systems). Change Notifications:** Replace the "LINE Works" nodes with a Slack or Email node to change the admin notification channel.
by Abdullah Alshiekh
📈 Automated Customer Rewards Platform: Jotform Integration This blueprint details a highly efficient, AI-powered workflow designed to automate customer reward fulfillment. Leveraging the accessible interface of Jotform, this system delivers superior reliability and exceptional processing speed. 📊 Reliability, Productivity, and Performance This workflow is engineered to maximize operational efficiency and maintain data integrity: Instant Fulfillment: Automation handles receipt scanning (OCR), AI calculation, logging, and notification in seconds, eliminating manual delays. Seamless Data Capture: Leverages the user-friendly Jotform interface for fast, reliable customer submission and file uploads. 🛠️ Quick Configuration Guide Jotform Webhook:* In your *JotForm* settings, paste the n8n *Jotform Trigger URL** into the Webhook Integration. Done. API Access:* Generate a *"Full Access"** JotForm API key and insert it into the required n8n nodes (Jotform Trigger and Fetch All Receipts). Credential Setup:** Plug in your necessary API keys (Gemini, OCR.Space) and update the Notion Database ID and internal email recipient. 🚀 How It Works (Practical Flow) Submission:* Customer submits their request via *Jotform**. Processing:** System extracts text from the receipt (OCR), the AI calculates the reward, and the If node verifies the total. Fulfillment:** Transaction logged, confirmation emails sent to both the customer and the internal team. If you need any help Get in Touch