by Amjid Ali
Overview This workflow template automates lead management and customer inquiry processing by integrating ERPNext, AI agents, and email notifications. It streamlines the process of capturing leads, analyzing inquiries, and generating actionable responses. The workflow uses ERPNext to capture inquiries, analyzes them with AI, and notifies the appropriate team or individual, all while maintaining a professional approach. What This Template Does ERPNext Webhook Integration: Captures leads and inquiries through ERPNext webhooks. Triggers the workflow when a new lead is created. AI-Powered Inquiry Analysis: Uses AI to extract key details from lead notes (e.g., customer name, organization, inquiry summary). Classifies inquiries as valid or invalid based on relevance to products, services, or solutions. Contact Assignment: Matches inquiries to the appropriate contact(s) using a Google Sheets database or ERPNext contact information. Handles multiple contacts if required. Email Notifications: Generates professional email notifications for valid inquiries. Sends emails to the appropriate contact(s) with inquiry details and action steps. Invalid Lead Handling: Identifies invalid inquiries (e.g., unrelated to products or services) and flags them for follow-up or dismissal. Custom Email Formatting: Converts plain text into professionally formatted HTML emails. Ensures that communication is clear, concise, and visually appealing. How It Works Step 1: Capture Lead Data Webhook in ERPNext:** Create a webhook in ERPNext for the "Lead" DocType. Set the trigger to on_insert to capture new leads in real-time. Lead Details:** The workflow fetches lead details, including notes, contact information, and the source of the lead. Step 2: Validate and Analyze Inquiry AI Agent for Analysis:** An AI agent analyzes the lead notes to extract key details and classify the inquiry as valid or invalid. The analysis includes checking the relevance of the inquiry to products, services, or solutions offered by the company. Invalid Leads:** If the inquiry is invalid, the workflow flags it and stops further processing. Step 3: Assign Contact(s) Google Sheets Integration:** Uses a Google Sheets database to map products, services, or solutions to responsible contacts. Ensures that inquiries are directed to the right person or team. Multiple Contacts:** Handles cases where multiple contacts are responsible for a particular product or service. Step 4: Generate and Send Email Notifications AI-Generated Emails:** The workflow generates a professional email summarizing the inquiry. Emails include details like customer name, organization, inquiry summary, and action steps. Custom HTML Formatting:** Emails are converted to HTML for a polished and professional appearance. Send Notifications:** Sends email notifications through Microsoft Outlook or another configured email client. Optionally, notifies via WhatsApp or SMS for urgent inquiries. Step 5: Post-Inquiry Actions ERPNext Record Updates:** Updates the lead record in ERPNext with relevant details, including inquiry status and contact information. Setup Instructions Prerequisites ERPNext: A configured ERPNext instance with lead data and a webhook for the "Lead" DocType. Google Sheets: A sheet mapping products, services, or solutions to responsible contacts. AI Integration: Credentials for OpenAI or other supported AI platforms. Email Client: Credentials for Microsoft Outlook or another email client. Step-by-Step Setup ERPNext Configuration: Create a webhook for the "Lead" DocType in ERPNext. Test the webhook with sample data to ensure proper integration. Workflow Import: Import the workflow template into n8n. Configure nodes with your API credentials for ERPNext, Google Sheets, and AI tools. Google Sheets Integration: Prepare a Google Sheet with columns for product, service, or solution and the responsible contact(s). Link the sheet to the workflow. AI Agent Configuration: Customize the AI agent’s prompts to align with your business’s products and services. Adjust criteria for valid and invalid inquiries as needed. Email Setup: Configure the email client node with your email service credentials. Customize the email template for your organization. Testing: Run the workflow with sample leads to validate the entire process. Check email notifications, contact assignments, and record updates in ERPNext. Dos and Don’ts Dos: Test Thoroughly:** Test the workflow with various scenarios before deploying in production. Secure Credentials:** Keep API and email credentials secure to avoid unauthorized access. Customize Prompts:** Tailor AI prompts to match your business needs and language style. Use Professional Email Templates:** Ensure emails are clear and well-formatted. Don’ts: Skip Validation:** Always validate inquiry data to avoid sending irrelevant notifications. Overload the Workflow:** Avoid adding unnecessary nodes that can slow down processing. Ignore Errors:** Monitor logs and address errors promptly for a smooth workflow. Resources GET n8n Now N8N COURSE n8n Book YouTube Tutorial:** Watch the full step-by-step tutorial on setting up this workflow: SyncBricks YouTube Channel Courses and Training:** Learn more about ERPNext and AI automation through my comprehensive courses: SyncBricks LMS Support and Contact:** Email: amjid@amjidali.com Website: SyncBricks LinkedIn: Amjid Ali
by Uche Madu
🧩 What This Workflow Does This workflow automates the process of identifying and enriching decision-maker contacts from a list of companies. By integrating with Apollo's APIs and Google Sheets, it streamlines lead generation, ensures data accuracy through human verification, and maintains an organized leads database. 📚 Use Case Ideal for sales and marketing teams aiming to: Automate the discovery of key decision-makers (e.g., CEOs, CTOs). Enrich contact information with LinkedIn profiles, emails, and phone numbers. Maintain an up-to-date leads database with minimal manual intervention. Receive weekly summaries of newly verified leads. 🧪 Setup 1. Google Sheets Preparation: Use the following pre-configured Google Sheet: Company Decision Maker Discovery Sheet. This spreadsheet includes the necessary tabs and columns: Companies, Contacts, and Contacts (Verified). It also contains a custom onEdit Apps Script function that automatically updates the Status column to Pending whenever the Domain field is modified. To review or modify the script, navigate to Extensions > Apps Script within the Google Sheet. 2. Credentials Setup: Configure the following credentials in your n8n instance: Google Sheets: To read from and write to the spreadsheet. Slack: To send verification prompts and weekly reports. Apollo: To access the Organization Search, Organization Enrichment, People Search, and Bulk People Enrichment APIs. LLM Service (e.g., OpenAI): To generate company summaries and determine departments based on job titles. 3. Workflow Configuration: Import the workflow into your n8n instance. Update the nodes to reference the correct Google Sheet and Slack channel. Ensure that the Apollo and LLM nodes have the appropriate API keys and configurations. 4. Testing the Workflow: Add a new company entry in the Companies tab of the Google Sheet. Verify that the workflow triggers automatically, processes the data, and updates the Contacts and Contacts (Verified) tabs accordingly. Check Slack for any verification prompts and confirm that weekly reports are sent as scheduled.
by Jimleuk
This n8n template builds a meeting assistant that compiles timely reminders of upcoming meetings filled with email history and recent LinkedIn activity of other people on the invite. This is then discreetly sent via WhatsApp ensuring the user is always prepared, informed and ready to impress! How it works A scheduled trigger fires hourly to check for upcoming personal meetings. When found, the invite is analysed by an AI agent to pull email and LinkedIn details of the other invitees. 2 subworkflows are then triggered for each invitee to (1) search for last email correspondence with them and (2) scrape their LinkedIn profile + recent activity for social updates. Using both available sources, another AI agent is used to summarise this information and generate a short meeting prep message for the user. The notification is finally sent to the user's WhatsApp, allowing them ample time to review. How to use There are a lot of moving parts in this template so in it's current form, it's best to use this for personal rather than team calendars. The LinkedIn scraping method used in this workflow requires you to paste in your LinkedIn cookies from your browser which essentially let's n8n impersonate you. You can retrieve this from dev console or ask someone technical for help! Note: It may be wise to switch to other LinkedIn scraping approaches which do not impersonate your own account for production. Requirements OpenAI for LLM Gmail for Email Google Calendar for upcoming events WhatsApp Business account for notifications Customising this workflow Try adding information sources which are relevant to you and your invitees. Such as company search, other social media sites etc. Create an on-demand version which doesn't rely on the scheduled trigger. Sometimes you want to know prepare for meetings hours or days in advance where this could help immensely.
by n8n Team
This workflow let's a bot in Slack notify a specific channel when a new product in WooCommerce is published and live on the site. Prerequisites WooCommerce account Slack and a Slack bot How it works Listen for WooCommerce product creation If permalink starts with https://[your-url-here].com/product/ Slack bot notifies channel that a new product has been added. Please note, you must update the URL in the IF node to match your url. If your WooCommerce doesn't use the slug /product/, that will need to be updated too.
by Sheryl
Description This workflow provides a powerful AI assistant for content creators, book editors, and marketers. It automates the collection and analysis of trending discussions from Reddit, YouTube, and X (Twitter), generating insightful topic reports. This frees you from hours of tedious data compilation, allowing you to make faster, more accurate topic decisions based on deep AI analysis. How it works This workflow simulates the complete research process of a strategic editor: Initiate & Collect: A user submits a keyword via a public Form Trigger. The workflow then automatically fetches relevant, trending content in parallel from the official APIs of Reddit, YouTube, and X (Twitter). Multi-stage AI Processing & Analysis: The workflow utilizes a layered AI pipeline to process the data. First, a lightweight Gemini model in the AI Pre-filter Content node rapidly screens the vast amount of content to filter out noise. Next, a more powerful Gemini Pro model in the AI Deep Analysis node performs a detailed, structured analysis on each high-value item, extracting summaries, sentiment, and key arguments. Finally, a "strategist" AI model in the AI Synthesize Final Report node aggregates all analyses to generate the comprehensive final topic report in HTML. Multi-Channel Report Distribution: The workflow distributes the final report to multiple channels based on pre-defined templates. The Send Gmail Report node sends the complete HTML report. The Send Feishu Notification node sends a concise summary card to a group chat. Meanwhile, the Archive to Google Sheets node archives key data. Setup Steps This workflow takes approximately 20-30 minutes to set up, with most of the time spent connecting your accounts. Connect Your API Accounts: In the n8n Credentials section, you will need to prepare and connect credentials for the following services: Google: For the Gemini AI model, Gmail sending, and Google Sheets archiving. This requires a Google Cloud API Key and OAuth2 credentials. Reddit: For fetching Reddit posts. This requires a Reddit account with OAuth2 configured in n8n to allow searches. YouTube: For collecting YouTube videos. You'll need to enable the YouTube Data API v3 in your Google Cloud Console and get an API Key. Twitter: For the official Twitter node, requiring a free developer account and an App with v2 API access. Configure Output Channels: In the final nodes (Send Gmail Report, Send Feishu Notification, Archive to Google Sheets), update the recipient email address, the Feishu bot's Webhook URL, and the target spreadsheet ID to match your own. Activate and Share the Trigger: Activate the workflow. The first Form Trigger node will automatically generate a public URL. Share this link with your team members to let them start using the tool.
by David Olusola
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. MCP Gmail Workflow – AI-Powered Email Management ✨ What It Does A smart n8n workflow that connects Gmail with an AI agent (via MCP), letting you send, read, and organize emails using natural language. ⚙️ Key Features 🧠 AI Commands: “Send email to John about the budget” 📥 Inbox Control: Mark read/unread, apply/remove labels 🗂 Smart Organization: Auto-label based on content 🤖 MCP-Ready: Works with Claude, ChatGPT, etc. 🎯 Use Cases “📤 Send a follow-up to the client about yesterday’s meeting” “📬 Mark all newsletters as read and label ‘Newsletter’” “🧾 Summarize latest email from Sarah” “🗃 Label all Project X emails as ‘Project-X-2024’” “⭐ Find unread emails from my manager and mark as important” 🛠 Setup Guide 🔑 Prerequisites n8n (self-hosted or cloud) Gmail API credentials MCP-compatible AI (optional but powerful) 📥 1. Import Workflow Copy JSON → Open n8n → Import → Paste → Done ✅ 🔐 2. Gmail OAuth2 Setup Create Google project → Enable Gmail API Create OAuth2 creds → Add n8n redirect URI In n8n: Add Gmail OAuth2 → Paste Client ID/Secret → Connect 🧩 3. Update Credential References Find your credential ID in n8n Update each Gmail node with your ID 🧠 4. MCP Trigger (Optional) Use provided webhook URL in your AI system Send test prompts to verify connection 🧪 5. Test Key Actions ✅ “Send a test email” ✅ “Read latest email” ✅ “Label last email as ‘Test’” ✅ “Mark latest email as unread” ⚙️ 6. Advanced Tips Create custom labels in Gmail Use HTTPS + webhook auth Add retries and error handling in n8n 🧯 Troubleshooting ❗ Gmail Auth Error? → Re-auth and check redirect URI ❗ Webhook not firing? → Check endpoint + manual test ❗ Label errors? → Use correct label names or IDs ✅ Required Gmail Scopes: gmail.modify gmail.send 📈 Best Practices 🔁 Test regularly 🔒 Use minimal permissions 🏷 Consistent label naming 🔍 Monitor execution + webhook logs 🎉 You’re All Set! Control Gmail with your voice or text through AI. Make managing emails smarter, faster, and 100% automated 💌
by Dr. Firas
Convert YouTube videos to viral Shorts with Klap and auto-post with Blotato > ⚠️ Disclaimer: This workflow uses Community Nodes and requires a self-hosted n8n instance. Who is this for? This workflow is perfect for content creators, YouTubers, marketing teams and entrepreneurs who want to effortlessly convert long YouTube videos into short, viral-ready clips and publish them automatically on TikTok, Instagram, YouTube Shorts and other platforms. What problem is this workflow solving? Manually creating short, engaging clips from YouTube videos takes hours: Selecting highlights Adding subtitles and effects Downloading and editing Posting individually on each platform This workflow eliminates all of that: AI-powered Shorts generation with Klap Smart scheduling based on your posting calendar Full automation of uploads to multiple platforms What this workflow does From a simple YouTube link sent via Telegram, the workflow: Extracts the YouTube URL and number of Shorts requested Sends the video to Klap for AI-powered Shorts generation Checks when the Shorts are ready Schedules publication times based on your custom settings Uploads the Shorts to Blotato Auto-posts on TikTok, YouTube Shorts, Instagram and more Sends a confirmation recap to Telegram Setup Connect your Telegram bot to the trigger node Add your Klap API key for video processing Link your Google Sheets with your scheduling preferences Add your Blotato API key and social platform IDs Adjust the number of Shorts generated if needed Modify the scheduling logic or time windows in the Google Sheets How to customize this workflow to your needs Change AI video settings in the Klap API request Adjust time windows and frequency in the scheduling nodes Limit the workflow to specific platforms (e.g., TikTok only) Add a manual approval step before publishing Modify the Telegram recap message content 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
by Joseph LePage
Empower Your AI Chatbot with Long-Term Memory and Dynamic Tool Routing This n8n workflow equips your AI agent with long-term memory and a dynamic tools router, enabling it to provide intelligent, context-aware responses while managing tasks across multiple tools. By combining persistent memory and modular task routing, this workflow makes your AI smarter, more efficient, and highly adaptable. 👥 Who Is This For? AI Developers & Automation Enthusiasts: Integrate advanced AI features like long-term memory and task routing without coding expertise. Businesses & Teams: Automate tasks while maintaining personalized, context-aware interactions. Customer Support Teams: Improve user experience with chatbots that remember past interactions. Marketers & Content Creators: Streamline communication across platforms like Gmail and Telegram. AI Researchers: Experiment with persistent memory and multi-tool integration. 🚀 What Problem Does This Solve? This workflow simplifies the creation of intelligent AI systems that retain memory, manage tasks dynamically, and automate notifications across tools like Gmail and Telegram—saving time and improving efficiency. 🛠️ What This Workflow Does Save & Retrieve Memories**: Uses Google Docs for long-term storage to recall past interactions or user preferences. Dynamic Task Routing**: Routes tasks to the right tools (e.g., saving/retrieving memories or sending notifications). AI-Powered Context Understanding**: Combines OpenAI GPT-based short-term memory with long-term memory for smarter responses. Multi-Channel Notifications**: Sends updates via Gmail or Telegram. 🔧 Setup API Credentials: Connect to OpenAI (AI processing), Google Docs (memory storage), Gmail/Telegram (notifications). Customize Parameters: Adjust the AI agent's system message for your use case. Define task-routing rules in the tools router node. Test & Deploy: Verify memory saving/retrieval, task routing, and notification delivery. 💡 How to Customize Modify the system message in the OpenAI node to tailor your agent’s behavior. Add or adjust routing rules for additional tools. Update notification settings to match your communication preferences.
by AppStoneLab Technologies LLP
🎉 Festival Social Media Automation with Gemini AI for X/Twitter & Facebook Transform your festival marketing with this comprehensive automation workflow that creates and posts culturally authentic social media content across multiple platforms daily. ⚙️ What this workflow does This workflow automatically: Fetches festival data** from Google Sheets based on today's date Generates AI-powered prompts** for both image creation and social media content Creates stunning festival images** using Google Gemini 2.0 Flash Preview Produces platform-specific content** optimized for X (Twitter) and Facebook Posts automatically** with proper image attachments and error handling ✨ Key Features 🎯 Intelligent Content Generation AI-powered prompt generation tailored to each festival's cultural context Platform-specific content optimization (character limits, hashtag strategies) Culturally sensitive and authentic messaging 🎨 Visual Content Creation Automated image generation using Google Gemini 2.0 Flash Preview Festival-themed graphics with vibrant, culturally appropriate designs Optimized for social media engagement 📲 Multi-Platform Publishing Simultaneous posting to X (Twitter) and Facebook Platform-specific formatting and optimization Built-in error handling and backup posting methods ⏰ Fully Automated Daily execution at 8:00 AM Date-based festival data retrieval Zero manual intervention required 📱 Apps and Integrations Google Sheets** - Festival calendar and data storage Google Gemini 2.0 Flash Preview** - AI content and image generation X (Twitter)** - Social media posting Facebook Graph API** - Facebook page posting Schedule Trigger** - Daily automation 🛠️🕊️ Setup Instructions 1. 📊 Google Sheets Configuration Create a Google Sheets document with columns: Date, Name of the Festival, Description Format dates as DD/MM/YYYY Connect your Google Sheets credential in n8n 2. 🤖 Google Gemini API Setup Obtain a Google AI Studio API key from Google AI Studio Configure the Google Gemini credential in n8n Ensure you have access to Gemini 2.0 Flash Preview 3. 🕊️X (Twitter) Credentials Setup Important: Due to X API limitations, you'll need TWO separate OAuth2 credentials: X API For Image Upload (Generic OAuth2): Create a new OAuth2 credential with these settings: Grant Type: PKCE Authorization URL: https://x.com/i/oauth2/authorize Access Token URL: https://api.x.com/2/oauth2/token Scope: media.write offline.access tweet.read users.read Note: Cannot combine media.write with tweet.write in the same credential For Tweet Posting (X OAuth2): Use the predefined X OAuth2 credential Configure with scopes: tweet.write offline.access tweet.read users.read 4. 📘Facebook Graph API Setup Create a Facebook App and get your access token from Meta for Developers Configure the Facebook Graph API credential Update the node with your Facebook page ID 🎬 How to Use Populate your Google Sheets with festival data for upcoming dates Activate the workflow - it will run automatically daily at 8:00 AM Monitor the execution - check logs for successful posts or any errors Customize content by modifying the prompt generation logic if needed 🔄 Workflow Components 🔗 Data Flow Daily Trigger → Get Today's Date → Fetch Festival Data Generate AI Prompts → Create Image & Content Process Media → Merge Data → Post to Platforms 🛡️ Error Handling Backup HTTP posting method for X if primary method fails Continue execution even if individual platform posting fails Comprehensive error logging for troubleshooting 🎨 Customization Options ✍️ Content Personalization Modify the prompt generation logic for different content styles Adjust platform-specific character limits and hashtag strategies Customize image generation prompts for different visual styles 🌐 Platform Extension Add Instagram, LinkedIn, or other social media platforms Implement additional content formats (Stories, Reels, etc.) Create platform-specific posting schedules 📊 Data Sources Connect to different data sources (Airtable, Notion, CMS) Add support for multiple festival categories Implement content approval workflows 💡 Best Practices 📝 Content Quality Regularly review and update your festival database Monitor AI-generated content for cultural sensitivity Test different prompt styles for optimal engagement 🔑 API Management Monitor API usage limits for all connected services Implement rate limiting for high-volume posting Set up alerts for credential expiration ⏰ Scheduling Consider time zones for optimal posting times Implement staggered posting across platforms Add weekend/holiday scheduling logic 🔧 Troubleshooting ⚠️ Common Issues Image upload fails**: Check OAuth2 credentials and API limits Content generation errors**: Verify Gemini API key and model availability Date matching issues**: Ensure date format consistency in Google Sheets ⚡️ Performance Tips Optimize image generation prompts for faster processing Use structured output parsing for consistent results Implement content caching for repeated festivals 🎯 Use Cases Cultural Organizations** - Automate festival announcements and celebrations Event Management Companies** - Scale social media presence across multiple events Tourism Boards** - Promote local festivals and cultural events Marketing Agencies** - Manage multiple client festival campaigns Community Organizations** - Engage audiences with regular cultural content ⭐️ Benefits Time Savings** - Eliminate manual social media posting Consistency** - Maintain regular posting schedule Cultural Authenticity** - AI-generated content respects cultural context Multi-Platform Reach** - Simultaneous posting increases visibility Scalability** - Handle unlimited festivals with zero additional effort This workflow transforms festival marketing from a time-consuming manual process into a fully automated, culturally intelligent system that engages audiences across multiple platforms while maintaining authenticity and relevance.
by Dina Lev
Automate Legal Document Generation with n8n, Apify, Google Drive, and AI This tutorial details an end-to-end automation solution for streamlining the lien filing process for Homeowners Associations (HOAs) using an n8n workflow. It significantly reduces manual effort and potential errors for legal professionals by automating document retrieval, information extraction, and document generation. Who's it for This template is ideal for legal professionals, law firms, and property management companies that frequently handle lien filings for Homeowners Associations. If you're looking to reduce manual document processing time, minimize errors, and improve efficiency in your legal operations, this workflow is for you. The Problem Legal professionals often allocate a significant portion of their time—up to 40%—to manual document processing tasks. The traditional process for filing a lien is particularly time-consuming (e.g., 15 minutes per case) and error-prone, involving steps like manual searching, downloading, extracting, and populating legal documents. The Automation Solution Overview This automation leverages an n8n workflow in conjunction with external services like Playwright (via Apify), Google Drive, Google Sheets, Gmail, and the Gemini API. The primary objective is to automate the legal document generation process—from initial data submission to final document generation and notification. Requirements Before importing and running the n8n workflow, you need the following: n8n Instance:** A running n8n instance (self-hosted or cloud). Google Account:** With access to Google Sheets, Google Drive, and Gmail. Google Sheets:** An Input Sheet to receive form responses (e.g., "Legal Automation Input Form (Responses)"). An Output/Review Sheet for extracted data and approval (e.g., "Automation Output data Sheet") with specific columns like "Timestamp", "Legal Description", "Association Name", "Debt", "Parcel", "Owner", "Doc link", "Approval", and "Created". Google Drive:** A main folder for n8n outputs (e.g., "N8N Folder"). A Google Docs Lien Template with placeholders (e.g., {{ASSOCIATION}}, {{DEBT}}, {{PROPERTY}}, {{MONTH}}, {{YEAR}}, {{DAY}}, {{PARCEL}}, {{OWNER}}). Google Gemini API Key:** For text and image processing. Apify Account & Playwright Actor:** An Apify account with access to a Playwright actor capable of scraping property information from your target county's website. Setup Steps n8n Credentials: Add Google Sheets, Google Drive, and Gmail credentials in your n8n instance. Add an HTTP Query Auth credential for your Gemini API key (named "Query Auth account" in the template). Ensure your Apify API token is configured within the Apify Playwright script to find property info node. Google Sheets Configuration: Link the Google Sheets Trigger node to your Input Sheet. Link the Google Sheets node (for appending data) and the Intermediate data received trigger to your Output/Review Sheet. Google Drive Configuration: Update the Create folder to output node with the ID of your "N8N Folder". Update the Make Copy of Template node with the ID of your Google Docs Lien Template. Email Addresses: Update the recipient email addresses in the Approve Through Email and Notify complete nodes to your desired notification email. Detailed Tutorial Steps and n8n Workflow Breakdown Summary This n8n workflow, "Legal Document Generator E2E", automates the process of generating legal lien documents, from initial data input to final document creation and notification. Initiate Workflow: The workflow starts with a Google Sheets Trigger node, which listens for new lien requests submitted via a form that populates a Google Sheet. Gather Property Data: An Apify Playwright script to find property info node fetches property details from county websites, and a Get file for property node downloads associated legal documents. Process and Store Document: The downloaded document is transformed to base64 using Transform to base64 and then uploaded to Google Drive via Upload legal doc for storage and further processing. Extract Information with AI: Call Gemini API for legal desc and Property metadata nodes leverage the Gemini API to extract the precise legal description, parcel number, and owner's name from the document. This extracted data is then structured by the Property Information Extractor. Review and Approve: The extracted information is appended to an intermediate Google Sheet by the first Google Sheets node, and an email is sent via Approve Through Email to the user for review and approval. Generate Documents on Approval: A second Intermediate data received Google Sheets Trigger node monitors the approval status in the sheet. Once "Approved", an If node allows the workflow to proceed. Create and Populate Documents: A new client-specific folder is created in Google Drive using Create folder to output. A blank lien template is copied (Make Copy of Template), and its custom variables are populated with the extracted data using Change Custom Variables. Finalize and Store Output: The populated document is converted to PDF (Generate PDF), and both the new PDF (Add PDF To Drive) and the original source document (Move file in Google Drive) are saved to the client's new folder. Update Records and Notify: The Update Creation Google Sheets node marks the document as "Created" in the tracking sheet and updates the document link. Finally, Notify complete sends a notification email about the completion. How to Customize the Workflow Adjust Input Form Fields:** Modify the column names in your initial Google Sheet and update the expressions in the Google Sheets Trigger and Apify Playwright script to find property info nodes to match your form. Change County Website/Scraper:** If you need to fetch data from a different county or property database, you will need to modify the Apify Playwright script to find property info node to call a different Apify actor or configure a new HTTP Request node to interact with your chosen data source. Customize Document Template:** Update the placeholders in your Google Docs Lien Template to match your specific document needs. Ensure corresponding replaceAll actions are updated in the Change Custom Variables node. Modify AI Prompts:** Refine the prompts within the Call Gemini API for legal desc and Property metadata nodes to improve the accuracy of information extraction based on your document types. Notification Preferences:** Adjust the sendTo email addresses and subject/message content in the Approve Through Email and Notify complete nodes. Benefits of this Automation This automation offers significant advantages for legal professionals: Streamlined Organization:** Ensures all relevant documents—original source files, editable templates, and final PDFs—are systematically organized, tracked, and easily accessible within Google Drive. Time-Saving and Efficiency:** Documents are quickly generated and ready for client sharing, leading to faster turnaround times and improved service delivery. Scalability:** Provides a scalable solution for handling a higher volume of document processing tasks without a proportional increase in manual effort. Learn more about Chill Labs and our services on our website: Chill Labs
by Onur
Automated AI Content Creation & Instagram Publishing from Google Sheets This n8n workflow automates the creation and publishing of social media content directly to Instagram, using ideas stored in a Google Sheet. It leverages AI (Google Gemini and Replicate Flux) to generate concepts, image prompts, captions, and the final image, turning your content plan into reality with minimal manual intervention. Think of this as the execution engine for your content strategy. It assumes you have a separate process (whether manual entry, another workflow, or a different tool) for populating the Google Sheet with initial post ideas (including Topic, Audience, Voice, and Platform). This workflow takes those ideas and handles the rest, from AI generation to final publication. What does this workflow do? This workflow streamlines the content execution process by: Automatically fetching** unprocessed content ideas from a designated Google Sheet based on a schedule. Using Google Gemini to generate a platform-specific content concept (specifically for a 'Single Image' format). Generating two distinct AI image prompt options based on the concept using Gemini. Writing an engaging, platform-tailored caption (including hashtags) using Gemini, based on the first prompt option. Generating a visual image using the first prompt option via the Replicate API (using the Flux model). Publishing* the generated image and caption directly to a connected *Instagram Business account**. Updating the status** in the Google Sheet to mark the idea as completed, preventing reprocessing. Who is this for? Social Media Managers & Agencies:** Automate the execution of your content calendar stored in Google Sheets. Marketing Teams:** Streamline content production from planned ideas and ensure consistent posting schedules. Content Creators & Solopreneurs:** Save significant time by automating the generation and publishing process based on your pre-defined ideas. Anyone** using Google Sheets to plan social media content and wanting to automate the creative generation and posting steps with AI. Benefits Full Automation:** From fetching planned ideas to Instagram publishing, automate the entire content execution pipeline. AI-Powered Generation:** Leverage Google Gemini for creative concepts, prompts, and captions, and Replicate for image generation based on your initial topic. Content Calendar Execution:** Directly turn your Google Sheet plan into published posts. Time Savings:** Drastically reduce the manual effort involved in creating visuals and text for each planned post. Consistency:** Maintain a regular posting schedule by automatically processing your queue of ideas. Platform-Specific Content:** AI prompts are designed to tailor concepts, prompts, and captions for the platform specified in your sheet (e.g., Instagram or LinkedIn). How it Works Scheduled Trigger: The workflow starts automatically based on the schedule you set (e.g., every hour, daily). Fetch Idea: Reads the next row from your Google Sheet where the 'Status' column indicates it's pending (e.g., '0'). It only fetches one idea per run. Prepare Inputs: Extracts Topic, Audience, Voice, and Platform from the sheet data. AI Concept Generation (Gemini): Creates a single content concept suitable for a 'Single Image' post on the target platform. AI Prompt Generation (Gemini): Develops two detailed, distinct image prompt options based on the concept. AI Caption Generation (Gemini): Writes a caption tailored to the platform, using the first image prompt and other context. Image Generation (Replicate): Sends the first prompt to the Replicate API (Flux model) to generate the image. Prepare for Instagram: Formats the generated image URL and caption. Publish to Instagram: Uses the Facebook Graph API in three steps: Creates a media container by uploading the image URL and caption. Waits for Instagram to process the container. Publishes the processed container to your feed. Update Sheet: Changes the 'Status' in the Google Sheet for the processed row (e.g., to '1') to mark it as complete. n8n Nodes Used Schedule Trigger Google Sheets (Read & Update operations) Set (Multiple instances for data preparation) Langchain Chain - LLM (Multiple instances for Gemini calls) Langchain Chat Model - Google Gemini (Multiple instances) Langchain Output Parser - Structured (Multiple instances) HTTP Request (for Replicate API call) Wait Facebook Graph API (Multiple instances for Instagram publishing steps) Prerequisites Active n8n instance (Cloud or Self-Hosted). Google Account** with access to Google Sheets. Google Sheets API Credentials (OAuth2):** Configured in n8n. A Google Sheet** structured with columns like Topic, Audience, Voice, Platform, Status (or similar). Ensure your 'pending' and 'completed' statuses are defined (e.g., '0' and '1'). Google Cloud Project** with the Vertex AI API enabled. Google Gemini API Credentials:** Configured in n8n (usually via Google Vertex AI credentials). Replicate Account** and API Token. Replicate API Credentials (Header Auth):** Configured in n8n. Facebook Developer Account**. Instagram Business Account** connected to a Facebook Page. Facebook App** with necessary permissions: instagram_basic, instagram_content_publish, pages_read_engagement, pages_show_list. Facebook Graph API Credentials (OAuth2):** Configured in n8n with the required permissions. Setup Import the workflow JSON into your n8n instance. Configure Schedule Trigger: Set the desired frequency (e.g., every 30 minutes, every 4 hours) for checking new ideas in the sheet. Configure Google Sheets Nodes: Select your Google Sheets OAuth2 credentials for both Google Sheets nodes. In 1. Get Next Post Idea..., enter your Spreadsheet ID and Sheet Name. Verify the Status filter matches your 'pending' value (e.g., 0). In 7. Update Post Status..., enter the same Spreadsheet ID and Sheet Name. Ensure the Matching Columns (e.g., Topic) and the Status value to update match your 'completed' value (e.g., 1). Configure Google Gemini Nodes: Select your configured Google Vertex AI / Gemini credentials in all Google Gemini Chat Model nodes. Configure Replicate Node (4. Generate Image...): Select your Replicate Header Auth credentials. The workflow uses black-forest-labs/flux-1.1-pro-ultra by default; you can change this if needed. Configure Facebook Graph API Nodes (6a, 6c): Select your Facebook Graph API OAuth2 credentials. Crucially, update the Instagram Account ID in the Node parameter of both Facebook Graph API nodes (6a and 6c). The template uses a placeholder (17841473009917118); replace this with your actual Instagram Business Account ID. Adjust Wait Node (6b): The default wait time might be sufficient, but if you encounter errors during publishing (especially with larger images/videos in the future), you might need to increase the wait duration. Activate the workflow. Populate your Google Sheet: Ensure you have rows with your content ideas and the correct 'pending' status (e.g., '0'). The workflow will pick them up on its next scheduled run. This workflow transforms your Google Sheet content plan into a fully automated AI-powered Instagram publishing engine. Start automating your social media presence today!
by NanaB
What it does This n8n workflow creates a cutting-edge, multi-modal AI Memory Assistant designed to capture, understand, and intelligently recall your personal or business information from diverse sources. It automatically processes voice notes, images, documents (like PDFs), and text messages sent via Telegram. Leveraging GPT-4o for advanced AI processing (including visual analysis, document parsing, transcription, and semantic understanding) and MongoDB Atlas Vector Search for persistent and lightning-fast recall, this assistant acts as an external brain. Furthermore, it integrates with Gmail, allowing the AI to send and search emails as part of its memory and response capabilities. This end-to-end solution blurprint provides a powerful starting point for personal knowledge management and intelligent automation. How it works 1. Multi-Modal Input Ingestion 🗣️📸📄💬 Your memories begin when you send a voice note, an image, a document (e.g., PDF), or a text message to your Telegram bot. The workflow immediately identifies the input type. 2. Advanced AI Content Processing 🧠✨ Each input type undergoes specialized AI processing by GPT-4o: Voice notes are transcribed into text using OpenAI Whisper. Images are visually analyzed by GPT-4o Vision, generating detailed textual descriptions. Documents (PDFs) are processed for text extraction, leveraging GPT-4o for robust parsing and understanding of content and structure. Unsupported document types are gracefully handled with a user notification. Text messages are directly forwarded for further processing. This phase transforms all disparate input formats into a unified, rich textual representation. 3. Intelligent Memory Chunking & Vectorization ✂️🏷️➡️🔢 The processed content (transcriptions, image descriptions, extracted document text, or direct text) is then fed back into GPT-4o. The AI intelligently chunks the information into smaller, semantically coherent pieces, extracts relevant keywords and tags, and generates concise summaries. Each of these enhanced memory chunks is then converted into a high-dimensional vector embedding using OpenAI Embeddings. 4. Persistent Storage & Recall (MongoDB Atlas Vector Search) 💾🔍 These vector embeddings, along with their original content, metadata, and tags, are stored in your MongoDB Atlas cluster, which is configured with Atlas Vector Search. This allows for highly efficient and semantically relevant retrieval of memories based on user queries, forming the core of your "smart recall" system. 5. AI Agent & External Tools (Gmail Integration) 🤖🛠️ When you ask a question, the AI Agent (powered by GPT-4o) acts as the central intelligence. It uses the MongoDB Chat Memory to maintain conversational context and, crucially, queries the MongoDB Atlas Vector Search store to retrieve relevant past memories. The agent also has access to Gmail tools, enabling it to send emails on your behalf or search your past emails to find information or context that might not be in your personal memory store. 6. Smart Response Generation & Delivery 💬➡️📱 Finally, using the retrieved context from MongoDB and the conversational history, GPT-4o synthesizes a concise, accurate, and contextually aware answer. This response is then delivered back to you via your Telegram bot. How to set it up (~20 Minutes) Getting this powerful workflow running requires a few key configurations and external service dependencies. Telegram Bot Setup: Use BotFather in Telegram to create a new bot and obtain its API Token. In your n8n instance, add a new Telegram API credential. Give it a clear name (e.g., "My AI Memory Bot") and paste your API Token. OpenAI API Key Setup: Log in to your OpenAI account and generate a new API key. Within n8n, create a new OpenAI API credential. Name it appropriately (e.g., "My OpenAI Key for GPT-4o") and paste your API key. This credential will be used by the OpenAI Chat Model (GPT-4o for processing, chunking, and RAG), Analyze Image, and Transcribe Audio nodes. MongoDB Atlas Setup: If you don't have one, create a free-tier or paid cluster on MongoDB Atlas. Create a database and a collection within your cluster to store your memory chunks and their vector embeddings. Crucially, configure an Atlas Vector Search index on your chosen collection. This index will be on the field containing your embeddings (e.g., embedding field, type knnVector). Refer to MongoDB Atlas documentation for detailed instructions on creating vector search indexes. In n8n, add a new MongoDB credential. Provide your MongoDB Atlas connection string (ensure it includes your username, password, and database name), and give it a clear name (e.g., "My Atlas DB"). This credential will be used by the MongoDB Chat Memory node and for any custom HTTP requests you might use for Atlas Vector Search insertion/querying. Gmail Account Setup: Go to Google Cloud Console, enable the Gmail API for your project, and configure your OAuth consent screen. Create an OAuth 2.0 Client ID for a Desktop app (or Web application, depending on your n8n setup and redirect URI). Download the JSON credentials. In n8n, add a new Gmail OAuth2 API credential. Follow the n8n instructions to configure it using your Google Client ID and Client Secret, and authenticate with your Gmail account, ensuring it has sufficient permissions to send and search emails. External API Services: If your Extract from File node relies on an external service for robust PDF/DocX text extraction, ensure you have an API key and the service is operational. The current flow uses ConvertAPI. Add the necessary credential (e.g., ConvertAPI) in n8n. How you could enhance it ✨ This workflow offers numerous avenues for advanced customization and expansion: Expanded Document Type Support: Enhance the "Document Processing" section to handle a wider range of document types beyond just PDFs (e.g., .docx, .xlsx, .pptx, markdown, CSV) by integrating additional conversion APIs or specialized parsing libraries (e.g., using a custom code node or dedicated third-party services like Apache Tika, Unstructured.io). Fine-tuned Memory Chunks & Metadata: Implement more sophisticated chunking strategies for very long documents, perhaps based on semantic breaks or document structure (headings, sections), to improve recall accuracy. Add more metadata fields (e.g., original author, document date, custom categories) to your MongoDB entries for richer filtering and context. Advanced AI Prompting: Allow users to dynamically set parameters for their memory inputs (e.g., "This is a high-priority meeting note," "This image contains sensitive information") which can influence how GPT-4o processes, tags, and stores the memory, or how it's retrieved later. n8n Tool Expansion for Proactive Actions: Significantly expand the AI Agent's capabilities by providing it with access to a wider range of n8n tools, moving beyond just information retrieval and email External Data Source Integration (APIs): Expand the AI Agent's tools to query other external APIs (e.g., weather, stock prices, news, CRM systems) so it can provide real-time information relevant to your memories. Getting Assistance & More Resources Need assistance setting this up, adapting it to a unique use case, or exploring more advanced customizations? Don't hesitate to reach out! You can contact me directly at nanabrownsnr@gmail.com. Also, feel free to check out my Youtube Channel where I discuss other n8n templates, as well as Innovation and automation solutions.