by DigiMetaLab
A reasoning agent that can think, search, calculate, and remember โ powered by GROQ inference and ready to deploy in one click. Unlike traditional AI bots that only respond, this assistant reasons before replying, fetches real-time facts, does math, and keeps short-term memory of your conversation. ๐ง How it works This template builds a conversational AI agent using the GROQ LLaMA 3 or LLaMA 4 API, combined with modular tools like: ๐ง Think Tool โ performs step-by-step logical reasoning ๐ SerpAPI โ fetches live data from Google search โ Calculator โ handles arithmetic and math queries ๐พ Memory Buffer โ keeps track of the last 5 messages for context Everything is integrated inside n8n and optimized for blazing-fast replies using GROQโs ultra-low latency. ๐ง Your Agent Will: Understand and analyze your queries Think through solutions before answering Pull real-time data via SerpAPI Perform calculations with the built-in math engine Recall prior context using short-term memory Respond clearly, conversationally โ like a real assistant ๐งโ๐ผ Who is this template for? Perfect for: AI builders and creators using GROQ + n8n Teams needing a real-time LLaMA-powered assistant Beginners exploring LangChain + n8n workflows Developers combining LLMs + tools + memory ๐ How to Set Up Plug in your GROQ API key Add your SerpAPI key Import and run โ itโs ready to chat! All tools are pre-wired. You can expand the memory, customize prompts, or plug in more tools. ๐ฌ Use Cases Connect this agent with: Telegram Bots ๐ค WhatsApp via Twilio ๐ฑ Slack, Discord, or Gmail ๐ฌ Manual triggers in n8n ๐ ๐ Check out more templates by this creator: https://n8n.io/creators/digimetalab
by Yaron Been
LinkedIn AI Agent: Auto-Post Creator & Multi-Group Distributor Transform simple topic ideas into engaging LinkedIn posts and automatically distribute them across your profile and multiple LinkedIn groups. This powerful n8n workflow combines AI content generation with intelligent distribution, helping you maintain a consistent professional presence while maximizing your reach across relevant communities. ๐ How It Works This sophisticated 6-step automation turns content ideas into LinkedIn success: Step 1: Smart Content Monitoring The workflow continuously monitors your Google Sheets for new post topics marked as "Pending", checking every minute for fresh content to process. Step 2: AI-Powered Content Generation GPT-4 transforms your basic topic into a professionally crafted LinkedIn post featuring: Compelling opening hooks that grab attention 3-4 informative paragraphs with valuable insights Strategic questions to encourage engagement 4-6 relevant hashtags for discoverability Professional emoji placement for visual appeal Optimized formatting for LinkedIn's platform Step 3: Professional Formatting The workflow ensures your content meets LinkedIn's technical requirements with proper JSON formatting, character limits, and special character handling. Step 4: Personal Profile Publishing Your generated post is automatically published to your personal LinkedIn profile, maintaining your professional brand presence. Step 5: Multi-Group Distribution The same content is intelligently distributed across all your specified LinkedIn groups, amplifying your reach to targeted professional communities. Step 6: Status Management The workflow automatically updates your Google Sheets to mark posts as "Posted", providing clear tracking of your content pipeline. โ๏ธ Setup Steps Prerequisites Active LinkedIn account with API access Google Sheets access for content management OpenAI API key with GPT-4 access LinkedIn group memberships with posting permissions n8n instance (cloud or self-hosted) Required Google Sheets Structure Sheet 1 - Main Content: | ID | LinkedIn Post Title | Status | |----|-------------------|--------| | 1 | AI Trends in 2024 | Pending | | 2 | Remote Work Tips | Posted | Sheet 2 - Groups: | GroupIds | |-------------| | 123456789 | | 987654321 | | 456789123 | Note: Collect LinkedIn group IDs from groups where you have posting permissions. These can be found in the group URL or through LinkedIn's API. Configuration Steps Credential Setup Google Sheets OAuth2: Access your content spreadsheet OpenAI API Key: Required for AI content generation LinkedIn OAuth2: Enable profile and group posting HTTP Authentication: Configure LinkedIn API headers Google Sheets Preparation Create spreadsheet with the required two-sheet structure Populate group IDs from your joined LinkedIn groups Add initial post topics with "Pending" status Ensure proper column naming and data types LinkedIn Group Setup Join relevant professional LinkedIn groups Verify posting permissions in each group Collect group IDs using LinkedIn's interface or API Test posting permissions before full automation AI Content Customization The default prompt generates professional LinkedIn content, but can be customized for: Industry-specific terminology and trends Company voice and brand guidelines Target audience preferences Content style (educational, promotional, thought leadership) Workflow Activation Import the workflow JSON into your n8n instance Configure all credential connections Test with sample content before going live Activate the Google Sheets trigger ๐ฏ Use Cases Content Creators & Influencers Consistent Posting: Maintain regular LinkedIn presence without daily manual work Audience Growth: Reach multiple professional communities simultaneously Content Scaling: Transform brief ideas into full-length engaging posts Brand Building: Establish thought leadership across industry groups Marketing Teams Lead Generation: Share valuable content across targeted professional groups Brand Awareness: Increase visibility in relevant industry communities Thought Leadership: Position company experts as industry authorities Content Distribution: Maximize reach of marketing messages and insights Sales Professionals Pipeline Building: Share insights that attract potential clients Network Expansion: Engage with prospects across multiple professional groups Authority Building: Establish credibility through valuable content sharing Relationship Nurturing: Maintain visibility with existing connections Consultants & Freelancers Client Acquisition: Demonstrate expertise to potential clients Professional Branding: Build reputation across industry-specific groups Service Promotion: Share case studies and success stories broadly Network Building: Connect with peers and potential collaborators Business Leaders & Executives Industry Influence: Share strategic insights across professional networks Talent Attraction: Showcase company culture and opportunities Partnership Development: Connect with potential business partners Market Education: Share expertise to influence industry conversations ๐ง Advanced Customization Options Content Strategy Enhancement Multi-Tone Generation: Create different content styles for various audiences Industry Templates: Pre-built prompts for specific professional sectors Engagement Optimization: A/B testing different post formats and styles Content Calendar Integration: Schedule posts for optimal timing Distribution Intelligence Group Performance Tracking: Monitor engagement across different groups Selective Distribution: Choose specific groups based on content type Audience Segmentation: Tailor content for different professional communities Engagement Analysis: Track which groups provide best ROI Content Quality Control Approval Workflows: Add human review before automatic posting Content Scoring: Rate post quality before distribution Brand Compliance: Ensure posts meet company guidelines Duplicate Detection: Avoid posting similar content too frequently Extended Integration Options CRM Integration: Track leads generated from LinkedIn posts Analytics Platforms: Monitor engagement and conversion metrics Content Management: Connect to existing content planning tools Social Media Expansion: Extend to other professional platforms ๐ Content Generation Features AI Writing Capabilities The workflow generates LinkedIn posts that include: Professional Structure: Attention-grabbing opening statements Well-organized multi-paragraph content Clear value propositions and insights Strategic calls-to-action for engagement LinkedIn Optimization: Platform-specific formatting and spacing Proper hashtag research and placement Emoji usage that enhances readability Character count optimization for maximum impact Engagement Drivers: Thought-provoking questions to encourage comments Industry insights that spark discussions Personal anecdotes that build connections Actionable tips that provide immediate value Sample Output Input Topic: "Remote Work Productivity Tips" Generated Post: ๐ Working from home has taught me that productivity isn't about being busyโit's about being intentional. After managing remote teams for 3 years, I've discovered that the most successful remote workers share three key habits that transform their daily performance. First, they create physical boundaries even in small spaces. A dedicated workspace signals to your brain that it's time to focus, even if it's just a corner of your kitchen table. Second, they batch similar tasks together. Instead of jumping between emails, calls, and deep work, they group activities to maintain mental flow and reduce cognitive switching costs. Third, they communicate proactively rather than reactively. They share progress updates before being asked and clarify expectations upfront to avoid confusion later. What's your most effective remote work strategy? I'd love to hear what's working for your team! ๐ช #RemoteWork #Productivity #WorkFromHome #Leadership #TeamManagement #ProfessionalDevelopment ๐ ๏ธ Troubleshooting & Best Practices Common Issues & Solutions LinkedIn API Limitations Respect posting frequency limits to avoid account restrictions Monitor API usage and implement appropriate delays between posts Ensure compliance with LinkedIn's terms of service Maintain authentic engagement rather than purely automated interactions Group Posting Permissions Verify membership status and posting rights before adding group IDs Some groups require administrator approval for posts Monitor group rules and community guidelines Remove inactive or restricted groups from your list Content Quality Control Review AI-generated content periodically for brand consistency Adjust prompts based on engagement performance Maintain a balance between automation and personal touch Monitor comments and engage authentically with your audience Optimization Strategies Performance Enhancement Track engagement metrics across different groups A/B test posting times and content formats Refine prompts based on successful post patterns Gradually expand to new groups based on performance Content Strategy Develop content themes that resonate with your target audience Create series of related posts for deeper engagement Balance promotional content with value-driven insights Maintain consistency in voice and messaging Network Growth Engage with comments on your automated posts Connect with active commenters to expand your network Participate in group discussions beyond your own posts Build genuine relationships through authentic interactions ๐ Success Metrics Engagement Indicators Post Reach: Total views across profile and groups Interaction Rate: Comments, likes, and shares per post Network Growth: New connections from content engagement Group Performance: Which communities provide best engagement Business Impact Measurements Lead Generation: Connections and inquiries from LinkedIn posts Brand Awareness: Mentions and sharing of your content Thought Leadership: Recognition as industry expert Professional Opportunities: Speaking, collaboration, or job opportunities ๐ Questions & Support Need help setting up or optimizing your LinkedIn AI Agent workflow? ๐ง Direct Technical Support Email: Yaron@nofluff.online Response Time: Within 24 hours on business days Expertise: LinkedIn API integration, AI prompt optimization, workflow scaling ๐ฅ Comprehensive Learning Resources YouTube Channel: https://www.youtube.com/@YaronBeen/videos Complete setup walkthrough and configuration Advanced customization techniques and strategies LinkedIn API best practices and limitations Content strategy optimization for maximum engagement Troubleshooting common integration issues ๐ค Professional Networking & Updates LinkedIn: https://www.linkedin.com/in/yaronbeen/ Connect for ongoing automation support and advice Share your LinkedIn growth success stories Get early access to new workflow templates and features Join discussions about LinkedIn marketing automation ๐ฌ Support Request Guidelines Include in your support message: Your current LinkedIn strategy and goals Target audience and industry focus Specific LinkedIn groups you want to target Any technical errors or integration issues Current content creation process and pain points
by Batu รztรผrk
Extract the main idea and key takeaways from YouTube videos and turn them into Airtable content ideas ๐ Description Automatically turn YouTube videos into clear, structured content ideas stored in Airtable. This workflow pulls new video links from Airtable, extracts transcripts using a RapidAPI service, summarizes them with your favourite LLM, and logs the main idea and key takeawaysโkeeping your content pipeline fresh with minimal effort. โ๏ธ What It Does Scans Airtable for new YouTube video links every 5 minutes. Extracts the transcript of the video using a third-party API via RapidAPI. Summarizes the content to generate a main idea and takeaways. Updates the original Airtable entry with the insights and marks it as completed. ๐ Prerequisites Before using this template, make sure you have: โ A RapidAPI account with access to the youtube-video-summarizer-gpt-ai API. โ A valid RapidAPI key. โ An OpenAI, Claude or Gemini account connected to n8n. โ An Airtable account with a base and table ready. ๐งฐ Setup Instructions Clone this template into your n8n workspace. Open the Get YouTube Sources node and configure your Airtable credentials. In the Get video transcript node: Enter your X-RapidAPI-Key under headers. The API endpoint is pre-configured. Connect your LLM credentials to the Extract detailed summary node. (Optional) Adjust the summarization prompt in the LangChain node to better suit your tone. Set your preferred schedule in the Trigger node. ๐ Airtable Setup Create a base (e.g., Content Hub) with a table named Ideas and the following columns: | Column Name | Type | Required | Notes | |-------------|------------|----------|----------------------------| | Type | Single select | โ | Must be set to Youtube Video | | Source | URL | โ | The YouTube video URL | | Status | Checkbox | โ | Leave empty initially; updated after processing | | MainIdea | Single line text | โ | Summary generated by OpenAI | | Key Takeaways | Long text | โ | List of takeaways extracted from the transcript Activate the workflowโand you're done!
by Zain Ali
๐ง Email real time RAG Assistant with Gmail, OpenAI & PGVector ๐ Whoโs it for This workflow is ideal for: Professionals Project managers Sales and support teams Anyone managing high volumes of Gmail messages It enables fast and intelligent search through your email inbox using natural language queries. โ๏ธ How it works / What it does Continuously monitors your Gmail inbox for new emails. Extracts email content and metadata (subject, body, sender, date). Converts email content into vector embeddings using OpenAI. Stores embeddings in a PostgreSQL database with PGVector. A conversational AI agent performs semantic search on your stored email history. Supports time-sensitive and context-aware responses via OpenAI Chat model. ๐ How to set up Connect your Gmail account to the Gmail Trigger node (with API access enabled). Configure OpenAI credentials for the Embedding and Chat nodes. Set up a PostgreSQL database with the PGVector extension enabled. Import the workflow into your n8n instance (Cloud or Self-hosted). Customize parameters like polling frequency, embedding settings, or vector query depth. ๐ Requirements โ n8n instance (Self-hosted or Cloud) โ Gmail account with API access โ OpenAI API Key โ PostgreSQL database with PGVector extension installed ๐ ๏ธ How to customize the workflow Email Filtering**: Change filters in the Gmail Trigger to watch specific labels or senders. Text Splitting Granularity**: Adjust chunkSize and chunkOverlap in the text splitter node. Query Depth**: Modify topK in the vector search node to retrieve more or fewer similar results. Prompt Tuning**: Customize the system message or agent instructions in the RAG node. Workflow Extensions**: Add notifications, error logging, Slack/Telegram alerts, or data exports.
by Jimleuk
This n8n template shows you how to connect Github's Free Models to your existing n8n AI workflows. Whilst it is possible to use HTTP nodes to access Github Models, The aim of this template is to use it with existing n8n LLM nodes - saves the trouble of refactoring! Please note, Github states their model APIs are not intended for production usage! If you need higher rate limits, you'll need to use a paid service. How it works The approach builds a custom OpenAI compatible API around the Github Models API - all done in n8n! First, we attach an OpenAI subnode to our LLM node and configure a new OpenAI credential. Within this new OpenAI credential, we change the "Base URL" to point at a n8n webhook we've prepared as part of this template. Next, we create 2 webhooks which the LLM node will now attempt to connect with: "models" and "chat completion". The "models" webhook simply calls the Github Model's "list all models" endpoint and remaps the response to be compatible with our LLM node. The "Chat Completion" webhook does a similar task with Github's Chat Completion endpoint. How to use Once connected, just open chat and ask away! Any LLM or AI agent node connected with this custom LLM subnode will send requests to the Github Models API. Allowing your to try out a range of SOTA models for free. Requirements Github account and credentials for access to Models. If you've used the Github node previously, you can reuse this credential for this template. Customising this workflow This template is just an example. Use the custom OpenAI credential for your other workflows to test Github models. References https://docs.github.com/en/github-models/prototyping-with-ai-models https://docs.github.com/en/github-models
by Nurseflow
๐ผ LinkedIn Content Machine โ AI-Powered Post Generator & Scheduler for X and LinkedIn How it works: This end-to-end workflow automates your personal or brand content strategy by: ๐ง Using Google Gemini or OpenAI to generate engaging LinkedIn/X content from a title or trending posts. ๐๏ธ Posting directly to LinkedIn and X (formerly Twitter). ๐ Pulling high-performing LinkedIn posts to inspire new ideas. โ๏ธ Saving AI-generated drafts to Google Sheets for review. ๐ Notifying your team on Slack when drafts are ready. Steps to set up: Add your API keys for Google Gemini or OpenAI. Set up your LinkedIn, X (Twitter), Google Sheets, and Slack credentials. Customize prompt logic or post filters if needed. Schedule the idea generation module or trigger it manually. Start generating and posting consistent, high-quality content with zero manual effort!
by Louis Chan
How it works Transform medical documents into structured data using Google Gemini AI with enterprise-grade accuracy. Classifies document types (receipts, prescriptions, lab reports, clinical notes) Extracts text with 95%+ accuracy using advanced OCR Structures data according to medical taxonomy standards Supports multiple languages (English, Chinese, auto-detect) Tracks processing costs and quality metrics automatically Set up steps Prerequisites Google Gemini API key (get from Google AI Studio) Quick setup Import this workflow template Configure Google Gemini API credentials in n8n Test with a sample medical document URL Deploy your webhook endpoint Usage Send POST request to your webhook: { "image_url": "https://example.com/medical-receipt.jpg", "expected_type": "financial", "language_hint": "auto" } Get structured response: json{ "success": true, "result": { "documentType": "financial", "metadata": { "providerName": "Dr. Smith Clinic", "createdDate": "2025-01-06", "currency": "USD" }, "content": { "amount": 150.00, "services": [...] }, "quality_metrics": { "overall_confidence": 0.95 } } } Use cases Healthcare Organizations Medical billing automation - Process receipts and invoices automatically Insurance claim processing - Extract data from claim documents Clinical documentation - Digitize patient records and notes Data standardization - Consistent structured output format System Integrators EMR integration - Connect with existing healthcare systems Workflow automation - Reduce manual data entry by 90% Multi-language support - Handle international medical documents Quality assurance - Built-in confidence scoring and validation Supported Document Types Financial: Medical receipts, bills, insurance claims, invoices Clinical: Medical charts, progress notes, consultation reports Prescription: Prescriptions, medication lists, pharmacy records Administrative: Referrals, authorizations, patient registration Diagnostic: Lab reports, test results, screening reports Legal: Medical certificates, documentation forms
by Nick Saraev
This workflow creates an end-to-end Instagram content pipeline that automatically discovers trending content from competitor channels, extracts valuable insights, and generates new high-quality scripts for your own content creation. The system helped scale an Instagram channel from 0 to 10,000 followers in just 15 days through intelligent content repurposing. Benefits: Complete Content Automation - Monitors competitor Instagram accounts, downloads new reels, and processes them without manual intervention AI-Powered Script Generation - Uses ChatGPT and Perplexity to analyze content, identify tools/technologies, and rewrite scripts with fresh angles Smart Duplicate Prevention - Automatically tracks processed content in a database to avoid redundant work Multi-Platform Intelligence - Combines Instagram scraping, AI transcription, web research, and content generation in one seamless flow Scalable Content Strategy - Process content from multiple niches and creators to fuel unlimited content ideas Revenue-Focused Approach - Specifically designed to identify monetizable tools and technologies for business-focused content How It Works: Instagram Content Discovery: Uses Apify's Instagram scraper to monitor specified creator accounts for new reels Automatically downloads video content and metadata from target accounts Filters content based on engagement metrics and relevance Intelligent Processing Pipeline: Transcribes video content using OpenAI Whisper for accurate text extraction Filters content using AI to identify tools, technologies, and automation opportunities Cross-references against existing database to prevent duplicate processing Enhanced Research & Analysis: Searches Perplexity AI for additional insights about discovered tools Generates step-by-step usage guides and implementation instructions Identifies unique angles and opportunities for content improvement Script Generation & Optimization: Creates new, original scripts optimized for your specific audience Maintains consistent brand voice while adding fresh perspectives Includes strategic call-to-action elements for audience engagement Required Google Sheets Database Setup: Before running this workflow, create a Google Sheets database with these exact column headers: Essential Columns: id - Unique Instagram post identifier (primary key for duplicate detection) timestamp - When the reel was posted caption - Original reel caption text hashtags - Hashtags used in the post videoUrl - Direct link to download the video file username - Account that posted the reel scrapedTranscript - Original transcript from video (added by workflow) newTranscript - AI-generated script for your content (added by workflow) Additional Tracking Columns: shortCode - Instagram's internal post code url - Public Instagram post URL commentsCount - Number of comments firstComment - Top comment on the post likesCount - Number of likes videoViewCount - View count metrics videoDuration - Length of video in seconds Setup Instructions: Create a new Google Sheet with these column headers in the first row Name the sheet "Reels" Connect your Google Sheets OAuth credentials in n8n Update the document ID in the workflow nodes The merge logic relies on the id column to prevent duplicate processing, so this structure is essential for the workflow to function correctly. Business Use Cases: Content Creators - Scale content production by 10x while maintaining quality and originality Marketing Agencies - Offer content research and ideation as a premium service Course Creators - Identify trending tools and technologies for educational content Revenue Potential: This exact system can be sold as a service for $3,000-$5,000 to growing channels or agencies. The automation saves 10+ hours weekly of manual research and content planning. Difficulty Level: Intermediate Estimated Build Time: 1-2 hours Monthly Operating Cost: ~$30 (API usage) Watch the Complete Build Process Want to see exactly how this system was built from scratch? Nick Saraev walks through the entire development process in this comprehensive tutorial, including all the debugging, dead ends, and problem-solving that goes into building real automation systems. ๐ฅ Watch: "The N8N Instagram Parasite System (10K Followers In 15 Days)" This 1.5-hour deep-dive shows the actual build process - not a polished demo, but real system development with all the thinking and iteration included. Set Up Steps: Configure Apify Integration: Sign up for Apify account and obtain API key Replace the bearer token in the "Run Actor Synchronously" node Customize the username array with your target Instagram accounts Set Up AI Services: Add OpenAI API credentials for transcription and content generation Configure Perplexity API for enhanced research capabilities Set up appropriate rate limiting for cost control Database Configuration: Create Google Sheets database with provided column structure Connect Google Sheets OAuth credentials Configure the merge logic for duplicate detection Content Filtering Setup: Customize the AI prompts for your specific niche and requirements Adjust the filtering criteria for tool/technology detection Set up the script generation template to match your brand voice Automation Schedule: Configure the schedule trigger for daily content monitoring Set optimal timing based on your content creation workflow Test the complete flow with a small number of accounts first Advanced Customization: Add additional content sources beyond Instagram Integrate with your existing content management systems Scale up monitoring to dozens of competitor accounts More AI Automation Systems:* For more advanced automation tutorials and business systems, check out My YouTube Channel where I share proven automation strategies that generate real revenue.
by Jimleuk
This n8n demonstrates how to build a simple Google Drive MCP server to search and get contents of files from Google Drive. This MCP example is based off an official MCP reference implementation which can be found here -https://github.com/modelcontextprotocol/servers/tree/main/src/gdrive How it works A MCP server trigger is used and connected to 1x Google Drive tool and 1x Custom Workflow tool. The Google Drive tool is set to perform a search on files within our Google Drive folder. The Custom Workflow tool downloads target files found in our drive and converts the binaries to their text representation. Eg. PDFs have only their text contents extracted and returned to the MCP client. How to use This Google Drive MCP server allows any compatible MCP client to manage a person or shared Google Drive. Simple select a drive or for better control, specify a folder within the drive to scope the operations to. Connect your MCP client by following the n8n guidelines here - https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-langchain.mcptrigger/#integrating-with-claude-desktop Try the following queries in your MCP client: "Please help me search for last month's expense reports." "What does the company policy document say about cancellations and refunds?" Requirements Google Drive for documents. OpenAI for image and audio understanding. MCP Client or Agent for usage such as Claude Desktop - https://claude.ai/download Customising this workflow Add additional capabilities such as renaming, moving and/or deleting files. Remember to set the MCP server to require credentials before going to production and sharing this MCP server with others!
by Gain FLow AI
Overview This workflow automates the process of sending personalized cold email sequences to your prospects. It fetches un-emailed leads from your Google Sheet, validates their email addresses, and then dispatches tailored emails according to a predefined schedule. It updates your CRM (Google Sheet) with the status of each sent email, ensuring your outreach efforts are tracked and efficient. Use Case This workflow is perfect for: Sales Teams**: Automate the delivery of multi-stage cold email campaigns to a large volume of leads. Business Development**: Nurture prospects over time with a structured email sequence. Recruiters**: Send out introductory emails to potential candidates for open positions. Marketers**: Distribute personalized outreach for events, content, or product launches. Anyone doing cold outreach**: Ensure consistent follow-up and track email performance without manual effort. How It Works Scheduled Trigger: The workflow is set to run automatically at a defined interval (e.g., every 6 hours, as currently configured by the "Set Timer" node). This ensures regular outreach without manual intervention. Fetch Unsent Emails: The "Get Emails" node queries your Google Sheet to identify prospects who haven't yet received the current email in the sequence (i.e., "Email Sent " is "No"). Control Volume: A "Limit" node can be used to control the number of emails sent in each batch, preventing you from sending too many emails at once and potentially hitting sending limits. Loop Through Prospects: The "Loop Over Items" node processes each selected prospect individually. Email Validation (Conditional Send): An "If" node checks if the prospect's "Email Address" is valid and exists. This prevents sending emails to invalid addresses, improving deliverability. Send Email: "Send Email" Node: For valid email addresses, this node dispatches the personalized email to the prospect. It retrieves the recipient's email, subject, and body from your Google Sheet. "connect" Node: (Note: The provided JSON uses a generic emailSend node named "connect" that links to an SMTP credential. This represents the actual email sending mechanism, whether it's Gmail or a custom SMTP server.) Update CRM: After successfully sending an email, the "Update Records" node updates your Google Sheet. It marks the "Email Sent " column as "Yes" and records the "Sent on" timestamp and a "Message Id" for tracking. Delay Between Sends: A "Wait" node introduces a delay between sending emails to individual prospects. This helps mimic human sending behavior and can improve deliverability. How to Set It Up To set up your Automated Cold Email Sender, follow these steps: Google Sheet Setup: Duplicate the Provided Template: Make a copy of the Google Sheet Template (1TjXelyGPg5G8lbPDI9_XOReTzmU1o52z2R3v8dYaoQM) into your own Google Drive. This sheet should contain columns for "Name", "Email Address ", "Sender Email", "Email Subject", "Email Body", "Email Sent ", "Sent on", and "Message Id". Connect Google Sheets: Ensure your Google Sheets OAuth2 API credentials are set up in n8n and linked to the "Get Emails" and "Update Records" nodes. Update Sheet IDs: In both "Get Emails" and "Update Records" nodes, update the documentId with the ID of your copied template. Email Sending Service Credentials: Gmail: If using Gmail, ensure your Gmail OAuth2 credentials are configured and connected to the "Send Email" node (or the "connect" node, if that's your chosen sender). Other Email Services (SMTP): If you use a different email service, you'll need to set up an SMTP credential in n8n and connect it to the "connect" node. Refer to the "Sticky Note4" for guidance on non-Google email services. Configure Timer: In the "Set Timer" node, adjust the hoursInterval or other time settings to define how frequently you want the email sending process to run (e.g., every 6 hours, once a day, etc.). Control Volume (Optional): In the "Limit" node, you can set the maxItems to control how many emails are processed and sent in each batch. This is useful for managing email sending limits or gradual outreach. Import the Workflow: Import the provided workflow JSON into your n8n instance. Populate Your Sheet: Fill your copied Google Sheet with prospect data, including the email subject and body for each email you wish to send. Ensure the "Email Sent " column is initially "No". Activate and Monitor: Activate the workflow. It will begin fetching and sending emails based on your configured schedule. Monitor your Google Sheet to track the "Email Sent " status. This workflow provides a robust and automated solution for managing your cold email campaigns, saving you time and increasing your outreach efficiency.
by HoangSP
Medical Q&A Chatbot for Urology using RAG with Pinecone and GPT-4o This template provides an AI-powered Q&A assistant for the Urology domain using Retrieval-Augmented Generation (RAG). It uses Pinecone for vector search and GPT-4o for conversational responses. ๐ง Use Case This chatbot is designed for clinics or medical pages that want to automate question answering for Urology-related conditions. It uses a vector store of domain knowledge to return verified responses. ๐ง Requirements โ OpenAI API key (GPT-4o or GPT-4o-mini) โ Pinecone account with an active index โ Verified Urology documents embedded into Pinecone โ๏ธ Setup Instructions Create a Pinecone vector index and connect it using the Pinecone credentials node. Upload Urology-related documents to embed using the Create Embeddings for Urology Docs node. Customize the chatbot system message to reflect your medical specialty. Deploy this chatbot on your website or link it with Telegram via the chat trigger node. ๐ ๏ธ Components chatTrigger: Listens for user messages and starts the workflow. Medical AI Agent: GPT-based agent guided by domain-specific instructions. RAG Tool Vector Store: Fetches relevant documents from Pinecone using vector search. Memory Buffer: Maintains conversation context. Create Embeddings for Urology Docs: Encodes documents into vector format. ๐ Customization You can replace the knowledge base with any other medical domain by: Updating the documents stored in Pinecone. Modifying the system prompt in the AI Agent node. ๐ฃ CTA This chatbot is ideal for clinics, medical consultants, or educational websites wanting a reliable AI assistant in Urology.
by Airtop
Recursive Web Scraping Use Case Automating web scraping with recursive depth is ideal for collecting content across multiple linked pagesโperfect for content aggregation, lead generation, or research projects. What This Automation Does This automation reads a list of URLs from a Google Sheet, scrapes each page, stores the content in a document, and adds newly discovered links back to the sheet. It continues this process for a specified number of iterations based on the defined scraping depth. Input Parameters: Seed URL: The starting URL to begin the scraping process. Example: https://example.com/ Links must contain: Restricts the links to those that contain this specified string. Example: https://example.com/ Depth: The number of iterations (layers of links) to scrape beyond the initial set. Example: 3 How It Works Starts by reading the Seed URL from the Google Sheet. Scrapes each page and saves its content to the specified document. Extracts new links from each page that match the Links must contain string, appends them to the Google Sheet. Repeats steps 2โ3 for the number of times specified by Depth - 1. Setup Requirements Airtop API Key โ free to generate. Credentials set up for Google Docs (requires creating a project on Google Console). Read how to. Credentials set up for Google Spreadsheet. Next Steps Add Filtering Rules**: Filter which links to follow based on domain, path, or content type. Combine with Scheduler**: Run this automation on a schedule to continuously explore newly discovered pages. Export Structured Data**: Extend the process to store extracted data in a CSV or database for analysis. Read more about website scraping for LLMS