by David Roberts
OpenAI Assistant is a powerful tool, but at the time of writing it doesn't automatically remember past messages from a conversation. This workflow demonstrates how to get around this, by managing the chat history in n8n and passing it to the assistant when required. This makes it possible to use OpenAI Assistant for chatbot use cases. Note that to use this template, you need to be on n8n version 1.28.0 or later.
by Robert Breen
This guide walks you through building an intelligent AI Agent in n8n that routes tasks to the appropriate sub-agent using the new @n8n/n8n-nodes-langchain agent framework. You’ll create a Manager Agent that evaluates user input and delegates it to either an Email Agent or a Data Agent—each with its own role, memory, and OpenAI model. This is perfect for use cases where you want a single entry point but intelligent branching behind the scenes. 🔧 Step 1: Set Up the Manager Agent Start by dragging in an Agent node and name it something like ManagerAgent. This agent will act as the “brain” of your system, analyzing the user's input and determining whether it should be handled by the email-writing sub-agent or the data-summary sub-agent. Open the node’s settings and paste the following into the System Message: You are an AI Manager that delegates tasks to specialized agents. Your job is to analyze the user's message and decide whether it requires: An EmailAgent for writing outreach, follow-up, or templated emails, or A DataAgent for tasks involving data summaries, metrics, or analysis. Send the instructions to the sub agents. This instruction gives the Manager Agent clarity on what roles exist and what types of tasks belong to each one. 🧠 Step 2: Add Memory to the Manager Agent Drag in a Memory (BufferWindow) node and label it Manager Memory. Connect it to the ai_memory input of the Manager Agent. This ensures the agent can remember recent inputs and outputs from the user and agents during the conversation. No extra configuration is needed in this memory node—just connect it to the agent. 🔌 Step 3: Connect a Language Model to the Manager Agent Next, add a Language Model node and choose OpenAI Chat Model. Select a model like gpt-4o-mini or gpt-4, depending on what you have access to. Under Credentials, connect your OpenAI API key. If you haven’t created this credential yet: Click "OpenAI API" under Credentials. Choose "Create New". Paste your OpenAI API key (found at https://platform.openai.com/account/api-keys). Save it and return to the workflow. Once the model is set, connect it to the ai_languageModel input of the Manager Agent. ✉️ Step 4: Create the Email Agent Tool Now you’ll create a specialized sub-agent that only writes emails. Add an Agent Tool node and call it EmailAgent. In the tool’s settings, describe its job clearly. For example: Writes professional, friendly, or action-oriented emails based on instructions. Then scroll down to the System Message section and enter the following: You are a professional Email Writing Assistant. You write polished, effective emails for tasks such as outreach, follow-ups, and client communication. Follow the instruction provided exactly and return only the email content. Use a warm, business-appropriate tone. For the text input field, use the expression: {{ $fromAI('Prompt__User_Message_', ``, 'string') }} This allows the Email Agent to receive exactly what the Manager Agent wants it to handle. Add another Memory node and link it to this tool to help it maintain short-term context. Then add a second Language Model node, configured just like the first one (you can even clone it), and connect it to the EmailAgent. Finally, connect this entire EmailAgent setup back to the ManagerAgent by attaching it to its ai_tool input. 📊 Step 5: Create the Data Agent Tool Repeat the same steps, but this time for data summaries and analysis. Add another Agent Tool node and name it DataAgent. In the Tool Description, write something like: Responds to instructions requiring metrics, summaries, or data analysis explanations. For its input text field, you can use: {{json.query}} If desired, provide a system message that gives the agent more detailed instruction on how to behave: You are a helpful Data Analyst. Summarize trends, explain metrics, and break down data clearly based on user instructions. As with the EmailAgent, you’ll also need: A dedicated Memory node A dedicated Language Model node A connection to the ai_tool input of the Manager Agent Now the Manager Agent has two tools it can delegate to: one for communication and one for insights. 🧪 Step 6: Test Your AI Agent System Deploy the workflow and start testing by sending prompts like: > “Write a cold outreach email to a software company.” The ManagerAgent should route that to the EmailAgent. Then try: > “Summarize how our lead volume changed last month.” The DataAgent should receive that task. If routing isn’t working as expected, double-check your system messages and input bindings in each agent tool. ✅ You’re Done! You now have a modular, multi-agent AI system powered by n8n. The Manager Agent delegates intelligently, each sub-agent is optimized for its role, and all of them benefit from context memory. For more advanced setups, you can chain tools, add additional memory types, or use retrieval (RAG) tools for external document support.
by Robert Breen
A step-by-step demo that shows how to pull your Outlook calendar events for the week and ask GPT-4o to write a short summary. Along the way you’ll practice basic data-transform nodes (Code, Filter, Aggregate) and see where to attach the required API credentials. 1️⃣ Manual Trigger — Run Workflow | Why | Lets you click “Execute” in the n8n editor so you can test each change. | | --- | --- | 2️⃣ Get Outlook Events — Get many events Node type: Microsoft Outlook → Event → Get All Fields selected: subject, start API setup (inside this node): Click Credentials ▸ Microsoft Outlook OAuth2 API If you haven’t connected before: Choose “Microsoft Outlook OAuth2 API” → “Create New”. Sign in and grant the Calendars.Read permission. Save the credential (e.g., “Microsoft Outlook account”). Output: A list of events with the raw ISO start time. > Teaching moment: Outlook returns a full dateTime string. We’ll normalize it next so it’s easy to filter. 3️⃣ Normalize Dates — Convert to Date Format // Code node contents return $input.all().map(item => { const startDateTime = new Date(item.json.start.dateTime); const formattedDate = startDateTime.toISOString().split('T')[0]; // YYYY-MM-DD return { json: { ...item.json, startDateFormatted: formattedDate } }; }); 4️⃣ Filter the Events Down to This Week After we’ve normalised the start date-time into a simple YYYY-MM-DD string, we drop in a Filter node. Add one rule for every day you want to keep—for example 2025-08-07 or 2025-08-08. Rows that match any of those dates will continue through the workflow; everything else is quietly discarded. Why we’re doing this: we only want to summarise tomorrow’s and the following day’s meetings, not the entire calendar. 5️⃣ Roll All Subjects Into a Single Item Next comes an Aggregate node. Tell it to aggregate the subject field and choose the option “Only aggregated fields.” The result is one clean item whose subject property is now a tidy list of every meeting title. It’s far easier (and cheaper) to pass one prompt to GPT than dozens of small ones. 6️⃣ Turn That List Into Plain Text Insert a small Code node right after the aggregation: return [{ json: { text: items .map(item => JSON.stringify(item.json)) .join('\n') } }]; Need a Hand? I’m always happy to chat automation, n8n, or Outlook API quirks. Robert Breen – Automation Consultant & n8n Instructor 📧 robert@ynteractive.com | LinkedIn
by Jitesh Dugar
Streamline client onboarding and project setup from hours to minutes with AI-driven automation. This intelligent workflow eliminates manual coordination, builds proposals, creates projects in Asana, welcomes clients on Slack, and logs everything — ensuring 90% faster onboarding and zero dropped steps. What This Workflow Does Transforms your client onboarding from scattered tools and emails into one seamless automation: 📝 Capture Client Details – JotForm intake form collects client, company, and project information. 🧠 AI-Powered Analysis – LangChain AI Agent analyzes the project scope, estimates effort, and recommends team composition. 📄 Generate Proposal – Automatically builds a professional HTML proposal summarizing goals, timeline, and estimated hours. 🗂️ Create Asana Project – Generates a new project with all key details, milestones, and assigned team members. 💬 Slack Collaboration – Creates a dedicated Slack channel, sends welcome messages, and introduces the project team. 📧 Welcome Email – Sends a personalized onboarding email to the client with project summary and next steps. 💼 CRM Sync – Creates or updates a HubSpot contact with complete project and client information. 📊 Audit Logging – Logs all onboarding activity to Google Sheets for centralized record-keeping. Key Features 🤖 AI Proposal Generation – Uses LangChain AI to generate smart project summaries and resource plans. ⚙️ End-to-End Automation – From form submission to project creation, communication, and CRM logging. 💬 Smart Slack Setup – Automatic channel creation and messaging for internal coordination. 📧 Personalized Client Emails – Beautifully formatted, professional onboarding emails. 🗂️ Asana Integration – Project creation with dynamic task templates and priorities. 📊 Google Sheets Logging – Instant audit trail for every client submission and generated proposal. 💼 CRM Integration – Automatically syncs client data with HubSpot for sales and account tracking. Perfect For 🚀 Agencies & Service Providers – Automate client onboarding, proposal creation, and task setup. 🏢 Consultancies – Quickly turn client requests into structured projects with assigned resources. 💻 Freelancers & Creators – Impress clients with AI-built proposals and instant communication. 📈 Growing Teams – Scale onboarding without extra admin or coordination time. 🧠 Operations Teams – Ensure consistency and transparency across all onboarding activities. What You’ll Need Required Integrations 🧾 JotForm – Client intake form (project details, budget, company info). Create your form for free on JotForm using this link 🤖 AI Agent – For analyzing project scope and building proposals. 🗂️ Asana – Project creation and task assignment. 💬 Slack – For automated client channel creation and internal communication. 📧 Gmail – For onboarding and proposal emails. 💼 HubSpot – CRM contact creation and project linkage. 📊 Google Sheets – For logging all onboarding and AI results. Optional Enhancements 📄 PDF Generation (PDF Munk) – Convert AI-generated proposals into downloadable PDFs. 💬 Slack Interactive Approvals – Add buttons for internal review before client communication. 📈 Performance Dashboard – Connect Google Sheets data to Looker Studio for tracking onboarding times. 🌍 Multilingual Support – Add translation nodes for international clients. 🔐 File Attachments – Send proposal PDFs or onboarding kits automatically via Gmail. Quick Start 1️⃣ Import Template – Copy and import the JSON file into your n8n workspace. 2️⃣ Set Up JotForm – Create a form with fields for client name, company, project name, budget, and requirements. 3️⃣ Add Credentials – Connect JotForm, AI Agent, Asana, Slack, Gmail, HubSpot, and Google Sheets. 4️⃣ Configure Sheet ID – Replace YOUR_SHEET_ID in the Google Sheets node. 5️⃣ Customize Proposal HTML – Edit AI prompt and branding to reflect your company’s style. 6️⃣ Test Workflow – Submit a test form and verify Slack, Asana, Gmail, and Sheets outputs. 7️⃣ Deploy – Activate workflow and share the JotForm link with your sales or operations team. Customization Options 1️⃣ Proposal Branding – Customize proposal HTML with logos, brand colors, and formatting. 2️⃣ AI Prompt Tuning – Refine the LangChain AI prompt to match your tone or project style. 3️⃣ Task Templates – Adjust task names, assignees, and due dates in the Asana creation node. 4️⃣ Slack Messaging – Update welcome message formatting and team introduction details. 5️⃣ CRM Fields – Map additional HubSpot properties for better data tracking. 6️⃣ Sheet Logging – Add more columns for tracking team recommendations or proposal scores. Expected Results ⚡ 90% Faster Onboarding – Reduce manual setup from hours to minutes. 🤖 AI Precision – Intelligent proposals and team allocations that impress clients instantly. 📈 Zero Missed Steps – Every project automatically created, communicated, and logged. 💬 Seamless Collaboration – Slack, Gmail, and Asana in perfect sync. 🗂️ Complete Transparency – Every onboarding step logged for accountability and improvement. 🏆 Use Cases 🧑💼 Marketing & Creative Agencies – Automate creative project scoping and proposal creation. 💻 Software Development Teams – Rapidly assess client tech requirements and allocate developers. 🧾 Consulting Firms – Build data-backed, AI-enhanced proposals for client engagements. 🏢 Corporate PMOs – Standardize project setup and approvals across multiple departments. Pro Tips 💡 Refine AI Prompt – Include examples of past projects to improve proposal quality. 💬 Add Slack Approvals – Insert “manager approval” logic before sending proposals. 📄 Attach PDFs – Use PDF Munk for branded, downloadable proposals. 📊 Track Conversion – Link HubSpot deal stage changes based on Asana progress. 📅 Monitor Efficiency – Use Sheet timestamps to calculate average onboarding time. Learning Resources This workflow demonstrates: AI integration using Agents Multi-app orchestration and data syncing Advanced HTML and email template customization Real-world Asana and Slack API usage CRM syncing and Google Sheets logging Modular, scalable n8n workflow design Workflow Structure Visualization 📝 JotForm Submission ↓ 🧠 AI Project Analysis (Agent) ↓ 📄 Proposal Generation (HTML) ↓ 🗂️ Asana Project Creation ↓ 💬 Slack Channel Setup & Message ↓ 📧 Gmail Welcome Email ↓ 💼 HubSpot Contact Creation ↓ 📊 Google Sheets Log Ready to Revolutionize Client Onboarding? Import this template today and let AI handle the heavy lifting. Your team saves hours, your clients get instant engagement — and your entire process runs flawlessly. ✨
by Fenngbrotalk
n8n Workflow: AI-Powered Stock Chart Analysis Bot for Telegram This is a powerful n8n automation workflow that integrates a Telegram bot with OpenAI's multimodal large language model (GPT-4 Vision) to provide users with real-time stock chart analysis. Workflow Breakdown Receive Image:** The workflow is initiated by a Telegram Trigger. It activates whenever a user sends an image (e.g., a stock's candlestick chart) to a designated Telegram chat, automatically downloading the file. Image Pre-processing:** To optimize the AI's performance and efficiency, the Edit Image node resizes the incoming image to a standard 512x512 pixel format. AI Vision Analysis:** The processed image is then passed to a LangChain Chain, which utilizes the OpenAI GPT-4 Vision model. A sophisticated system prompt instructs the AI to act as a professional stock analyst. Intelligent Interpretation:** The AI analyzes the image to identify the stock's name, price trend (uptrend, downtrend, or sideways), key support/resistance levels, and volume changes. It then generates a comprehensive analysis report combining technical indicators and market sentiment. Structured Output:** To ensure reliability and consistency, the AI's output is parsed into a specific JSON format. This structure includes a search_word (for the industry/sector) and the main content (the analysis text). Send Response:** Finally, the workflow extracts the content field from the JSON output and uses the Telegram node to send this professional analysis back to the user as a text message in the same chat. Key Features User-Friendly:** Users simply send an image to get an analysis, requiring no complex commands. Instant & Efficient:** The entire analysis and response process is fully automated and completed within moments. Professional-Grade Analysis:** Leverages the advanced image recognition and reasoning capabilities of GPT-4 Vision to deliver insights comparable to those of a human analyst. Reliable & Consistent:** The use of structured output ensures that the format of the response is always consistent and easy to read or process further.
by Hemanth Arety
Generate AEO strategy from brand input using AI competitor analysis This workflow automatically creates a comprehensive Answer Engine Optimization (AEO) strategy by identifying your top competitors, analyzing their positioning, and generating custom recommendations to help your brand rank in AI-powered search engines like ChatGPT, Perplexity, and Google SGE. Who it's for This template is perfect for: Digital marketing agencies** offering AEO services to clients In-house marketers** optimizing content for AI search engines Brand strategists** analyzing competitive positioning Content teams** creating AI-optimized content strategies SEO professionals** expanding into Answer Engine Optimization What it does The workflow automates the entire AEO research and strategy process in 6 steps: Collects brand information via a user-friendly web form (brand name, website, niche, product type, email) Identifies top 3 competitors using Google Gemini AI based on product overlap, market position, digital presence, and geographic factors Scrapes target brand website with Firecrawl to extract value propositions, features, and content themes Scrapes competitor websites in parallel to gather competitive intelligence Generates comprehensive AEO strategy using OpenAI GPT-4 with 15+ actionable recommendations Delivers formatted report via email with executive summary, competitive analysis, and implementation roadmap The entire process runs automatically and takes approximately 5-7 minutes to complete. How to set up Requirements You'll need API credentials for: Google Gemini API** (for competitor analysis) - Get API key OpenAI API** (for strategy generation) - Get API key Firecrawl API** (for web scraping) - Get API key Gmail account** (for email delivery) - Use OAuth2 authentication Setup Steps Import the workflow into your n8n instance Configure credentials: Add your Google Gemini API key to the "Google Gemini Chat Model" node Add your OpenAI API key to the "OpenAI Chat Model" node Add your Firecrawl API key as HTTP Header Auth credentials Connect your Gmail account using OAuth2 Activate the workflow and copy the form webhook URL Test the workflow by submitting a real brand through the form Check your email for the generated AEO strategy report Credentials Setup Tips For Firecrawl: Create HTTP Header Auth credentials with header name Authorization and value Bearer YOUR_API_KEY For Gmail: Use OAuth2 to avoid authentication issues with 2FA Test each API credential individually before running the full workflow How it works Competitor Identification The Google Gemini AI agent analyzes your brand based on 4 weighted criteria: product/service overlap (40%), market position (30%), digital presence (20%), and geographic overlap (10%). It returns structured JSON data with competitor names, URLs, overlap percentages, and detailed reasoning. Web Scraping Firecrawl extracts structured data from websites using custom schemas. For each site, it captures: company name, products/services, value proposition, target audience, key features, pricing info, and content themes. This runs asynchronously with 60-second waits to allow for complete extraction. Strategy Generation OpenAI GPT-4 analyzes the combined brand and competitor data to generate a comprehensive report including: executive summary, competitive analysis, 15+ specific AEO tactics across 4 categories (content optimization, structural improvements, authority building, answer engine targeting), content priority matrix with 10 ranked topics, and a detailed implementation roadmap. Email Delivery The strategy is formatted as a professional HTML email with clear sections, visual hierarchy, and actionable next steps. Recipients get an immediately implementable roadmap for improving their AEO performance. How to customize the workflow Change AI Models Replace Google Gemini** with Claude, GPT-4, or other LLM in the competitor analysis node Replace OpenAI** with Anthropic Claude or Google Gemini in the strategy generation node Both use LangChain agent nodes, making model swapping straightforward Modify Competitor Analysis Find more competitors**: Edit the AI prompt to request 5 or 10 competitors instead of 3 Add filtering criteria**: Include factors like company size, funding stage, or geographic focus Change ranking weights**: Adjust the 40/30/20/10 weighting in the prompt Enhance Data Collection Add social media scraping**: Include LinkedIn, Twitter/X, or Facebook page analysis Pull review data**: Integrate G2, Capterra, or Trustpilot APIs for customer sentiment Include traffic data**: Add SimilarWeb or Semrush API calls for competitive metrics Change Output Format Export to Google Docs**: Replace Gmail with Google Docs node to create shareable documents Send to Slack/Discord**: Post strategy summaries to team channels for collaboration Save to database**: Store results in Airtable, PostgreSQL, or MongoDB for tracking Create presentations**: Generate PowerPoint slides using automation tools Add More Features Schedule periodic analysis**: Run monthly competitive audits for specific brands A/B test strategies**: Generate multiple strategies and compare results over time Multi-language support**: Add translation nodes for international brands Custom branding**: Modify email templates with your agency's logo and colors Adjust Scraping Behavior Change Firecrawl schema**: Customize extracted data fields based on industry needs Add timeout handling**: Implement retry logic for failed scraping attempts Scrape more pages**: Extend beyond homepage to include blog, pricing, and about pages Use different scrapers**: Replace Firecrawl with Apify, Browserless, or custom solutions Tips for best results Provide clear brand information**: The more specific the product type and niche, the better the competitor identification Ensure websites are accessible**: Some sites block scrapers; consider adding user agents or rotating IPs Monitor API costs**: Firecrawl and OpenAI charges can add up; set usage limits Review generated strategies**: AI recommendations should be reviewed and customized for your specific context Iterate on prompts**: Fine-tune the AI prompts based on output quality over multiple runs Common use cases Client onboarding** for marketing agencies - Generate initial AEO assessments Content strategy planning** - Identify topics and angles competitors are missing Quarterly audits** - Track competitive positioning changes over time Product launches** - Understand competitive landscape before entering market Sales enablement** - Equip sales teams with competitive intelligence Note: This workflow uses community and AI nodes that require external API access. Make sure your n8n instance can make outbound HTTP requests and has the necessary LangChain nodes installed.
by franck fambou
Overview This advanced automation workflow enables deep web scraping combined with Retrieval-Augmented Generation (RAG) to transform websites into intelligent, queryable knowledge bases. The system recursively crawls target websites, extracts content, and indexes all data in a vector database for AI conversational access. How the system works Intelligent Web Scraping and RAG Pipeline Recursive Web Scraper - Automatically crawls every accessible page of a target website Data Extraction - Collects text, metadata, emails, links, and PDF documents Supabase Integration - Stores content in PostgreSQL tables for scalability RAG Vectorization - Generates embeddings and stores them for semantic search AI Query Layer - Connects embeddings to an AI chat engine with citations Error Handling - Automatically retriggers failed queries Setup Instructions Estimated setup time: 30-45 minutes Prerequisites Self-hosted n8n instance (v0.200.0 or higher) Supabase account and project (PostgreSQL enabled) OpenAI/Gemini/Claude API key for embeddings and chat Optional: External vector database (Pinecone, Qdrant) Detailed configuration steps Step 1: Supabase configuration Project creation**: New Supabase project with PostgreSQL enabled Generating credentials**: API keys (anon key and service_role key) and connection string Security configuration**: RLS policies according to your access requirements Step 2: Connect Supabase to n8n Configure Supabase node**: Add credentials to n8n Credentials Test connection**: Verify with a simple query Configure PostgreSQL**: Direct connection for advanced operations Step 3: Preparing the database Main tables**: pages: URLs, content, metadata, scraping statuses documents: Extracted and processed PDF files embeddings: Vectors for semantic search links: Link graph for navigation Management functions**: Scripts to reactivate failed URLs and manage retries Step 4: Configuring automation Recursive scraper**: Starting URL, crawling depth, CSS selectors HTTP extraction**: User-Agent, headers, timeouts, and retry policies Supabase backup**: Batch insertion, data validation, duplicate management Step 5: Error handling and re-executions Failure monitoring**: Automatic detection of failed URLs Manual triggers**: Selective re-execution by domain or date Recovery sub-streams**: Retry logic with exponential backoff Step 6: RAG processing Embedding generation**: Text-embedding models with intelligent chunking Vector storage**: Supabase pgvector or external database Conversational engine**: Connection to chat models with source citations Data structure Main Supabase tables | Table | Content | Usage | |-------|---------|-------| | pages | URLs, HTML content, metadata | Main storage for scraped content | | documents | PDF files, extracted text | Downloaded and processed documents | | embeddings | Vectors, text chunks | Semantic search and RAG | | links | Link graph, navigation | Relationships between pages | Use cases Business and enterprise Competitive intelligence with conversational querying Market research from complex web domains Compliance monitoring and regulatory watch Research and academia Literature extraction with semantic search Building datasets from fragmented sources Legal and technical Scraping legal repositories with intelligent queries Technical documentation transformed into a conversational assistant Key features Advanced scraping Recursive crawling with automatic link discovery Multi-format extraction (HTML, PDF, emails) Intelligent error handling and retry Intelligent RAG Contextual embeddings for semantic search Multi-document queries with citations Intuitive conversational interface Performance and scalability Processing of thousands of pages per execution Embedding cache for fast responses Scalable architecture with Supabase Technical Architecture Main flow: Target URL → Recursive scraping → Content extraction → Supabase storage → Vectorization → Conversational interface Supported types: HTML pages, PDF documents, metadata, links, emails Performance specifications Capacity**: 10,000+ pages per run Response time**: < 5 seconds for RAG queries Accuracy**: >90% relevance for specific domains Scalability**: Distributed architecture via Supabase Advanced configuration Customization Crawling depth and scope controls Domain and content type filters Chunking settings to optimize RAG Monitoring Real-time monitoring in Supabase Cost and performance metrics Detailed conversation logs
by Dr. Firas
💥 Automate YouTube thumbnail creation from video links (with templated.io) Who is this for? This workflow is designed for content creators, YouTubers, and automation enthusiasts who want to automatically generate stunning YouTube thumbnails and streamline their publishing workflow — all within n8n. If you regularly post videos and spend hours designing thumbnails manually, this automation is built for you. What problem is this workflow solving? Creating thumbnails is time-consuming — yet crucial for video performance. This workflow completely automates that process: No more manual design. No more downloading screenshots. No more repetitive uploads. In less than 2 minutes, you can refresh your entire YouTube thumbnail library and make your channel look brand new. What this workflow does Once activated, this workflow can: ✅ Receive YouTube video links via Telegram ✅ Extract metadata (title, description, channel info) via YouTube API ✅ Generate a custom thumbnail automatically using Templated.io ✅ Upload the new thumbnail to Google Drive ✅ Log data in Google Sheets ✅ Send email and Telegram notifications when ready ✅ Create and publish AI-generated social posts on LinkedIn, Facebook, and Twitter via Blotato Bonus: You can re-create dozens of YouTube covers in minutes — saving up to 5 hours per week and around $500/month in manual design effort. Setup 1️⃣ Get a YouTube Data API v3 key from Google Cloud Console 2️⃣ Create a Templated.io account and get your API key + template ID 3️⃣ Set up a Telegram bot using @BotFather 4️⃣ Create a Google Drive folder and copy the folder ID 5️⃣ Create a Google Sheet with columns: Date, Video ID, Video URL, Title, Thumbnail Link, Status 6️⃣ Get your Blotato API key from the dashboard 7️⃣ Connect your social media accounts to Blotato 8️⃣ Fill all credentials in the Workflow Configuration node 9️⃣ Test by sending a YouTube URL to your Telegram bot How to customize this workflow Replace the Templated.io template ID with your own custom thumbnail layout Modify the OpenAI node prompts to change text tone or style Add or remove social platforms in the Blotato section Adjust the wait time (default: 5 minutes) based on template complexity Localize or translate the generated captions as needed Expected Outcome With one Telegram message, you’ll receive: A professional custom thumbnail An instant email + Telegram notification A Google Drive link with your ready-to-use design And your social networks will be automatically updated — no manual uploads. Credits Thumbnail generation powered by Templated.io Social publishing powered by Blotato Automation orchestrated via n8n 👋 Need help or want to customize this? 📩 Contact: LinkedIn 📺 YouTube: @DRFIRASS 🚀 Workshops: Mes Ateliers n8n 🎥 Watch This Tutorial 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube / 🚀 Mes Ateliers n8n
by Marth
Automated AI-Driven Competitor & Market Intelligence System Problem Solved:** Small and Medium-sized IT companies often struggle to stay ahead in a rapidly evolving market. Manually tracking competitor moves, pricing changes, product updates, and emerging market trends is time-consuming, inconsistent, and often too slow for agile sales strategies. This leads to missed sales opportunities, ineffective pitches, and a reactive rather than proactive market approach. Solution Overview:** This n8n workflow automates the continuous collection and AI-powered analysis of competitor data and market trends. By leveraging web scraping, RSS feeds, and advanced AI models, it transforms raw data into actionable insights for your sales and marketing teams. The system generates structured reports, notifies relevant stakeholders, and stores intelligence in your database, empowering your team with real-time, strategic information. For Whom:** This high-value workflow is perfect for: IT Solution Providers & SaaS Companies: To maintain a competitive edge and tailor sales pitches based on competitor weaknesses and market opportunities. Sales & Marketing Leaders: To gain comprehensive, automated market intelligence without extensive manual research. Product Development Teams: To identify market gaps and validate new feature development based on competitive landscapes and customer sentiment. Business Strategists: To inform strategic planning with data-driven insights into industry trends and competitive threats. How It Works (Scope of the Workflow) ⚙️ This system establishes a powerful, automated pipeline for market and competitor intelligence: Scheduled Data Collection: The workflow runs automatically at predefined intervals (e.g., weekly), initiating data retrieval from various online sources. Diverse Information Gathering: It pulls data from competitor websites (pricing, features, blogs via web scraping services), industry news and blogs (via RSS feeds), and potentially other sources. Intelligent Data Preparation: Collected data is aggregated, cleaned, and pre-processed using custom code to ensure it's in an optimal format for AI analysis, removing noise and extracting relevant text. AI-Powered Analysis: An advanced AI model (like OpenAI's GPT-4o) performs in-depth analysis on the cleaned data. It identifies competitor strengths, weaknesses, new offerings, pricing changes, customer sentiment from reviews, emerging market trends, and suggests specific opportunities and threats for your company. Automated Report Generation: The AI's structured insights are automatically populated into a professional Google Docs report using a predefined template, making the intelligence easily digestible for your team. Team Notification: Stakeholders (sales leads, marketing managers) receive automated notifications via Slack (or email), alerting them to the new report and key insights. Strategic Data Storage & Utilization: All analyzed insights are stored in a central database (e.g., PostgreSQL). This builds a historical record for long-term trend analysis and can optionally trigger sub-workflows to generate personalized sales talking points directly relevant to ongoing deals or specific prospects. Setup Steps 🛠️ (Building the Workflow) To implement this sophisticated workflow in your n8n instance, follow these detailed steps: Prepare Your Digital Assets & Accounts: Google Sheet (Optional, if using for CRM data): For simpler CRM, create a sheet with CompetitorName, LastAnalyzedDate, Strengths, Weaknesses, Opportunities, Threats, SalesTalkingPoints. API Keys & Credentials: OpenAI API Key: Essential for the AI analysis. Web Scraping Service API Key: For services like Apify, Crawlbase, or similar (e.g., Bright Data, ScraperAPI). Database Access: Credentials for your PostgreSQL/MySQL database. Ensure you've created necessary tables (competitor_profiles, market_trends) with appropriate columns. Google Docs Credential: To link n8n to your Google Drive for report generation. Create a template Google Doc with placeholders (e.g., {{competitorName}}, {{strengths}}). Slack Credential: For sending team notifications to specific channels. CRM API Key (Optional): If directly integrating with HubSpot, Salesforce, or custom CRM via API. Identify Data Sources for Intelligence: Compile a list of competitor website URLs you want to monitor (e.g., pricing pages, blog sections, news). Identify relevant online review platforms (e.g., G2, Capterra) for competitor products. Gather RSS Feed URLs from key industry news sources, tech blogs, and competitor's own blogs. Define keywords for general market trends or competitor mentions, if using tools that provide RSS feeds (like Google Alerts). Build the n8n Workflow (10 Key Nodes): Start a new workflow in n8n and add the following nodes, configuring their parameters and connections carefully: Cron (Scheduled Analysis Trigger): Set this to trigger daily or weekly at a specific time (e.g., Every Week, At Hour: 0, At Minute: 0). HTTP Request (Fetch Competitor Web Data): Configure this to call your chosen web scraping service's API. Set Method to POST, URL to the service's API endpoint, and build the JSON/Raw Body with the startUrls (competitor websites, review sites) for scraping, including your API Key in Authentication (e.g., Header Auth). RSS Feed (Fetch News & Blog RSS): Add the URLs of competitor blogs and industry news RSS feeds. Merge (Combine Data Sources): Connect inputs from both Fetch Competitor Web Data and Fetch News & Blog RSS. Use Merge By Position. Code (Pre-process Data for AI): Write JavaScript code to iterate through merged items, extract relevant text content, perform basic cleaning (e.g., HTML stripping), and limit text length for AI input. Output should be an array of objects with content, title, url, and source. OpenAI (AI Analysis & Competitor Insights): Select your OpenAI credential. Set Resource to Chat Completion and Model to gpt-4o. In Messages, create a System message defining AI's role and a User message containing the dynamic prompt (referencing {{ $json.map(item => ... ).join('\\n\\n') }} for content, title, url, source) and requesting a structured JSON output for analysis. Set Output to Raw Data. Google Docs (Generate Market Intelligence Report): Select your Google Docs credential. Set Operation to Create document from template. Provide your Template Document ID and map the Values from the parsed AI output (using JSON.parse($json.choices[0].message.content).PropertyName) to your template placeholders. Slack (Sales & Marketing Team Notification): Select your Slack credential. Set Chat ID to your team's Slack channel ID. Compose the Text message, referencing the report link ({{ $json.documentUrl }}) and key AI insights (e.g., {{ JSON.parse($json.choices[0].message.content).Competitor_Name }}). PostgreSQL (Store Insights to Database): Select your PostgreSQL credential. Set Operation to Execute Query. Write an INSERT ... ON CONFLICT DO UPDATE SQL query to store the AI insights into your competitor_profiles or market_trends table, mapping values from the parsed AI output. OpenAI (Generate Personalized Sales Talking Points - Optional Branch): This node can be part of the main workflow or a separate, manually triggered workflow. Configure it similarly to the main AI node, but with a prompt tailored to generate sales talking points based on a specific sales context and the stored insights. Final Testing & Activation: Run a Test: Before going live, manually trigger the workflow from the first node. Carefully review the data at each stage to ensure correct processing and output. Verify that reports are generated, notifications are sent, and data is stored correctly. Activate Workflow: Once testing is complete and successful, activate the workflow in n8n. This system will empower your IT company's sales team with invaluable, data-driven intelligence, enabling them to close more deals and stay ahead in the market.
by May Ramati Kroitero
Automated Job Hunt with Tavily — Setup & Run Guide What this template does? Automatically searches for recent job postings (example: “Software Engineering Intern”), extracts structured details from each posting using an AI agent + Tavily, bundles results, and emails a single weekly digest. Estimated setup time: ~30 minutes 1. Required credentials Before you import or run the workflow, create/configure these credentials in your n8n instance: OpenAI (Chat model) — used by the OpenAI Chat Model and Message a model nodes. Add an OpenAI credential (name it e.g. OpenAi account) and paste your OpenAi API key. Tavily API — used by the Search in Tavily node. Add a Tavily credential (name it e.g. Tavily account) and add your Tavily API key. Gmail (OAuth2) — used by the Send a message node to deliver the digest email. Configure Gmail OAuth2 credential and select it for the Gmail node (e.g. Gmail account. 2. Node-by-node configuration (what to check/change) Schedule Trigger Node name: Schedule Trigger Configure interval: daily or weekly (example: weekly, trigger at 08:00). Note: This is the workflow start. Adjust to your preferred cadence. AI Agent Node name: AI Agent Important: First step — set the agent’s prompt / system message. Search in Tavily (Tavily Tool node) Node name: Tavily Query: user-editable field (example default: Roles posted this week for Software Engineering) Advice: keep query under 400 chars; change to target role/location keywords. Options recommended: Search Depth: advanced (optional, better extraction) Max Results: 15 Time Range: week (limit to past week) Include Raw Content: true (fetch full page content for better extraction) Include Domains: indeed.com, glassdoor.com,linkedin.com — prioritize trusted sources Edit Fields / Set (bundle) Node name: Edit Fields (Set) Purpose: Collect the agent output into one field (e.g., $json.output or Response) for downstream processing. Message a Model (OpenAI formatting step) Node name: Message a model Uses OpenAI (the openAiApi credential). This node can be used to reformat or normalize the agent output into consistent blocks if needed. Use the same system rules you used for the agent (the prompt/system message earlier). You can also leave this minimal if the agent already outputs structured blocks. Code Node (Parsing & structuring) Node name: Code Purpose: Split the agent/LLM text into separate job postings and extract fields with regex. Aggregate Node Node name: Aggregate Mode: aggregateAllItemData (this combines all parsed postings into a single data array so the Gmail node can loop over them) Gmail node (Send a message) Node name: Send a message sendTo: set to recipient(s) (e.g., your inbox) subject: e.g. New Jobs for this week! emailType: text (or html if you build HTML content) message (body): use the expression that loops through data and formats every posting. 3. How to test (quick steps) Set credentials in n8n (OpenAI, Tavily, Gmail). Run the Schedule Trigger manually (use the “Execute Workflow” or manually trigger nodes). Inspect the Search in Tavily node output — confirm it returns results. Inspect the AI Agent and Message a model outputs — ensure formatted postings are produced and separated by --- END JOB POSTING ---. Run the Code node — confirm it returns structured items with posting_number, job_title, requirements[], etc. Check Aggregate output: you should see a single item with data array. In Gmail node, run a test send — confirm the email receives one combined message with all postings. 4. Troubleshooting tips Gmail body shows [Array: …]: Avoid dragging the array raw — use the expression that maps data to formatted strings. Code node split error: Occurs when raw is undefined. Ensure previous node returns message.content or adjust to use $input.all() and join contents safely. Missing fields after parsing: Check LLM/agent output labels match the Code node’s regex (e.g., Job Title:). If labels differ, update regex or LLM formatting. 5. Customization ideas Filter by location or remote-only roles, or add keyword filters (seniority, stack). Send results to Google Sheets or Slack instead of/in addition to Gmail. Add an LLM summarization step to create a 1-line highlight per posting.
by Harry Siggins
This n8n template transforms your daily meeting preparation by automatically researching attendees and generating comprehensive briefing documents. Every weekday morning, it analyzes your calendar events, researches each external attendee using multiple data sources, and delivers professionally formatted meeting briefs directly to your Slack channel. Who's it for Business professionals, sales teams, account managers, and executives who regularly attend meetings with external contacts and want to arrive fully prepared with relevant context, conversation starters, and strategic insights about their attendees. How it works The workflow triggers automatically Monday through Friday at 6 AM, fetching your day's calendar events and filtering for meetings with external attendees. For each meeting, an AI agent researches attendees using your CRM (Attio), email history (Gmail), past calendar interactions, and external company research via Perplexity when needed. The system then generates structured meeting briefs containing attendee background, relationship context, key talking points, and strategic objectives, delivering everything as a formatted Slack message to start your day. Requirements Google Calendar with OAuth2 credentials Gmail with OAuth2 credentials Slack workspace with bot token and channel access Attio CRM with API bearer token OpenRouter API key for AI model access (or other API credentials to connect AI to your AI agents) Perplexity API key for company research How to set up Configure credentials for all required services in your n8n instance Update personal identifiers in the workflow: Replace "YOUR_EMAIL@example.com" with your actual calendar email in both Google Calendar nodes Replace "YOUR_SLACK_CHANNEL_ID" with your target channel ID in both Slack nodes Adjust AI models in OpenRouter nodes based on your preferences and model availability Test the workflow manually with a day that contains external meetings Verify Slack formatting appears correctly in your channel How to customize the workflow Change meeting research depth: Modify the AI agent prompt to focus on specific research areas like company financial data, recent news, or technical background. Adjust notification timing: Update the cron expression in the Schedule Trigger to run at different times or days. Expand CRM integration: Add additional Attio API calls to capture more contact details or create follow-up tasks. Enhance Slack formatting: Customize the Block Kit message structure in the JavaScript code node to include additional meeting metadata or visual elements. Add more research sources: Connect additional tools like LinkedIn, company databases, or news APIs to the AI agent for richer attendee insights. The template uses multiple AI models through OpenRouter for different processing stages, allowing you to optimize costs and performance by selecting appropriate models for research tasks versus text formatting operations.
by Tom
This n8n workflow template simplifies the process of digesting cybersecurity reports by summarizing, deduplicating, organizing, and identifying viral topics of interest into daily emails. It will generate two types of emails: A daily digest with summaries of deduplicated cybersecurity reports organized into various topics. A daily viral topic report with summaries of recurring topics that have been identified over the last seven days. This workflow supports threat intelligence analysts digest the high number of cybersecurity reports they must analyse daily by decreasing the noise and tracking topics of importance with additional care, while providing customizability with regards to sources and output format. How it works The workflow follows the threat intelligence lifecycle as labelled by the coloured notes. Every morning, collect news articles from a set of RSS feeds. Merge the feeds output and prepare them for LLM consumption. Task an LLM with writing an intelligence briefing that summarizes, deduplicates, and organizes the topics. Generate and send an email with the daily digest. Collect the daily digests of the last seven days and prepare them for LLM consumption. Task an LLM with writing a report that covers 'viral' topics that have appeared prominently in the news. Store this report and send out over email. How to use & customization The workflow will trigger daily at 7am. The workflow can be reused for other types of news as well. The RSS feeds can be swapped out and the AI prompts can easily be altered. The parameters used for the viral topic identification process can easily be changed (number of previous days considered, requirements for a topic to be 'viral'). Requirements The workflow leverages Gemini (free tier) for email content generation and Baserow for storing generated reports. The viral topic identification relies on the Gemini Pro model because of a higher data quantity and more complex task. An SMTP email account must be provided to send the emails with. This can be through Gmail.