by Igor Chernyaev
Template name Smart AI Support Assistant for Telegram Short description Smart AI Support Assistant for Telegram automatically answers repeated questions in your group using a Q&A knowledge base in Pinecone and forwards new or unclear questions to a human expert. Long description (Description поле) How it works Question detection listens to messages in a Telegram group and checks whether each new message is a real question or an expert reply. Knowledge base search looks for an existing answer in the Pinecone vector store for valid questions from the group. Auto‑reply from cache sends the saved answer straight back to the group when a good match is found, without involving the expert. Escalation to expert creates a ticket and forwards unanswered questions to the expert in a private chat with the same bot. Expert learning loop saves the expert’s reply to Pinecone so that similar questions are answered automatically in the future. Setup steps Connect Telegram Trigger to a single Telegram bot that is added as an admin to the group/supergroup and receives all user messages. Use the same bot for the expert: the expert’s private chat with this bot is where tickets and questions are delivered. Set up Pinecone: create an index, note the environment and index name, and add your Pinecone API key to n8n credentials. Add your AI model API key (for example, OpenAI) and select the model used for embeddings and answer rewriting. Configure any environment variables or n8n credentials for project IDs and spaces/namespaces used in Pinecone. Test the full flow: send a question in the group, confirm that a ticket reaches the expert in a private chat, reply once, and check that the next similar question is answered automatically from the cache.
by Yusuke
🧠 Overview Discover and analyze the most valuable community-built n8n workflows on GitHub. This automation searches public repositories, analyzes JSON workflows using AI, and saves a ranked report to Google Sheets — including summaries, use cases, difficulty, stars, node count, and repository links. ⚙️ How It Works Search GitHub Code API — queries for extension:json n8n and splits results Fetch & Parse — downloads each candidate file’s raw JSON and safely parses it Extract Metadata — detects AI-powered flows and collects key node information AI Analysis — evaluates the top N workflows (description, use case, difficulty) Merge Insights — combines AI analysis with GitHub data Save to Google Sheets — appends or updates by workflow name 🧩 Setup Instructions (5–10 min) Open Config node and set: search_query — e.g., "openai" extension:json n8n max_results — number of results to fetch (1–100) ai_analysis_top — number of workflows analyzed with AI SPREADSHEET_ID, SHEET_NAME — Google Sheets target Add GitHub PAT via HTTP Header Credential: Authorization: Bearer <YOUR_TOKEN> Connect OpenAI Credential to OpenAI Chat Model Connect Google Sheets (OAuth2) to Save to Google Sheets (Optional) Enable Schedule Trigger to run weekly for automatic updates > 💡 Tip: If you need to show literal brackets, use backticks like `<example>` (no HTML entities needed). 📚 Use Cases 1) Trend Tracking for AI Automations Goal:** Identify the fastest-growing AI-powered n8n workflows on GitHub. Output:** Sorted list by stars and AI detection, updated weekly. 2) Internal Workflow Benchmarking Goal:** Compare your organization’s workflows against top public examples. Output:** Difficulty, node count, and AI usage metrics in Google Sheets. 3) Market Research for Automation Agencies Goal:** Discover trending integrations and tool combinations (e.g., OpenAI + Slack). Output:** Data-driven insights for client projects and content planning. 🧪 Notes & Best Practices 🔐 No hardcoded secrets — use n8n Credentials 🧱 Works with self-hosted or cloud n8n 🧪 Start small (max_results = 10) before scaling 🧭 Use “AI Powered” + “Stars” columns in Sheets to identify top templates 🧩 Uses only Markdown sticky notes — no HTML formatting required 🔗 Resources GitHub (template JSON):** github-workflow-finder-ai.json
by Cheng Siong Chin
How It Works This workflow automates student progress monitoring and academic intervention orchestration through intelligent AI-driven analysis. Designed for educational institutions, learning management systems, and academic advisors, it solves the critical challenge of identifying at-risk students while coordinating timely interventions across faculty and support services. The system receives student data via webhook, fetches historical learning records, and merges these sources for comprehensive progress analysis. It employs a dual-agent AI framework for student progress validation and academic orchestration, detecting performance gaps, engagement issues, and intervention opportunities. The workflow intelligently routes findings based on validation status, triggering orchestration actions for students requiring support while logging compliant progress for successful learners. By executing multi-channel interventions through HTTP APIs and email notifications, it ensures educators and students receive timely guidance while maintaining complete audit trails for academic accountability and accreditation compliance. Setup Steps Configure Student Data Webhook trigger endpoint Connect Workflow Configuration node with academic performance parameters Set up Fetch Student Learning History node with LMS API credentials Configure Merge Student Data node for data consolidation Connect Student Progress Validation Agent with Claude/OpenAI API credentials Set up AI processing nodes Configure Route by Validation Status node with performance thresholds Connect Academic Orchestration Agent with AI API credentials for intervention planning Set up orchestration processing Prerequisites Claude/OpenAI API credentials for AI agents, learning management system API access Use Cases Universities identifying students requiring academic support, online learning platforms detecting engagement drops Customization Adjust validation thresholds for institutional academic standards Benefits Reduces student identification lag by 75%, eliminates manual progress tracking
by giangxai
Overview This workflow automatically creates short-form AI videos using Sora 2 Cameos, powered by n8n and AI agents. It connects viral content collection, AI script and prompt generation, video rendering, video merging, and multi-platform publishing into a single fully automated system. Once configured, the workflow runs end to end without manual editing or intervention. The workflow is designed for creators, marketers, and affiliate builders who want to scale faceless or avatar-style AI videos consistently using viral ideas and automated publishing. What can this workflow do? Automatically collect viral content ideas from external sources Analyze viral ideas and generate structured Sora 2–ready prompts Create AI videos using pre-selected Sora 2 Cameos characters Merge multiple AI video clips into a single final video Publish videos automatically to TikTok, Facebook, and Instagram Track publishing status, success, and errors in Google Sheets This workflow reduces manual video production work while keeping the process structured, scalable, and repeatable. How it works The workflow starts by automatically collecting viral content ideas on a schedule and storing them in Google Sheets as a content backlog. Before prompt generation, a Cameos character is selected and configured manually on Sora2.com. This selected avatar is used consistently across all generated videos. Each viral idea is then analyzed to extract hooks, themes, and video direction. An AI Agent generates structured scripts and Sora 2–ready prompts based on both the viral content and the selected Cameos character. These prompts are sent to the Sora 2 Cameos video generation API to render short video clips. The workflow monitors rendering status and retries if needed. Once all clips are ready, they are automatically merged into a single final video. The finished video is published to social platforms such as TikTok, Facebook, and Instagram. Publishing results and errors are logged back to Google Sheets for monitoring and optimization. Setup steps Connect an AI model (Gemini or compatible LLM) for script and prompt generation Select and configure a Cameos character on Sora2.com Add Sora 2 Cameos API credentials for AI video generation Configure the video merge step to combine multiple clips into one final video Connect social publishing APIs (TikTok, Facebook, Instagram) Connect Google Sheets for content intake and status tracking Once configured, the workflow runs automatically on a schedule without manual input. Documentation For a full walkthrough, optimization tips, and scaling strategies, watch the detailed tutorial on YouTube.
by Joe V
🔧 Setup Guide - Hiring Bot Workflow 📋 Prerequisites Before importing this workflow, make sure you have: ✅ n8n Instance (cloud or self-hosted) ✅ Telegram Bot Token (from @BotFather) ✅ OpenAI API Key (with GPT-4 Vision access) ✅ Gmail Account (with OAuth setup) ✅ Google Drive (to store your resume) ✅ Redis Instance (free tier available at Redis Cloud) 🚀 Step-by-Step Setup 1️⃣ Upload Your Resume to OpenAI First, you need to upload your resume to OpenAI's Files API: Upload your resume to OpenAI curl https://api.openai.com/v1/files \ -H "Authorization: Bearer YOUR_OPENAI_API_KEY" \ -F purpose="assistants" \ -F file="@/path/to/your/resume.pdf" Important: Save the file_id from the response (looks like file-xxxxxxxxxxxxx) Alternative: Use OpenAI Playground or Python: from openai import OpenAI client = OpenAI(api_key="YOUR_API_KEY") with open("resume.pdf", "rb") as file: response = client.files.create(file=file, purpose="assistants") print(f"File ID: {response.id}") 2️⃣ Upload Your Resume to Google Drive Go to Google Drive Upload your resume PDF Right-click → "Get link" → Copy the file ID from URL URL format: https://drive.google.com/file/d/FILE_ID_HERE/view Example ID: 1h79U8IFtI2dp_OBtnyhdGaarWpKb9qq9 3️⃣ Create a Telegram Bot Open Telegram and message @BotFather Send /newbot Choose a name and username Save the Bot Token (looks like 123456789:ABCdefGHIjklMNOpqrsTUVwxyz) (Optional) Set bot commands: /start - Start the bot /help - Get help 4️⃣ Set Up Redis Option A: Redis Cloud (Recommended - Free) Go to Redis Cloud Create a free account Create a database Note: Host, Port, Password Option B: Local Redis Docker docker run -d -p 6379:6379 redis:latest Or via package manager sudo apt-get install redis-server 5️⃣ Import the Workflow to n8n Open n8n Click "+" → "Import from File" Select Hiring_Bot_Anonymized.json Workflow will import with placeholder values 6️⃣ Configure Credentials A. Telegram Bot Credentials In n8n, go to Credentials → Create New Select "Telegram API" Enter your Bot Token from Step 3 Test & Save B. OpenAI API Credentials Go to Credentials → Create New Select "OpenAI API" Enter your OpenAI API Key Test & Save C. Redis Credentials Go to Credentials → Create New Select "Redis" Enter: Host: Your Redis host Port: 6379 (default) Password: Your Redis password Test & Save D. Gmail Credentials Go to Credentials → Create New Select "Gmail OAuth2 API" Follow OAuth setup flow Authorize n8n to access Gmail Test & Save E. Google Drive Credentials Go to Credentials → Create New Select "Google Drive OAuth2 API" Follow OAuth setup flow Authorize n8n to access Drive Test & Save 7️⃣ Update Node Values A. Update OpenAI File ID in "PayloadForReply" Node Double-click the "PayloadForReply" node Find this line in the code: const resumeFileId = "YOUR_OPENAI_FILE_ID_HERE"; Replace with your actual OpenAI file ID from Step 1: const resumeFileId = "file-xxxxxxxxxxxxx"; Save the node B. Update Google Drive File ID (Both "Download Resume" Nodes) There are TWO nodes that need updating: Node 1: "Download Resume" Double-click the node In the "File ID" field, click "Expression" Replace YOUR_GOOGLE_DRIVE_FILE_ID with your actual ID Update "Cached Result Name" to your resume filename Save Node 2: "Download Resume1" (same process) Double-click the node Update File ID Update filename Save 8️⃣ Assign Credentials to Nodes After importing, you need to assign your credentials to each node: Nodes that need credentials: | Node Name | Credential Type | |-----------|----------------| | Telegram Trigger | Telegram API | | Generating Reply | OpenAI API | | Store AI Reply | Redis | | GetValues | Redis | | Download Resume | Google Drive OAuth2 | | Download Resume1 | Google Drive OAuth2 | | Schedule Email | Gmail OAuth2 | | SendConfirmation | Telegram API | | Send a message | Telegram API | | Edit a text message | Telegram API | | Send a text message | Telegram API | | Send a chat action | Telegram API | How to assign: Click on each node In the "Credentials" section, select your saved credential Save the node 🧪 Testing the Workflow 1️⃣ Activate the Workflow Click the "Active" toggle in the top-right Workflow should now be listening for Telegram messages 2️⃣ Test with a Job Post Find a job post online (LinkedIn, Indeed, etc.) Take a screenshot Send it to your Telegram bot Bot should respond with: "Analyzing job post..." (typing indicator) Full email draft with confirmation button 3️⃣ Test Email Sending Click "Send The Email" button Check Gmail to verify email was sent Check if resume was attached 🐛 Troubleshooting Issue: "No binary image found" Solution:** Make sure you're sending an image file, not a document Issue: "Invalid resume file_id" Solution:** Check OpenAI file_id format (starts with file-) Verify file was uploaded successfully Make sure you updated the code in PayloadForReply node Issue: "Failed to parse model JSON" Solution:** Check OpenAI API quota/limits Verify model name is correct (gpt-5.2) Check if image is readable Issue: Gmail not sending Solution:** Re-authenticate Gmail OAuth Check Gmail permissions Verify "attachments" field is set to "Resume" Issue: Redis connection failed Solution:** Test Redis connection in credentials Check firewall rules Verify host/port/password Issue: Telegram webhook not working Solution:** Deactivate and reactivate workflow Check Telegram bot token is valid Make sure bot is not blocked 🔐 Security Best Practices Never share your credentials - Keep API keys private Use environment variables in n8n for sensitive data Set up Redis password - Don't use default settings Limit OAuth scopes - Only grant necessary permissions Rotate API keys regularly Monitor usage - Check for unexpected API calls 🎨 Customization Ideas Change AI Model In the PayloadForReply node, update: const MODEL = "gpt-5.2"; // Change to gpt-4, claude-3-opus, etc. Adjust Email Length Modify the system prompt: // From: Email body: ~120–180 words unless INSIGHTS specify otherwise. // To: Email body: ~100–150 words for concise applications. Add More Languages Update language detection logic in the system prompt to support more languages. Custom Job Filtering Edit the system prompt to target specific roles: // From: Only pick ONE job offer to process — the one most clearly related to Data roles // To: Only pick ONE job offer to process — the one most clearly related to [YOUR FIELD] Add Follow-up Reminders Add a "Wait" node after email sends to schedule a reminder after 7 days. 📊 Workflow Structure Telegram Input ↓ Switch (Route by type) ↓ ├─ New Job Post │ ↓ │ Send Chat Action (typing...) │ ↓ │ PayloadForReply (Build AI request) │ ↓ │ Generating Reply (Call OpenAI) │ ↓ │ FormatAiReply (Parse JSON) │ ↓ │ Store AI Reply (Redis cache) │ ↓ │ SendConfirmation (Show preview) │ └─ Callback (User clicked "Send") ↓ GetValues (Fetch from Redis) ↓ Format Response ↓ Download Resume (from Drive) ↓ ├─ Path A: Immediate Send │ ↓ │ Send Confirmation Message │ ↓ │ Edit Message (update status) │ └─ Path B: Scheduled Send ↓ Wait (10 seconds) ↓ Download Resume Again ↓ Schedule Email (Gmail) ↓ Send Success Message 💡 Tips for Best Results High-Quality Resume: Upload a well-formatted PDF resume Clear Screenshots: Take clear, readable job post screenshots Use Captions: Add instructions via Telegram captions Example: "make it more casual" Example: "send to recruiter@company.com" Review Before Sending: Always read the draft before clicking send Update Resume Regularly: Keep your Google Drive resume current Test First: Try with a few test jobs before mass applying 🆘 Need Help? 📚 n8n Documentation 💬 n8n Community Forum 📺 n8n YouTube Channel 🤖 OpenAI Documentation 📱 Telegram Bot API Docs 📝 Checklist Use this checklist to verify your setup: [ ] OpenAI resume file uploaded (got file_id) [ ] Google Drive resume uploaded (got file ID) [ ] Telegram bot created (got bot token) [ ] Redis instance created (got credentials) [ ] All n8n credentials created and tested [ ] PayloadForReply node updated with OpenAI file_id [ ] Both Download Resume nodes updated with Drive file_id [ ] All nodes have credentials assigned [ ] Workflow activated [ ] Test message sent successfully [ ] Test email received successfully 🎉 You're all set! Start applying to jobs in 10 seconds! Made with ❤️ and n8n
by Roshan Ramani
Who's it for This workflow is ideal for: Content creators who want to replicate successful LinkedIn strategies Social media managers monitoring competitor content performance Marketing teams analyzing trending topics in their industry Personal brands looking to create data-driven content Agencies managing multiple LinkedIn accounts What it does This comprehensive workflow automates the entire LinkedIn content lifecycle: it scrapes viral posts from target accounts, analyzes engagement patterns, identifies trending topics, generates original AI-powered content based on those trends, creates accompanying images, and automatically publishes to your LinkedIn profile or company page. How it works Phase 1: Data Collection (Runs every 12 hours) Scheduler triggers the workflow twice daily Fetches LinkedIn profile URLs from Google Sheets Processes profiles in batches of 3 to respect API limits Uses Apify API to scrape recent posts from each profile Adds 3-second delays between requests to avoid rate limiting Filters for high-engagement posts (20+ likes, comments, or reposts) Saves viral posts to Google Sheets with full metadata Phase 2: Content Generation (Triggered by new data) Monitors Google Sheets for new viral posts every minute Filters posts published within the last 3 days that haven't been analyzed Aggregates trending content into a single dataset Analyzes patterns using Google Gemini AI to identify: Common themes and topics Engagement triggers and hooks Successful content structures Trending hashtags and formats Generates original LinkedIn post with proper formatting Creates AI image prompt optimized for minimal text Generates professional image using Google Imagen Publishes complete post to your LinkedIn account Marks analyzed posts as complete to prevent duplication Setup steps 1. Configure Google Sheets Create a new Google Sheet with two tabs: Tab 1: "usernames & links" - Add LinkedIn profile URLs you want to monitor Tab 2: "scrape data" - Leave empty (auto-populated by workflow) Connect your Google Sheets credentials in both nodes Replace all instances of YOUR_GOOGLE_SHEET_ID with your actual sheet ID Replace SHEET_GID values with your actual sheet GIDs 2. Set up Apify API Sign up for Apify account and get API token Replace YOUR_APIFY_API_TOKEN in "Scrape LinkedIn Posts API" node Note: Apify has free tier with limited requests 3. Configure Google Gemini credentials Obtain Google PaLM API credentials Add credentials to both "Google Gemini Chat Model" and "Generate an image" nodes 4. Set up LinkedIn publishing Connect your LinkedIn credentials in "Publish to LinkedIn" node If posting as organization, replace YOUR_LINKEDIN_ORGANIZATION_ID with your company page ID If posting as individual, change "postAs" parameter to "person" 5. Configure scheduling Default schedule: every 12 hours Adjust "LinkedIn Content Automation Scheduler" trigger if needed Consider your API rate limits when changing frequency 6. Test the workflow Manually trigger Phase 1 to scrape posts Verify data appears in Google Sheets "scrape data" tab Wait for Phase 2 trigger or manually activate it Check that content is generated and published correctly Verify posts are marked as analyzed in Google Sheets Requirements Google Sheets API access (free) Google Sheets Trigger OAuth2 (free) Apify API token (free tier available, $49/month for more) Google PaLM/Gemini API key (pay-per-use pricing) LinkedIn OAuth credentials (free) How to customize Adjust scraping targets: Add more LinkedIn profile URLs to your Google Sheets Change batch size in "Process Profiles in Batches" (default: 3) Modify post limit per profile in Apify API call (default: 1 post) Modify engagement filters: Edit "Filter High-Engagement Posts" node thresholds Default: 20+ likes OR 20+ comments OR 20+ reposts Adjust based on your niche's typical engagement rates Add additional criteria like views or impressions Customize content analysis window: Change "Filter Recent Posts (3 Days)" to analyze different timeframes Options: 24 hours for fast-moving trends, 7 days for broader patterns Balance between recency and data volume Refine AI content generation: Edit system prompt in "LinkedIn Content Strategy AI" node Adjust content length, tone, or style preferences Add industry-specific guidelines Include brand voice requirements Modify hashtag strategy Customize image generation: Edit image prompt structure in AI prompt Change visual style, colors, or composition Adjust for brand guidelines Modify dimensions or aspect ratios Change posting schedule: Adjust "LinkedIn Content Automation Scheduler" frequency Consider optimal posting times for your audience Balance between content quality and posting frequency Coordinate with other marketing activities Enhance data collection: Increase posts per profile in Apify settings Add more profile URLs to monitor Implement competitor tracking Track additional metrics like impressions or click-through rates Add notifications: Connect Slack/Email nodes after successful posts Set up alerts for high-performing content Create reports of analyzed trends Monitor API usage and errors
by Blaine Holt
WhatsApp Booking Flow | Consultation Scheduling This n8n template automates appointment booking via WhatsApp Flows with real-time calendar availability, AI-powered intent classification, and CRM synchronization. It transforms manual booking conversations into a seamless self-service experience directly within WhatsApp. Who is this for? Service businesses wanting WhatsApp-based appointment booking Consultants and agencies offering scheduled calls or consultations Teams using Airtable for CRM and Google Calendar for scheduling Businesses looking to reduce booking friction with conversational commerce What problem does this workflow solve? Appointment booking typically involves back-and-forth messaging to find available times, collect customer details, and confirm bookings. This workflow eliminates that friction by providing an interactive booking flow within WhatsApp that checks real-time calendar availability, collects customer information, and automatically syncs bookings to your CRM and calendar. How it works Webhook Entry Point: A single webhook handles both GET (Meta verification) and POST (messages/flows) requests - required due to Meta's single-webhook-per-app restriction. Message Routing: Incoming requests are classified as either regular WhatsApp messages or encrypted Flow requests, then routed accordingly. WhatsApp Flow Handling: Flow requests are decrypted using RSA/AES-GCM encryption. The workflow handles multiple Flow actions: INIT: Returns consultation types and date constraints SERVICE_SELECTION: Processes service and customer details DATE_TIME_SELECTION: Queries calendar availability for selected date CONFIRMATION: Displays booking summary COMPLETE_BOOKING: Finalizes the booking AI Agent: For regular text messages, an OpenAI-powered agent classifies user intent. When booking intent is detected, it triggers the consultation template with the WhatsApp Flow. Booking Process: On completion, the workflow: Creates or updates customer in Airtable Creates booking record with event details Creates Google Calendar event Sends WhatsApp confirmation message Good to Know Verify Token: The WHATSAPP_VERIFY_TOKEN environment variable is required for Meta webhook verification. Set this to any secure string and use the same value in Meta Developer Portal. Cloud vs Self-hosted: Cloud n8n instances are easier to configure as they have public URLs. Self-hosted instances require additional setup for public accessibility. Hostinger/Docker Setup: For self-hosted instances, configure public webhook access in your docker-compose.yml or reverse proxy configuration. Meta Prerequisites: You'll need: Facebook Account Meta Developer App WhatsApp Business Account (linked to Developer App) WhatsApp Business Phone Number Health Checks: Meta sends periodic ping requests to verify webhook availability. The workflow handles these automatically with a 200 response. Single Webhook Restriction: Meta only allows one webhook URL per WhatsApp app, which is why all message types flow through a single endpoint with routing logic. Encryption: WhatsApp Flows use end-to-end encryption. The workflow handles RSA decryption (for AES key exchange) and AES-128-GCM encryption/decryption for Flow data. Requirements Meta Business Account with WhatsApp Business API access WhatsApp Business App in Meta Developer Portal n8n instance (cloud or self-hosted with public URL) OpenAI API key (for intent classification with GPT-4o) Airtable account with base containing: Customers, Services, Bookings tables Google Calendar with OAuth credentials Environment Variables | Variable | Description | |----------|-------------| | WHATSAPP_VERIFY_TOKEN | Webhook verification token (match in Meta Developer Portal) | | WHATSAPP_PRIVATE_KEY | RSA private key for Flow encryption (PEM format) | | WHATSAPP_PRIVATE_KEY_PASSPHRASE | Passphrase for the RSA private key | | GOOGLE_CALENDAR_ID | Calendar ID for availability checks and event creation | Setup Meta Business Setup Create a Meta Developer App at developers.facebook.com Add WhatsApp product to your app Set up a WhatsApp Business Account Generate a permanent access token Import Workflow Import personal-booking-whatsapp-flow.json into n8n Replace placeholder credential IDs with your actual credentials Configure Credentials WhatsApp: HTTP Bearer Auth with your access token OpenAI: API key for GPT-4o Airtable: OAuth2 authentication Google Calendar: OAuth2 authentication Set Environment Variables Configure all variables listed above in n8n settings Configure Webhook in Meta Navigate to WhatsApp > Configuration in Developer Portal Set webhook URL to your n8n webhook endpoint Enter your verify token Subscribe to messages webhook field Create WhatsApp Flow In WhatsApp Manager, create a new Flow Use the JSON from whatsapp-flow.json as your Flow definition Publish the Flow and note the Flow ID Create Message Template Create a template with a Flow button component Link it to your published Flow Submit for approval Airtable Schema Customers Table: customer_name (text) customer_email (email) phone_number (text) created_at (date) Services Table: service_name (text) service_key (text) - e.g., "30_min", "60_min" duration_minutes (number) Bookings Table: customer_email (text) service_type (linked to Services) event_date (date) event_time (text) booking_status (select: Pending, Confirmed, Cancelled) calendar_event_id (text) created_at (date) Customizing this workflow Consultation Types: Modify the INIT handler code node to add/change consultation options and durations. Business Hours: Adjust the calendar availability logic in the date refresh handler to match your working hours. AI Agent Prompts: Customize the system prompt in the AI Agent node to match your business context and available services. Messaging Templates: Create additional WhatsApp templates for different services (quotes, information requests) and add corresponding tools to the AI agent. CRM Fields: Extend the Airtable schema and update the booking creation nodes to capture additional customer data. Made by www.fenrirlabs.nl
by Yaron Been
Description This workflow monitors Bitcoin prices across multiple exchanges and sends you alerts when significant price drops occur. It helps crypto traders and investors identify buying opportunities without constantly watching the markets. Overview This workflow monitors Bitcoin prices across multiple exchanges and sends you alerts when significant price drops occur. It uses Bright Data to scrape real-time price data and can be configured to notify you through various channels. Tools Used n8n:** The automation platform that orchestrates the workflow. Bright Data:** For scraping cryptocurrency exchange data without getting blocked. Notification Services:** Email, SMS, Telegram, or other messaging platforms. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Configure Bright Data: Add your Bright Data credentials to the Bright Data node. Set Up Notifications: Configure your preferred notification method. Customize: Set your price thresholds, monitoring frequency, and which exchanges to track. Use Cases Crypto Traders:** Get notified of buying opportunities during price dips. Investors:** Monitor your crypto investments and make informed decisions. Financial Analysts:** Track Bitcoin price movements for market analysis. Connect with Me Website:** https://www.nofluff.online YouTube:** https://www.youtube.com/@YaronBeen/videos LinkedIn:** https://www.linkedin.com/in/yaronbeen/ Get Bright Data:** https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #bitcoin #cryptocurrency #brightdata #pricealerts #cryptotrading #bitcoinalerts #cryptoalerts #cryptomonitoring #n8nworkflow #workflow #nocode #cryptoinvesting #bitcoinprice #cryptomarket #tradingalerts #cryptotools #bitcointrading #pricemonitoring #cryptoautomation #bitcoininvestment #cryptotracker #marketalerts #tradingopportunities #cryptoprices
by JustinLee
This workflow demonstrates a simple Retrieval-Augmented Generation (RAG) pipeline in n8n, split into two main sections: 🔹 Part 1: Load Data into Vector Store Reads files from disk (or Google Drive). Splits content into manageable chunks using a recursive text splitter. Generates embeddings using the Cohere Embedding API. Stores the vectors into an In-Memory Vector Store (for simplicity; can be replaced with Pinecone, Qdrant, etc.). 🔹 Part 2: Chat with the Vector Store Takes user input from a chat UI or trigger node. Embeds the query using the same Cohere embedding model. Retrieves similar chunks from the vector store via similarity search. Uses Groq-hosted LLM to generate a final answer based on the context. 🛠️ Technologies Used: 📦 Cohere Embedding API ⚡ Groq LLM for fast inference 🧠 n8n for orchestrating and visualizing the flow 🧲 In-Memory Vector Store (for prototyping) 🧪 Usage: Upload or point to your source documents. Embed them and populate the vector store. Ask questions through the chat trigger node. Receive context-aware responses based on retrieved content.
by Yaron Been
Description This workflow automatically monitors sneaker prices across multiple retailers and sends you alerts when prices drop on your favorite models. It helps sneaker enthusiasts and collectors find the best deals without constantly checking multiple websites. Overview This workflow automatically monitors sneaker prices across multiple retailers and sends you alerts when prices drop. It uses Bright Data to scrape sneaker websites and can notify you through various channels when your desired models go on sale. Tools Used n8n:** The automation platform that orchestrates the workflow. Bright Data:** For scraping sneaker retailer websites without getting blocked. Notification Services:** Email, SMS, or other messaging platforms. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Configure Bright Data: Add your Bright Data credentials to the Bright Data node. Set Up Notifications: Configure your preferred notification method. Customize: Add the sneaker models you want to track and your price thresholds. Use Cases Sneaker Collectors:** Get notified when rare models drop in price. Resellers:** Find profitable buying opportunities. Budget Shoppers:** Wait for the best deals on your favorite sneakers. Connect with Me Website:** https://www.nofluff.online YouTube:** https://www.youtube.com/@YaronBeen/videos LinkedIn:** https://www.linkedin.com/in/yaronbeen/ Get Bright Data:** https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #sneakers #pricealerts #brightdata #webscraping #sneakerdeals #sneakermonitor #pricedrop #sneakerhead #n8nworkflow #workflow #nocode #sneakersales #dealfinder #sneakermarket #pricetracking #sneakerprices #shoealerts #sneakercollector #sneakershopping #reselling #sneakerreseller #dealnotifications #sneakerautomation #shoedeals
by Yaron Been
Description This workflow automatically monitors your competitors' product prices and notifies you of any changes. It helps you stay competitive in the market by providing real-time insights into pricing strategies without manual checking. Overview This workflow automatically tracks your competitors' pricing across their websites and notifies you of any changes. It uses Bright Data to scrape pricing data and can generate reports or send alerts when prices change. Tools Used n8n:** The automation platform that orchestrates the workflow. Bright Data:** For scraping competitor websites without being blocked. Spreadsheets/Databases:** For storing and analyzing price data. Notification Services:** For alerting you to significant price changes. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Configure Bright Data: Add your Bright Data credentials to the Bright Data node. Set Up Data Storage: Configure where you want to store the price data. Customize: Add your competitors' URLs and the specific products to monitor. Use Cases E-commerce Businesses:** Stay competitive with real-time price monitoring. Pricing Analysts:** Automate data collection for pricing strategy. Retailers:** Adjust your pricing strategy based on market trends. Connect with Me Website:** https://www.nofluff.online YouTube:** https://www.youtube.com/@YaronBeen/videos LinkedIn:** https://www.linkedin.com/in/yaronbeen/ Get Bright Data:** https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #pricing #competitoranalysis #brightdata #ecommerce #pricemonitoring #competitiveintelligence #pricetracking #marketanalysis #n8nworkflow #workflow #nocode #pricingstrategy #competitortracking #ecommercetools #pricecomparison #businessintelligence #marketresearch #competitivepricing #retailautomation #pricealerts #datadriven #webscraping #pricinganalysis #competitorinsights
by Oneclick AI Squad
Automate your post-event networking with this intelligent n8n workflow. Triggered instantly after an event, it collects attendee and interaction data, enriches profiles with LinkedIn insights, and uses GPT-4 to analyze engagement and generate tailored follow-up messages. High-value leads are prioritized, messages are sent via email, LinkedIn, or Slack, and all activity is logged in your CRM and database. Save hours of manual follow-up while boosting relationship-building and ROI. 🤝✨ Advanced Features Webhook automation** – Starts instantly on event completion Multi-Source Enrichment** – Combines event data, interactions, and LinkedIn profiles AI-Powered Insights** – GPT-4 analyzes behavior and suggests personalized talking points Smart Priority Filtering** – Routes leads into High, Medium, and Low priority paths Personalized Content Generation** – AI crafts custom emails and LinkedIn messages Multi-Channel Outreach** – Sends via Email, LinkedIn DM, and Slack CRM Integration** – Automatically updates HubSpot with contact notes and engagement PostgreSQL Logging** – Stores full interaction history and analytics ROI Dashboard** – Tracks response rates, meetings booked, and pipeline impact What It Does Collects attendee data from your event platform Enriches with LinkedIn profiles & real-time interaction logs Scores networking potential using engagement algorithms Uses AI to analyze conversations, roles, and mutual interests Generates hyper-personalized follow-up emails and LinkedIn messages Sends messages through preferred channels (email, LinkedIn, Slack) Updates HubSpot CRM with follow-up status and next steps Logs all actions and tracks analytics for performance reporting Workflow Process The Webhook Trigger initiates the workflow via POST request with event and attendee data. Get Attendees** fetches participant list from the event platform. Get Interactions** pulls Q&A, chat, poll, and networking activity logs. Enrich LinkedIn Data** retrieves professional profiles, job titles, and company details via LinkedIn API. Merge & Enrich Data** combines all sources into a unified lead profile. AI Analyze Profile** uses GPT-4 to evaluate interaction depth, role relevance, and conversation context. Filter High Priority** routes top-tier leads (e.g., decision-makers with strong engagement). Filter Medium Priority** handles warm prospects for lighter follow-up. AI Agent1** generates personalized email content using chat model and memory. Generate Email** creates a professional, context-aware follow-up email. Send Email** delivers the message to the lead’s inbox. AI Agent2** crafts a concise, friendly LinkedIn connection message. Generate LinkedIn Msg** produces a tailored outreach note. Send LinkedIn** posts the message via LinkedIn API. Slack Notification** alerts your team in real-time about high-priority outreach. Update CRM (HubSpot)** adds contact, tags, and follow-up tasks automatically. Save to Database (Insert)** logs full lead journey and message content in PostgreSQL. Generate Analytics** compiles engagement metrics and success rates. Send Response** confirms completion back to the event system. Setup Instructions Import the workflow JSON into n8n Configure credentials: Event Platform API (for attendees & interactions) LinkedIn API (OAuth2) OpenAI (GPT-4) SMTP (for email) or Email Service (SendGrid, etc.) HubSpot API Key PostgreSQL Database Slack Webhook URL Trigger with a webhook POST containing event ID and settings Watch personalized outreach happen automatically! Prerequisites Event platform with webhook + attendee/interaction API LinkedIn Developer App with API access OpenAI API key with GPT-4 access HubSpot account with API enabled PostgreSQL database (table for leads & logs) Slack workspace (optional, for team alerts) Example Webhook Payload { "eventId": "evt_spring2025", "eventName": "Annual Growth Summit", "triggerFollowUp": true, "priorityThreshold": { "high": 75, "medium": 50 } } Modification Options Adjust scoring logic in AI Analyze Profile (e.g., weight Q&A participation higher) Add custom email templates in Generate Email with your brand voice Include meeting booking links (Calendly) in high-priority messages Route VIP leads to Send SMS via Twilio Export analytics to Google Sheets or BI tools (Looker, Tableau) Add approval step before sending LinkedIn messages Ready to 10x your event ROI? Get in touch with us for custom n8n automation!