by Yaron Been
Automated workflow that transforms BuiltWith technology data into actionable sales leads in Trello, creating a visual sales pipeline. ๐ What It Does Converts tech stack data into Trello cards Organizes leads by technology stack Tracks sales pipeline stages Enables team collaboration Updates automatically ๐ฏ Perfect For Sales teams Business development Account executives Tech startups Digital agencies โ๏ธ Key Benefits โ Visual sales pipeline โ Easy lead qualification โ Team collaboration โ Technology-based filtering โ Automated data entry ๐ง What You Need BuiltWith API access Trello account n8n instance Google account (for authentication) ๐ Data Mapped to Trello Company details Technology stack Contact information Website metrics Custom labels ๐ ๏ธ Setup & Support Quick Setup Start in 20 minutes with our step-by-step guide ๐บ Watch Tutorial ๐ผ Get Expert Support ๐ง Direct Help Turn technology intelligence into sales opportunities with automated lead management.
by David Olusola
AI Lead Capture System - Complete Setup Guide Prerequisites n8n instance (cloud or self-hosted) Google AI Studio account (free tier available) Google account for Sheets integration Website with chat widget capability Phase 1: Core Infrastructure Setup Step 1: Set Up Google AI Studio Go to Google AI Studio Create account or sign in with Google Navigate to "Get API Key" Create new API key for your project Copy and securely store the API key Free tier limits: 15 requests/minute, 1 million tokens/month Step 2: Configure Google Sheets Create new Google Sheet for lead storage Add column headers (exact names): Full Name Company Name Email Address Phone Number Project Intent/Needs Project Timeline Budget Range Preferred Communication Channel How they heard about DAEX AI Copy the Google Sheet ID from URL (between /d/ and /edit) Ensure sheet is accessible to your Google account Step 3: Import n8n Workflow Open your n8n instance Create new workflow Click "..." menu โ Import from JSON Paste the provided workflow JSON Workflow will appear with all nodes connected Phase 2: Credential Configuration Step 4: Set Up Google Gemini API In n8n, go to Credentials โ Add Credential Search for "Google PaLM API" Enter your API key from Step 1 Test connection Link to the "Google Gemini Chat Model" node Step 5: Configure Google Sheets Access Go to Credentials โ Add Credential Select "Google Sheets OAuth2 API" Follow OAuth flow to authorize your Google account Test connection with your sheet Link to the "Google Sheets" node Phase 3: Workflow Customization Step 6: Update Company Information Open the AI Agent node In the system message, replace all mentions of: Company name and description Service offerings and specializations FAQ knowledge base Typical project timelines and pricing ranges Adjust conversation tone to match your brand voice Step 7: Configure Lead Qualification Fields In the AI Agent system message, modify the required information list: Add/remove qualification questions Adjust budget ranges for your services Customize timeline options Update communication channel preferences In Google Sheets node, update column mappings if you changed fields Step 8: Set Up Sheet Integration Open Google Sheets node Click on Document ID dropdown Select your lead capture sheet Verify all column mappings match your sheet headers Test with sample data Phase 4: Website Integration Step 9: Get Webhook URL Open Webhook node in n8n Copy the webhook URL (starts with your n8n domain) Note: URL format is https://your-n8n-domain.com/webhook/[unique-id] Step 10: Connect Your Chat Widget Choose your integration method: Option A: Direct JavaScript Integration javascript// Add to your website function sendMessage(message, sessionId) { fetch('YOUR_WEBHOOK_URL', { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ message: message, sessionId: sessionId || 'visitor-' + Date.now() }) }) .then(response => response.json()) .then(data => { // Display AI response in your chat widget displayMessage(data.message); }); } Option B: Chat Platform Webhook Open your chat platform settings (Intercom, Crisp, etc.) Find webhook/integration section Add webhook URL pointing to your n8n endpoint Configure to send message and session data Option C: Zapier/Make.com Integration Create new Zap/Scenario Trigger: New chat message from your platform Action: HTTP POST to your n8n webhook Map message content and session ID Phase 5: Testing & Optimization Step 11: Test Complete Flow Send test message through your chat widget Verify AI responds appropriately Check conversation context is maintained Confirm lead data appears in Google Sheets Test with various conversation scenarios Step 12: Monitor Performance Check n8n execution logs for errors Monitor Google Sheets for data quality Review conversation logs for improvement opportunities Track response times and conversion rates Step 13: Fine-Tune Conversations Analyze real conversation logs Update system prompts based on common questions Add new FAQ knowledge to the AI agent Adjust qualification questions based on lead quality Optimize for your specific customer patterns Phase 6: Advanced Features (Optional) Step 14: Add Lead Scoring Create new column in Google Sheets for "Lead Score" Update AI agent to calculate scores based on: Budget range (higher budget = higher score) Timeline urgency (sooner = higher score) Project complexity (complex = higher score) Add conditional formatting in Google Sheets to highlight high-value leads Step 15: Set Up Notifications Add email notification node after Google Sheets Configure to send alerts for high-priority leads Include lead details and conversation summary Set up different notification rules for different lead scores Step 16: Analytics Dashboard Connect Google Sheets to Google Data Studio or similar Create dashboard showing: Daily lead volume Conversion rates by source Average qualification time Lead quality scores Revenue pipeline from captured leads Troubleshooting Common Issues AI Not Responding Check Google Gemini API key validity Verify API quota not exceeded Review n8n execution logs for errors Data Not Saving to Sheets Confirm Google Sheets permissions Check column name matching Verify sheet ID is correct Chat Widget Not Connecting Test webhook URL directly with curl/Postman Verify JSON format matches expected structure Check CORS settings if browser-based integration Conversation Context Lost Ensure sessionId is unique per visitor Check memory node configuration Verify sessionId is passed consistently
by Halfbit ๐
Jura Coffee Counter: Webhook API & Google Sheets Logger โ๏ธ Track how many coffees your Jura E8 espresso machine makes โ fully automated via webhook and Google Sheets. This workflow exposes a custom API endpoint that can be called by smart devices, such as an ESP8266 or ESP32 reading data from a Jura E8 coffee machine via Bluetooth Low Energy (BLE). The incoming data (including total coffee count) is timestamped and appended to a Google Sheet, making it easy to visualize or analyze your machine usage. โ Originally built for a Jura E8, based on AlexxIT/Jura reverse-engineering project. > ๐ This workflow uses Google Sheets as a logging backend. You can easily switch it to Airtable, Notion, or a database of your choice. Live example available at: https://halfbitstudio.com/o-nas/ > ๐ฅ๏ธ In our setup, this workflow is used to provide real-time coffee consumption stats displayed directly on our website. > ๐ Some Jura machines require an accessory Bluetooth transmitter to enable connectivity. Communication is based on the Bluetooth Low Energy (BLE) protocol. Use Case Tracking usage of a Jura coffee machine Logging IoT sensor data into Google Sheets Creating dashboards for daily consumption Smart office setups with coffee stats! Features โ๏ธ Two Webhook endpoints: POST /{{WEBHOOK_POST_PATH}} โ receives JSON from ESP (coffee machine reader) GET /{{WEBHOOK_GET_PATH}} โ returns latest records as JSON ๐ Timestamping via Date & Time node ๐น Coffee counter extraction from incoming JSON ๐งพ Appends structured rows to Google Sheets ๐ค Webhook response for external status or dashboards Setup Instructions Jura Coffee Machine Integration (Hardware) Use an ESP device (e.g. ESP8266 or ESP32) to connect to the Jura E8 via Bluetooth Low Energy (BLE). Send POST requests with JSON payload: { "total_coffees": 123 } Reverse-engineered protocol reference: AlexxIT/Jura Google Sheets Configuration Create a new Google Sheet with column headers like: date | time | coffee counter Connect your Google account in n8n and authorize access to this sheet. Replace the documentId and sheetName fields in the Google Sheets nodes: Use full URL to your spreadsheet Use the actual sheet name (e.g. Sheet1) Environment Variables & Placeholders | Placeholder | Description | | ------------------------ | ----------------------------------------------- | | {{WEBHOOK_POST_PATH}} | Endpoint to receive coffee counter data | | {{WEBHOOK_GET_PATH}} | Endpoint to return latest data (for dashboards) | | {{SHEET_ID}} | Google Spreadsheet ID | | {{GOOGLE_CREDENTIALS}} | OAuth2 credentials for Google Sheets | | {{DATA_COLUMNS}} | Column names in the target sheet | Testing the Workflow Send test request: Use Postman or ESP to send a POST request to /{{WEBHOOK_POST_PATH}} Body should include total_coffees value Check Google Sheet: Open your sheet and verify that a new row was appended Test GET endpoint: Access the second webhook URL (e.g. /{{WEBHOOK_GET_PATH}}) in browser or fetch via API Optional: Use Respond to Webhook output in a dashboard or frontend Customization Tips Sheet format**: Add more columns if you want to track additional data (e.g. machine temperature, errors) Output format**: Replace Google Sheets with any other storage (e.g. MySQL, Notion) Auth layer**: Add basic auth or token verification if needed for public exposure Notifications**: Send alerts to Discord/Slack when reaching thresholds (e.g. 200 coffees brewed) Tags: google-sheets, iot, webhook, jura, coffee, api, automation
by Yang
๐งพ What this workflow does This workflow turns YouTube video links into ready-to-edit newsletter drafts using Dumpling AI and GPT-4o. It reads new video URLs from a Google Sheet, extracts their transcripts, summarizes them into email-friendly content, and logs the finished draft back into the same sheet. An email notification is also sent to alert the user once each draft is created. ๐ค Who is this for Newsletter writers or marketers repurposing video content YouTube creators building email follow-ups from videos Agencies or VAs batching social โ email content Automation users streamlining content workflows โ๏ธ How to set up โ Requirements Google Sheet** with the following columns: link โ YouTube video URL blog post โ for saving the generated newsletter draft Active accounts for: Dumpling AI (API for YouTube transcripts) OpenAI GPT-4 or GPT-4o Google Sheets Gmail (OAuth2 credential) ๐ง Setup steps Connect all credentials using n8n's Credential Manager: Google Sheets (OAuth2) Dumpling AI (via HTTP Header Auth) OpenAI Gmail Update the sheet ID and tab name in both Google Sheets nodes. Customize the GPT-4o prompt (optional): Located in the โGPT-4o: Write Newsletter Draft from Transcriptโ node You can edit tone, structure, and audience targeting in the system message Verify email recipient in the Gmail node and update if needed. ๐ง How it works The workflow is triggered manually or on schedule. It pulls YouTube links without drafts from the sheet. Each videoโs transcript is fetched using Dumpling AI. GPT-4o summarizes the transcript into a clean, friendly newsletter format. The draft is written back to the same row in Google Sheets. An email is sent to notify the user that the draft is ready. ๐ ๏ธ Customization ideas Send finished drafts to Notion or Airtable instead of Sheets Generate social media posts from the same transcript Add automatic review steps using GPT scoring or editing Trigger this on new form submissions or YouTube uploads instead This is a fast, AI-powered way to turn long-form video content into clean, polished newsletters โ ready to share or schedule with minimal editing.
by Abhishek Patoliya
This powerful n8n automation sends you daily weather updates directly to your Telegram chat using live data from OpenWeatherMap. It supports automatic daily updates and manual lookups via form input. โ Prerequisites Before you begin, make sure you have: A working n8n instance (v1.0 or later recommended). An account with OpenWeatherMap (free plan is sufficient). A Telegram Bot created via @BotFather. Your Telegram user ID or chat ID. ๐ API & Bot Setup ๐งฉ OpenWeatherMap API Go to https://openweathermap.org/api Sign up and verify your account. Navigate to API Keys in your account dashboard. Copy your API key (used later in the HTTP Request node). ๐ค Telegram Bot Open @BotFather in Telegram. Run /newbot and follow the prompts: Choose a name and username for your bot. Youโll get a bot token (copy this). Start a chat with your new bot to activate it. To get your Telegram User ID, use @userinfobot or an n8n Telegram Trigger node. ๐ Trigger Options โฐ Schedule Trigger (Automatic) Runs daily at 8:00 AM IST. Ideal for consistent, passive updates. ๐ Form Trigger (Manual) Input ๐ City and ๐ Country manually. Instantly receive weather info in Telegram. ๐ง How the Flow Works Trigger Activated (Scheduled or Form) City & Country fetched (default or from form) HTTP Request sent to OpenWeatherMap with API key Weather Data Parsed & Formatted: ๐ Current Date ๐ City & Country ๐ค๏ธ Weather Description ๐ก๏ธ Temperature (ยฐC) ๐ง Humidity (%) ๐ฌ๏ธ Wind Speed (m/s) ๐ผ Atmospheric Pressure ๐ Sunrise Time (IST) ๐ Sunset Time (IST) Message Sent to Telegram ๐งฐ Nodes Used Schedule Trigger** โ Runs every day at 8:00 AM IST Form Trigger** โ Accepts user input Set Node** โ Default city/country values and date formatting HTTP Request** โ Calls OpenWeatherMap API Function Node** โ Converts timestamps to IST Telegram Node** โ Sends formatted weather message ๐ฆ Example Telegram Output ๐ Wednesday, 10 July 2025 ๐ค Weather in Mumbai, IN: Condition: Clear sky Temperature: 30ยฐC ๐ง Humidity: 70% ๐ฌ Wind Speed: 3 m/s ๐ผ Pressure: 1013 hPa ๐ Sunrise: 5:57:12 AM ๐ Sunset: 6:53:45 PM ๐ ๏ธ Customization Tips ๐๏ธ Change Default City/Country Locate the Set Node (used before the API call). Replace "Mumbai" and "IN" with your preferred location. Or connect the Form Trigger input to allow dynamic values. ๐ Change Schedule Time Open the Schedule Trigger node. Adjust to your preferred time zone and daily timing (e.g., 7 AM IST). ๐งช Add Extra Data OpenWeatherMap returns more fields like visibility, UV index, etc. You can include these in your Telegram message via the Function Node and Set Node.
by Roshan Ramani
Overview An intelligent automation workflow that monitors your Gmail inbox and sends AI-powered summaries of important emails directly to your Telegram chat. Perfect for staying updated on critical communications without constantly checking your email. ๐ Key Features Real-time Email Monitoring**: Checks Gmail every minute for new emails Smart Content Filtering**: Only processes emails containing important keywords AI-Powered Summarization**: Uses GPT-4o-mini to create concise, human-readable summaries Instant Telegram Notifications**: Delivers summaries directly to your preferred Telegram chat Customizable Keywords**: Easily modify filters to match your specific needs ๐ง How It Works Workflow Steps: Email Trigger: Continuously monitors your Gmail inbox for new messages Smart Filter: Analyzes email subject and body for important keywords (sales, jobs, etc.) AI Processing: Sends relevant emails to OpenAI for intelligent summarization Telegram Delivery: Sends formatted summary to your Telegram chat Sample Output: ๐ฆ Your Flipkart order "Bluetooth Speaker" was delivered today. Enjoy! ๐ฐ Invoice from AWS for $23.50 is due by July 20. Check billing portal. โ HR shared your July payslip. No action needed unless there's an error. ๐ Setup Requirements Gmail account with OAuth2 credentials OpenAI API key Telegram bot token and chat ID N8N instance (cloud or self-hosted) ๐ Use Cases Business Alerts**: Payment due notices, invoice reminders E-commerce**: Order confirmations, delivery updates HR Communications**: Payslips, policy updates, announcements Security**: Login alerts, security notifications Job Hunting**: Application responses, interview invitations โ๏ธ Customization Options Keyword Filters**: Add/remove keywords in the filter node (invoice, payment, security, delivery, etc.) AI Prompt**: Modify the summarization style and format Polling Frequency**: Adjust email checking interval Multiple Chats**: Send to different Telegram chats based on email type ๐ Privacy & Security Processes emails locally through n8n No email content stored permanently Uses secure OAuth2 authentication Respects Gmail API rate limits ๐ Performance Lightweight and efficient Minimal resource usage Fast AI processing with GPT-4o-mini Reliable Telegram delivery ๐ก Pro Tips Start with broad keywords and refine based on results Use multiple condition branches for different email types Set up different Telegram chats for work vs personal emails Monitor your OpenAI usage to avoid unexpected costs
by Adrian Bent
This workflow takes two inputs, YouTube video URL (required) and a description of what information to extract from the video. If the description/"what you want" field is left empty, the default prompt will generate a detailed summary and description of the video's contents. However, you can ask for something more specific using this field/input. ++ Don't forget to make the workflow Active and use the production URL from the form node. Benefits Instant Summary Generation - Convert hours of watching YouTube videos to familiar, structured paragraphs and sentences in less than a minute Live Integration - Generate a summary or extract information on the contents of a YouTube video whenever, wherever Virtually Complete Automation - All that needs to be done is to add the video URL and describe what you want to know from the video Presentation - You can ask for a specific structure or tone to better help you understand or study the contents of the video How It Works Smart Form Interface: Simple N8N form captures video URL and description of what's to be extracted Designed for rapid and repeated completion anywhere and anytime Description Check: Uses JavaScript to determine if the description was filled in or left empty If the description field was left empty, the default prompt is, "Please be as descriptive as possible about the contents being spoken of in this video after giving a detailed summary." If the description field is filled, then the filled input will be used to describe what information to extract from the video HTTP Request: We're using Gemini API, specifically the video understanding endpoint We make a post HTTP request passing the video URL and the description of what information to extract Setup Instructions: HTTP Request Setup: Sign up for a Google Cloud account, join the Developer Program and get your Gemini API key Get curl for Gemini Video Understanding API The video understanding relies on the inputs from the form, code and HTTP request node, so correct mapping is essential for the workflow to function correctly. Feel free to reach out for additional help or clarification at my Gmail: terflix45@gmail.com, and I'll get back to you as soon as I can. Setup Steps: Code Node Setup: The code node is used as a filter to ensure a description prompt is always passed on. Use the JavaScript code below for that effect: // Loop over input items and add a new field called 'myNewField' to the JSON of each one for (const item of $input.all()) { item.json.myNewField = 1; if ($input.first().json['What u want?'].trim() == "") { $input.first().json['What do you want?'] = "Please be as descriptive as possible about the contents being spoken of this video after giving a detailed summary"; } } return $input.all(); // End of Code HTTP Request: To use Gemini Video Understanding, you'll need your Gemini API key Go to https://ai.google.dev/gemini-api/docs/video-understanding#youtube. This link will take you directly to the snippet. Just select REST programming language, copy that curl command, then paste it into the HTTP Request node Replace "Please summarize the video in 3 sentences." with the code node's output, which should either be the default description or the one entered by the user (second output field variable) Replace "https://www.youtube.com/watch?v=9hE5-98ZeCg" with the n8n form node's first output field, which should be the YouTube video URL variable Replace $GEMINI_API_KEY with your API key Redirect: Use n8n form node, page type "Final Ending" to redirect user to the initial n8n form for another analysis or preferred destination
by Dhruv Dalsaniya
Description: This n8n workflow automates a Discord bot to fetch messages from a specified channel and send AI-generated responses in threads. It ensures smooth message processing and interaction, making it ideal for managing community discussions, customer support, or AI-based engagement. This workflow leverages Redis for memory persistence, ensuring that conversation history is maintained even if the workflow restarts, providing a seamless user experience. How It Works The bot listens for new messages in a specified Discord channel. It sends the messages to an AI model for response generation. The AI-generated reply is posted as a thread under the original message. The bot runs on an Ubuntu server and is managed using PM2 for uptime stability. The Discord bot (Python script) acts as the bridge, capturing messages from Discord and sending them to the n8n webhook. The n8n workflow then processes these messages, interacts with the AI model, and sends the AI's response back to Discord via the bot. Prerequisites to host Bot Sign up on Pella, which is a managed hosting service for Discord Bots. (Easy Setup) A Redis instance for memory persistence. Redis is an in-memory data structure store, used here to store and retrieve conversation history, ensuring that the AI can maintain context across multiple interactions. This is crucial for coherent and continuous conversations. Set Up Steps 1๏ธโฃ Create a Discord Bot Go to the Discord Developer Portal. Click โNew Applicationโ, enter a name, and create it. Navigate to Bot > Reset Token, then copy the Bot Token. Enable Privileged Gateway Intents (Presence, Server Members, Message Content). Under OAuth2 > URL Generator, select bot scope and required permissions. Copy the generated URL, open it in a browser, select your server, and click Authorize. 2๏ธโฃ Deploy the Bot on Pella Create a new folder discord-bot and navigate into it: Create and configure an .env file to store your bot token: Copy the code to .env: (You can copy the webhook URL from the n8n workflow) TOKEN=your-bot-token-here WEBHOOK_URL=https://your-domain.tld/webhook/getmessage Create file main.py copy the below code and save it: Copy this Bot script to main.py: import discord import requests import json import os from dotenv import load_dotenv Load environment variables from .env file load_dotenv() TOKEN = os.getenv("TOKEN") WEBHOOK_URL = os.getenv("WEBHOOK_URL") Bot Configuration LISTEN_CHANNELS = ["YOUR_CHANNEL_ID_1", "YOUR_CHANNEL_ID_2"] # Replace with your target channel IDs Intents setup intents = discord.Intents.default() intents.messages = True # Enable message event intents.guilds = True intents.message_content = True # Required to read messages client = discord.Client(intents=intents) @client.event async def on_ready(): print(f'Logged in as {client.user}') @client.event async def on_message(message): if message.author == client.user: return # Ignore bot's own messages if str(message.channel.id) in LISTEN_CHANNELS: try: fetched_message = await message.channel.fetch_message(message.id) # Ensure correct fetching payload = { "channel_id": str(fetched_message.channel.id), # Ensure it's string "chat_message": fetched_message.content, "timestamp": str(fetched_message.created_at), # Ensure proper formatting "message_id": str(fetched_message.id), # Ensure ID is a string "user_id": str(fetched_message.author.id) # Ensure user ID is also string } headers = {'Content-Type': 'application/json'} response = requests.post(WEBHOOK_URL, data=json.dumps(payload), headers=headers) if response.status_code == 200: print(f"Message sent successfully: {payload}") else: print(f"Failed to send message: {response.status_code}, Response: {response.text}") except Exception as e: print(f"Error fetching message: {e}") client.run(TOKEN) Create requirements.txt and copy: discord python-dotenv 3๏ธโฃ Follow the video to set up the bot which will run 24/7 Tutorial - https://www.youtube.com/watch?v=rNnK3XlUtYU Note: Free Plan will expire after 24 hours, so please opt for the Paid Plan in Pella to keep your bot running. 4๏ธโฃ n8n Workflow Configuration The n8n workflow consists of the following nodes: Get Discord Messages (Webhook):** This node acts as the entry point for messages from the Discord bot. It receives the channel_id, chat_message, timestamp, message_id, and user_id from Discord when a new message is posted in the configured channel. Its webhook path is /getmessage and it expects a POST request. Chat Agent (Langchain Agent):** This node processes the incoming Discord message (chat_message). It is configured as a conversational agent, integrating the language model and memory to generate an appropriate response. It also has a prompt to keep the reply concise, under 1800 characters. OpenAI -4o-mini (Langchain Language Model):** This node connects to the OpenAI API and uses the gpt-4o-mini-2024-07-18 model for generating AI responses. It is the core AI component of the workflow. Message History (Redis Chat Memory):** This node manages the conversation history using Redis. It stores and retrieves chat messages, ensuring the Chat Agent maintains context for each user based on their user_id. This is critical for coherent multi-turn conversations. Calculator (Langchain Tool):** This node provides a calculator tool that the AI agent can utilize if a mathematical calculation is required within the conversation. This expands the capabilities of the AI beyond just text generation. Response fromAI (Discord):** This node sends the AI-generated response back to the Discord channel. It uses the Discord Bot API credentials and replies in a thread under the original message (message_id) in the specified channel_id. Sticky Note1, Sticky Note2, Sticky Note3, Sticky Note4, Sticky Note5, Sticky Note:** These are informational nodes within the workflow providing instructions, code snippets for the Discord bot, and setup guidance for the user. These notes guide the user on setting up the .env file, requirements.txt, the Python bot code, and general recommendations for channel configuration and adding tools. 5๏ธโฃ Setting up Redis Choose a Redis Hosting Provider: You can use a cloud provider like Redis Labs, Aiven, or set up your own Redis instance on a VPS. Obtain Redis Connection Details: Once your Redis instance is set up, you will need the host, port, and password (if applicable). Configure n8n Redis Nodes: In your n8n workflow, configure the "Message History" node with your Redis connection details. Ensure the Redis credential โ redis-for-n8n is properly set up with your Redis instance details (host, port, password). 6๏ธโฃ Customizing the Template AI Model:** You can easily swap out the "OpenAI -4o-mini" node with any other AI service supported by n8n (e.g., Cohere, Hugging Face) to use a different language model. Ensure the new language model node is connected to the ai_languageModel input of the "Chat Agent" node. Agent Prompt:** Modify the text parameter in the "Chat Agent" node to change the AI's persona, provide specific instructions, or adjust the response length. Additional Tools:** The "Calculator" node is an example of an AI tool. You can add more Langchain tool nodes (e.g., search, data lookup) and connect them to the ai_tool input of the "Chat Agent" node to extend the AI's capabilities. Refer to the "Sticky Note5" in the workflow for a reminder. Channel Filtering:** Adjust the LISTEN_CHANNELS list in the main.py file of your Discord bot to include or exclude specific Discord channel IDs where the bot should listen for messages. Thread Management:** The "Response fromAI" node can be modified to change how threads are created or managed, or to send responses directly to the channel instead of a thread. The current setup links the response to the original message ID (message_reference). 7๏ธโฃ Testing Instructions Start the Discord Bot: Ensure your main.py script is running on Pella. Activate the n8n Workflow: Make sure your n8n workflow is active and listening for webhooks. Send a Message in Discord: Go to one of the LISTEN_CHANNELS in your Discord server and send a message. Verify Response: The bot should capture the message, send it to n8n, receive an AI-generated response, and post it as a thread under your original message. Check Redis: Verify that the conversation history is being stored and updated correctly in your Redis instance. Look for keys related to user IDs. โ Now your bot is running in the background! ๐
by Aurรฉlien P.
๐ค๏ธ Daily Weather Forecast Bot A comprehensive n8n workflow that fetches detailed weather forecasts from OpenWeatherMap and sends beautifully formatted daily summaries to Telegram. ๐ Features ๐ Daily Overview**: Complete temperature range, rainfall totals, and wind conditions โฐ Hourly Forecast**: Weather predictions at key times (9AM, 12PM, 3PM, 6PM, 9PM) ๐ก๏ธ Smart Emojis**: Context-aware weather icons and temperature indicators ๐ก Smart Recommendations**: Contextual advice (umbrella alerts, clothing suggestions, sun protection) ๐ช๏ธ Enhanced Details**: Feels-like temperature, humidity levels, wind speed, UV warnings ๐ฑ Rich Formatting**: HTML-formatted messages with emojis for excellent readability ๐ Timezone-Aware**: Proper handling of Luxembourg timezone (CET/CEST) ๐ ๏ธ What This Workflow Does Triggers daily at 7:50 AM to send morning weather updates Fetches 5-day forecast from OpenWeatherMap API with 3-hour intervals Processes and analyzes weather data with smart algorithms Formats comprehensive report with HTML styling and emojis Sends to Telegram with professional formatting and actionable insights โ๏ธ Setup Instructions 1. OpenWeatherMap API Sign up at OpenWeatherMap Get your free API key (1000 calls/day included) Replace API_KEY in the HTTP Request node URL 2. Telegram Bot Message @BotFather on Telegram Send /newbot command and follow instructions Copy the bot token to n8n credentials Get your chat ID by messaging the bot, then visiting: https://api.telegram.org/bot<YOUR_BOT_TOKEN>/getUpdates Update the chatId parameter in the Telegram node 3. Location Configuration Default location: Strassen, Luxembourg To change: modify q=Strassen in the HTTP Request URL Format: q=CityName,CountryCode (e.g., q=Paris,FR) ๐ฏ Technical Details API Source**: OpenWeatherMap 5-day forecast Schedule**: Daily at 7:50 AM (configurable) Format**: HTML with rich emoji formatting Error Handling**: 3 retry attempts with 5-second delays Rate Limits**: Uses only 1 API call per day Timezone**: Europe/Luxembourg (handles CET/CEST automatically) ๐ Weather Data Analyzed Temperature ranges and "feels like" temperatures Precipitation forecasts and accumulation Wind speed and conditions Humidity levels and comfort indicators Cloud coverage and visibility UV index recommendations Time-specific weather patterns ๐ก Smart Features Conditional Recommendations**: Only shows relevant advice Night/Day Awareness**: Different emojis for time of day Temperature Context**: Color-coded temperature indicators Weather Severity**: Appropriate icons for weather intensity Humidity Comfort**: Comfort level indicators Wind Analysis**: Descriptive wind condition text ๐ง Customization Options Schedule**: Modify trigger time in the Schedule node Location**: Change city in HTTP Request URL Forecast Hours**: Adjust desiredHours array in the code Temperature Thresholds**: Modify emoji temperature ranges Recommendation Logic**: Customize advice triggers ๐ฑ Sample Output ๐ค๏ธ Weather Forecast for Strassen, LU ๐ Monday, 2 June 2025 ๐ Daily Overview ๐ก๏ธ Range: 12ยฐC - 22ยฐC ๐ง Comfortable (65%) โฐ Hourly Forecast ๐ 09:00 โ๏ธ 15ยฐC ๐ 12:00 ๐ค๏ธ 20ยฐC ๐ 15:00 โ๏ธ 22ยฐC (feels 24ยฐC) ๐ 18:00 โ 19ยฐC ๐ 21:00 ๐ 16ยฐC ๐ก Data from OpenWeatherMap | Updated: 07:50 CET ๐ Getting Started Import this workflow to your n8n instance Add your OpenWeatherMap API key Set up Telegram bot credentials Test manually first Activate for daily automated runs ๐ Requirements n8n instance (cloud or self-hosted) Free OpenWeatherMap API account Telegram bot token Basic understanding of n8n workflows Perfect for: Daily weather updates, team notifications, personal weather tracking, smart home automation triggers.
by Derek Cheung
Purpose of workflow: The purpose of this workflow is to automate scraping of a website, transforming it into a structured format, and loading it directly into a Google Sheets spreadsheet. How it works: Web Scraping: Uses the Jina AI service to scrape website data and convert it into LLM-friendly text. Information Extraction: Employs an AI node to extract specific book details (title, price, availability, image URL, product URL) from the scraped data. Data Splitting: Splits the extracted information into individual book entries. Google Sheets Integration: Automatically populates a Google Sheets spreadsheet with the structured book data. Step by step setup: Set up Jina AI service: Sign up for a Jina AI account and obtain an API key. Configure the HTTP Request node: Enter the Jina AI URL with the target website. Add the API key to the request headers for authentication. Set up the Information Extractor node: Use Claude AI to generate a JSON schema for data extraction. Upload a screenshot of the target website to Claude AI. Ask Claude AI to suggest a JSON schema for extracting required information. Copy the generated schema into the Information Extractor node. Configure the Split node: Set it up to separate the extracted data into individual book entries. Set up the Google Sheets node: Create a Google Sheets spreadsheet with columns for title, price, availability, image URL, and product URL. Configure the node to map the extracted data to the appropriate columns.
by Yaron Been
Automated system to track and analyze technology stacks used by target companies, helping identify decision-makers and technology trends. ๐ What It Does Tracks technology stack of target companies Identifies key decision-makers (CTOs, Tech Leads) Monitors technology changes and updates Provides competitive intelligence Generates actionable insights ๐ฏ Perfect For B2B SaaS companies Technology vendors Sales and business development teams Competitive intelligence analysts Market researchers โ๏ธ Key Benefits โ Identify potential customers โ Stay ahead of technology trends โ Target decision-makers effectively โ Monitor competitor technology stacks โ Data-driven sales strategies ๐ง What You Need BuiltWith API key n8n instance CRM integration (optional) Email/Slack for alerts ๐ Data Tracked Company technologies Hosting providers Frameworks and libraries Analytics tools Marketing technologies ๐ ๏ธ Setup & Support Quick Setup Deploy in 20 minutes with our step-by-step guide ๐บ Watch Tutorial ๐ผ Get Expert Support ๐ง Direct Help Gain a competitive edge by understanding the technology landscape of your target market.
by Manuel
Effortlessly optimize your workflow by automatically importing hundreds of manufacturers from a Google Sheet into your Shopware online store, saving countless hours of manual work. How it works Retrieve all manufactures from a Google Sheet Add each manufacture via Shopware sync API Endpoint to Shopware Upload a logo for each manufacture from a provided public URL to Shopware Set Up Steps Add your Shopware url to first node called Settings Create a Google Sheet in your Google account with the following columns (Demo Sheet) name (the name of the manufacturer which has to be unique and is required) website (url to the manufacturer website) description logo_url (public manufcaturer logo url. Have to be a png, jpg or svg file) translation_language_code_1 (optional. Language Code of your language. For example 'es-ES' for spanish. You have to make sure a language with this code exists in your Shopware shop.) translation_name_1 (optional. Manufacturer name translated to the language you defined at translation_language_code_1) translation_description_1 (optional. Manufacturer description translated to the language you defined at translation_language_code_1) translation_language_code_2 (optional. Same as translation_language_code_1 for another language) translation_name_2 (optional. Same as translation_name_1 for another language) translation_description_2 (optional. Same as translation_description_1 for another language) translation_language_code_3 (optional. Same as translation_language_code_1 for another language) translation_name_3 (optional. Same as translation_name_1 for another language) translation_description_3 (optional. Same as translation_description_1 for another language) Connect to your Google account Connect to your Shopware account Create a Shopware Integration Connect to Shopware at the nodes "Import Manufacturer" and "Upload Manufacturer Logo" using a Generic OAuth2 API Authentication with Grant Type "Client Credentials". The Access Token URL is https://your-shopware-domain.com/api/oauth/token. Run the workflow