by Joseph LePage
The n8n Nostr Community Node is a tool that integrates Nostr functionality into n8n workflows, allowing users to interact with the Nostr protocol seamlessly. It provides both read and write capabilities and can be used for various automation tasks. Disclaimer This node is ideal for self-hosted n8n setups, as ++community nodes are not supported on n8n cloud++. It opens up exciting possibilities for integrating workflows with the decentralized Nostr protocol. n8n Community Node for Nostr n8n-nodes-nostrobots Features Write Operations**: Send notes and events (kind1) to the Nostr network. Read Operations**: Fetch events based on criteria such as event ID, public key, hashtags, mentions, or search terms. Utility Functions**: Convert events into different formats like naddr or nevent and handle key transformations between bech32 and hex formats. Trigger Events**: Monitor the Nostr network for specific mentions or events and trigger workflows automatically. Use Cases Automating note posting without exposing private keys. Setting up notifications for mentions or specific events. Creating bots or AI assistants that respond to mentions on Nostr. Installation Install n8n on your system. Add the Nostr Community Node to your instance. Configure your credentials using a Nostr secret key (supports bech32 or hex formats).
by David Harvey
iMessage AI-Powered Smart Calorie Tracker > 📌 What it looks like in use: > This image shows a visual of the workflow in action. Use it for reference when replicating or customizing the template. This n8n template transforms a user-submitted food photo into a detailed, friendly, AI-generated nutritional report — sent back seamlessly as a chat message. It combines OpenAI's visual reasoning, Postgres-based memory, and real-time messaging with Blooio to create a hands-free calorie and nutrition tracker. 🧠 Use Cases Auto-analyze meals based on user-uploaded images. Daily/weekly/monthly diet summaries with no manual input. Virtual food journaling integrated into messaging apps. Nutrition companion for healthcare, fitness, and wellness apps. 📌 Good to Know ⚠️ This uses GPT-4 with image capabilities, which may incur higher usage costs depending on your OpenAI pricing tier. Review OpenAI’s pricing. The model uses visual reasoning and estimation to determine nutritional info — results are estimates and should not replace medical advice. Blooio is used for sending/receiving messages. You will need a valid API key and project set up with webhook delivery. A Postgres database is required for long-term memory (optional but recommended). You can use any memory node with it. ⚙️ How It Works Webhook Trigger The workflow begins when a message is received via Blooio. This webhook listens for user-submitted content, including any image attachments. Image Validation and Extraction A conditional check verifies the presence of attachments. If images are found, their URLs are extracted using a Code node and prepared for processing. Image Analysis via AI Agent Images are passed to an OpenAI-based agent using a custom system prompt that: Identifies the meal, Estimates portion sizes, Calculates calories, macros, fiber, sugar, and sodium, Scores the meal with a health and confidence rating, Responds in a chatty, human-like summary format. Memory Integration A Postgres memory node stores user interactions for recall and contextual continuity, allowing day/week/month reports to be generated based on cumulative messages. Response Aggregation & Summary Messages are aggregated and summarized by a second AI agent into a single concise message to be sent back to the user via Blooio. Message Dispatch The final message is posted back to the originating conversation using the Blooio Send Message API. 🚀 How to Use The included webhook can be triggered manually or programmatically by linking Blooio to a frontend chat UI. You can test the flow using a manual POST request containing mock Blooio payloads. Want to use a different messages app? Replace the Blooio nodes with your preferred messaging API (e.g., Twilio, Slack, Telegram). ✅ Requirements OpenAI API access with GPT-4 Vision or equivalent multimodal support. Blooio account with access to incoming and outgoing message APIs. Optional: Postgres DB (e.g., via Neon) for tracking message context over time. 🛠️ Customising This Workflow Prompt Tuning** Tailor the system prompt in the AI Agent node to fit specific diets (e.g., keto, diabetic), age groups, or regionally-specific foods. Analytics Dashboards** Hook up your Postgres memory to a data visualization tool for nutritional trends over time. Multilingual Support** Adjust the response prompt to translate messages into other languages or regional dialects. Image Preprocessing** Insert a preprocessing node before sending images to the model to resize, crop, or enhance clarity for better results.
by Agent Studio
Overview This n8n workflow processes user feedback automatically, tags it with sentiment, and links it to relevant insights in Notion. It uses GPT-4 to analyze each feedback entry, determine whether it corresponds to an existing insight or a new one, and update the Notion databases accordingly. It helps teams centralize and structure qualitative user feedback at scale. Who It’s For Product teams looking to organize and prioritize user feedback. Founders or solo builders seeking actionable insights from qualitative data. Anyone managing a Notion workspace where feedback is collected and needs to be tagged or linked to features and improvements. Prerequisites A Notion account with: A Feedback database (must include fields for feedback content and status). An Insights database with multi-select fields for Solution, User Persona, and a relation to Feedback. The Notion template (linked below) helps you get started quickly — just remove the mock data. A configured Notion API integration in n8n. 👉 Don’t forget to connect the n8n integration to the correct Notion page. An OpenAI API key Notion Template This workflow is designed to work seamlessly with a pre-configured Notion template that includes the required feedback and insights structure. 👉 User Feedback Analysis – Notion Template How It Works The workflow is triggered when a feedback item is updated in Notion (e.g. new feedback is submitted). Sentiment analysis (Positive, Neutral, or Negative) is run using OpenAI and stored in a select field in Notion. The AI agent analyzes the feedback to: Identify whether it matches an existing insight. Or create a new insight in Notion with a concise name, solution, and user persona. The feedback is then linked to the appropriate insight and marked as "Processed." How to Use It Connect your Notion databases in all Notion nodes (including those used by the AI agent) for both Feedback and Insights — follow the node names provided. Ensure your OpenAI and Notion credentials are correctly set. Set up your product context: Define a “Product Overview” and list your “Core Features”. This helps the AI agent categorize insights more accurately. (The Basecamp product is used as an example in the template.) (Optional) Modify the prompt to better fit your specific product context. Once feedback is added or updated in Notion, the workflow triggers automatically. Notes Only feedback with the status Received is processed. New insights are only created if no relevant match is found. Feedback is linked to insights via Notion’s relation property. A fallback parser is included to fix potential formatting issues in the AI output. You can swap the default n8n memory for a more robust backend like Supabase. 🙏 Please share your feedback with us. It helps us tremendously!
by IvanCore
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Telegram Voice AI Assistant This n8n template creates a multimodal Telegram bot that dynamically responds to users: Replies with voice** when receiving voice messages (using ElevenLabs TTS) Replies with text** for text-based queries Supports custom AI tools (e.g., crypto APIs, databases, or custom functions) Built with LangChain Agents, it can integrate any external API or data source into conversations. Key Features 🎙️ Smart Response Logic Voice Query? → Voice Reply** Transcribes audio via ElevenLabs STT Processes with AI (Groq/Gemini) Converts text response to natural speech (ElevenLabs TTS) Text Query? → Text Reply** Bypasses TTS/STT for faster responses 🛠️ Extensible AI Tools Add your own tools: Database lookups Weather/stock APIs Custom Python functions RAG (document retrieval) Supports multi-step tool chaining (e.g., "Get BTC price → analyze trends → summarize") 🌐 Language & Context Auto-detects user language (via Telegram’s language_code) Maintains session memory (remembers conversation history) Use Cases Voice-first customer support** Crypto/analytics assistants** (e.g., "What’s Ethereum’s current gas fee?") Multilingual FAQ bots** Educational tutors** (voice-interactive learning) Requirements Telegram Bot Token** ElevenLabs API Key** (For TTS/STT) Groq API Key* or *Google Gemini API Key** Customization Tips Change AI personality*: Modify the *systemMessage in the Voice Assistant node Add more models**: Swap Groq/Gemini for OpenAI, Anthropic, etc. Extend functionality**: Add RAG (Retrieval-Augmented Generation) for document queries Take this template to create a Siri-like AI assistant for Telegram in minutes! 🚀
by Eduard
Transform static digital assets into dynamic, self-updating powerhouses that stay relevant for years to come! This workflow solves a common problem: once you publish forms, emails, or templates, their content becomes frozen in time. Users discovering them months later see outdated information, missed opportunities, and stale offers. Stop losing opportunities to stale content – make your digital assets work harder and stay fresher, automatically! Here's how it works: 🔗 Stable embed links mean your original assets never need updating 🔄 Dynamic URL redirects that automatically point to the latest pages 🖼️ Auto-updating images that showcase fresh offers or content 📅 Scheduled updates keep everything current without manual intervention Perfect for: Workflow sticky notes that become evergreen marketing billboards Registration forms with current promotions Email signatures with latest offers Website banners that stay seasonally relevant Any digital asset you want to "future-proof" The magic: Set it up once, embed the stable URLs/images in your content, then forget about it. Years later, users will still see fresh, as current information automatically pulled from your workflow. Requirements: Free accounts with GitHub (image storage) and shorten.rest (URL redirects). Both can be swapped for your preferred services. Follow me on LinkedIn for more tips on AI automation and n8n workflows!
by Open Paws
This general-purpose sub-agent combines multiple research and automation tools to support high-impact decision-making for animal advocacy workflows. It’s designed to act as a reusable, modular unit within larger multi-agent systems—handling search, scraping, scoring, and domain-specific semantic lookup. It powers many of the advanced workflows released by Open Paws and serves as a versatile backend utility agent. 🛠️ What It Does Performs real-time Google Search using Serper Scrapes and extracts page content using Jina AI and Scraping Dog Conducts semantic search over the Open Paws knowledge base Generates OpenAI embeddings for similarity search and analysis Routes search and content analysis through OpenRouter LLMs Connects with downstream tools like the Text Scoring Sub-Workflow to evaluate message performance > 🧩 This agent is typically used as a sub-workflow in larger automations where agents need access to external tools or advocacy-specific knowledge. 🧠 Domain Focus: Animal Advocacy The agent is pre-configured to interface with the Open Paws database—an open-source, animal advocacy-specific knowledge graph—and is optimized for content and research tasks relevant to farmed animal issues, corporate campaigns, and activist communication. 🔗 Integrated Tools and APIs | Tool | Purpose | |---------------|------------------------------------------| | Serper API | Real-time Google Search queries | | Jina AI | Web scraping and content extraction | | Scraping Dog | Social media scraping where Jina is blocked | | OpenAI API | Embedding generation for semantic search | | OpenRouter | Proxy to multiple LLMs (e.g., GPT-4, Claude)| | Open Paws DB | Advocacy-specific semantic knowledge base | 📦 Use Cases Create and evaluate online content (e.g. social media, emails, petitions) for predicted performance and advocacy alignment Act as a research and reasoning agent within multi-agent workflows Automate web and social media research for real-time campaign support Surface relevant facts or arguments from an advocacy-specific knowledge base Assist communications teams with message testing and content ideation Monitor search results and scrape pages to inform rapid response messaging
by Dan Rahimi
Sync Notion Contacts to Google Contacts with Group Labels Overview Seamlessly transfer your Notion contacts to Google Contacts with organized group labels, simplifying your CRM management. This n8n workflow automates syncing contacts from a Notion database to Google Contacts, applying group labels based on Notion properties. It triggers on new or updated contacts, ensuring your Google Contacts are always organized without manual effort. ✨ Key Features 🔄 Automatic Sync: Updates Google Contacts when Notion entries are added or modified. 🏷️ Group Organization: Assigns labels to contacts based on Notion’s property_buy field. ✅ Duplicate Prevention: Marks synced contacts in Notion with a checkbox. 🛠️ Flexible Customization: Add fields like email in the “Map Notion Contact Fields” node. 📡 Community Nodes: Leverages Notion and Google Contacts nodes for integration. 📋 Prerequisites Required Credentials Notion API Token:** Set up OAuth2 in n8n. Get your token from Notion’s API settings. Google Contacts OAuth2:** Configure in n8n. See n8n’s Google Contacts guide. Notion Database:** Must include name, phone, labels (property_buy), and an “Added to Contacts” checkbox. Self-Hosted n8n:** Required for community nodes. 🔄 Workflow Process Trigger: Activates on new or updated Notion database entries. Fetch Data: Retrieves contact details (name, phone, labels) from Notion. Map Fields: Organizes data in the “Map Notion Contact Fields” node. Verify Groups: Checks for existing Google Contact groups; creates new ones if needed. Sync Contacts: Adds contacts to Google Contacts with labels. Update Notion: Marks contacts as synced. Result: Organized, labeled contacts in Google Contacts, updated automatically. 📊 Output Data Structure Name:** Contact’s first name from Notion. Phone:** Contact’s phone number. Group Labels:** Assigned from Notion’s property_buy field. Sync Status:** Notion checkbox updated to confirm sync. 💡 Pro Tips Real-Time Updates:** Set the Notion Trigger node to check every minute for faster syncing. Expand Fields:** Add email or other fields in the “Map Notion Contact Fields” node. Clean Labels:** Use consistent Notion labels for better Google Contacts organization. Test Small:** Start with a small dataset to verify setup. 🆘 Troubleshooting Authentication Issues:** Verify Notion and Google Contacts OAuth2 credentials. Sync Failures:** Ensure Notion database ID and field names match the workflow. Group Errors:** Check that property_buy labels are valid. 👨💻 Creator Information 👤 Created by: Dan Rahimi 🌐 Website: DanRahimi.com 📧 Email: Fa.Danial@gmail.com 📺 YouTube: @DanRahimi 👥 LinkedIn: Dan-Rahimi 🤝 Support & Contributions Enjoyed this workflow? Support my work or explore more: ☕ Buy Me a Coffee 📚 AI Automation Courses: Visit DanRahimi.com for more articles and tutorials about AI automation. Disclaimer: This workflow uses community nodes and requires a self-hosted n8n instance.
by AiAgent
What It Does This intelligent workflow simplifies the complex task of determining whether a website is legitimate or potentially a scam. By simply submitting a URL through a form, the system initiates a multi-agent evaluation process. Four dedicated AI agents—each powered by GPT-4o and connected to SerpAPI—analyze different dimensions of the website: domain and technical details, search engine signals, product and pricing patterns, and on-site content analysis. Their findings are then passed to a fifth AI agent, the Analyzer, powered by GPT-4o mini, which consolidates the data, scores the site on a scale of 1–10 for scam likelihood, and presents the findings in a clear, structured format for the user. Who It's For This workflow is ideal for anyone who needs to quickly and reliably assess the trustworthiness of a website. Whether you're a consumer double-checking a store before making a purchase, a small business owner validating supplier sites, a cybersecurity analyst conducting threat assessments, or a developer building fraud detection into your platform — this tool offers fast, AI-powered insights without the need for manual research or technical expertise. It's designed for both individuals and teams who value accurate, scalable scam detection. How It Works The process begins with a simple form submission where the user enters the URL of the website they want to investigate. Once submitted, the workflow activates four specialized AI agents—each powered by GPT-4o and connected to SerpAPI—to independently analyze the site from different angles: Agent 1 examines domain age, SSL certificates, and TLD trustworthiness. Agent 2 reviews search engine results, forum mentions, and public scam reports. Agent 3 analyzes product pricing patterns and brand authenticity. Agent 4 assesses on-site content quality, grammar, legitimacy of claims, and presence of business info. Each agent returns its findings, which are then aggregated and passed to a fifth AI agent—the Analyzer. This final agent, powered by GPT-4o mini, evaluates all the input, assigns a scam likelihood score from 1 to 10, and compiles a neatly formatted summary with organized insights and a disclaimer for context. Set UP You will need to obtain an Open AI API key from platform.openai.com/api-keys After you obtain this Open AI API key you will need to connect it to the Open AI Chat Model for all of the Tools agents (Analyzer, Domain & Technical Details, Search Engine Signals, Product & Pricing Patterns, and Content Analysis Tools Agents). You will now need to fund your Open AI account. GPT 4o costs ~$0.01 to run the workflow. Next you will need to create a SerpAPI account at https://serpapi.com/users/sign_up After you create an account you will need to obtain a SerpAPI key. You will then need to use this key to connect to the SerpAPI tool for each of the tools agents (Domain & Technical Details, Search Engine Signals, Product & Pricing Patterns, and Content Analysis Tools Agents) Tip: SerpAPI will allow you to run 100 free searches each month. This workflow uses ~5-15 SerpAPI searches per run. If you would like to utilize the workflow more than that each month, create multiple SerpAPI accounts and have an API key for each account. When you utilize all 100 free searches for an account, switch to the API key for another account within the workflow. Disclaimer This tool is designed to assist in evaluating the potential risk of websites using AI-generated insights. The scam likelihood score and analysis provided are based on publicly available information and should not be considered a definitive or authoritative assessment. This tool does not guarantee the accuracy, safety, or legitimacy of any website. Users should perform their own due diligence and use independent judgment before engaging with any site. N8N, OpenAI, its affiliates, and the creators of this workflow are not responsible for any loss, damages, or consequences arising from the use of this tool or the actions taken based on its results.
by Oneclick AI Squad
This workflow listens for incoming book request emails, extracts the user's intent using the Ollama LLM, queries book data (title, summary, details) via an API, and sends a personalized recommendation email. Ideal for automated book suggestions using LLMs and structured APIs, great for newsletters, reading clubs, and educational bots. How It Works Email Request: Triggers the workflow when a new email with a book request is received. Analyze Email with Ollama: Extracts user intent and book preferences using the Ollama LLM. Create Book Search Query: Generates a query based on the analyzed intent. Book Search API: Fetches book data (title, summary, details) from an API. Check API Response: Validates the API response for book availability. Handle No Book Found: Manages cases where no suitable book is found. Extract Book Summary: Pulls the summary from the API response. Wait for Summary Response: Pauses to ensure summary data is ready. Retrieve Book Details: Gathers additional book details. Format Book Data: Structures the book information for the recommendation. Enhance Data with Code: Refines the data using custom code. Generate Email Content: Creates a personalized email recommendation. Send Email: Delivers the recommendation to the user. How to Use Import the workflow into n8n. Configure email credentials for the Email Request node. Set up Ollama LLM API credentials and endpoint. Configure the Book Search API with appropriate credentials and endpoint. Test with a sample email requesting a book recommendation. Adjust the Generate Email Content node for custom email templates if needed. Ensure the Send Email node is linked to a valid email service. Requirements Email service API credentials (e.g., Gmail, SMTP) Ollama LLM API access Book Search API credentials Customizing This Workflow Modify the Analyze Email with Ollama node to refine intent extraction for specific genres. Adjust the Book Search API query to target different book databases. Customize the Generate Email Content node to include additional details like author bios.
by Max Tkacz
Easily generate images with Black Forest's Flux Text-to-Image AI models using Hugging Face’s Inference API. This template serves a webform where you can enter prompts and select predefined visual styles that are customizable with no-code. The workflow integrates seamlessly with Hugging Face's free tier, and it’s easy to modify for any Text-to-Image model that supports API access. Try it Curious what this template does? Try a public version here: https://devrel.app.n8n.cloud/form/flux Set Up Watch this quick set up video 👇 Accounts required Huggingface.co account (free) Cloudflare.com account (free - used for storage; but can be swapped easily e.g. GDrive) Key Features: Text-to-Image Creation**: Generates unique visuals based on your prompt and style. Hugging Face Integration**: Utilizes Hugging Face’s Inference API for reliable image generation. Customizable Visual Styles**: Select from preset styles or easily add your own. Adaptable**: Swap in any Hugging Face Text-to-Image model that supports API calls. Ideal for: Creators**: Rapidly create visuals for projects. Marketers**: Prototype campaign visuals. Developers**: Test different AI image models effortlessly. How It Works: You submit an image prompt via the webform and select a visual style, which appends style instructions to your prompt. The Hugging Face Inference API then generates and returns the image, which gets hosted on Cloudflare S3. The workflow can be easily adjusted to use other models and styles for complete flexibility.
by David Olusola
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 📁 Google Drive MCP Workflow – AI-Powered File Management Automation 🚀 🧠 Overview A secure and intelligent n8n workflow that connects with Google Drive via MCP (Model Control Protocol). Ideal for AI agent tasks, compliance-driven storage, and document automation. 🌟 Key Features 🔒 Built-In Safety Backs up files before edits (timestamped) Supports rollback using file history Validates file size, type, and permissions 📁 Smart Organization Automatically converts file types (PDF, DOCX, etc.) Moves files to structured folders Auto-archives old files based on age or rules 🔄 MCP Integration Accepts standardized JSON via webhook Real-time execution for AI agents Fully customizable input (action, fileId, format, etc.) ✅ AI Callable MCP Actions These are the commands AI agents can perform via MCP: Download a file (with optional format conversion) Upload a new file to Google Drive Copy a file for backup Move a file to a specific folder Archive old or inactive files Organize documents into folders Convert files to a new format (PDF, DOCX, etc.) Retrieve and review file history for rollback 📝 Example Input { "action": "download", "fileId": "abc123", "folderPath": "/projects/clientA", "convertFormat": "pdf" } 🔐 Security & Performance OAuth2 secured access to Google Drive API No sensitive data stored in transit Real-time audit logs and alerts Batch-friendly with built-in rate limiting 📌 Ideal For Businesses automating file management AI Agents retrieving, sorting, converting, or archiving files Compliance teams needing file versioning and backups ⚙️ Requirements n8n + Google Drive API v3 MCP server + Webhook integration Google OAuth2 Credentials
by n8n Team
This workflow offers an effective way to handle a chatbot's functionality, making use of multiple tools for information retrieval, conversation context storage, and message sending. It's a setup tailored for a Slack environment, aiming to offer an interactive, AI-driven chatbot experience. Note that to use this template, you need to be on n8n version 1.19.4 or later.