by David Olusola
Overview This workflow watches for new rows in a Google Sheet (e.g., where you manually log customer reviews) and uses a Code node to perform a simple sentiment analysis, then updates the same row with the detected sentiment. Use Case: Quickly gauge customer satisfaction, identify positive/negative trends, and prioritize follow-ups based on sentiment. How It Works This workflow operates in four main steps: Google Sheets Trigger (New Row): The workflow starts with a Google Sheets Trigger node configured to monitor a specific Google Sheet for new rows. This triggers the workflow whenever a new review is added. Code Node (Sentiment Analysis): A Code node receives the new row data (containing the review text). Inside this node, JavaScript code performs a basic sentiment analysis by checking for keywords (e.g., "great", "excellent" for positive; "bad", "problem" for negative). It assigns "Positive", "Negative", or "Neutral" sentiment. Update Google Sheet Row: A Google Sheets node is configured to update the same row that triggered the workflow. It adds the sentiment result (and potentially other analysis data) to a new column in that row. Setup Steps To get this workflow up and running, follow these instructions: Step 1: Create Google Sheets Credentials in n8n In your n8n instance, click on Credentials in the left sidebar. Click New Credential. Search for and select "Google Sheets OAuth2 API" and follow the authentication steps with your Google account. Save it. Make note of the Credential Name (e.g., "My Google Sheets Account"). Step 2: Prepare Your Google Sheet (or better Make a copy of the one provided in the template) Create a new Google Sheet in your Google Drive (e.g., Customer Reviews). In the first row, add these column headers: Timestamp Customer Name Review Text Sentiment (This column will be updated by the workflow) Review ID (Optional, for tracking) Copy the Sheet ID from the URL (e.g., https://docs.google.com/spreadsheets/d/YOUR_GOOGLE_SHEET_ID_HERE/edit). Copy the GID of the specific sheet tab (e.g., https://docs.google.com/spreadsheets/d/YOUR_GOOGLE_SHEET_ID_HERE/edit#gid=YOUR_GID_HERE). This is the sheetName value. Step 3: Import the Workflow JSON Step 4: Activate and Test the Workflow Click the "Activate" toggle button in the top right corner of the n8n workflow editor. Go to your Google Sheet and manually add a new row with a "Review Text" (e.g., "This product is great, I love it!"). Leave the "Sentiment" column empty. The workflow should trigger automatically (it polls every minute by default), analyze the sentiment, and update the "Sentiment" column in your Google Sheet. You can also manually "Execute Workflow" to test immediately.
by Alex Hi no code
Automate Instagram DMs with OpenAI GPT and ManyChat How It Works: Once connected, GPT will automatically initiate conversations with messages from new recipients in Intagram. Who Is This For? This workflow is ideal for marketers, business owners content creators who want to automatically respond to Instagram direct messages using OpenAI GPT. By integrating ManyChat, you can manage conversations, nurture leads, and provide instant replies at scale. What This Workflow Does Captures** incoming Instagram DMs through ManyChat’s integration. Processes** messages with GPT to generate a relevant response. Delivers** instant replies back to Instagram users, creating efficient, AI-driven communication. Setup Import the Template: Copy the n8n workflow into your workspace. OpenAI Credentials: Add your OpenAI API key in n8n so GPT can generate responses. ManyChat Account: Create (or log in to) your ManyChat account. Connect Instagram: Link your Instagram profile as a channel in ManyChat. ManyChat Custom Field: Create a custom field for storing user input or conversation context. Configure Default Reply: In ManyChat, set up the default Instagram reply flow to point to your n8n webhook. Add External Request: Create an external request step in ManyChat to send messages to n8n. Test the Flow: Send yourself a DM on Instagram to confirm the workflow triggers and GPT responds correctly. Instructions and links: Notion instruction Register in ManyChat
by Don Jayamaha Jr
This advanced agent analyzes long-term price action in the Binance Spot Market using 1-day candles. It calculates key macro indicators like RSI, MACD, BBANDS, EMA, SMA, and ADX to identify high-confidence trend setups and market momentum. Used by the Quant AI system for directional bias and macro-level signal validation. 🎥 Watch Tutorial: 🎯 Purpose Detect major trend reversals, consolidation zones, and macro bias Support long-term swing trading decisions Provide reliable 1-day signals for downstream agents 🧠 Core Features | Feature | Description | | --------------------------- | ------------------------------------------------------------ | | 🔁 Trigger | Called by parent workflows via Execute Workflow | | 📥 Input Format | { "message": "MATICUSDT", "sessionId": "telegram_id" } | | 📡 Webhook Call | Sends request to internal 1d indicators webhook | | 🧮 Technical Indicators | RSI, MACD, BBANDS, EMA, SMA, ADX (based on 40 daily candles) | | 🧠 GPT (gpt-4.1-mini) Agent | Interprets numerical data into human-readable trend signals | | 💬 Output | Summary suitable for Telegram or further agent consumption | 🔗 External Tools Called https://treasurium.app.n8n.cloud/webhook/1d-indicators Sends: { "symbol": "SOLUSDT" } 📊 Indicator Calculations | Indicator | Purpose | | -------------- | ------------------------------- | | RSI (14) | Overbought / Oversold Signals | | MACD (12,26,9) | Trend Reversals / Momentum | | BBANDS (20, 2) | Volatility Expansion | | EMA (20) | Short-Term Trend Confirmation | | SMA (20) | Macro-Level Support/Resistance | | ADX (14) | Trend Strength + Directional DI | 📦 Setup Import the JSON into n8n. Add your OpenAI API credentials. Ensure webhook /1d-indicators is connected and working. Use this agent as a sub-workflow in: Binance SM Financial Analyst Tool Binance Spot Market Quant AI Agent 📤 Output Example 📅 1D Overview – MATICUSDT • RSI: 71 → Overbought • MACD: Bearish Cross forming • BBANDS: Widening Volatility • EMA < SMA → Downtrend Momentum • ADX: 33 → High Trend Strength 📌 Notes Not user-facing — outputs are structured JSON or Telegram-style summaries. Pairs well with shorter timeframe tools (15m–4h) for confidence stacking. 🧾 Licensing & Attribution © 2025 Treasurium Capital Limited Company Architecture, prompts, and trade report structure are IP-protected. No unauthorized rebranding permitted. 🔗 Need help? Reach out on LinkedIn – Don Jayamaha
by Joseph LePage
MCP AI Chatbot using Brave Search Disclaimer: This workflow only works with local installations of n8n because it uses a community MCP node Who is this for? This workflow is ideal for developers, automation enthusiasts, and businesses looking to integrate AI-powered chat capabilities into their workflows. It's particularly useful for those leveraging Brave Search and MCP tools to enhance user interactions and streamline data retrieval. What problem is this workflow solving? This workflow addresses the challenge of creating an intelligent chatbot that can process user queries, execute searches using Brave Search, and provide responses enriched by AI. It simplifies the integration of multiple tools into a cohesive system, saving time and effort for users who need a robust conversational AI solution. What this workflow does Listens for incoming chat messages using the Chat Trigger node. Processes user input with an AI Agent powered by GPT-4o. Retrieves relevant tools using the MCP Get Brave Tools node. Executes specific search queries via the MCP Execute Brave Search node. Maintains short-term memory of conversations with the Simple Memory node. Setup Prerequisites: Access to an n8n instance (self-hosted). API credentials for OpenAI and MCP Client Tools. Brave Search API key. Steps: Import the workflow JSON into your n8n instance. Configure the API credentials for OpenAI and MCP Client Tools in their respective nodes. Set up your Brave Search API key in the MCP nodes. https://brave.com/search/api/ Testing: Use the built-in chat interface to send test messages. Verify that the chatbot processes queries and returns results as expected. How to customize this workflow to your needs Modify the AI Agent's prompt settings to tailor responses to your specific use case. Adjust the memory buffer in the Simple Memory node to retain more or less conversational context. Replace or add additional tools in the MCP nodes to expand functionality.
by keisha kalra
Try It Out! This n8n template creates a fully automated Instagram content schedule using AI and Google Sheets. It is perfect for content creators, marketing teams, or local businesses looking to organize and scale their social media posting. How it works The workflow starts by reading two sets of inputs from a Google Sheet: Your content strategy inputs (Pillar, Objective, Frequency, Format, Structure, Examples). A list of scraped blog posts with title, URL, and description (fetched from your website). Blog posts are scraped using Apify and parsed to extract key fields, which are stored in a tab labeled "Input (blog month)". You can assign a preferred posting month for each blog (e.g. fall blog posts get tagged for September). The workflow then merges both inputs and extracts the relevant information for further information added by ChatGPT. AI Scheduling & Personalization Once merged, the workflow loops through each content item and: Identifies if the scheduled post falls on or near a holiday (like Mother’s Day) and adjusts the content accordingly. A reference tool is attached to guide structure and tone, based on a library of post examples. Sends the content to an AI Agent (using GPT-4, but customizable) that generates: A compelling Instagram caption A visual description Hashtags Suggested post date, day, content pillar, and format (carousel, reel, image, etc.) Output All generated content including captions, structure, dates, hashtags, and pillar is exported into a tab titled Output in your Google Sheet. The final schedule is ready for manual review, editing, or publishing to social media. How to use The workflow uses a manual trigger to start, but you can replace it with a Webhook, cron job, or form submission. Add/edit your content strategy in Google Sheets. How to Set-Up Initial Input Tab Define your content pillars and structure Create a tab named "Input" or "Strategy" Include these columns: Pillar: e.g., Family images Objective: e.g., Showcase images Frequency: e.g., Bi-weekly Content Form: e.g., Images, Reels Structure: brief description of expected layout (e.g., carousel Q&A, singular photo) Examples: prompts or questions to guide AI (e.g., Why do you think families should do a session?) Input (blog month) Tab – Store scraped blog content Include these columns: URL: direct link to blog post Title: blog post title Description: short summary of the post Preferred Month: month you want it posted (e.g., August, September) This sheet is partially auto-filled by the workflow (except for Preferred Month) Output Tab – Final scheduled content Include these columns: Date: scheduled posting date (YYYY-MM-DD) Day: day of the week Pillar: content category assigned Format: e.g., Images, Reels, Carousel Description: visual summary Caption: Instagram-ready caption Hashtags: complete hashtag block To use the Apify HTTP Request node: Drag in an HTTP Request node into your n8n workflow. Set the Method and URL based on how you're using Apify: Use POST if you want to run an actor live with dynamic input (e.g. scrape blog posts in real time). Use GET if you want to retrieve results from a completed or static dataset run (faster and cheaper if you're reusing previous data). Configure query or body parameters: Include your Apify API token for authentication (e.g. token=YOUR_API_KEY) For POST: include an input object with any required actor settings (e.g., blog URL to scrape). For GET: specify the dataset ID in the URL Test the node to ensure you're retrieving the blog titles, descriptions, and URLs as expected. Requirements Apify account for scraping blog posts OpenAI key (e.g. GPT-4) or another model of your choice Google Sheets Credentials Example Use Cases A photographer repurposing blogs into Instagram carousels A nonprofit automatically generating seasonal posts A small team managing multi-pillar content across weeks or months Need Help? Join the n8n Discord or ask in the n8n Forum! Happy Content Making ! 📅✨
by Robert Breen
This no-code n8n workflow finds recent Instagram posts by hashtag, scrapes profile data, and uses an AI agent to evaluate whether each account is a good collaboration lead. The workflow filters based on the number of followers and the content of their bio, and outputs structured reasoning for outreach decisions. Perfect for creators, marketers, or business developers looking to automate influencer or community partnership prospecting—especially in niche ecosystems like n8n. ✅ Key Features 🔍 Hashtag Discovery**: Finds recent Instagram posts from a specified hashtag (e.g., #n8n) 👤 Account Scraping**: Retrieves profile details such as follower count and biography 🧠 AI Evaluation**: Uses OpenAI and LangChain to determine if the profile is a good fit for outreach 📦 Structured Output**: Returns a JSON object with "Yes/No" lead status and reasoning 🛠️ Manual Execution**: Run on demand using the manual trigger 🧰 What You'll Need | Tool / API | Purpose | Setup Steps | |-------------------------|------------------------------------------|-------------| | Apify Account | To access Instagram scraping actors | Create account → Generate API Token → Use in httpQueryAuth credential in n8n | | OpenAI API Key | To power the AI decision-making agent | Sign up at OpenAI → Create API key → Paste into OpenAI credential in n8n | | LangChain Plugin for n8n | AI Orchestration with System Message | Install LangChain nodes from Community Nodes (already installed in this workflow) | 🔧 Step-by-Step Setup 1️⃣ Manual Trigger Node**: When clicking ‘Execute workflow’ Use**: Allows you to run the workflow manually while testing. 2️⃣ Define Hashtag Node**: Create Search Term Value**: Sets "n8n" as the default Instagram hashtag to scan. You can edit this to any other hashtag you'd like. 3️⃣ Find Recent Posts Node**: Find Recent Posts API**: Apify Instagram Hashtag Scraper Auth Setup**: Go to your Apify Console Click “Create new token” In n8n, create a new HTTP Query Auth credential Set token in the token query param (e.g., ?token=yourTokenHere) Choose the credential in this node 4️⃣ Scrape Each Profile Node**: Scrape Accounts API**: Apify Instagram Profile Scraper Body**: JSON with usernames from the hashtag search Note**: Uses the same httpQueryAuth credential as the previous node. 5️⃣ Extract Fields Node**: Set bio and follower count What it does**: Extracts biography and followersCount from the profile JSON and stores them in clean variables for AI input. 6️⃣ AI Lead Scoring Node**: AI Agent Purpose**: Uses GPT-4o-mini to analyze the bio and follower count Prompt Details**: 7️⃣ AI Model Node**: OpenAI Chat Model Model**: gpt-4o-mini Credential**: Connect your OpenAI account via API Key. Go to OpenAI API Keys Copy your key and create a new OpenAI API credential in n8n. 8️⃣ Output Parser Node**: Structured Output Parser What it does**: Parses the response from the AI into structured JSON for further use (e.g., storing leads, sending to Airtable, etc.) 🧪 Sample Output { "lead status": "Yes", "Reasoning": "The user has 3.5k followers and their bio shows they build automations with n8n." } 📬 Need More Help? If you'd like assistance setting this up, customizing it to your niche, or expanding it to score and store leads automatically — I can help! 👤 Robert Breen Automation Consultant | AI Workflow Designer | n8n Expert 📧 robert@ynteractive.com 🌐 ynteractive.com 🔗 LinkedIn
by Vitorio Magalhães
Auto-publish NASA APOD to LinkedIn with AI translation and hashtags Transform NASA's daily astronomical wonders into engaging LinkedIn content automatically. This workflow fetches NASA's Astronomy Picture of the Day, translates it to Brazilian Portuguese using AI, generates strategic hashtags, and publishes everything to your LinkedIn profile with the stunning space image attached. Who's it for Content creators, astronomy enthusiasts, science communicators, and anyone wanting to share high-quality educational content consistently on LinkedIn. Perfect for Portuguese-speaking professionals who want to engage their network with fascinating space discoveries while building their personal brand as a science advocate. How it works The workflow runs on a daily schedule and handles the complete content pipeline automatically. It fetches the latest NASA APOD through the official API, including both the image and detailed explanation. The English description gets professionally translated to selected language using Google Gemini 2.5 Flash, while maintaining scientific accuracy and terminology. Smart hashtag generation combines fixed branding tags with content-specific ones, mixing Portuguese and English for maximum reach. The final post includes the NASA image, translated description, and strategic hashtags, then gets published to your LinkedIn profile automatically. How to set up You'll need accounts for Google AI Studio (free), LinkedIn Developer (free), and a Telegram bot for notifications. The setup takes about 15 minutes and uses only free services and APIs. First, create your Google AI Studio account and get an API key for the AI translation services. Then set up a LinkedIn OAuth2 application to enable posting permissions. Create a Telegram bot through BotFather and get your chat ID for notifications. Configure the Settings node with your Telegram chat ID and preferred language. The workflow comes with all prompts and configurations ready to use. Test each component individually before activating the daily automation. Requirements LinkedIn account with posting permissions Google AI Studio API key (free tier available) Telegram bot token and your chat ID Basic understanding of OAuth2 setup for LinkedIn NASA API key (optional - demo key included) All services used have generous free tiers, making this workflow completely free to operate indefinitely. How to customize the workflow The centralized Settings node makes customization simple. Change the target language from Brazilian Portuguese to any other language by updating the translate_to_language variable. Modify the posting schedule in the CRON trigger to match your preferred timing. Customize the post template in the "Create Final Post Text" node to match your personal brand voice. Adjust the hashtag strategy by editing the AI prompt in the "Generate Hashtags" node. Add additional social platforms by duplicating the LinkedIn publisher with different credentials. The AI prompts can be fine-tuned for different writing styles or specific astronomical topics. You can also extend the workflow to include additional content processing, image enhancements, or cross-posting to multiple platforms while maintaining the core NASA APOD automation.
by Aadarsh Jain
Who is this for? This workflow is designed for DevOps engineers, platform engineers, and Kubernetes administrators who want to interact with their Kubernetes clusters through natural language queries in n8n. It's perfect for teams who need quick cluster insights without memorizing complex kubectl commands or switching between multiple cluster contexts manually. How it works? The workflow operates in three intelligent stages: Cluster Discovery & Context Switching - Automatically lists available clusters from your kubeconfig and switches to the appropriate cluster based on your natural language query Command Generation - Uses GPT-4o to analyze your request and generate the correct kubectl command with proper flags, selectors, and output formatting Command Execution - Executes the generated kubectl command against your selected cluster and returns the results The workflow supports multi-cluster environments and can handle queries like: "Show me all pods in production cluster" "List failing deployments in production" "Get pod details in kube-system namespace" Setup Clone the MCP Server git clone https://github.com/aadarshjain/kubectl-mcp-server cd kubectl-mcp-server Configure your kubeconfig - Ensure your ~/.kube/config contains all the clusters you want to access Set up MCP STDIO credentials in n8n Command: /full/path/to/python-package Arguments: /full/path/to/kubectl-mcp-server/server.py Import the workflow into your n8n instance Configure OpenAI credentials for the GPT-4o models Test the workflow using the chat interface with queries like "show pods in [cluster-name]"
by clancy jack
This n8n workflow recommends Taiwan indie music based on a user's city, mood, birthday, today's weather, and star sign. Here's a concise overview: Trigger: Starts manually with the "When clicking ‘Test workflow’" node. Input Setup: The "infomation" node sets user inputs (e.g., city: Taipei, mood: Happy, birthday: 1996/11/21). Song Recommendation: The "get song recommendation" node uses OpenAI's GPT-4o-mini to: Fetch today's weather for the specified city. Determine the user's zodiac sign from their birthday. Check the zodiac sign's daily fortune. Recommend a Taiwan indie song considering weather and fortune. Explain the song choice and highlight its features. Return results in JSON format. Data Extraction: The "Information Extractor" node parses the JSON output, extracting fields like date, city, weather, zodiac sign, fortune, song, artist, and additional info. Spotify Search: The "Spotify" node searches for the recommended song using the artist and song name, retrieving a Spotify URL. Final Output: The "Final Output" node compiles all data, including the Spotify link, into a structured format. Additional Note: A "Sticky Note" provides context about the workflow's purpose and credits the creator, n8nguide. This workflow integrates AI, weather data, astrology, and Spotify to deliver personalized Taiwan indie music recommendations.
by Viktor
Nightly Discord Channel Cleanup This workflow runs every day at 9:00 p.m. and: Retrieves all Discord channels using your provided credentials. Pauses briefly to respect Discord API rate limits. Loops through each channel and fetches messages. Filters out messages older than seven days. Deletes those older messages, again pausing to stay within deletion rate limits. By setting up this workflow on a schedule, you can automatically keep Discord channels tidy and compliant with retention policies. 👨🎤 Setup Add your Discord credentials Change the server in each Discord node to the correct one Click the Test Workflow button Activate the workflow to run on a schedule
by Basil Irfan
🚀 LinkedIn Lead-Gen Flywheel – Apify → GPT-4o → Google Sheets → Phantombuster What this workflow does Collect audience specs – simple web-form asks for your ideal company profile. Generate a laser-targeted Apollo search URL with GPT-4o (no manual filtering). Scrape the matching leads via an Apify actor (returns clean JSON). Craft hyper-personalized icebreakers for each lead using GPT-4o (ultra-short, human-sounding). Log everything to Google Sheets – name, LinkedIn URL, company site, summary, and the icebreaker. (Optional) Auto-launch Phantombuster to fire off those connection requests at scale. Why it matters Zero grunt work:** audience research, scraping, copy-writing, and outreach all run hands-free. Punchy personalization:** micro-icebreakers outperform canned intros, boosting accept rates. Scales with you:** flip a switch to go from 10 to 1 000+ connections/day. Node rundown | Step | Node | Key Inputs | Key Outputs | |------|------|-----------|-------------| | 1 | Form Trigger | Audience description | description_of_company | | 2 | OpenAI (GPT-4o) | Audience text | SearchUrl | | 3 | HTTP Request – Apify | SearchUrl, APIFY_TOKEN | Lead JSON | | 4 | OpenAI (GPT-4o) | Lead JSON | Icebreaker | | 5 | Google Sheets | Lead + Icebreaker | Row append/update | | 6 | Aggregate | Sheet rows | Batched output | | 7 | HTTP Request – Phantombuster | PHANTOM_KEY, AGENT_ID | Launch status | Prerequisites OpenAI API key** (GPT-4o access recommended) Apify API token** with access to actor id Google Service Account creds** shared with your target sheet Phantombuster API key** and Agent ID for your LinkedIn connector Active Apollo account to open the generated search URL (only required for debugging) Setup (5-minute sprint) Import the workflow into n8n. Add the required credentials in Credentials → OpenAI, Apify, Google Sheets, Phantombuster. Paste your Phantombuster Agent ID into the HTTP Request node URL. Publish the Form Trigger URL—this is where you (or your SDRs) describe the target audience. Hit Execute Workflow once to verify data flows end-to-end. Customization tips Titles & keywords:** tweak the prompt in the first GPT-4o node to lock in different roles or industries. Icebreaker style:** adjust the second GPT-4o prompt to match your brand voice. Data columns:** map extra fields from Apify into Google Sheets as needed. Skip outreach:** disable the Phantombuster node if you only want the leads + icebreakers.
by Angel Menendez
Who is this for? This workflow is for professionals and teams who want to automate LinkedIn message replies with intelligent, human-like responses — without losing control over tone or accuracy. Ideal for founders, sales teams, DevRel, or community managers handling high-volume inbound messages. What problem is this workflow solving? Responding to every LinkedIn message manually is slow and inconsistent. Basic AI bots generate replies without context or nuance. This subworkflow solves both problems by using structured message routing from Notion and profile insights from UniPile to craft smart, context-aware responses. What this workflow does This workflow takes the sender’s message and profile (from LinkedIn Auto Message Router with Request Detection) and references your centralized Notion database of message types. It uses that to either match the message to a known response or generate a new one using OpenAI's GPT model — all while following professional tone guidelines. This is the third workflow in a 3-part automation system: Receives data from LinkedIn Auto Message Router with Request Detection Uses UniPile LinkedIn Profile Lookup Subworkflow to enrich responses based on follower count or org data Example Use Case If a message comes from someone with low reach (e.g., under 1,000 followers), the AI politely deflects a meeting request. If an influencer reaches out, the AI immediately offers a booking link. Your team controls this logic by updating the Notion database — no edits to the workflow required. Setup Connect this workflow as a subworkflow in your router or Slack approval flow Store your Notion API key and database ID in n8n Provide the following parent inputs: message – The LinkedIn message text sender – Name of the sender chatid – Session ID (optional for memory) linkedinprofile – Enriched array with LinkedIn context (follower count, connection info, etc.) Add your preferred AI model credentials (supports OpenAI, Gemini, or Ollama) Optional: Customize system prompt to better match your brand voice How to customize this workflow to your needs Update the Notion schema to include industry-specific categories or actions Change the AI tone (e.g., humorous, more corporate, etc.) Add conditional logic for auto-sending messages without Slack approval Extend to support multiple platforms (e.g., email, X/Twitter, Instagram DMs)