by Avkash Kakdiya
How it works This workflow turns a single planning row in Google Sheets into a fully structured content engine. It generates weighted content pillars, builds a rule-based posting calendar, and then creates publish-ready social posts using AI. The workflow strictly controls format routing, CTA rules, and execution order. All outputs are written back to Google Sheets for easy review and execution. Step-by-step Step 1: Input capture & pillar generation** Google Sheets Trigger – Detects new or updated planning rows. Get row(s) in sheet – Fetches brand, platform, scheduling, and promotion inputs. Message a model – Calculates calendar metrics and generates platform-specific content pillars. Code in JavaScript – Validates AI output and enforces 100% weight distribution. Append row in sheet – Stores finalized content pillars in the pillars sheet. Step 2: Calendar generation & routing** Message a model7 – Generates a full day-by-day content calendar from the pillars. Code in JavaScript7 – Normalizes calendar data into a sheet-compatible structure. Append row in sheet6 – Saves calendar entries with dates, formats, CTAs, and status. Switch By Format – Routes items based on Video vs Non-Video formats. Step 3: Post creation & final storage** Loop Over Items – Processes each calendar entry one at a time. Message a model6 – Creates complete hooks, captions, CTAs, and hashtags. Code in JavaScript6 – Formats AI output for final storage. Append row in sheet7 – Stores publish-ready posts in the final sheet. Wait – Controls pacing to avoid API rate limits. Why use this? Eliminates manual content planning and ideation. Enforces strategic content mix and CTA discipline. Produces platform-ready posts automatically. Keeps all planning, calendars, and content in Google Sheets. Scales content operations without extra overhead.
by Guido X Jansen
AI Council: Multi-Model Consensus with Peer Review Inspired by Andrej Karpathy's LLM Council, but rebuilt in n8n. This workflow creates a "council" of AI models that independently answer your question, then peer-review each other's responses before a final arbiter synthesizes the best answer. Who is this for? If you want to prepare for an upcoming meeting with different people and prep for their different views find any "blind spots" in your view on a certain subject Researchers wanting more robust AI-generated answers Developers exploring multi-model architectures Anyone seeking higher-quality responses through AI consensus, potentially with faster/cheaper models. Teams evaluating different LLM capabilities side-by-side How it works Ask a Question — Submit your query via the Chat Trigger Individual Answers — Four different models (Gemini, Llama, Gemma, Mistral) independently generate responses Peer Review — Each model reviews ALL answers, identifying pros, cons, and overall assessment Final Synthesis — DeepSeek R1 analyzes all peer reviews and produces a refined, consensus-based final answer Setup Instructions Prerequisites Access to an LLM (e.g. OpenRouter account with API credits) Steps Create OpenRouter credentials in n8n: Go to Settings → Credentials → Add Credential Select "OpenRouter" and paste your API key Connect all model nodes to your OpenRouter credential. In this example I used Gemini, Llama, Gemma, Mistral and Deepseek, but you can use whatever you want. You can also use the same models, but change their parameters. Play around to find out what suits you best. Activate the workflow and open the Chat interface to test Customization Ideas You can add as many answer and review models as you want. Do note that each AI node is executed in series, so each will add to the total duration. Swap models via OpenRouter's model selector (e.g., use Claude, GPT-4, etc.) Adjust the peer review prompt to represent a certain persona or with domain-specific evaluation criteria Add memory nodes for multi-turn conversations Connect to Slack/Discord instead of the Chat Trigger
by Daniel Turgeman
How it works A daily schedule pulls your target accounts from HubSpot All companies are bulk-enriched with Lusha in a single API call A code node detects growth signals: headcount increase, revenue growth, and funding activity For accounts showing signals, Lusha searches for key contacts and alerts your sales team via Slack Set up steps Install the Lusha community node Add your Lusha API, HubSpot, and Slack credentials Define your target account list or ICP filters in HubSpot Set the Slack channel for signal alerts and activate
by Kevin Armbruster
Automatically add Travel time blockers before Appointments This bot automatically adds Travel time blockers to your calendar, so you never come late to an appointment again. How it works Trigger**: The workflow is initiated daily at 7 AM by a "Schedule Trigger". AI Agent**: An "AI Agent" node orchestrates the main logic. Fetch events**: It uses the get_calendar_events tool to retrieve all events scheduled for the current day. Identify events with location**: It then filters these events to identify those that have a specified location. Check for existing travel time Blockers*: For each event with a location, it checks if a Travel time blocker already exists. Events that *do not have such a blocker are marked for processing. Calculate travel time: Using the Google Directions API it determines how lot it takes to get to the location of the event. The starting location is by default your **Home Address, unless there is a previous event within 2 hours before the event, in which case it will use the location of that previous event. Create Travel time blocker**: Finally, it uses the create_calendar_event tool to create the Travel time blocker with a duration equal to the calculated travel time + 10 minutes for buffer. Set up steps Set Variables Home address Blocker name Mode of Transportation Connect your LLM Provider Connect your Google Calendar Connect your Google Directions API
by Denis
What this workflow does Complete Airtable database management system using MCP (Model Context Protocol) for AI agents. Create bases, tables with complex field types, manage records, and maintain state with Redis storage. Setup steps Add your Airtable Personal Access Token to credentials Configure Redis connection for ID storage Get your workspace ID from Airtable (starts with wsp...) Connect to MCP Server Trigger Configure your AI agent with the provided instructions Key features Create new Airtable bases and custom tables Support for all field types (date, number, select, etc.) Full CRUD operations on records Rename tables and fields Store base/workspace IDs to avoid repeated requests Generic operations work with ANY Airtable structure Included operations create_base, create_custom_table, add_field get_table_ids, get_existing_records update_record, rename_table, rename_fields delete_record get/set base_id and workspace_id (Redis storage) Notes Check sticky notes in workflow for ID locations and field type requirements.
by higashiyama
AI Team Morale Monitor Who’s it for For team leads, HR, and managers who want to monitor the emotional tone and morale of their teams based on message sentiment. How it works Trigger: Runs every Monday at 9 AM. Config: Defines your Teams and Slack channels. Fetch: Gathers messages for the week. AI Analysis: Evaluates tone and stress levels. Aggregate: Computes team sentiment averages. Report: Creates a readable morale summary. Slack Post: Sends report to your workspace. How to set up Connect Microsoft Teams and Slack credentials. Enter your Team and Channel IDs in the Workflow Configuration node. Adjust the schedule if desired. Requirements Microsoft Teams and Slack access. Gemini (or OpenAI) API credentials set in AI nodes. How to customize Modify the AI prompts for different insight depth. Replace Gemini with other LLMs if preferred. Change posting platform or format. Note: This workflow uses only linguistic data — no personal identifiers or private metadata.
by Daniel Turgeman
How it works Triggers instantly when a new contact is created in HubSpot Checks the contact has a valid email, then enriches with Lusha A data quality check verifies Lusha returned meaningful data (phone, title, company) Enriched fields are written back to HubSpot; high-seniority contacts trigger a Slack alert Set up steps Install the Lusha community node Add your Lusha API, HubSpot OAuth2, and Slack credentials Customize the seniority levels that trigger alerts Activate the workflow — every new HubSpot contact will be enriched in real time
by Rahul Joshi
📊 Description This workflow automatically creates a daily market intelligence brief for your stock portfolio. Instead of checking prices, news, and social media separately, it brings everything together into one clear update. On a scheduled basis, the workflow reads your stock list from Google Sheets and processes each stock individually. It fetches the latest stock price data, recent market news, and investor sentiment from public sources. All this information is then analyzed by AI to identify what truly matters, filtering out noise and repeated information. The AI generates a concise market summary that highlights overall sentiment, key drivers, risks, and one actionable insight for the day. The final result is sent directly to Slack as a clean, easy-to-read message, helping you stay informed without manual effort. This workflow is ideal for anyone who wants a clear daily view of market conditions without spending hours monitoring multiple platforms. ⚙️ What This Workflow Does Runs automatically on a daily schedule Reads stock symbols from Google Sheets Fetches latest stock price data Collects recent market news Gathers investor sentiment from public discussions Uses AI to summarize market-moving signals Sends one actionable daily brief to Slack ✅ Key Benefits Saves time by automating market monitoring Reduces noise and highlights what actually matters Combines prices, news, and sentiment in one place Provides clear daily insights instead of raw data Easy to adjust for different portfolios or schedules 🧩 Features Scheduled daily execution Portfolio-based stock tracking Market news collection via RSS Social sentiment analysis from Reddit AI-driven market intelligence summary Structured output for alerts or reporting Slack integration for daily delivery 🔐 Requirements To use this workflow, you will need: Alpha Vantage API key for stock price data OpenAI account for AI analysis Google Sheets access for portfolio input Slack account for message delivery n8n instance (cloud or self-hosted) 🎯 Target Audience Stock investors Portfolio managers Market analysts Finance teams Founders and operators tracking markets Automation builders in finance
by Databox
Stop switching between Slack and your analytics dashboards. Mention the bot in any Slack channel, ask a business question, and get an AI-powered answer from Databox in seconds - without leaving Slack. Who's it for Marketing and growth teams** who want instant data answers during standups Managers** who need quick metric checks without logging into dashboards Anyone using Slack** who wants to query Databox data conversationally How it works @mention the bot in Slack with a business question Slack Trigger captures the message and passes it to the AI Agent AI Agent queries Databox via MCP in real time Answer is posted back to the same Slack channel Requirements Databox account** (free plan works) OpenAI API key (or Anthropic) Slack account with a custom Slack app (setup guide in the sticky notes) How to set up Setup takes around 15 minutes. The main step is creating a custom Slack app: Go to api.slack.com/apps - create an app - add app_mentions:read and chat:write scopes Copy the Bot Token (xoxb-) and Signing Secret - add as a Slack API credential in n8n Click Databox MCP Tool - set Authentication to OAuth2 and authorize Add your OpenAI API key to the Chat Model node Activate - paste the webhook URL into Slack Event Subscriptions Invite the bot to your channel with /invite @BotName
by Guillaume Duvernay
Stop duplicating your work! This template demonstrates a powerful design pattern to handle multiple triggers (e.g., Form, Webhook, Sub-workflow) within a single, unified workflow. By using a "normalize and consolidate" technique, your core logic becomes independent of the trigger that started it, making your automations cleaner, more scalable, and far easier to maintain. Who is this for? n8n developers & architects:** Build robust, enterprise-grade workflows that are easy to maintain. Automation specialists:** Integrate the same core process with multiple external systems without repeating yourself. Anyone who values clean design:** Apply the DRY (Don't Repeat Yourself) principle to your automations. What problem does this solve? Reduces duplication:** Avoids creating near-identical workflows for each trigger source. Simplifies maintenance:** Update your core logic in one place, not across multiple workflows. Improves scalability:** Easily add new triggers without altering the core processing logic. Enhances readability:** A clear separation of data intake from core logic makes workflows easier to understand. How it works (The "Normalize & Consolidate" Pattern) Trigger: The workflow starts from one of several possible entry points, each with a unique data structure. Normalize: Each trigger path immediately flows into a dedicated Set node. This node acts as an adapter, reformatting the unique data into a standardized schema with consistent key names (e.g., mapping body.feedback to feedback). Consolidate: All "normalize" nodes connect to a single Set node. This node uses the generic {{ $json.key_name }} expression to accept the standardized data from any branch. From here, the workflow is a single, unified path. Setup This template is a blueprint. To adapt it: Replace the triggers with your own. Normalize your data: After each trigger, use a Set node to map its unique output to your common schema. Connect to the consolidator: Link all your "normalize" nodes to the Consolidate trigger data node. Build your core logic after the consolidation point, referencing the unified data. Taking it further Merge any branches:** Use this pattern to merge any parallel branches in a workflow, not just triggers. Create robust error handling:** Unify "success" and "error" paths before a final notification step to report on the outcome.
by Daniel Turgeman
How it works A webhook receives a new lead with an email address, which is validated Lusha enriches the lead with seniority, company size, industry, and phone data A lead score is calculated based on ICP fit (seniority, company size, industry match, data quality) Three-tier routing: hot leads (60+ pts) get a Slack alert + CRM upsert, warm leads (35-59) get CRM + Slack, cold leads go to nurture A webhook response returns the score and tier to the calling system Set up steps Install the Lusha community node Add your Lusha API, Slack, and HubSpot credentials Customize the scoring logic in the Code node to match your ICP criteria Set the Slack channels (#hot-leads, #warm-leads) and activate
by James Carter
This n8n template generates a dynamic weekly sales report from Airtable and sends it to Slack. It calculates key sales metrics like total pipeline value, weighted pipeline (based on deal stage), top deal, closed revenue, and win rate.. all formatted in a clean Slack message. How it works A schedule trigger starts the workflow (e.g., every Monday). It fetches deal data from Airtable, splits open vs closed deals, calculates all metrics with JavaScript, and formats the output. The message is then sent to Slack using Markdown for readability. How to use Update the Airtable credentials and select your base and table with fields: Deal Name, Value, Status, etc. Set the Slack channel in the final node to your preferred sales or ops channel. Requirements Airtable base with relevant deal data (see field structure) Slack webhook or token for sending messages Customising this workflow You can adapt the logic to other CRMs like Salesforce or HubSpot, add charts, or tweak stage weights. You can also change the schedule or add filters (e.g., by rep or region).