by Simon
Address Validation Workflow About This workflow automates the process of validating and correcting client shipping addresses in Billbee, ensuring accurate delivery information. It's ideal for e-commerce businesses looking to save time and reduce errors in their order fulfillment process. The workflow uses Billbee, an order management platform for small to medium-sized online retailers, and the Endereco API for address validation. Who Is This For? E-Commerce Businesses**: Streamline order fulfillment by automatically correcting common shipping address errors. Warehouse Teams**: Reduce manual work and ensure packages are shipped to the correct address. Small to Medium-Sized Retailers**: Businesses using Billbee to manage orders and requiring efficient, automated solutions for address validation. How it Works Trigger: Workflow starts via a Billbee Webhook when an order is imported. Fetch Data: Retrieve the client's shipping address using the Order ID. Validate Address: Send the address to the Endereco API for validation and correction (e.g., house number errors). Conditional Actions: Valid Address: Update the address in Billbee. Invalid Address: Tag the order with "Validation Error." Track Status: Add tags in Billbee for processed orders. Setup Steps API Keys: Obtain Billbee Developer/User API Keys and Endereco API Key. Billbee Rule: Create an automation rule: Trigger: Order imported. Action: Call External URL with OrderId to trigger n8n workflow. Optional: Use a secondary trigger (e.g., order state changes to "gepackt") for manual corrections. Customization Options Filter Delivery Addresses: Customize filters to exclude specific delivery types, such as pickup shops ("Postfiliale," "Paketshop," or "Packstation"). Filters can be adjusted within Billbee or in the workflow. Error Handling: Configure additional actions for orders that fail validation, such as notifying your team or flagging orders for manual review. Order Tags: Define custom tags in Billbee to better track order statuses (e.g., "Address Corrected," "Validation Error"). Trigger Types: Use additional triggers such as changes to order states (e.g., "gepackt" or "In Fulfillment") for manual corrections or validations. Address Fields: Modify the workflow to focus on specific address components, such as postal codes, city names, or country codes. Validation Rules: Adjust Endereco API settings or add custom logic to refine validation criteria based on your business needs. API Documentation Endereco**: Endereco API Docs Billbee**: Billbee API Docs
by Simeon
🔄 Reddit Content Operations via MCP Server 🧑💼 Who is this for? This workflow is built for content creators, marketers, Reddit automation enthusiasts, and AI agent developers who want structured, programmable access to Reddit content. If you're researching niche communities, tracking trends, or automating Reddit engagement — this is for you. 💡 What problem is this workflow solving? Reddit has valuable content scattered across subreddits, but manual analysis or engagement is inefficient. This workflow acts as a centralized API interface to: Query and manage Reddit posts Create, fetch, delete, and reply to comments Analyze subreddit metadata and behavior Enable AI agents to autonomously operate on Reddit data It does this using an MCP (Model Context Protocol) Server over Server-Sent Events (SSE). ⚙️ What this workflow does This template sets up a custom MCP Server that listens for JSON-based operation commands sent via SSE. Based on the operation, it routes the request to one of the following branches: 🟥 Post CRUD Create a new Reddit post Search posts across subreddits Fetch posts by ID Delete existing posts 🟩 Comment CRUD Create or reply to comments Fetch multiple comments from posts Delete specific comments 🟦 Subreddit Read Operations Get information about subreddits List subreddit posts Retrieve subreddit rules 🛠 Setup Import this workflow into your self-hosted n8n instance. Configure Reddit credentials (OAuth2). Connect your input system to the MCP Server Trigger node via SSE. Send operation payloads to the server like this: { "operation": "post_search", "params": { "query": "AI agents", "subreddit": "machinelearning" } } The workflow will route to the appropriate node based on operation type. 🧩 Supported Operations post_create post_get_many post_search post_delete post_get_by_id comment_create comment_reply comment_get_many comment_delete subreddit_get_about subreddit_get_many subreddit_get_rules 🧠 How to customize this workflow to your needs Add new operations to the operation_switch node for additional API functionality. Chain results into Notion, Slack, Airtable, or external APIs. Integrate with OpenAI/GPT to summarize posts or filter content. Add logic to score and sort content by engagement, sentiment, or keywords. 🟨 Sticky Notes Each operation group is color-coded (Posts, Comments, Subreddits). Sticky Notes explain the purpose and dependencies of each section. Easy to maintain and extend with clear logical separation. ⚠️ This template uses a custom MCP Server node and only works in self-hosted n8n. 🖼 Workflow Preview
by KlickTipp
Community Node Disclaimer: This workflow uses KlickTipp community nodes. How It Works AI Agent and KlickTipp Tools Integration via Telegram: This component connects a large language model (LLM), such as Claude or OpenAI, to the KlickTipp contact management platform through Telegram messaging. The AI Agent interprets natural language queries received from Telegram and dynamically maps them to KlickTipp API operations, enabling intuitive and automated contact handling through a familiar messaging interface. Key Features Telegram & LLM Interaction Setup: Captures messages received via Telegram bot as an alternative to the chat message node. Maintains conversation state using a memory buffer tied to Telegram chat IDs. Interprets user input using an LLM (Claude or OpenAI). Routes interpreted commands to specific KlickTipp tools based on detected intent. Sends responses back to Telegram users with operation results. KlickTipp Integration: Complete set of KlickTipp API endpoints included: Contact Management:** Add, update, get, list, delete, and unsubscribe contacts. Contact Tagging:** Tag, untag, list tagged contacts. Tag Operations:** Create, get, update, delete, list tags. Opt-In Processes:** List and retrieve opt-in process details. Data Fields:** List and get custom data fields. Redirects:** Retrieve redirect URLs. Use Cases Supported: Query contact information via email or name through Telegram messages. Identify and segment contacts by city, region, or behavior via Telegram commands. Create or update contacts from data provided in Telegram messages. Dynamically apply or remove tags to initiate campaigns through Telegram bot interactions. Automate targeted outreach based on contact attributes using Telegram as the control interface. Setup Instructions Install and Configure Nodes: Set up a Telegram bot using BotFather and obtain the bot token. Configure the Telegram Trigger node in n8n with your bot token. Configure the LLM model (e.g., OpenAI or Claude) and memory node if used. Connect all required KlickTipp nodes and authenticate using valid API credentials. Activate the workflow. Define Tagging and Field Mapping: Identify which fields and tags are relevant to your use cases. Ensure necessary tags and custom fields are already created in KlickTipp. Workflow Logic: Trigger via Telegram: A message is received by the Telegram bot and passed to the AI Agent. Query Handling via LLM Agent: AI interprets the natural language input and determines the action. Contact Search & Segmentation: Searches contacts using identifiers (email, address) or criteria. Data Operations: Retrieves, updates, or manages contact and tag data based on interpreted command. Campaign Preparation: Applies tags or sends campaign triggers depending on query results. Response via Telegram: Sends formatted results back to the Telegram user. Benefits: Mobile-First Interface:** Users can manage KlickTipp contacts directly from Telegram on any device. AI-Powered Automation:** Reduces manual contact search and tagging efforts through intelligent processing. Scalable Integration:** Built-in support for full range of KlickTipp operations allows diverse use-case handling. Data Consistency:** Ensures structured data flows between Telegram, AI, and KlickTipp, minimizing errors. Testing and Deployment: Use defined Telegram messages such as: “Tell me something about the contact with email address X” “Tag all contacts from region Y” “Send campaign Z to customers in area A” Validate expected actions in KlickTipp after message execution and confirm responses in Telegram. Notes: Customization:** Adjust tag logic, AI prompts, and contact field mappings based on project needs. Extensibility:** The template can be expanded with further logic for Google Sheets input or campaign feedback loops Resources: Use KlickTipp Community Node in n8n Automate Workflows: KlickTipp Integration in n8n
by Mark Shcherbakov
Video Guide I prepared a detailed guide to help you set up your workflow effectively, enabling you to extract insights from YouTube for content generation using an AI agent. Youtube Link Who is this for? This workflow is ideal for content creators, marketers, and analysts looking to enhance their YouTube strategies through data-driven insights. It’s particularly beneficial for individuals wanting to understand audience preferences and improve their video content. What problem does this workflow solve? Navigating the content generation and optimization process can be complex, especially without significant audience insight. This workflow automates insights extraction from YouTube videos and comments, empowering users to create more engaging and relevant content effectively. What this workflow does The workflow integrates various APIs to gather insights from YouTube videos, enabling automated commentary analysis, video transcription, and thumbnail evaluation. The main functionalities include: Extracting user preferences from comments. Transcribing video content for enhanced understanding. Analyzing thumbnails via AI for maximum viewer engagement insights. AI Insights Extraction: Automatically pulls comments and metrics from selected YouTube creators to evaluate trends and gaps. Dynamic Video Planning: Uses transcriptions to help creators outline video scripts and topics based on audience interest. Thumbnail Assessment: Provides analysis on thumbnail designs to improve click-through rates and viewer attraction. Setup N8N Workflow API Setup: Create a Google Cloud project and enable the YouTube Data API. Generate an API key to be included in your workflow requests. YouTube Creator and Video Selection: Start by defining a request to identify top creators based on their video views. Capture the YouTube video IDs for further analysis of comments and other video metrics. Comment Analysis: Gather comments associated with the selected videos and analyze them for user insights. Video Transcription: Utilize the insights from transcriptions to formulate content plans. Thumbnail Analysis: Evaluate your video thumbnails by submitting the URL through the OpenAI API to gain insights into their effectiveness.
by David Ashby
Complete MCP server exposing 1 Recommendation API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add Recommendation API credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the Recommendation API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.ebay.com{basePath} • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (1 total) 🔧 Find (1 endpoints) • POST /find: Get Promoted Listings Recommendations 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native Recommendation API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Joseph LePage
Multi-AI Agent Chatbot for Postgres/Supabase Databases and QuickChart Generation Who is this for? This workflow is ideal for data analysts, developers, and business intelligence teams who need an AI-powered chatbot to query Postgres/Supabase databases and generate dynamic charts for data visualization. What problem does this solve? It simplifies data exploration by combining conversational AI with database querying and chart generation. Users can interact with their database using natural language, retrieve insights, and visualize data without manual SQL queries or chart configuration. What this workflow does AI-Powered Chat Interface: Accepts natural language prompts to query databases or generate charts. Routes user requests through a tool agent system to determine the appropriate action (query or chart). Database Querying: Executes SQL queries on Postgres/Supabase databases based on user input. Retrieves schema information, table definitions, and specific data records. Dynamic Chart Generation: Uses QuickChart to create bar charts, line charts, or other visualizations from database records. Outputs a shareable chart URL or JSON configuration for further customization. Memory Integration: Maintains chat history using Postgres memory nodes, enabling context-aware interactions. Workflow diagram showcasing AI agents, database querying, and chart generation paths. Setup Prerequisites: A Postgres-compatible database (e.g., Supabase). API credentials for OpenAI. Configuration Steps: Add your database connection credentials in the Postgres nodes. Set up OpenAI credentials for GPT-4o-mini in the language model nodes. Adjust the QuickChart schema in the "QuickChart Object Schema" node to fit your use case. Testing: Trigger the chat workflow via the "When chat message received" node. Test with prompts like "Generate a bar chart of sales data" or "Show me all users in the database." How to customize this workflow Modify AI Prompts** Add Chart Types** Integrate Other Tools**
by Tanay Agarwal
Who is this for? This workflow is ideal for HR teams, startups, and enterprises that want to handle employee interactions through WhatsApp and automate responses using LLM (OpenAI) and intelligent routing. What problem is this workflow solving? Managing WhatsApp messages manually can be time-consuming and error-prone. This workflow solves that by: Auto-classifying messages using LLM Routing them to the right AI-powered agent Automating leave approvals, attendance, HR FAQs, complaints, and candidate shortlisting Delivering final responses interactively via WhatsApp What this workflow does WhatsApp Trigger captures incoming messages LLM Classification analyzes message intent and outputs category (1–5) Switch Node routes the message to the correct agent: 1 → Leave Agent 2 → HR FAQ Chatbot 3 → Attendance Agent 4 → Complaint/Request Agent 5 → Shortlisting Agent Each agent performs specific tasks using tools like: Google Sheets (fetch dept head emails, JD/applicants, logs) Google Calendar (schedule meetings) Vector Search (for policy embeddings) OpenAI (transcription, classification, chatbot) Final WhatsApp Response node sends updates and interactive options to the user Setup Connect WhatsApp API (e.g., via Twilio or WhatsApp Business Cloud API) Configure OpenAI credentials Set up Google Sheets with: Employee data JD and applicants info Policy documents (for embedding) Prepare Google Calendar access Create a vector store with embedded company policy docs How to customize this workflow to your needs Update the LLM prompt to suit your company’s categories or expand to more intents Replace sample sheets with your organization’s actual data Train your own policy embeddings if needed Add/modify agents (e.g., Payroll Bot, IT Support Bot) by cloning an existing pattern Adjust the Switch Node if you add more classifications With this modular and intelligent setup, you can turn your WhatsApp into a smart HR & operations assistant powered by AI, accessible 24/7.
by Jimleuk
This n8n workflow builds an appointment scheduling AI agent which can Take enquiries from prospective customers and help them book an appointment by checking appointment availability Where no appointment is booked, the Agent is able to send follow-up messages to re-engage leads. After an appointment is booked, the agent is able reschedule or even cancel the booking for the user without human intervention. For small outfits, this workflow could contribute the necessary "man-power" required to increase business sales. The sample Airtable can be found here: https://airtable.com/appO2nHiT9XPuGrjN/shroSFT2yjf87XAox 2024-10-22 Updated to Cal.com API v2. How it works The customer sends an enquiry via SMS to trigger our workflow. For this trigger, we'll use a Twilio webhook. The prospective or existing customer's number is logged in an Airtable Base which we'll be using to track all our enquries. Next, the message is sent to our AI Agent who can reply to the user and decide if an appointment booking can be made. The reply is made via SMS using Twilio. A scheduled trigger which runs every day, checks our chat logs for a list of prospective customers who have yet to book an appointment but still show interest. This list is sent to our AI Agent to formulate a personalised follow-up message to each lead and ask them if they want to continue with the booking. The follow-up interaction is logged so as to not to send too many messages to the customer. Requirements A Twilio account to receive customer messages. An Airtable account and Base to use as our datastore for enquiries. Cal.com account to use as our scheduling service. OpenAI account for our AI model. Customising this workflow Not using Airtable? Swap this out for your CRM of choice such as hubspot or your own service. Not using Cal.com? Swap this out for API-enabled services such as Acuity Scheduling or your own service.
by Geekaz / Kazen
Who is this for? This template is designed for social media managers, content creators, data analysts, and anyone who wants to automatically save and analyze their Meta Threads posts in Notion. It’s particularly useful for: Building a personal archive of your Threads content. Training AI models using your social media data. Tracking your online presence and engagement. What this workflow does This workflow uses the Meta Threads API to automatically retrieve your posts and import them into a Notion database. It retrieves the post content, date, and time, and stores them in designated properties within your Notion database. Setup Get Threads Access Token and ID: Obtain a long-lived access token and your Threads ID from the Meta Threads developer platform. This token auto-refreshes, ensuring uninterrupted workflow operation. Configure Credentials and Date Range: In the “Set Credentials” node (using edit fields), enter your token and ID. Set the since and until parameters in the “Set Date Range” node to specify the post import period. Connect to Notion and Create a Database: Connect to your Notion workspace and create a database with these properties (customize with the “Create Properties” node): a. Title: Threads post URL (Notion entry title). b. Threads ID: Unique post ID (prevents duplicate imports). c. Username: Post author (for future multi-account/source management). d. Post Date: Original post date. e. Source (Multi-Select): “Threads” tag (for future multi-platform import and filtering). f. Created: Import date and time. g. Import Check (Optional): For use with a separate post-categorization workflow.
by Zacharia Kimotho
This workflow automates sentiment analysis of Reddit posts related to Apple's WWDC25 event. It extracts data, categorizes posts, analyzes sentiment of comments, and updates a Google Sheet with the results. Preliquisites Bright Data Account: You need a Bright Data account to scrape Reddit data. Ensure you have the correct permissions to use their API. https://brightdata.com/ Google Sheets API Credentials: Enable the Google Sheets API in your Google Cloud project and create credentials (OAuth 2.0 Client IDs). Google Gemini API Credentials: You need a Gemini API key to run the sentiment analysis. Ensure you have the correct permissions to use their API. https://ai.google.dev/". You can use any other models of choice Setup Import the Workflow: Import the provided JSON workflow into your n8n instance.", Configure Bright Data Credentials:, In the 'scrap reddit' and the 'get status' nodes, in Header Parameters find the Authorization field, replace Bearer 1234 with your Bright Data API key. Apply this to every node that utilizes your Bright Data API Key., Set up the Google Sheets API credentials, In the 'Append Sentiments' node, set up the Google Sheets API by connecting your Google Sheets account through oAuth 2 credentials. ", Configure the Google Gemini Credential ID, In the ' Sentiment Analysis per comment' node, set up the Google Gemini API by connecting your Google AI account through the API credentials. , Configure Additional Parameters:, In the 'scrap reddit' node, modify the JSON body to adjust the search term, date, or sort method., In the 'Wait' node, alter the 'Amount' to adjust the polling interval for scraping status, it is set to 15 seconds by default., In the 'Text Classifier' node, customize the categories and descriptions to suit the sentiment analysis needs. Review categories such as 'WWDC events' to ensure relevancy., In the 'Sentiment Analysis per comment' node, modify the system prompt template to improve context. customization_options Bright Data API parameters to adjust scraping behavior. Wait node duration to optimize polling. Text Classifier categories and descriptions. Sentiment Analysis system prompt. Use Case Examples Brand Monitoring:** Track public sentiment towards Apple during and after the WWDC25 event. Product Feedback Analysis:** Gather insights into user reactions to new product announcements. Competitive Analysis:** Compare sentiment towards Apple's announcements versus competitors. Event Impact Assessment:** Measure the overall impact of the WWDC25 event on various aspects of Apple's business. Target_audiences: Marketing professionals in the tech industry, Brand managers, Product managers, Market research analysts, Social media managers Troubleshooting: Workflow fails to start. Check that all necessary credentials (Bright Data and Google Sheets API) are correctly configured and that the Bright Data API key is valid. Data scraping fails. Verify the Bright Data API key, ensure the dataset ID is correct, and inspect the Bright Data dashboard for any issues with the scraping job. Sentiment analysis is inaccurate. Refine the categories and descriptions in the 'Text Classifier' node. Check that you have the correct Google Gemini API key, as the original is a placeholder. Google Sheets are not updating. Ensure the Google Sheets API credentials have the necessary permissions to write to the specified spreadsheet and sheet. Check API usage limits. Workflow does not produce the correct output. Check the data connections, by clicking the connections, and looking at which data is being produced. Check all formulas for errors. Happy productivity!
by David Ashby
Complete MCP server exposing 1 Buy Marketing API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add Buy Marketing API credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the Buy Marketing API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.ebay.com/buy/marketing/v1_beta • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (1 total) 🔧 Merchandised_Product (1 endpoints) • GET /merchandised_product: Fetch Merchandised Products 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native Buy Marketing API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Custom Workflows AI
Introduction The "Automatic Weekly Digital PR Stories Suggestions" workflow is a sophisticated automated system designed to identify trending news stories on Reddit, analyze public sentiment through comment analysis, extract key information from source articles, and generate strategic angles for potential digital PR campaigns. This workflow leverages the power of social media trends, natural language processing, and AI-driven analysis to deliver curated, sentiment-analyzed news opportunities for PR professionals. Operating on a weekly schedule, the workflow searches Reddit for posts related to specified topics, filters them based on engagement metrics, and performs a deep analysis of both the content and public reaction. It then generates comprehensive reports that include story opportunities, audience insights, and strategic recommendations. These reports are automatically compiled, stored in Google Drive, and shared with team members via Mattermost for immediate collaboration. This workflow solves the time-consuming process of manually monitoring social media for trending stories, analyzing public sentiment, and identifying PR opportunities. By automating these tasks, PR professionals can focus on strategy development and execution rather than spending hours on research and analysis. Who is this for? This workflow is designed for digital PR professionals, content marketers, communications teams, and media relations specialists who need to stay on top of trending stories and public sentiment to develop timely and effective PR campaigns. It's particularly valuable for: PR agencies managing multiple clients across different industries In-house PR teams needing to identify media opportunities quickly Content marketers looking for trending topics to create timely content Communications professionals monitoring public perception of industry news Users should have basic familiarity with n8n workflows and the PR strategy development process. While technical knowledge of the integrated APIs is not required to use the workflow, some understanding of Reddit, sentiment analysis, and PR campaign development would be beneficial for interpreting and acting on the generated reports. What problem is this workflow solving? Digital PR professionals face several challenges that this workflow addresses: Information Overload: Manually monitoring social media platforms for trending stories is time-consuming and often results in missed opportunities. Sentiment Analysis Complexity: Understanding public perception of news stories requires reading through hundreds of comments and identifying patterns, which is labor-intensive and subjective. Content Extraction: Visiting multiple news sources to read and analyze articles takes significant time. Strategic Angle Development: Identifying unique PR angles that leverage trending stories and public sentiment requires synthesizing large amounts of information. Team Collaboration: Sharing findings and insights with team members in a structured format can be cumbersome. By automating these processes, the workflow enables PR professionals to quickly identify trending stories with PR potential, understand public sentiment, and develop strategic angles based on comprehensive analysis, all while maintaining a structured approach to team collaboration. What this workflow does Overview The workflow automatically identifies trending posts on Reddit related to specified topics, analyzes both the content of linked articles and public sentiment from comments, and generates comprehensive PR strategy reports. These reports include story opportunities, audience insights, and strategic recommendations based on the analysis. The final reports are compiled, stored in Google Drive, and shared with team members via Mattermost. Process Topic Selection and Reddit Search: The workflow starts with a list of topics specified in the "Set Data" node It searches Reddit for posts related to these topics Posts are filtered based on upvotes and other criteria to focus on trending content Comment Analysis: For each post, the workflow retrieves comments It extracts the top 30 comments based on score Using Claude AI, it analyzes the comments to understand: Overall sentiment Dominant narratives Audience insights PR implications Content Analysis: The workflow extracts the content of the linked article using Jina AI It analyzes the content to identify: Core story elements Technical aspects Narrative opportunities Viral elements PR Strategy Development: Based on the combined analysis of comments and content, the workflow generates: First-mover story opportunities Trend-amplifier story ideas Priority rankings Execution roadmap Strategic recommendations Report Generation and Distribution: The workflow compiles comprehensive reports for each post Reports are converted to text files All files are compressed into a ZIP archive The archive is uploaded to Google Drive A link to the archive is shared with team members via Mattermost Setup To set up this workflow, follow these steps: Import the Workflow: Download the workflow JSON file Import it into your n8n instance Configure API Credentials: Reddit: Add a new credential "Reddit OAuth2 API" by following the guide at https://docs.n8n.io/integrations/builtin/credentials/reddit/ Anthropic: Add a new credential "Anthropic Account" by following the guide at https://docs.n8n.io/integrations/builtin/credentials/anthropic/ Google Drive: Add a new credential "Google Drive OAuth2 API" by following the guide at https://docs.n8n.io/integrations/builtin/credentials/google/oauth-single-service/ Configure the "Set Data" Node: Set your interested topics (one per line) Add your Jina API key (obtain from https://jina.ai/api-dashboard/key-manager) Configure the Mattermost Node: Update your Mattermost instance URL Set your Webhook ID and Channel Follow the guide at https://developers.mattermost.com/integrate/webhooks/incoming/ for webhook setup Adjust the Schedule (Optional): The workflow is set to run every Monday at 6am Modify the "Schedule Trigger" node if you need a different schedule Test the Workflow: Run the workflow manually to ensure all connections are working properly Check the output to verify the reports are being generated correctly How to customize this workflow to your needs This workflow can be customized in several ways to better suit your specific requirements: Topic Selection: Modify the topics in the "Set Data" node to focus on industries or subjects relevant to your PR strategy Add multiple topics to cover different client interests or market segments Filtering Criteria: Adjust the "Upvotes Requirement Filtering" node to change the minimum upvotes threshold Modify the filtering conditions to include or exclude certain types of posts Analysis Parameters: Customize the prompts in the "Comments Analysis," "News Analysis," and "Stories Report" nodes to focus on specific aspects of the content or comments Adjust the temperature settings in the Anthropic Chat Model nodes to control the creativity of the AI responses Report Format: Modify the "Set Final Report" node to change the structure or content of the final reports Add or remove sections based on your specific reporting needs Distribution Method: Replace or supplement the Mattermost notification with email notifications, Slack messages, or other communication channels Add additional storage options beyond Google Drive Schedule Frequency: Change the "Schedule Trigger" node to run the workflow more or less frequently Set up multiple triggers for different topics or clients Integration with Other Systems: Add nodes to integrate with your CRM, content management system, or project management tools Create connections to automatically populate content calendars or task management systems