by Juan Carlos Cavero Gracia
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Description This automation template is designed for content curators, marketers, and anyone looking to supercharge their content sharing strategy. It transforms any web article, blog post, or news link into a series of platform-specific social media posts, generated by AI. It also captures a live screenshot of the webpage to use as the post image, automating the entire process of publishing them across X (Twitter), LinkedIn, Threads, and Reddit. Note: The default example is configured to share n8n templates, but this workflow can promote any web page, article, or news story. Just change the URL! The upload-post node only works for self-hosted n8n instances, but you can use the standard http node for uploading the content* Who Is This For? Content Curators & Marketers:** Effortlessly share valuable industry news and articles with tailored messages and visuals for each audience. Social Media Managers:** Keep your social feeds consistently active with relevant, high-quality content without the manual overhead. Community Builders & Brand Evangelists:** Quickly disseminate product updates, tutorials, and blog posts to your community on all relevant platforms. Professionals & Thought Leaders:** Build your personal brand by easily sharing insightful articles with automated visuals, adding your unique perspective. What Problem Does This Workflow Solve? Sharing a single piece of content across multiple social platforms is tedious. You need to manually write unique posts, create visuals, and then publish everything. This workflow addresses these challenges by: Automating Content Creation:** Uses a powerful AI agent (Google Gemini) to read any URL and write compelling, unique posts for each social network. Generating Visuals Automatically:** Captures a high-quality screenshot of the source webpage to use as a visually appealing image in your posts, increasing engagement. Ensuring Platform-Specific Tone:** The AI is instructed to generate professional posts for LinkedIn, concise threads for X, conversational updates for Threads, and community-focused posts for Reddit. One-Click Distribution:** Takes a single URL as input and handles the entire content creation and sharing process across multiple platforms automatically. How It Works Input a URL: In the "Set Input Data" node, simply paste the URL of the article or page you want to share. AI Analysis & Generation: The workflow sends the URL to the AI agent, which scrapes the content and generates four distinct, ready-to-publish posts. Screenshot Generation: At the same time, it uses the ScreenshotOne service to capture a high-quality image of the provided URL. Cross-Platform Publishing: The generated content and the screenshot are automatically sent to the corresponding nodes to be posted on X, LinkedIn, and Threads, while the text-only version is sent to Reddit. Setup AI Model Credentials: Add your Google Gemini API key to the Google Gemini Chat Model node to power the AI agent. Screenshot Service (ScreenshotOne): The workflow uses ScreenshotOne to generate images for your posts. Create a free account at screenshotone.com to get your own API key. The free plan includes 100 screenshots per month. In the Upload Post X, Upload Post LinkedIn, and Upload Post Threads nodes, go to the Photos parameter (under Additional Fields) and replace the existing access_key in the URL with your own. Upload-Post Account: This workflow uses upload-post.com for multi-platform posting. Create a free account at upload-post.com to get your API Token and User ID. Add the credentials in the Upload Post X, Upload Post LinkedIn, and Upload Post Threads nodes. Reddit Credentials: Connect your Reddit account using OAuth2 in the Reddit node to enable posting. Customize the AI: (Optional) Edit the prompt in the Social Media Agent node to match your content. The default prompt is optimized for sharing n8n templates, but you can easily adapt it for any topic to fit your brand's voice and style. Requirements Accounts:** n8n, Google (for Gemini API), ScreenshotOne, upload-post.com, Reddit. API Keys & Credentials:** Google Gemini API Key, ScreenshotOne API Key, Upload-post.com API Token & User ID, Reddit OAuth2 credentials. Use this template to become a content-sharing powerhouse, saving hours of work while increasing your reach and engagement across the web.
by Yar Malik (Asfandyar)
How it works Trigger: Listens for an incoming chat message Copy Assistant: Feeds the message (plus memory) into an OpenAI Chat Model and exposes two “tools” Cold Email Writer Tool Sales Letter Tool• Tool execution: Depending on the user’s intent, the appropriate tool generates the copy • Save output: Writes the generated email or sales letter into your target document via the Update a document node Set up steps • Configure your OpenAI Chat Model credentials in n8n (no hard-coded keys!) • Add and authenticate the Simple Memory credential (to keep context across messages) • Create Google Docs (or MS Word) credentials for the Update a document node • Ensure your Chat trigger is pointing at your incoming-message endpoint • Mandatory: Drop sticky-note annotations on each tool node explaining where to enter API keys and how to tweak prompts Once everything’s wired up, send a test chat message like “Write me a cold email for a fintech startup” and watch the workflow spin up a polished draft in your document. How to use Import the workflow JSON into n8n. Configure your Chat trigger (webhook or form) to receive incoming messages. Send a chat prompt like: “Write me a cold email for a B2B SaaS offering.” The “Copy Assistant” custom GPT picks the right tool (Cold Email or Sales Letter). Generated copy is written directly into your linked Google Doc or Word document. Requirements OpenAI API Key (with Chat Completions & Custom GPTs enabled) Custom Assistant created in your ChatGPT dashboard (Assistant ID pasted into the Chat Model node) n8n instance (Cloud or self-hosted) with credentials set up for: Simple Memory (to persist context) Google Docs or Microsoft Word (for document output) Customising this workflow Tweak system and user prompts inside the Copy Assistant node to fit your brand voice. Swap in Slack, Teams or email nodes instead of a document writer to deliver copy where you need it. Add or remove tools (e.g., “Follow-up Email Writer”) by duplicating the existing tool pattern. Use sticky-note annotations on every node to explain where to enter API keys, Assistant IDs, or prompt tweaks.
by Praveena
Idea The idea for app came since I wanted to build a unique gift for my niece because she gets excited for her birthday (which Im going to miss this year). The web app has a simple countdown (in html and JS) but more importantly, there is an AI agent that will answer some specific questions and know her preferences. How it works The questions from app are sent via web hook to N8N which has pulls preferences file (about her likes, dislikes, personality) from postgre and AI Agent that will answer questions/respond. The current status is stored back in postgre (especially about status of cat and universe happenings) before responding back. Features Integrated AI chatbot via N8N webhook Persistent conversation history Minimizable chat interface Fallback support for offline testing Features: -- Wheres Mittens - This is a query to track her lost cat in multiverse. -- Multiverse updates with recent update stored Pre Requisites Postgre SQL database is available. Alternatively, use any other database but change the N8N nodes. LLM Api Key. Step by Step Instructions Export this N8N Workflow. Modify LLM API Key, I used openAI, 4.1 For web app scofflding,you will need Node, HTML and Javascript. I've created a mini version using Node and JS with web app and N8N connection settings here: <https://github.com/productiser/FiBirthdayAgent> PostgreSQL Database Script (1 table for memory and context storage): CREATE TABLE fifi_world_context ( id TEXT PRIMARY KEY, -- e.g., 'agent_fifi' cat_location TEXT, -- e.g., "Bubble Nebula" cat_activity TEXT, -- e.g., "Playing laser tag with moon mice" fifi_preferences JSONB, -- e.g., likes/dislikes/foods/shows world_history TEXT, -- Summary of narrative events last_updated TIMESTAMP DEFAULT CURRENT_TIMESTAMP ); 5.Modify system prompt as per your needs. Built With N8N Self hosted Self hosted web app Hosted on Vercel Total spend = <£1 (AI costs only) Total Time = <1 day Support Watch this video for web app overview and how it looks. <https://youtu.be/e7PlrTdvwoM> Contact me on info@pankstr.com/ superllmuser@gmail.com for any queries Hope you enjoy!!
by Pavel Duchovny
Who is this for? This workflow is designed for: Database administrators and developers working with MongoDB Content managers handling movie databases Organizations looking to implement AI-powered search and recommendation systems Developers interested in combining LangChain, OpenAI, and MongoDB capabilities What problem does this workflow solve? Traditional database queries can be complex and require specific MongoDB syntax knowledge. This workflow addresses: The complexity of writing MongoDB aggregation pipelines The need for natural language interaction with movie databases The challenge of maintaining user preferences and favorites The gap between AI language models and database operations What this workflow does This workflow creates an intelligent agent that: Accepts natural language queries about movies Translates user requests into MongoDB aggregation pipelines Queries a movie database containing detailed information including: Plot summaries Genre classifications Cast and director information Runtime and release dates Ratings and awards Provides contextual responses using OpenAI's language model Allows users to save favorite movies to the database Maintains conversation context using a window buffer memory Setup Required Credentials: OpenAI API credentials MongoDB connection details Node Configuration: Configure the MongoDB connection in the MongoDBAggregate node Set up the OpenAI Chat Model with your API key Ensure the webhook trigger is properly configured for receiving chat messages Database Requirements: A MongoDB collection named "movies" with the specified document structure Proper indexes for efficient querying Appropriate user permissions for read/write operations How to customize this workflow Modify the Document Structure: Update the tool description in the MongoDBAggregate node to match your collection schema Adjust the aggregation pipeline templates for your specific use case Enhance the AI Agent: Customize the prompt in the "AI Agent - Movie Recommendation" node Modify the window buffer memory size based on your context needs Add additional tools for more functionality Extend Functionality: Add more MongoDB operations beyond aggregation Implement additional workflows for different types of queries Create custom error handling and validation Add user authentication and rate limiting Integration Options: Connect to external APIs for additional movie data Add webhook endpoints for different platforms Implement caching mechanisms for frequent queries Add data transformation nodes for specific output formats This workflow serves as a foundation that can be adapted to various use cases beyond movie recommendations, such as e-commerce product search, content management systems, or any scenario requiring intelligent database interaction.
by Yaron Been
Transform raw customer feedback into powerful testimonial quotes automatically. This intelligent n8n workflow monitors feedback forms, uses AI to identify and extract the most emotionally engaging testimonial content, and organizes everything into a searchable database for your marketing campaigns. 🔄 How It Works This streamlined 4-step automation turns feedback into marketing assets: Step 1: Continuous Feedback Monitoring The workflow monitors your Google Sheets (connected to feedback forms) every minute, instantly detecting new customer submissions and triggering the extraction process. Step 2: Intelligent Quote Extraction Google Gemini AI analyzes each feedback submission using specialized prompts designed to: Identify emotionally engaging phrases and statements Extract short, impactful testimonial quotes from longer feedback Filter out neutral, irrelevant, or negative content Focus on marketing-ready, quotable customer experiences Preserve the authentic voice and emotion of the original feedback Step 3: Automated Database Population Extracted testimonials are automatically written back to your Google Sheets in a dedicated "Testimony" column, creating an organized, searchable database of customer quotes ready for marketing use. Step 4: Instant Team Notification Email alerts are sent immediately to your marketing team with each new extracted testimonial, ensuring no valuable social proof goes unnoticed or unused. ⚙️ Setup Steps Prerequisites Google Workspace account for Forms, Sheets, and Gmail Google Gemini API access for intelligent quote extraction n8n instance (cloud or self-hosted) Basic understanding of Google Forms and customer feedback collection Required Google Forms Structure Create a customer feedback form with these essential fields: 📝 Required Form Fields: Name (Short answer text) Email Address (Email field with validation) Feedback (Paragraph text - this is where testimonials are extracted from) Testimony (Leave blank - will be auto-populated by AI) Form Design Best Practices: Use open-ended questions to encourage detailed responses Ask specific questions about customer experience and outcomes Include questions about before/after results for powerful testimonials Make the feedback field prominent and easy to complete Configuration Steps 1. Credential Setup Google Sheets OAuth2**: Monitor feedback responses and update testimonial database Google Gemini API Key**: Extract intelligent, emotionally engaging quotes from feedback Gmail OAuth2**: Send automated notifications to marketing team Google Forms Integration**: Ensure seamless data flow from feedback forms 2. Google Sheets Configuration Verify your feedback response sheet contains proper column structure: | Timestamp | Name | Email | Feedback | Testimony | 3. AI Extraction Optimization The default prompt extracts impactful testimonials, but can be customized for: Industry-Specific Language**: Healthcare, technology, finance, retail terminology Quote Length Preferences**: Short punchy quotes vs longer detailed testimonials Emotional Tone Targeting**: Excitement, relief, satisfaction, transformation Content Focus**: Results-oriented, process-focused, or relationship-based testimonials 4. Notification Customization Email alerts can be configured for: Multiple Recipients**: Marketing team, sales team, customer success Custom Subject Lines**: Include customer name, product type, or urgency indicators Rich Content**: Include full feedback alongside extracted testimonial Categorization**: Different alerts for different product lines or service types 5. Quality Control Implementation Extraction Confidence**: Set minimum quality thresholds for extracted quotes Manual Review Process**: Flag testimonials for human review before publication Approval Workflows**: Add approval steps for high-value or sensitive testimonials Version Control**: Track original feedback alongside extracted quotes 🚀 Use Cases E-commerce & Retail Product Reviews**: Extract compelling quotes from detailed product feedback Customer Success Stories**: Identify transformation narratives from user experiences Social Proof Collection**: Build testimonial libraries for product pages and ads Review Mining**: Turn long reviews into short, shareable testimonial quotes SaaS & Technology Companies User Experience Feedback**: Extract quotes about software usability and impact ROI Testimonials**: Identify statements about business results and efficiency gains Feature Feedback**: Capture specific praise for product capabilities and benefits Customer Success Metrics**: Extract quantifiable results and outcome statements Professional Services Client Success Stories**: Transform project feedback into powerful case study quotes Service Quality Testimonials**: Extract praise for expertise, communication, and results Consulting Impact**: Identify statements about business transformation and growth Relationship Testimonials**: Capture quotes about trust, partnership, and collaboration Healthcare & Wellness Patient Experience**: Extract quotes about care quality and health outcomes Treatment Success**: Identify statements about symptom improvement and recovery Provider Relationships**: Capture testimonials about bedside manner and communication Wellness Journey**: Extract quotes about lifestyle changes and health transformations Education & Training Student Success Stories**: Extract quotes about learning outcomes and career impact Course Effectiveness**: Identify statements about skill development and knowledge gains Instructor Praise**: Capture testimonials about teaching quality and support Career Transformation**: Extract quotes about professional growth and opportunities 🔧 Advanced Customization Options Multi-Category Extraction Enhance extraction with specialized processing: Product-Specific: Extract testimonials for different product lines separately Service-Based: Customize extraction for various service offerings Demographic-Focused: Tailor extraction for different customer segments Journey-Stage: Extract testimonials for awareness, consideration, and retention phases Quality Enhancement Features Implement advanced quality control: Sentiment Scoring**: Rate extracted testimonials for emotional impact Authenticity Verification**: Cross-reference testimonials with customer records Duplicate Detection**: Prevent similar testimonials from the same customer Content Enrichment**: Add context and customer details to extracted quotes Marketing Integration Extensions Connect to marketing and sales tools: Social Media Publishing**: Auto-post testimonials to Facebook, LinkedIn, Twitter Website Integration**: Push testimonials to website testimonial sections Email Marketing**: Include fresh testimonials in newsletter campaigns Sales Enablement**: Provide sales team with relevant testimonials for prospects Analytics and Reporting Generate insights from testimonial data: Testimonial Performance**: Track which quotes generate most engagement Customer Satisfaction Trends**: Analyze testimonial sentiment over time Product/Service Insights**: Identify most praised features and benefits Competitive Advantages**: Extract testimonials highlighting differentiators 📊 Extraction Examples Before (Raw Feedback): "I was really struggling with managing my team's projects and keeping track of all the deadlines. Everything was scattered across different tools and I was spending way too much time just trying to figure out what everyone was working on. Since we started using your project management software about 6 months ago, it's been a complete game changer. Now I can see everything at a glance, our team communication has improved dramatically, and we're actually finishing projects ahead of schedule. The reporting features are amazing too - I can finally show my boss concrete data about our team's productivity. I honestly don't know how we managed without it. The customer support team has been fantastic as well, always quick to help when we had questions during setup." After (AI Extracted Testimonial): "Complete game changer - now I can see everything at a glance, our team communication has improved dramatically, and we're actually finishing projects ahead of schedule." Healthcare Example: Before (Raw Feedback): "I had been dealing with chronic back pain for over 3 years and had tried everything - physical therapy, medication, different doctors. Nothing seemed to help long-term. When I found Dr. Martinez, I was honestly pretty skeptical because I'd been disappointed so many times before. But after our first consultation, I felt hopeful for the first time in years. She really listened to me and explained everything clearly. The treatment plan she developed was comprehensive but manageable. Within just 2 months, I was experiencing significant pain reduction, and now after 6 months, I'm practically pain-free. I can play with my kids again, sleep through the night, and even started hiking on weekends. Dr. Martinez didn't just treat my symptoms - she helped me get my life back." After (AI Extracted Testimonial): "Within just 2 months, I was experiencing significant pain reduction, and now I'm practically pain-free. Dr. Martinez didn't just treat my symptoms - she helped me get my life back." 🛠️ Troubleshooting & Best Practices Common Issues & Solutions Low-Quality Extractions Improve Feedback Questions**: Ask more specific, outcome-focused questions Refine AI Prompts**: Adjust extraction criteria for better quote selection Set Minimum Length**: Ensure feedback has sufficient content for meaningful extraction Quality Scoring**: Implement rating system for extracted testimonials Insufficient Feedback Volume Multiple Feedback Channels**: Collect testimonials through various touchpoints Incentivized Feedback**: Offer small rewards for detailed feedback submissions Follow-up Automation**: Send feedback requests to satisfied customers Timing Optimization**: Request feedback at optimal moments in customer journey Privacy and Consent Issues Permission Management**: Ensure customers consent to testimonial use Attribution Control**: Allow customers to specify how they want to be credited Approval Workflows**: Implement customer approval before publishing testimonials Data Protection**: Maintain compliance with privacy regulations Optimization Strategies Extraction Quality Enhancement Prompt Engineering**: Continuously refine AI prompts based on output quality A/B Test Extractions**: Test different extraction approaches for effectiveness Human Review Integration**: Combine AI extraction with human editorial oversight Context Preservation**: Maintain customer context alongside extracted quotes Marketing Integration Campaign Alignment**: Extract testimonials that support specific marketing campaigns Audience Segmentation**: Categorize testimonials for different target audiences Channel Optimization**: Format testimonials for specific marketing channels Performance Tracking**: Monitor which testimonials drive best marketing results Process Automation Multi-Stage Processing**: Implement multiple extraction and refinement steps Quality Gates**: Add checkpoints for testimonial quality and relevance Workflow Branching**: Route different types of feedback to appropriate processes Error Handling**: Implement fallbacks for failed extractions or poor-quality feedback 📈 Success Metrics Extraction Efficiency Processing Speed**: Reduce time from feedback submission to usable testimonial Success Rate**: Percentage of feedback submissions yielding quality testimonials Quote Quality**: Average rating of extracted testimonials by marketing team Volume Increase**: Growth in testimonial collection and database size Marketing Impact Testimonial Usage**: Frequency of extracted testimonials in marketing campaigns Conversion Rates**: Impact of AI-extracted testimonials on sales metrics Social Proof Effectiveness**: Engagement rates on testimonial-based content Customer Acquisition**: Attribution of new customers to testimonial-driven campaigns 📞 Questions & Support Need help implementing your AI Testimonial Extractor Agent? 📧 Specialized Technical Support Email**: Yaron@nofluff.online Response Time**: Within 24 hours on business days Expertise**: AI testimonial extraction, feedback form optimization, marketing automation 🎥 Comprehensive Learning Library YouTube Channel**: https://www.youtube.com/@YaronBeen/videos Complete setup guides for feedback form design and AI extraction Advanced prompt engineering techniques for testimonial quality Integration tutorials for marketing platforms and social media Best practices for customer feedback collection and testimonial usage Troubleshooting common extraction and quality issues 🤝 Professional Marketing Community LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Connect for ongoing testimonial marketing automation support Share your customer success story automation achievements Access exclusive templates for feedback forms and testimonial campaigns Join discussions about social proof marketing and customer experience automation 💬 Support Request Guidelines Include in your support message: Your industry and typical customer feedback patterns Current testimonial collection process and challenges Specific marketing channels where testimonials will be used Volume expectations and quality requirements Integration needs with existing marketing tools Ready to turn every customer feedback into marketing gold? Deploy this AI Testimonial Extractor Agent and build a powerful testimonial database that drives sales and builds trust with prospects automatically!
by Jaruphat J.
Who is this for? This workflow is ideal for businesses, accountants, and finance teams who receive bank slip images via LINE and want to automate the extraction of transaction details. It eliminates manual data entry and speeds up financial tracking. What problem does this workflow solve? Many businesses receive bank transfer slips via LINE from customers, but manually recording transaction details into spreadsheets is time-consuming and error-prone. This workflow automates the entire process, extracting structured data from the bank slips and storing it in Google Sheets for seamless record-keeping. What this workflow does: Receives bank slip images from LINE BOT Extracts transaction details (sender, receiver, amount, transaction ID) using SpaceOCR Automatically logs extracted data into Google Sheets Works with Standard Bank Slips & PromptPay transactions Eliminates manual data entry and reduces errors Setup Instructions: 1. Prerequisites A LINE BOT with Messaging API enabled A SpaceOCR API Key (Get from https://spaceocr.com/) A Google Sheets account to store extracted data An n8n instance running (Cloud or Self-hosted) 2. Setup Google Sheets Create a Google Sheet with the following structure: A (Date) B (Time) C (Sender) D (Receiver) E (Bank Name) F (Amount) G (Transaction ID) Ensure your Google Sheets API is enabled and connected to n8n. For an example of the required format, check this Google Sheets template: Google Sheets Template 3. Configure n8n Workflow 1. Webhook Node (Receives bank slip from LINE BOT) Set method:* Set Path:* 2. HTTP Request (Download Image from LINE Message) Retrieves image URL from the LINE message payload 3. SpaceOCR Node (Extract Text from Bank Slip) Input:* API Key:* #### 4. Google Sheets Node (Save Transaction Data) Select your Google Sheet Map extracted data (sender, receiver, amount, etc.) to the respective columns 4. Deploy & Test Activate the workflow in n8n Set Webhook URL in LINE Developer Console Send a test bank slip image to the LINE BOT Check Google Sheets for extracted transaction data
by Yang
Who is this for? This template is designed for content creators, marketing teams, educators, or media managers who want to repurpose video content into written blog posts with visuals. It's ideal for anyone looking to automate the process of transforming YouTube videos into professional blog articles and custom images. What problem is this workflow solving? Creating written content from video material is time-consuming and manual. This workflow solves that by automating the entire pipeline: from detecting new YouTube video uploads to transcribing the audio, turning it into an engaging blog post, generating a matching visual, and saving both in Airtable. It saves hours of work while keeping your blog or social feed active and consistent. What this workflow does This automation listens for new YouTube videos added to a Google Drive folder, extracts the full transcript using Dumpling AI, and sends it to GPT-4o to generate a blog post and image prompt. Dumpling AI then turns the prompt into a 16:9 visual. The blog and visual are saved into Airtable for easy publishing or curation. Setup Google Drive Trigger Create a folder in Google Drive and upload your YouTube videos there. Link this folder in the "Watch Folder for New YouTube Videos" node. Enable polling every minute or adjust as needed. Download & Prepare the Video The video is downloaded and converted into base64 format by the next two nodes: Download Video File and Convert Downloaded Video to Base64. Transcription with Dumpling AI The base64 video is sent to Dumpling AI’s extract-video endpoint. You must have a Dumpling AI account and an API key with access to this endpoint: Dumpling AI Docs Generate Blog Content with GPT-4o GPT-4o takes the transcript and generates: A human-like blog post A descriptive prompt for AI image generation Make sure your OpenAI credentials are configured. Generate the Visual The prompt is passed to Dumpling AI’s generate-ai-image endpoint using model FLUX.1-pro. The result is a clean 1024x576 image. Save to Airtable Blog content is stored under the Content field in Airtable. The image prompt is also added to the Attachments column as a visual reference. Ensure Airtable base and table are preconfigured with the correct field names. How to customize this workflow to your needs Change the GPT prompt to alter the tone or format of the blog post (e.g., add bullet points or SEO tags). Modify the Dumpling AI prompt to generate different image styles. Add a scheduler or webhook trigger to run at different intervals or through other integrations. Connect this output to Ghost, Notion, or your CMS using additional nodes. 🧠 Sticky Note Summary Part 1: Transcription & Blog Prompt Watches a Google Drive folder for new video uploads. Downloads and encodes the video. Transcribes full audio with Dumpling AI. GPT-4o writes a blog post and descriptive image prompt. Part 2: Image Generation & Airtable Save Dumpling AI generates a visual from the image prompt. Blog content is saved to Airtable. The image prompt is patched into the Attachments field in the same record. ✅ Use this if you want to automate repurposing YouTube videos into blog content with zero manual work.
by Sleak
Who is this template for? This workflow template is designed for people seeking alerts when certain specific changes are made to any web page. Leveraging agentic AI, it analyzes the page every day and autonomously decides whether to send you an e-mail notification. Example use cases Track price changes on [competitor's website]. Notify me when the price drops below €50. Monitor new blog posts on [industry leader's website] and summarize key insights. Check [competitor's job page] for new job postings related to software development. Watch for new product launches on [e-commerce site] and send me a summary. Detect any changes in the terms and conditions of [specific website]. Track customer reviews for [specific product] on [review site] and extract key themes. How it works When clicking 'test workflow' in the editor, a new browser tab will open where you can fill in the details of your espionage assignment Make sure you be as concise as possible when instructing AI. Instruct specific and to the point (see examples at the bottom). After submission, the flow will start off by extracting both the relevant website url and an optimized prompt. OpenAI's structured outputs is utilized, followed by a code node to parse the results for further use. From here on, the endless loop of daily checks will begin: Initial scrape 1 day delay Second scrape AI agent decides whether or not to notify you Back to step 1 You can cancel an espionage assignment at any time in the executions tab Set up steps Insert your OpenAI API key in the structured outputs node (second one) Create a Firecrawl account and connect your Firecrawl API key in both 'Scrape page'-nodes Connect your OpenAI account in the AI agents' model node Connect your Gmail account in the AI agents' Gmail tool node
by ömer
Generate and Publish AI Content to LinkedIn and X (Twitter) with n8n Overview This n8n workflow automates the generation and publishing of AI-powered social media content across LinkedIn and X (formerly Twitter). By leveraging AI, this workflow helps social media managers, marketers, and content creators streamline their posting process. Who is this for? Social media managers Content creators Digital marketers Businesses looking to automate content generation Features AI-powered content creation** tailored for LinkedIn and X (Twitter) Automated publishing** to both platforms Structured output parsing** to ensure consistency OAuth2 authentication** for secure posting Merge and confirmation steps** to track successful postings Setup Instructions Prerequisites Before using this workflow, ensure you have: An n8n instance set up API credentials for: Google Gemini AI (for content generation) X Developer Account with OAuth2 authentication LinkedIn Developer Account with OAuth2 authentication A form submission service integrated with n8n Workflow Breakdown 1. Trigger: Form Submission A user submits a form containing the post title. The form is secured with Basic Authentication. The submitted title is passed to the AI Agent. 2. AI Content Generation The Google Gemini Chat Model processes the title and generates: LinkedIn post content Twitter (X) post content Hashtags Call-to-action (LinkedIn) Character limit check (Twitter) 3. Parsing AI Output A structured output parser converts the AI-generated content into a JSON format. Ensures correct formatting for LinkedIn and Twitter (X). 4. Publishing to Social Media X (Twitter) Posting Extracts the Twitter post from the AI output. Publishes it via an OAuth2-authenticated X (Twitter) account. LinkedIn Posting Extracts the LinkedIn post from the AI output. Publishes it via an OAuth2-authenticated LinkedIn account. 5. Merging Post Results Merges the response data from both LinkedIn and Twitter after publishing. 6. Confirmation Step Displays a final confirmation form once the posts are successfully published. Benefits Save time** by automating content creation and publishing. Ensure consistency** across platforms with structured AI-generated posts. Secure authentication** using OAuth2 for LinkedIn and Twitter. Increase engagement** with AI-optimized hashtags and CTAs. This workflow enables seamless social media automation, helping professionals post engaging AI-powered content effortlessly. 🚀
by Luke
Built this for a dedicated Slack outage-notifications channel — works well on both desktop and mobile. This is for: IT Administrators & small MSPs looking to streamline M365 alerts from one or multiple mailboxes into a single or specific Slack channels IT Admins who prefer ChatOps over management-by-email What does it do Scans for M365 outage alerts emails (every 1 min) Checks if it impacts a specific user region (if the alert calls it out, countries have to be manually set) Summarizes the incident using OpenAI o4-mini (cheap model - or you can swap for local Ollama) Sends a Slack Block to your outage channel with incident link (can be extended) Deletes the original alert email after successful delivery Credentials Outlook: Create an Outlook credential (OAuth2.0) to point to the mailbox (regular or shared) where M365 service alerts will be received Slack: Create a Slack bot credential with access to the slack channel you want updates posted to OpenAI: Create a OpenAI credential that has access to the GPT-4O-MINI model. Recommend you use projects in OpenAI so that you may set a per-project-budget and not impact other projects. Review this OpenAI documentation for more info on managing Projects in the API portal. Expect this to consume no more than 1-2 cents per month on average. Setup Download & import the workflow Modify the first Outlook block (Check for 365 Service Alert) to use the Outlook credential Modify the OpenAI block's system prompt to call out the countries your users reside in ie. "- Assume the organization has users primarily in the U.S. and Australia. If those regions are affected, state: "Your users may have been affected." Otherwise, add: "No impact expected for your user base."" ← swap U.S. & Australia for desired countries Modify the Slack block (Post outage to Slack) to specify the channel updates will be posted to Sample Slack Output Workflow Diagram
by Aditya Sharma
Description This intelligent n8n automation streamlines the process of collecting, extracting, and scoring resumes sent to a Gmail inbox—making it an ideal solution for recruiters who regularly receive hundreds of applications. The workflow scans incoming emails with attachments, extracts relevant candidate information from resumes using AI, evaluates each candidate based on customizable criteria, and logs their scores alongside contact details in a connected Google Sheet. Who Is This For? Recruiters & Hiring Managers**: Automate the resume screening process and save hours of manual work. HR Teams at Startups & SMBs**: Quickly evaluate talent without needing large HR ops infrastructure. Agencies & Talent Acquisition Firms**: Screen large volumes of resumes efficiently and with consistent criteria. Solo Founders Hiring for Roles**: Use AI to help score and shortlist top candidates from email applications. What Problem Does This Workflow Solve? Manually reviewing resumes is time-consuming, error-prone, and inconsistent. This workflow solves these challenges by: Automatically detecting and extracting resumes from Gmail attachments. Using OpenAI to intelligently extract candidate info from unstructured PDFs. Scoring resumes using customizable evaluation criteria (e.g., relevant experience, skills, education). Logging all candidate data (Name, Email, LinkedIn, Score) in a centralized, filterable Google Sheet. Enabling faster, fairer, and more efficient candidate screening. How It Works 1. Gmail Trigger Runs on a scheduled interval (e.g., every 6 or 24 hours). Scans a connected Gmail inbox (using OAuth credentials) for unread emails that contain PDF attachments. 2. Extract Attachments Downloads the attached resumes from matching emails. 3. Parse Resume Text Sends the PDF file to OpenAI's API (via GPT-4 or GPT-3.5 with file support or via base64 + PDF-to-text tool). Prompts GPT with a structured format to extract fields like Name, Email, LinkedIn, Skills, and Education. 4. Score Resume Evaluates the resume on predefined scoring logic using AI or logic inside the workflow (e.g., "Has X skill = +10 points"). 5. Log to Google Sheets Appends a new row in a connected Google Sheet, including: Candidate Name Email Address LinkedIn URL Resume Score Setup Accounts & API Keys You’ll need accounts and credentials for: n8n** (hosted or self-hosted) Google Cloud Platform** (for Gmail, Drive, and Sheets APIs) OpenAI** (for GPT model access) Google Sheet Make a Google Sheet and connect it via Google Sheets node in n8n. Columns should include: Name Email LinkedIn Score Configuration Google Cloud: Enable Gmail API and Google Sheets API. Set up OAuth 2.0 Credentials in Google Console. Connect n8n Gmail, Drive, and Sheets nodes to these credentials. OpenAI: Generate an API Key. Use the HTTP Request node or official OpenAI node to send prompt requests. n8n Workflow: Add Gmail Trigger. Add extraction logic (e.g., filter PDFs). Add OpenAI prompt for resume parsing and scoring. Connect structured output to a Google Sheets node. Requirements Accounts: n8n** Google** (Gmail, Sheets, Drive, Cloud Console) OpenAI** API Keys & Credentials: OpenAI API Key Google Cloud OAuth Credentials Gmail Access Scopes (for reading attachments) Configured Google Sheet OpenAI usage (after free tier) Google Cloud API usage (if exceeding free quota)
by Pavel Duchovny
Building agentic AI workflows often requires multiple moving parts: memory management, document retrieval, vector similarity, and orchestration. Until now, these pieces had to be custom-wired. But with the new native n8n nodes for MongoDB Atlas, we reduce that overhead dramatically. With just a few clicks: Store and recall long-term memory from MongoDB Query vector embeddings stored in Atlas Vector Search Use these results in your LLM chains and automation logic In this example we present an ingestion and AI Agent flows that focus around Travel Planning. The different interest points that we want the agent to know about can be ingested into the vector store. The AI Agent will use the vector store tool to get relevant context about those points of interest if it needs to. Prerequisites MongoDB Atlas project and Cluster OpenAI Valid API Key for embeddings (can be other provider) Gemini API Key for the LLM (can be other provider) How it works: There are 2 main flows. One is ingesting flow: Gets a document from a webhook and use MongoDB Vector Atlas to embed the document title and description into points_of_interest collection. Embeddings are stored in a field named embedding Embeddings used are OpenAI's but it can be any type of supported embedders. Second flow is an AI Agent node with Chat Memory Stored in MongoDB Atlas and a Vector Search node as a tool: Chat Message Trigger**: Chatting with the AI Agent will trigger the conversation store in the MongoDB Chat Memory node. When data is necessary like a location search or details it will go to the "Vector Search" tool. Vector Search Tool** - uses Atlas Vector Search index created on the points_of_interest collection: // index name : "vector_index" // If you change an embedding provider make sure the numDimensions correspond to the model. { "fields": [ { "type": "vector", "path": "embedding", "numDimensions": 1536, "similarity": "cosine" } ] } Additional Resources MongoDB Atlas Vector Search n8n Atlas Vector Search docs