by Michael Muenzer
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Generates relevant keywords and questions from a a customer profile. Keyword data is enriched from ahref and everything is stored in a Google Sheet. This is great for market and customer research. Understanding search intent for a well defined audience and gives relevant actionable data in a fraction of time that manual research takes. How it works We'll define a customer profile in the 'Data' node We use an OpenAI LLM to fetch relevant search intent as keywords and questions We use an SEO MCP server to fetch keyword data from ahref free tooling The fetched data is stored in the Google sheet Set up steps Copy Google Sheet template and add it in all Google Sheet nodes Make sure that n8n has read & write permissions for your Google sheet. Add your list of domains in the first column in the Google sheet Add MCP credentials for seo-mcp Add OpenAI API credentials
by Jason Krol
Notion Weekly Journal AI Summary This workflow will run on a weekly schedule and retrieve your Notion Daily Journal pages for the past week and aggregate them into a ChatGPT generated concise summary. It will save that weekly summary back to your Notion as a new Note in addition to posting to a personal Discord channel. Additionally it will also retrieve all of the Tasks you've completed in the past week and provide a quick total with a congratulatory message to a Discord channel as well. Requirements/Setup: You need Notion setup w/ a Notes database If you want the Discord messages, setup a Discord webhook for your channel as well, or simply delete the Discord nodes. One of the properties for the Notes db should be Type with a value of Journal The contents of your daily Journal pages can be whatever you want I've found what works best for me is the format of "What was a highlight of the day?", "What was a low point of the day?", and "What decisions did I delegate, delay, or dodge?" You should also create an additional Type for your Weekly summary page that gets created - in this case I used simply Weekly Automate this to run weekly on your day of choice. I tend to only journal on weekdays so I've set mine up to run every Friday retrieving the past week's Journal entries. Options: You don't have to use Discord, feel free to swap out with Slack or remove altogether. You also don't need to use the Tasks summary bottom half, simply remove that if you don't want it or need it. You can easily reuse this workflow to aggregate your Weekly Summary notes (that this workflow auto generates/saves) to generate a Quarterly or even Yearly summary!
by Anna Bui
🎯 Universal Meeting Transcript to LinkedIn Content Automatically transform your meeting insights into engaging LinkedIn content with AI Perfect for coaches, consultants, sales professionals, and content creators who want to share valuable insights from their meetings without the manual effort of content creation. How it works Calendar trigger detects when your coaching/meeting ends Waits for meeting completion, then sends you a form via email You provide the meeting transcript and specify post preferences AI analyzes the transcript using your personal brand guidelines Generates professional LinkedIn content based on real insights Creates organized Google Docs with both transcript and final post Sends you links to review and publish your content How to use Connect your Google Calendar and Gmail accounts Update the calendar filter to match your meeting types Customize the AI prompts with your brand voice and style Replace email addresses with your own Test with a sample meeting transcript Requirements Google Calendar (for meeting detection) Gmail (for form delivery and notifications) Google Drive & Docs (for content storage) LangChain AI nodes (for content generation) Good to know AI processing may incur costs based on your LangChain provider Works with any meeting platform - just copy/paste transcripts Can be adapted to use webhooks from recording tools like Fireflies.ai Memory nodes store your brand guidelines for consistent output Happy Content Creating!
by vinci-king-01
How it works This workflow automatically extracts data from invoice documents (PDFs and images) and processes them through a comprehensive validation and approval system. Key Steps Multi-Input Triggers - Accepts invoices via email attachments or direct file uploads through webhook. AI-Powered Extraction - Uses ScrapeGraphAI to extract structured data from invoice documents. Data Cleaning & Validation - Processes and validates extracted data against business rules. Approval Workflow - Routes invoices requiring approval through a multi-stage approval process. System Integration - Automatically sends validated invoices to your accounting system. Set up steps Setup time: 10-15 minutes Configure ScrapeGraphAI credentials - Add your ScrapeGraphAI API key for invoice data extraction. Set up Telegram connection - Connect your Telegram account for approval notifications. Configure email trigger - Set up IMAP connection for processing emailed invoices. Customize validation rules - Adjust business rules, amount thresholds, and vendor lists. Set up accounting system integration - Configure the HTTP request node with your accounting system's API endpoint. Test the workflow - Upload a sample invoice to verify the extraction and approval process. Features Multi-format support**: PDF, PNG, JPG, JPEG, TIFF, BMP Intelligent validation**: Business rules, duplicate detection, amount thresholds Approval automation**: Multi-stage approval workflow with role-based routing Data quality scoring**: Confidence levels and completeness analysis Audit trail**: Complete processing history and metadata tracking
by vinci-king-01
How it works Transform your business with intelligent deal monitoring and automated customer engagement! This AI-powered coupon aggregator continuously tracks competitor deals and creates personalized marketing campaigns that convert. Key Steps 24/7 Deal Monitoring - Automatically scans competitor websites daily for the best deals and offers Smart Customer Segmentation - Uses AI to intelligently categorize and target your customer base Personalized Offer Generation - Creates tailored coupon campaigns based on customer behavior and preferences Automated Email Marketing - Sends targeted email campaigns with personalized deals to the right customers Performance Analytics - Tracks campaign performance and provides detailed insights and reports Daily Management Reports - Delivers comprehensive analytics to management team every morning Set up steps Setup time: 10-15 minutes Configure competitor monitoring - Add target websites and deal sources you want to track Set up customer database - Connect your customer data source for intelligent segmentation Configure email integration - Connect your email service provider for automated campaigns Customize deal criteria - Define what types of deals and offers to prioritize Set up analytics tracking - Configure Google Sheets or database for performance monitoring Test automation flow - Run a test cycle to ensure all integrations work smoothly Never miss a profitable deal opportunity - let AI handle the monitoring and targeting while you focus on growth!
by vinci-king-01
How it works Turn Amazon into your personal competitive intelligence goldmine! This AI-powered workflow automatically monitors Amazon markets 24/7, delivering deep competitor insights and pricing intelligence that would take you 10+ hours of manual research weekly. Key Steps Daily Market Scan - Runs automatically at 6:00 AM UTC to capture fresh competitive data AI-Powered Analysis - Uses ScrapeGraphAI to intelligently extract pricing, product details, and market positioning Competitive Intelligence - Analyzes competitor strategies, pricing gaps, and market opportunities Keyword Goldmine - Identifies high-value keyword opportunities your competitors are missing Strategic Insights - Generates actionable recommendations for pricing and positioning Automated Reporting - Delivers comprehensive market reports directly to Google Docs Set up steps Setup time: 15-20 minutes Configure ScrapeGraphAI credentials - Add your ScrapeGraphAI API key for intelligent web scraping Set up Google Docs integration - Connect Google OAuth2 for automated report generation Customize Amazon search URL - Target your specific product category or market niche Configure IP rotation - Set up proxy rotation if needed for large-scale monitoring Test with sample products - Start with a small product set to validate data accuracy Set competitive alerts - Define thresholds for price changes and market opportunities Save 10+ hours weekly while staying ahead of your competition with real-time market intelligence!
by Anna Bui
This n8n template automatically syncs website visitors identified by RB2B into your Attio CRM, creating comprehensive contact records and associated sales deals for immediate follow-up. Perfect for sales teams who want to capture every website visitor as a potential lead without manual data entry! Good to know RB2B identifies anonymous website visitors and sends structured data via Slack notifications The workflow prevents duplicate contacts by checking email addresses before creating new records All RB2B leads are automatically tagged with source tracking for easy identification How it works RB2B sends website visitor notifications to your designated Slack channel with visitor details The workflow extracts structured data from Slack messages including name, email, company, LinkedIn, and location It searches Attio CRM to check if the person already exists based on email address For new visitors, it creates a complete contact record with all available information For existing contacts, it updates their record and manages deal creation intelligently Automatically creates sales deals tagged as "RB2B Website Visitor" for proper lead tracking How to use Configure RB2B to send visitor notifications to a dedicated Slack channel The Slack trigger can be replaced with other triggers like webhooks if you prefer different notification methods Customize the deal naming conventions and stages to match your sales pipeline Requirements RB2B account with Slack integration enabled Attio CRM account with API access Slack workspace with bot permissions for the designated RB2B channel Customising this workflow Modify deal stages and values based on your sales process Add lead scoring based on company domain or visitor behavior patterns Integrate additional enrichment APIs to enhance contact data Set up automated email sequences or Slack notifications for high-value leads
by Paul
🚀 Google Search Console MCP Server 📋 Description This n8n workflow serves as a Model Context Protocol (MCP) server, connecting MCP-compatible AI tools (like Claude) directly to the Google Search Console APIs. With this workflow, users can automate critical SEO tasks and manage Google Search Console data effortlessly via MCP endpoints. Included Functionalities: 📌 List Verified Sites 📌 Retrieve Detailed Site Information 📌 Access Search Analytics Data 📌 Submit and Manage Sitemaps 📌 Request URL Indexing OAuth2 is fully supported for secure and seamless API interactions. 🛠️ Setup Instructions 🔑 Prerequisites n8n instance** (cloud or self-hosted) Google Cloud project with enabled APIs: Google Search Console API Web Search Indexing API OAuth2 Credentials from Google Cloud ⚙️ Workflow Setup Step 1: Import Workflow Open n8n, select "Import from JSON", and paste this workflow JSON. Step 2: Configure OAuth2 Credentials Navigate to Settings → Credentials. Add new credentials (Google OAuth2 API): Client ID and Client Secret from Google Cloud Scopes: https://www.googleapis.com/auth/webmasters.readonly https://www.googleapis.com/auth/webmasters https://www.googleapis.com/auth/indexing Step 3: Configure Webhooks Webhook URLs auto-generate in MCP Server Trigger node. Ensure webhooks are publicly accessible via HTTPS. Step 4: Testing Test your endpoints with sample HTTP requests to confirm everything is working correctly. 🎯 Usage Examples List Sites**: Fetch all verified Search Console sites. Get Site Info**: Get detailed information about a particular site. Search Analytics**: Pull metrics such as clicks, impressions, and rankings. Submit Sitemap**: Automatically submit sitemaps. Request URL Indexing**: Trigger Google's indexing for specific URLs instantly. 🚩 Use Cases & Applications SEO automation workflows AI-driven SEO analytics Real-time website performance monitoring Automated sitemap management
by Vishal Kumar
Trigger The workflow runs when a GitLab Merge Request (MR) is created or updated. Extract & Analyze It retrieves the code diff and sends it to Claude AI or GPT-4o for risk assessment and issue detection. Generate Report AI produces a structured summary with: Risk levels Identified issues Recommendations Test cases Notify Developers The report is: Emailed to developers and QA teams Posted as a comment on the GitLab MR Setup Guide Connect GitLab Add GitLab API credentials Select repositories to track Configure AI Analysis Enter Anthropic (Claude) or OpenAI (GPT-4o) API key Set Up Notifications Add Gmail credentials Update the email distribution list Test & Automate Create a test MR to verify analysis and email delivery Key Benefits Automated Code Review** – AI-driven risk assessment and recommendations Security & Compliance** – Identifies vulnerabilities before code is merged Integration with GitLab CI/CD** – Works within existing DevOps workflows Improved Collaboration** – Keeps developers and QA teams informed Developed by Quantana, an AI-powered automation and software development company.
by Jean-Marie Rizkallah
🧩 Jamf Policies Export to Slack Quickly export and review your entire Jamf policy configuration—including triggers, frequencies, and scope—directly in Slack. This enables IT and security teams to audit policy setups without logging into Jamf or generating reports manually. ❗The Problem Jamf Pro lacks a straightforward way to quickly review or share a list of all configured policies, including key attributes like frequency, scope, or triggers. Security teams often need this for audit or compliance reviews, but navigating Jamf’s UI or exporting via the API is time-consuming. 🔧 This Fixes It This workflow fetches all policies, extracts the most relevant fields, compiles them into a csv file, and posts that readble file into a designated Slack channel—automatically or on demand. ✅ Prerequisites • A Jamf Pro API key (OAuth2) with read access to policies • A Slack app with permission to post files into your chosen channel 🔍 How it works • Manually trigger or use the webhook to initiate the flow • Retrieve all policies from Jamf via the XML API • Convert the XML response into JSON • Split and loop through each policy ID • Retrieve detailed data for each policy • Format relevant fields (ID, name, trigger, scope, etc.) • Convert the final data set into an .csv file • Upload the file to your Slack channel ⚙️ Set up steps • Takes ~10 minutes to configure • Set the Jamf BaseURL in the “Jamf Server” node • Configure Jamf OAuth2 credentials in the HTTP Request nodes • Adjust the fields for export in the “Set-fields” node • Set your Slack credentials and target channel in the “Post to Slack” node • Optional: Customize the exported fields or filename 🔄 Automation Ready Schedule this flow daily/weekly, or tie it to change events to keep your team informed.
by Lucas Peyrin
How it works This template is a hands-on tutorial for one of n8n's most powerful data tools: the Compare Datasets node. It's the perfect next step after learning basic logic, showing you how to build robust data synchronization workflows. We use a simple Warehouse Audit analogy to make the concept crystal clear: Warehouse A:* Our main, "source of truth" database. This is the master list of what our inventory *should be. Warehouse B:** A second, remote database (like a Notion page or Google Sheet) that we need to keep in sync. The Compare Datasets Node:* This is our *Auditor**. It takes both inventory lists and meticulously compares them to find any discrepancies. The Auditor then sorts every item into one of four categories, which correspond to the node's four outputs: In A only: New products found in our main warehouse that need to be added to Warehouse B. Same: Products that match perfectly in both warehouses. No action needed! Different: Products that exist in both places but have different details (e.g., stock count). These need to be updated in Warehouse B. In B only: Extra products found in Warehouse B that aren't in our master list. These need to be deleted. This pattern is the foundation for any two-way data sync you'll ever need to build. Set up steps Setup time: 0 minutes! This workflow is a self-contained tutorial and requires no setup or credentials. Click "Execute Workflow" to start the audit. Explore the two Set nodes ("Warehouse A" and "Warehouse B") to see the initial data we are comparing. Click on "The Auditor" (Compare Datasets node) to see how it's configured to use product_id as the matching key. Follow the outputs to the four NoOp nodes to see which products were sorted into each category. Read the sticky notes next to each output—they explain exactly why each item ended up there.
by inderjeet Bhambra
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works? This workflow is an intelligent SEO analysis pipeline that ethically scrapes blog content and performs comprehensive SEO evaluation using AI. It receives blog URLs via webhook, validates permissions through robots.txt compliance, extracts content, and generates detailed SEO insights across four strategic dimensions: Content Optimization, Keyword Strategy, Technical SEO, and Backlink Building potential. The system prioritizes ethical web scraping by checking robots.txt permissions before proceeding, ensuring compliance with website policies. Upon successful analysis, it returns a structured JSON report with actionable SEO recommendations, performance scores, and optimization strategies. Technical Specifications Trigger: HTTP POST webhook Processing Time: 30-60 seconds depending on content size AI Model: GPT-4.1 minimum with specialized SEO analysis prompt. Output Format: Structured JSON Error Handling: Graceful failure with informative messages Compliance: Respects website robots.txt policies