by Rosh Ragel
This workflow processes emails received in Gmail and saves detailed information about each email to a MySQL database. Before using, you need to have: Gmail credentials MySQL database credentials A table in your database with the following columns: messageId (Gmail message ID) threadId snippet sender_name (nullable) sender_email recipient_name (nullable) recipient_email subject (nullable) How it works: The Gmail Trigger listens for new emails (checked every minute). A Code Node extracts the following fields from each email: Sender's name and email Recipient's name and email The MySQL Node inserts the extracted data into your database. If an entry with the same sender email already exists, it updates the record with the new details. How to use: Make sure your database table has all required columns listed above. Select the appropriate table and configure the matching column (e.g., id) to avoid duplicates. Customizing this Workflow: You can further modify the workflow to store attachments, timestamps, labels, or any other Gmail metadata as needed.
by Parag Javale
Social Media Auto-Poster (Google Sheets → Twitter & Instagram) This workflow automatically: Pulls rows marked as Pending from a Google Sheet. Generates a formatted Instagram caption and HTML preview. Converts the HTML into an image via HCTI.io. Posts the content: As a tweet (text only) to Twitter (X). As a post (image + caption) to Instagram via the Facebook Graph API. Marks the row in Google Sheets as Posted with a timestamp. It runs every 5 hours (configurable via the Schedule Trigger). Requirements Google Sheets API Credentials** connected in n8n. HCTI.io account** (HTML → Image API). Twitter (X) OAuth1 credentials**. Facebook/Instagram Graph API** access token (for the business account/page). A Google Sheet with at least these columns: RowID Caption Desc Hashtags Status Set Status to Pending for any row you want posted. Setup Import the JSON workflow (My_workflow.json) into your n8n instance. Link all credentials (replace placeholders with your own API keys and tokens). Update the Google Sheet ID and Sheet Name inside the Get row(s) in sheet and Update Status Posted nodes. (Optional) Adjust the posting interval in the Schedule Trigger node. How It Works Trigger: Runs every 5 hours. Fetch Rows: Reads Google Sheets for rows with Status = Pending. Caption Generation: Combines Desc + Hashtags into final_caption. HTML → Image: Converts caption to a styled 1080x1080 post. Social Posting: Posts the caption to Twitter (text only). Uploads the image + caption to Instagram. Update Status: Marks the row as Posted on [timestamp]. Notes Facebook/Instagram tokens expire; refresh or use long-lived tokens. HCTI.io may require a paid plan for high volumes. Works best with a business Instagram account linked to a Facebook Page. License This workflow can be reused and adapted freely under the MIT license.
by inderjeet Bhambra
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works? This workflow is an intelligent SEO analysis pipeline that ethically scrapes blog content and performs comprehensive SEO evaluation using AI. It receives blog URLs via webhook, validates permissions through robots.txt compliance, extracts content, and generates detailed SEO insights across four strategic dimensions: Content Optimization, Keyword Strategy, Technical SEO, and Backlink Building potential. The system prioritizes ethical web scraping by checking robots.txt permissions before proceeding, ensuring compliance with website policies. Upon successful analysis, it returns a structured JSON report with actionable SEO recommendations, performance scores, and optimization strategies. Technical Specifications Trigger: HTTP POST webhook Processing Time: 30-60 seconds depending on content size AI Model: GPT-4.1 minimum with specialized SEO analysis prompt. Output Format: Structured JSON Error Handling: Graceful failure with informative messages Compliance: Respects website robots.txt policies
by Lucas Peyrin
How it works This template is a hands-on tutorial for one of n8n's most powerful data tools: the Compare Datasets node. It's the perfect next step after learning basic logic, showing you how to build robust data synchronization workflows. We use a simple Warehouse Audit analogy to make the concept crystal clear: Warehouse A:* Our main, "source of truth" database. This is the master list of what our inventory *should be. Warehouse B:** A second, remote database (like a Notion page or Google Sheet) that we need to keep in sync. The Compare Datasets Node:* This is our *Auditor**. It takes both inventory lists and meticulously compares them to find any discrepancies. The Auditor then sorts every item into one of four categories, which correspond to the node's four outputs: In A only: New products found in our main warehouse that need to be added to Warehouse B. Same: Products that match perfectly in both warehouses. No action needed! Different: Products that exist in both places but have different details (e.g., stock count). These need to be updated in Warehouse B. In B only: Extra products found in Warehouse B that aren't in our master list. These need to be deleted. This pattern is the foundation for any two-way data sync you'll ever need to build. Set up steps Setup time: 0 minutes! This workflow is a self-contained tutorial and requires no setup or credentials. Click "Execute Workflow" to start the audit. Explore the two Set nodes ("Warehouse A" and "Warehouse B") to see the initial data we are comparing. Click on "The Auditor" (Compare Datasets node) to see how it's configured to use product_id as the matching key. Follow the outputs to the four NoOp nodes to see which products were sorted into each category. Read the sticky notes next to each output—they explain exactly why each item ended up there.
by Akash Kankariya
🚀 Discover trending and viral YouTube videos easily with this powerful n8n automation! This workflow helps you perform bulk research on YouTube videos related to any search term, analyzing engagement data like views, likes, comments, and channel statistics — all in one streamlined process. ✨ Perfect for: Content creators wanting to find viral video ideas Marketers analyzing competitor content YouTubers optimizing their content strategy How It Works 🎯 1️⃣ Input Your Search Term — Simply enter any keyword or topic you want to research. 2️⃣ Select Video Format — Choose between short, medium, or long videos. 3️⃣ Choose Number of Videos — Define how many videos to analyze in bulk. 4️⃣ Automatic Data Fetch — The workflow grabs video IDs, then fetches detailed video data and channel statistics from the YouTube API. 5️⃣ Performance Scoring — Videos are scored based on engagement rates with easy-to-understand labels like 🚀 HOLY HELL (viral) or 💀 Dead. 6️⃣ Export to Google Sheets — All data, including thumbnails and video URLs, is appended to your Google Sheet for comprehensive review and easy sharing. Setup Instructions 🛠️ Google API Key Get your YouTube Data API key from Google Developers Console. Add it securely in the n8n credentials manager (do not hardcode). Google Sheets Setup Create a Google Sheet to store your results (template link is provided). Share the sheet with your Google account used in n8n. Update the workflow with your sheet's Document ID and Sheet Name if needed. Run the Workflow Trigger the form webhook via browser or POST call. Enter search term, format, and number of videos. Let it process and check your Google Sheet for insights! Features ✨ Bulk fetches the latest and top-viewed YouTube videos. Intelligent video performance scoring with emojis for quick insights 🔥🎬. Organizes data into Google Sheets with thumbnail previews 🖼️. Easy to customize search parameters via an intuitive form. Fully automated, no manual API calls needed. Get Started Today! 🌟 Boost your YouTube content strategy and stay ahead with this powerful viral video research automation! Try it now on your n8n instance and tap into the world of viral content like a pro 🎥💡
by Jean-Marie Rizkallah
🧩 Jamf Policies Export to Slack Quickly export and review your entire Jamf policy configuration—including triggers, frequencies, and scope—directly in Slack. This enables IT and security teams to audit policy setups without logging into Jamf or generating reports manually. ❗The Problem Jamf Pro lacks a straightforward way to quickly review or share a list of all configured policies, including key attributes like frequency, scope, or triggers. Security teams often need this for audit or compliance reviews, but navigating Jamf’s UI or exporting via the API is time-consuming. 🔧 This Fixes It This workflow fetches all policies, extracts the most relevant fields, compiles them into a csv file, and posts that readble file into a designated Slack channel—automatically or on demand. ✅ Prerequisites • A Jamf Pro API key (OAuth2) with read access to policies • A Slack app with permission to post files into your chosen channel 🔍 How it works • Manually trigger or use the webhook to initiate the flow • Retrieve all policies from Jamf via the XML API • Convert the XML response into JSON • Split and loop through each policy ID • Retrieve detailed data for each policy • Format relevant fields (ID, name, trigger, scope, etc.) • Convert the final data set into an .csv file • Upload the file to your Slack channel ⚙️ Set up steps • Takes ~10 minutes to configure • Set the Jamf BaseURL in the “Jamf Server” node • Configure Jamf OAuth2 credentials in the HTTP Request nodes • Adjust the fields for export in the “Set-fields” node • Set your Slack credentials and target channel in the “Post to Slack” node • Optional: Customize the exported fields or filename 🔄 Automation Ready Schedule this flow daily/weekly, or tie it to change events to keep your team informed.
by Paul
🚀 Google Search Console MCP Server 📋 Description This n8n workflow serves as a Model Context Protocol (MCP) server, connecting MCP-compatible AI tools (like Claude) directly to the Google Search Console APIs. With this workflow, users can automate critical SEO tasks and manage Google Search Console data effortlessly via MCP endpoints. Included Functionalities: 📌 List Verified Sites 📌 Retrieve Detailed Site Information 📌 Access Search Analytics Data 📌 Submit and Manage Sitemaps 📌 Request URL Indexing OAuth2 is fully supported for secure and seamless API interactions. 🛠️ Setup Instructions 🔑 Prerequisites n8n instance** (cloud or self-hosted) Google Cloud project with enabled APIs: Google Search Console API Web Search Indexing API OAuth2 Credentials from Google Cloud ⚙️ Workflow Setup Step 1: Import Workflow Open n8n, select "Import from JSON", and paste this workflow JSON. Step 2: Configure OAuth2 Credentials Navigate to Settings → Credentials. Add new credentials (Google OAuth2 API): Client ID and Client Secret from Google Cloud Scopes: https://www.googleapis.com/auth/webmasters.readonly https://www.googleapis.com/auth/webmasters https://www.googleapis.com/auth/indexing Step 3: Configure Webhooks Webhook URLs auto-generate in MCP Server Trigger node. Ensure webhooks are publicly accessible via HTTPS. Step 4: Testing Test your endpoints with sample HTTP requests to confirm everything is working correctly. 🎯 Usage Examples List Sites**: Fetch all verified Search Console sites. Get Site Info**: Get detailed information about a particular site. Search Analytics**: Pull metrics such as clicks, impressions, and rankings. Submit Sitemap**: Automatically submit sitemaps. Request URL Indexing**: Trigger Google's indexing for specific URLs instantly. 🚩 Use Cases & Applications SEO automation workflows AI-driven SEO analytics Real-time website performance monitoring Automated sitemap management
by Yang
📄 What this workflow does This workflow captures a full-page screenshot of any website added to a Google Sheet and automatically uploads the screenshot to a designated Google Drive folder. It uses Dumpling AI’s screenshot API to generate the image and manages file storage through Google Drive. 👤 Who is this for This is ideal for: Marketers and outreach teams capturing snapshots of client or lead websites Lead generation specialists tracking landing page visuals Researchers or analysts who need to archive website visuals from URLs Anyone looking to automate website screenshot collection at scale ✅ Requirements A Google Sheet with a column labeled Website where URLs will be added Dumpling AI** API access for screenshot capture A connected Google Drive account with an accessible folder to store screenshots ⚙️ How to set up Replace the Google Sheet and folder IDs in the workflow with your own. Connect your Dumpling AI and Google credentials in n8n. Make sure your sheet contains a Website column with valid URLs. Activate the workflow to begin watching for new entries. 🔁 How it works (Workflow Steps) Watch New Row in Google Sheets: Triggers when a new row is added to the sheet. Request Screenshot from Dumpling AI: Sends the website URL to Dumpling AI and gets a screenshot URL. Download Screenshot: Fetches the image file from the returned URL. Upload Screenshot to Google Drive: Uploads the file to a selected folder in Google Drive. 🛠️ Customization Ideas Add timestamped filenames using the current date or domain name Append the Google Drive URL back to the same row in the sheet for easy access Extend the workflow to send Slack or email notifications when screenshots are saved Add filters to validate URLs before sending them to Dumpling AI
by Anna Bui
This n8n template automatically syncs website visitors identified by RB2B into your Attio CRM, creating comprehensive contact records and associated sales deals for immediate follow-up. Perfect for sales teams who want to capture every website visitor as a potential lead without manual data entry! Good to know RB2B identifies anonymous website visitors and sends structured data via Slack notifications The workflow prevents duplicate contacts by checking email addresses before creating new records All RB2B leads are automatically tagged with source tracking for easy identification How it works RB2B sends website visitor notifications to your designated Slack channel with visitor details The workflow extracts structured data from Slack messages including name, email, company, LinkedIn, and location It searches Attio CRM to check if the person already exists based on email address For new visitors, it creates a complete contact record with all available information For existing contacts, it updates their record and manages deal creation intelligently Automatically creates sales deals tagged as "RB2B Website Visitor" for proper lead tracking How to use Configure RB2B to send visitor notifications to a dedicated Slack channel The Slack trigger can be replaced with other triggers like webhooks if you prefer different notification methods Customize the deal naming conventions and stages to match your sales pipeline Requirements RB2B account with Slack integration enabled Attio CRM account with API access Slack workspace with bot permissions for the designated RB2B channel Customising this workflow Modify deal stages and values based on your sales process Add lead scoring based on company domain or visitor behavior patterns Integrate additional enrichment APIs to enhance contact data Set up automated email sequences or Slack notifications for high-value leads
by Davide
This workflow automates the generation of AI-enhanced, contextualized images using FLUX Kontext, based on prompts stored in a Google Sheet. The generated images are then saved to Google Drive, and their URLs are written back to the spreadsheet for easy access. Example Image: Prompt: The girl is lying on the bed and sleeping Result: Perfect for E-commerce and Social Media This workflow is especially useful for e-commerce businesses: Generate product images with dynamic backgrounds based on the use-case or season. Create contextual marketing visuals for ads, newsletters, or product pages. Scale visual content creation without the need for manual design work. How It Works Trigger**: The workflow can be started manually (via "Test workflow") or scheduled at regular intervals (e.g., every 5 minutes) using the "Schedule Trigger" node. Data Fetch**: The "Get new image" node retrieves a row from a Google Sheet where the "RESULT" column is empty. It extracts the prompt, image URL, output format, and aspect ratio for processing. Image Generation**: The "Create Image" node sends a request to the FLUX Kontext API (fal.run) with the provided parameters to generate a new AI-contextualized image. Status Check**: The workflow waits 60 seconds ("Wait 60 sec." node) before checking the status of the image generation request via the "Get status" node. If the status is "COMPLETED," it proceeds; otherwise, it loops back to wait. Result Handling**: Once completed, the "Get Image Url" node fetches the generated image URL, which is then downloaded ("Get Image File"), uploaded to Google Drive ("Upload Image"), and the Google Sheet is updated with the result ("Update result"). Set Up Steps To configure this workflow, follow these steps: Google Sheet Setup: Create a Google Sheet with columns for PROMPT, IMAGE URL, ASPECT RATIO, OUTPUT FORMAT, and RESULT (leave this empty). Link the sheet in the "Get new image" and "Update result" nodes. API Key Configuration: Sign up at fal.ai to obtain an API key. In the "Create Image" node, set the Header Auth with: Name: Authorization Value: Key YOURAPIKEY Google Drive Setup: Specify the target folder ID in the "Upload Image" node where generated images will be saved. Schedule Trigger (Optional): Adjust the "Schedule Trigger" node to run the workflow at desired intervals (e.g., every 5 minutes). Test Execution: Run the workflow manually via the "Test workflow" node to verify all steps function correctly. Once configured, the workflow will automatically process pending prompts, generate images, and update the Google Sheet with results. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by vinci-king-01
How it works Turn Amazon into your personal competitive intelligence goldmine! This AI-powered workflow automatically monitors Amazon markets 24/7, delivering deep competitor insights and pricing intelligence that would take you 10+ hours of manual research weekly. Key Steps Daily Market Scan - Runs automatically at 6:00 AM UTC to capture fresh competitive data AI-Powered Analysis - Uses ScrapeGraphAI to intelligently extract pricing, product details, and market positioning Competitive Intelligence - Analyzes competitor strategies, pricing gaps, and market opportunities Keyword Goldmine - Identifies high-value keyword opportunities your competitors are missing Strategic Insights - Generates actionable recommendations for pricing and positioning Automated Reporting - Delivers comprehensive market reports directly to Google Docs Set up steps Setup time: 15-20 minutes Configure ScrapeGraphAI credentials - Add your ScrapeGraphAI API key for intelligent web scraping Set up Google Docs integration - Connect Google OAuth2 for automated report generation Customize Amazon search URL - Target your specific product category or market niche Configure IP rotation - Set up proxy rotation if needed for large-scale monitoring Test with sample products - Start with a small product set to validate data accuracy Set competitive alerts - Define thresholds for price changes and market opportunities Save 10+ hours weekly while staying ahead of your competition with real-time market intelligence!
by vinci-king-01
How it works Transform your business with intelligent deal monitoring and automated customer engagement! This AI-powered coupon aggregator continuously tracks competitor deals and creates personalized marketing campaigns that convert. Key Steps 24/7 Deal Monitoring - Automatically scans competitor websites daily for the best deals and offers Smart Customer Segmentation - Uses AI to intelligently categorize and target your customer base Personalized Offer Generation - Creates tailored coupon campaigns based on customer behavior and preferences Automated Email Marketing - Sends targeted email campaigns with personalized deals to the right customers Performance Analytics - Tracks campaign performance and provides detailed insights and reports Daily Management Reports - Delivers comprehensive analytics to management team every morning Set up steps Setup time: 10-15 minutes Configure competitor monitoring - Add target websites and deal sources you want to track Set up customer database - Connect your customer data source for intelligent segmentation Configure email integration - Connect your email service provider for automated campaigns Customize deal criteria - Define what types of deals and offers to prioritize Set up analytics tracking - Configure Google Sheets or database for performance monitoring Test automation flow - Run a test cycle to ensure all integrations work smoothly Never miss a profitable deal opportunity - let AI handle the monitoring and targeting while you focus on growth!