by Brian Money
Overview This template is designed for Amazon sellers and advertisers who want to automate their campaign performance analysis and bidding strategy. It solves the common challenge of manually reviewing Sponsored Products reports and guessing how to adjust keywords, placements, and budgets. By combining Amazon Advertising reports with OpenAI's GPT-4o, this workflow delivers real-time, personalized optimization instructions — automatically. Features 📥 Automatically downloads Sponsored Products reports from Google Drive 🧠 Uses AI to analyze campaign, keyword, placement, targeting, and budget performance 📊 Supports both .csv and .xlsx report formats 🔁 Handles multiple ASINs and scales easily across ad accounts 📧 Sends structured optimization recommendations to your inbox via Gmail 🗂 Built-in logic to normalize filenames and correctly map reports 🧹 Includes error handling and formatting cleanup for AI-ready input Requirements To use this workflow, you’ll need: An Amazon Ads account with access to Sponsored Products reports A Google Drive folder where Amazon Ads reports are delivered (manually or via Gmail automation) A Gmail account (for sending summaries) An OpenAI API key with access to GPT-4o Optional: a developer account for the Amazon Ads API to fully automate report generation in the future Setup Instructions 📂 Connect your Amazon Ads reports folder in the Google Drive node 🔐 Add your credentials to the OpenAI and Gmail nodes 📝 Schedule five reports in the Amazon Ads Console: Search Term Report → Detailed Targeting Report → Detailed Campaign Report → Summary Placement Report → Summary Budget Report → Summary Use “Last 30 Days”, “Daily”, and .xlsx or .csv format 🔁 (Optional) Automate report ingestion using Gmail + Drive workflows 🧪 Test with one account, then replicate across additional ad accounts as needed ⏱️ Setup time: 15–30 minutes 📌 All field-specific guidance is included in workflow notes`
by Leandro Melo
Keep your Hostinger VPS servers secure with automated backups! This n8n (self-hosted) workflow for is designed to create daily snapshots and send server metrics effortlessly, ensuring you always have an up-to-date recovery copy. Key Features: ✅ Automated Snapshots: Daily execution with zero manual intervention. ✅ Smart Replacement: Hostinger allows only 1 snapshot per VPS—the workflow automatically replaces the previous one. ✅ Notifications: Alerts via WhatsApp (Evolution API) or other configurable channels for execution confirmation. Quick Setup: Prerequisites: Install the Community Node n8n-nodes-hostinger-api and n8n-nodes-evolution-api in your n8n instance. Generate a Hostinger API Key in their dashboard: hpanel.hostinger.com/profile/api. Workflow Configuration: Add the Hostinger API credential in the first node and reuse it across the workflow. Customize the schedule (e.g., daily at 2 AM) and notification method (Evolution API for WhatsApp, email, etc.). Important Note: Hostinger overwrites the previous snapshot with each new execution, keeping only the latest version. VPS Metrics avaliables (send in messages): 🔹Status: snapshot status 🔹Date: snapshot date time 🔹Server: server name 🔹IP: external server IP ⚙️ Métrics: 🔹 Number of vCPUs 🔹 Ram usage / avaliable 🔹 Hard Disk usage / avaliable 🔹 Operational Sys and version 🔹 Uptime time (days, hours)
by Custom Workflows AI
Introduction The Content SEO Audit Workflow is a powerful automated solution that generates comprehensive SEO audit reports for websites. By combining the crawling capabilities of DataForSEO with the search performance metrics from Google Search Console, this workflow delivers actionable insights into content quality, technical SEO issues, and performance optimization opportunities. The workflow crawls up to 1,000 pages of a website, analyzes various SEO factors including metadata, content quality, internal linking, and search performance, and then generates a professional, branded HTML report that can be shared directly with clients. The entire process is automated, transforming what would typically be hours of manual analysis into a streamlined workflow that produces consistent, thorough results. This workflow bridges the gap between technical SEO auditing and practical, client-ready deliverables, making it an invaluable tool for SEO professionals and digital marketing agencies. Who is this for? This workflow is designed for SEO consultants, digital marketing agencies, and content strategists who need to perform comprehensive content audits for clients or their own websites. It's particularly valuable for professionals who: Regularly conduct SEO audits as part of their service offerings Need to provide branded, professional reports to clients Want to automate the time-consuming process of content analysis Require data-driven insights to inform content strategy decisions Users should have basic familiarity with SEO concepts and metrics, as well as a basic understanding of how to set up API credentials in n8n. While no coding knowledge is required to run the workflow, users should be comfortable with configuring workflow parameters and following setup instructions. What problem is this workflow solving? Content audits are essential for SEO strategy but are traditionally labor-intensive and time-consuming. This workflow addresses several key challenges: Manual Data Collection: Gathering data from multiple sources (crawlers, Google Search Console, etc.) typically requires hours of work. This workflow automates the entire data collection process. Inconsistent Analysis: Manual audits can suffer from inconsistency in methodology. This workflow applies the same comprehensive analysis criteria to every page, ensuring thorough and consistent results. Report Generation: Creating professional, client-ready reports often requires additional design work after the analysis is complete. This workflow generates a fully branded HTML report automatically. Data Integration: Correlating technical SEO issues with actual search performance metrics is difficult when working with separate tools. This workflow seamlessly integrates crawl data with Google Search Console metrics. Scale Limitations: Manual audits become increasingly difficult with larger websites. This workflow can efficiently process up to 1,000 pages without additional effort. What this workflow does Overview The Content SEO Audit Workflow crawls a specified website, analyzes its content for various SEO issues, retrieves performance data from Google Search Console, and generates a comprehensive HTML report. The workflow identifies issues in five key categories: status issues (404 errors, redirects), content quality (thin content, readability), metadata SEO (title/description issues), internal linking (orphan pages, excessive click depth), and performance (underperforming content). The final report includes executive summaries, detailed issue breakdowns, and actionable recommendations, all branded with your company's colors and logo. Process Initial Configuration: The workflow begins by setting parameters including the target domain, crawl limits, company information, and branding colors. Website Crawling: The workflow creates a crawl task in DataForSEO and periodically checks its status until completion. Data Collection: Once crawling is complete, the workflow: Retrieves the raw audit data from DataForSEO Extracts all URLs with status code 200 (successful pages) Queries Google Search Console API for each URL to get clicks and impressions data Identifies 404 and 301 pages and retrieves their source links Data Analysis: The workflow analyzes the collected data to identify issues including: Technical issues: 404 errors, redirects, canonicalization problems Content issues: thin content, outdated content, readability problems SEO metadata issues: missing/duplicate titles and descriptions, H1 problems Internal linking issues: orphan pages, excessive click depth, low internal links Performance issues: underperforming pages based on GSC data Report Generation: Finally, the workflow: Calculates a health score based on the severity and quantity of issues Generates prioritized recommendations Creates a comprehensive HTML report with interactive tables and visualizations Customizes the report with your company's branding Provides the report as a downloadable HTML file Setup To set up this workflow, follow these steps: Import the workflow: Download the JSON file and import it into your n8n instance. Configure DataForSEO credentials: Create a DataForSEO account at https://app.dataforseo.com/api-access (they offer a free $1 credit for testing) Add a new "Basic Auth" credential in n8n following the HTTP Request Authentication guide Assign this credential to the "Create Task", "Check Task Status", "Get Raw Audit Data", and "Get Source URLs Data" nodes Configure Google Search Console credentials: Add a new "Google OAuth2 API" credential following the Google OAuth guide Ensure your Google account has access to the Google Search Console property you want to analyze Assign this credential to the "Query GSC API" node Update the "Set Fields" node with: dfs_domain: The website domain you want to audit dfs_max_crawl_pages: Maximum number of pages to crawl (default: 1000) dfs_enable_javascript: Whether to enable JavaScript rendering (default: false) company_name: Your company name for the report branding company_website: Your company website URL company_logo_url: URL to your company logo brand_primary_color: Your primary brand color (hex code) brand_secondary_color: Your secondary brand color (hex code) gsc_property_type: Set to "domain" or "url" depending on your Google Search Console property type Run the workflow: Click "Start" and wait for it to complete (approximately 20 minutes for 500 pages). Download the report: Once complete, download the HTML file from the "Download Report" node. How to customize this workflow to your needs This workflow can be adapted in several ways to better suit your specific requirements: Adjust crawl parameters: Modify the "Set Fields" node to change: The maximum number of pages to crawl (dfs_max_crawl_pages). This workflow supports up to 1000 pages. Whether to enable JavaScript rendering for JavaScript-heavy sites (dfs_enable_javascript) Customize issue detection thresholds: In the "Build Report Structure" code node, you can modify: Word count thresholds for thin content detection (currently 1500 words) Click depth thresholds (currently flags pages deeper than 4 clicks) Title and description length parameters (currently 40-60 chars for titles, 70-155 for descriptions) Readability score thresholds (currently flags Flesch-Kincaid scores below 55) Modify the report design: In the "Generate HTML Report" code node, you can: Adjust the HTML/CSS to change the report layout and styling Add or remove sections from the report Change the recommendations logic Modify the health score calculation algorithm Add additional data sources: You could extend the workflow by: Adding Pagespeed Insights data for performance metrics Incorporating backlink data from other APIs Adding keyword ranking data from rank tracking APIs Implement automated delivery: Add nodes after the "Download Report" to: Send the report directly to clients via email Upload it to cloud storage Create a PDF version of the report
by Ranjan Dailata
Who this is for? The Brand Content Extract, Summarization & Sentiment Analysis workflow is designed for professionals and teams who need to monitor, understand, and act on public brand perception at scale. It is ideal for: Brand Managers - Looking to track how their brand is portrayed online. Marketing Analysts - Seeking insights from competitor and industry content. PR & Communications Teams - Evaluating media tone and potential reputation risks. Data Scientists & AI Developers - Automating content intelligence pipelines. Growth Hackers - Performing large-scale web listening for campaign optimization. What problem is this workflow solving? Manually tracking and interpreting how your brand is mentioned across blogs, news sites, or product reviews is labor-intensive and unscalable. Traditional scraping tools return raw data but lack insights like summarization, sentiment analysis etc. This workflow addresses: Scalable extraction of brand-related content using Bright Data's infrastructure. Textual data extract for easy decision-making or alerting. Automated summarization of verbose or multi-paragraph articles using Gemini. Sentiment analysis of how a brand is being portrayed. What this workflow does Receives input: A brand URL for the data extraction and analysis. Uses Bright Data's Web Unlocker to extract content from relevant sites. Cleans and preprocesses the scraped content for readability. Sends the content to Google Gemini for: Enriched results including: Cleaned content Summary Sentiment Analysis Sends the response to a target system via Webhook notification Perists the response to disk Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). Update the Set URL and Bright Data Zone for setting the brand content URL and the Bright Data Zone name. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. How to customize this workflow to your needs Update Source** : Update the workflow input to read from Google Sheet or Airbase for dynamically tracking multiple brands or topics. AI Prompt Customization** : Tailor Gemini prompts for: Summary length (brief vs. detailed) Detailed Sentiment with the custom structured data format. Brand-specific tone detection (e.g., trust, excitement, dissatisfaction) Output Destinations**: Configure the output node to send the responses to various platforms, such as Slack, CRM systems, or databases.
by James Carter
This n8n workflow automatically fetches trending news articles based on your chosen country, category, and keyword — then enriches the data with AI-powered business insights before posting a concise summary to Slack. Ideal for sales teams, executives, marketers, or anyone who wants fast, actionable news briefings directly in their Slack workspace. ⸻ Who it’s for Executives, analysts, sales teams, or marketing professionals who want curated, AI-enhanced news summaries tailored to business opportunities, risks, and trends — delivered automatically to Slack. ⸻ How it works / What it does A Schedule Trigger runs on a daily, weekly, or custom frequency. It queries the NewsAPI to retrieve top headlines by country, category, or keyword. Headlines are formatted and enriched with your configured query context. The AI model (GPT-4) analyzes articles and summarizes key insights, categorizing them as Opportunities, Risks, or Trends. Finally, the summarized insights are posted directly into a Slack channel of your choice. ⸻ How to set up Set your schedule frequency in the Schedule Trigger node. Configure your preferred country, category, and keyword in the Inject Config node. Add your NewsAPI Key inside the Fetch News Articles node. Connect your Slack credentials in the Post to Slack node. Optional: Adjust the AI prompt for more tailored analysis. ⸻ Requirements A NewsAPI account to fetch headlines. An OpenAI API key for GPT-4 summarization. A Slack workspace and connected credentials via n8n. ⸻ How to customize the workflow Change the country, category, or keyword in the Inject Config to focus on specific markets or sectors. Adjust the AI prompt in the GPT node to prioritize certain insights like ESG factors, M&A activity, or market sentiment. Extend the workflow to log results to Google Sheets, email summaries, or send SMS alerts. Replace the Schedule Trigger with a Webhook if you want to trigger summaries on demand. This template is designed to be modular, making it easy to adapt for competitive intelligence, investment tracking, or industry news curation.
by Giannis Kotsakiachidis
🏦 GoCardless ⇄ Maybe Finance — Automatic Multi-Bank Sync & Weekly Overview 💸 Who’s it for 🤔 Freelancers, founders, households, and side-hustlers who work with several bank accounts but want one, always-up-to-date budget inside Maybe Finance—no more CSV exports or copy-paste. How it works / What it does ⚙️ Schedule Trigger (cron) fires every Monday 📅 (switch to Manual Trigger while testing) Get access token — fresh 24 h GoCardless token 🔑 Fetch transactions for each account: Revolut Pro Revolut Personal ABN AMRO (add extra HTTP Request nodes for any other GoCardless-supported banks) Extract booked — keep only settled items 🗂️ Set transactions … — map every record to Maybe Finance’s schema 📝 Merge all arrays into one payload 🔄 Create transactions to Maybe — POSTs each item via API 🚀 Resend Email — sends you a “Weekly transactions overview” 📧 All done in a single run — your Maybe dashboard is refreshed and you get an inbox alert. How to set up 🛠️ Import the template into n8n (cloud or self-hosted). Create credentials GoCardless secret_id & secret_key Maybe Finance API key (Optional) Resend API key for email notifications One-time GoCardless config (run the blocks on the left): /token/new/ → obtain token /institutions → find institution IDs /agreements/enduser/ → create agreements /requisitions/ → get the consent URL & finish bank login /requisitions/{id} → copy the GoCardless account_ids Create the same accounts in Maybe Finance and run the HTTP GET request in the purple frame and copy their account_ids. Open each Set transactions … node and paste the correct Maybe account_id. Adjust the Schedule Trigger (e.g. daily, monthly). Save & activate 🎉 Requirements 📋 n8n 1.33 + GoCardless app (secret ID & key, live or sandbox) Maybe Finance account & API key (Optional) Resend account for email How to customize ✨ Include pending transactions**: change the Item Lists filter. Add more banks**: duplicate the “Get … transactions” → “Extract booked” → “Set transactions” path and plug its output into the Merge node. Different interval**: edit the cron rule in Schedule Trigger. Disable emails**: just remove or deactivate the Resend node. Send alerts to Slack / Teams**: branch after the Merge node and add a chat node. Happy budgeting! 💰
by Nadia Privalikhina
This n8n template offers a free and automated way to convert images from a Google Drive folder into a single PDF document. It uses Google Slides as an intermediary, allowing you to control the final PDF's page size and orientation. If you're looking for a no-cost solution to batch convert images to PDF and need flexibility over the output dimensions (like A4, landscape, or portrait), this template is for you! It's especially handy for creating photo albums, visual reports, or simple portfolios directly from your Google Drive. How it works The workflow first copies a Google Slides template you specify. The page setup of this template (e.g., A4 Portrait) dictates your final PDF's dimensions. It then retrieves all images from a designated Google Drive folder, sorts them by creation date. Each image is added to a new slide in the copied presentation. Finally, the entire Google Slides presentation is converted into a PDF and saved back to your Google Drive. How to use Connect your Google Drive and Google Slides accounts in the relevant nodes. In the "Set Pdf File Name" node, define the name for your output PDF. In the "CopyPdfTemplate" node: Select your Google Slides template file (this sets the PDF page size/orientation). Choose the Google Drive folder containing your source images. Ensure your images are in the specified folder. For best results, images should have an aspect ratio similar to your chosen Slides template. Run the workflow to generate your PDF by clicking 'Test Workflow' Requirements Google Drive account. Google Slides account. Google Slides Template stored on your Google Drive Customising this workflow Adjust the "Filter: Only Images" node if you use image formats other than PNG (e.g., image/jpeg for JPGs). Modify the image sorting logic in the "Sort by Created Date" node if needed.
by Nikan Noorafkan
🤖 AI-Powered Content Marketing Research Tool > Transform your content strategy with automated competitor intelligence ⚡ What It Does Never miss a competitor move again. This workflow automatically: 🔍 Monitors competitor content across multiple domains 📊 Tracks trending keywords by region 💬 Extracts audience pain points from Reddit & forums 🤖 Generates AI strategy recommendations via OpenAI 📋 Outputs to Airtable, Notion & Slack for instant action 🎯 Perfect For Growth marketers** tracking competitor strategies Content teams** discovering trending topics SEO specialists** finding keyword opportunities Marketing agencies** managing multiple clients 🛠️ Technical Setup Required APIs & Credentials | Service | Credential Type | Monthly Cost | Purpose | |---------|----------------|--------------|---------| | Ahrefs | Header Auth | $99+ | Backlink & traffic analysis | | SEMrush | Query Auth | $119+ | Keyword research | | BuzzSumo | Header Auth | $199+ | Content performance | | OpenAI | Header Auth | ~$50 | AI recommendations | | Reddit | OAuth2 | Free | Audience insights | | Google Trends | Public API | Free | Trending topics | 📊 Database Schema Airtable Base: content-research-base Table 1: competitor-intelligence timestamp (Date) domain (Single line text) traffic_estimate (Number) backlinks (Number) content_gaps (Long text) publishing_frequency (Single line text) Table 2: keyword-opportunities timestamp (Date) trending_keywords (Long text) top_questions (Long text) content_opportunities (Long text) 🚀 Quick Start Guide Step 1: Import & Configure Import the workflow JSON Update competitor domains in 📋 Configuration Settings Map all API credentials Step 2: Setup Storage Airtable:** Create base with exact schema above Notion:** Create database with properties listed Slack:** Create #content-research-alerts channel Step 3: Test & Deploy First run populates: ✅ Airtable tables with competitor data ✅ Notion database with AI insights ✅ Slack channel with formatted alerts 💡 Example Output AI Recommendations Format { "action_items": [ { "topic": "Copy trading explainer", "format": "Video", "region": "UK", "priority": "High" } ], "publishing_calendar": [ {"week": "W34", "posts": 3} ], "alerts": [ "eToro gained 8 .edu backlinks this week" ] } Slack Alert Preview 🚨 Content Research Alert 📊 Top Findings: Sustainable packaging solutions Circular economy trends Eco-friendly manufacturing 📈 Trending Keywords: forex trading basics (+45%) social trading platforms (+32%) copy trading strategies (+28%) 💡 AI Recommendations: Focus on educational content in UK market... 🔧 Advanced Features ✅ Data Quality Validation Automatic retry** for failed API calls Data validation** before storage Error notifications** via Slack ⚙️ Scalability Options Multi-region support** (US, UK, DE, FR, JP) Batch processing** for large competitor lists Rate limiting** to respect API quotas 🎨 Customization Ready Modular design** - disable unused APIs Industry templates** - forex, ecommerce, SaaS Custom scoring** algorithms 📈 ROI & Performance Cost Analysis Setup time:** ~2 hours Monthly API costs:** $400-500 Time saved:** 15+ hours/week ROI:** 300%+ within first month Success Metrics Competitor insights:** 50+ data points daily Keyword opportunities:** 100+ suggestions/week Content ideas:** 20+ AI-generated topics Trend alerts:** Real-time notifications 🛡️ Troubleshooting Common Issues & Solutions | Symptom | Cause | Fix | |-------------|-----------|---------| | OpenAI timeout | Large data payload | Reduce batch size → Split processing | | Airtable 422 error | Field mismatch | Copy schema exactly | | Reddit 401 | OAuth expired | Re-authorize application | Rate Limiting Best Practices Ahrefs:** Max 1000 requests/day SEMrush:** 3000 requests/day OpenAI:** Monitor token usage 🌟 Why Choose This Template? > "From manual research to automated intelligence in 15 minutes" ✅ Production-ready - No additional coding required ✅ Cost-optimized - Uses free tiers where possible ✅ Scalable - Add competitors with one click ✅ Actionable - AI outputs ready for immediate use ✅ Community-tested - 500+ successful deployments Start your competitive intelligence today 🚀 Built with ❤️ for the n8n community
by Oneclick AI Squad
In this guide, we’ll walk you through setting up a smart workflow that triggers on new restaurant orders, extracts and formats customer and dish details from Google Sheets, uses Gemini AI to recommend dishes or offers, and sends suggestions via Telegram. Ready to automate your order processing and enhance customer experience? Let’s dive in! What’s the Goal? Automatically trigger the workflow when a new order is placed. Extract and format customer information and order details from Google Sheets. Use Gemini AI to analyze orders and recommend dishes or offers. Send personalized suggestions to customers via Telegram. Enable real-time order processing and customer engagement. By the end, you’ll have a smart system that processes orders and suggests items effortlessly. Why Does It Matter? Manual order processing and suggestion generation are inefficient and miss opportunities. Here’s why this workflow is a game changer: Real-Time Efficiency**: Instantly process orders and suggest items. Personalized Engagement**: AI-driven suggestions enhance customer satisfaction. Time-Saving Automation**: Reduce manual effort in order management. Improved Sales**: Targeted recommendations can boost order value. Think of it as your intelligent assistant for orders and customer delight. How It Works Here’s the step-by-step magic behind the automation: Step 1: New Order Trigger Trigger the workflow when a new order is detected (e.g., via a form submission). Step 2: Extract & Format Order Extract and format dish ordering details from the customer order details sheet for further processing. Step 3: Save Customer Info Save customer information (e.g., ID, name, mobile number) from the customer details sheet. Step 4: Save Dish Info Save dish details (e.g., name, quantity, price) from the customer order details sheet. Step 5: Prepare Dish Details for AI Prepare the dish details for AI analysis to generate recommendations. Step 6: Clean Data for Input to Improve AI Understanding Clean and structure the data to enhance AI comprehension. Step 7: Use Gemini AI to Recommend Dishes or Offers Utilize Gemini AI (via Google Chat Model and Think Tool) to recommend dishes or offers based on order data. Step 8: Format AI Suggestions Format the AI-generated suggestions into a Telegram-friendly message. Step 9: Send Suggestions via Telegram Send the formatted suggestions directly to the customer via Telegram. How to Use the Workflow? Importing a workflow in n8n is a straightforward process that allows you to use pre-built workflows to save time. Below is a step-by-step guide to importing the Smart Restaurant Order & Suggestion System workflow in n8n. Steps to Import a Workflow in n8n Obtain the Workflow JSON Source the Workflow: Workflows are shared as JSON files or code snippets, e.g., from the n8n community, a colleague, or exported from another n8n instance. Format: Ensure you have the workflow in JSON format, either as a file (e.g., workflow.json) or copied text. Access the n8n Workflow Editor Log in to n8n (via n8n Cloud or self-hosted instance). Navigate to the Workflows tab in the n8n dashboard. Click Add Workflow to create a blank workflow. Import the Workflow Option 1: Import via JSON Code (Clipboard): Click the three dots (⋯) in the top-right corner to open the menu. Select Import from Clipboard. Paste the JSON code into the text box. Click Import to load the workflow. Option 2: Import via JSON File: Click the three dots (⋯) in the top-right corner. Select Import from File. Choose the .json file from your computer. Click Open to import. Setup Notes Google Sheet Columns**: Customer Details Sheet: Customer id, Customer name, Customer mobile number (e.g., CUST-JW4Z8Y, ajay, 9898989898; CUST-VEITPW, akash, 9898976898). Customer Order Details Sheet: Customer id, Dish name, Dish quantity, Per unit price, Actual price (e.g., CUST-JW4Z8Y, Tandoori Chicken, 1, 250, 250; CUST-VEITPW, Masala Dosa, 1, 150, 150). Google Sheets Credentials**: Configure OAuth2 settings in the extract and save nodes with your Google Sheet ID and credentials. Gemini AI**: Set up the Gemini AI node with Google Chat Model and Think Tool credentials. Telegram Integration**: Authorize the Send Suggestions node with Telegram API credentials and the customer’s chat ID or mobile number. Trigger Setup**: Configure the New Order Trigger node to detect new orders (e.g., via form or webhook).
by MattF
This workflow tracks week-over-week changes in Google Search Console performance and highlights the top movers across keyword segments like brand, nonbrand, and content categories. Instead of providing a routine check, it focuses on significant movements by: Sending a Slack alert only if a query crosses a defined movement threshold. Emailing a structured report with the Top 25 increases and Top 25 decreases for clicks, including % changes and linked URLs It’s designed to surface the most important shifts, helping SEO teams catch big wins, losses, or anomalies early. How it works Runs weekly (e.g. every Monday) to compare last week’s GSC data to the week prior. Segments traffic based on query and page (e.g. brand terms, category page URLs, etc.). Calculates delta and % change for clicks, CTR, impressions, and position. Filters and flags top movers with large shifts (default: ±200 clicks and ±30%). Sends Slack alerts only if meaningful changes are detected. Emails a full HTML table report showing the Top 25 up/down queries per segment. Setup steps Requires a connected Google Search Console account. Slack alert is included by default (can be replaced with email, webhook, or other tools). Customize your brand terms and URL filters to match your segments (e.g. recipes, blog, category pages). Typical setup time: 15–25 minutes depending on the number of segments and filters you want. Note: “Recipes” is used in the example to show how to segment by content type. You can update this to reflect your own site’s structure.
by Arlin Perez
📨 Categorize and Label Existing Gmail Emails Automatically with GPT-4o mini 👥 Who's it for This workflow is perfect for individuals or teams who want to sort and label existing emails in their Gmail inbox 🗃️ using AI. Ideal for cleaning up unlabeled emails in bulk — no coding required! For sorting incoming emails messages in your gmail inbox, please use this free workflow: Categorize and Label Incoming Gmail Emails Automatically with GPT-4o mini 🤖 What it does It manually processes a selected number of existing Gmail emails, skips those that already have labels, sends the content to an AI Agent powered by GPT-4o mini 🧠, and applies a relevant Gmail label based on the email content. All labels must already exist in Gmail. ⚙️ How it works ▶️ Manual Trigger – The workflow starts manually when you click "Execute Workflow". 📥 Gmail Get Many Messages – Pulls a batch of existing inbox emails (default: 50). 🚫 Filter – Skips emails that already have one or more labels. 🧠 AI Agent (GPT-4o mini) – Analyzes the content and assigns a category. 🧾 Structured Output Parser – Converts the AI output into structured JSON. 🔀 Switch Node – Routes each email to the right label based on the AI result. 🏷️ Gmail Nodes – Apply the correct Gmail label to the email. 📋 Requirements Gmail account connected to n8n Gmail labels must be manually created in your inbox beforehand Labels must exactly match the category names defined in the AI prompt OpenAI credentials with GPT-4o mini access n8n's AI Agent & Structured Output Parser nodes 🛠️ How to set up In your Gmail account, create all the labels you want to use for categorizing emails Open the workflow and adjust the email fetch limit in the Gmail node (e.g., 50, 100) Confirm that the Filter skips emails that already have labels Define your categories in the AI Agent prompt — these must match the Gmail labels exactly In the Switch Node, create a condition for each label/category Ensure each Gmail Label Node applies the correct existing label Save the workflow and run it manually whenever you want to organize your inbox ✅ 🎨 How to customize the workflow Add or remove categories in the AI prompt & Switch Node Adjust the batch size of emails to process more or fewer per run Fine-tune the AI prompt to suit your inbox type (e.g., work, personal, client support)
by Adrian Bent
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow scrapes job listings on indeed via Apify, automatically gets that dataset, extracts information about the listing filters jobs off relevance, finds a decision maker at the company and updates a database (google sheets) with that info for outreach. All you need to do is run Apify actor then the database will update with the processed data. Benefits: Complete Job search Automation - A webhook monitors the Apify actor which sends a integration and starts the process AI-Powered Filter - Uses ChatGPT to analyze content/context, identify company goals, and filters based on job description Smart Duplicate Prevention - Automatically tracks processed job listings in a database to avoid redundancy Multi-Platform Intelligence - Combines Indeed scraping, web research via Tavily, and enriches each listing Niche Focus - Process content from multiple niches 6 currently (hardcoded) but can be changed to fit other niches (just prompt the "job filter" node) How It Works: Indeed Job Discovery: Search and apply filter for relevant job listings, copy and use URL in Apify Uses Apify's Indeed job scraper to scrape job listings from the URL of interest Automatically scrapes the information, stores it in a dataset and initiates a integration Oncoming Data Processing: Loops over 500 items (can be changed) with a batch size of 55 items (can be changed) to avoid running into API timeouts. Multiple filters to ensure all fields are scrapped with our required metrics (website must exist and number of employees < 250) Duplicate job listings are removed from oncoming batch to be processed Job Analysis & Filter: An additional filter to remove any job listing from the oncoming batch if it already exists in the google sheets database Then all new job listings gets pasted to chatGPT which uses information about the job post/description to determine if it is relevant to us All relevant jobs get a new field "verdict" which is either true or false and we keep the ones where verdict is true Enrich & Update Database: Uses Tavily to search for a decision maker (doesn't always finds one) and populate a row in google sheet with information about the job listing, the company and a decision maker at that company. Waits for 1 minute and 30 seconds to avoid google sheets and chatGPT API timeouts then loops back to the next batch to start filtering again until all job listings are processed Required Google Sheets Database Setup: Before running this workflow, create a Google Sheets database with these exact column headers: Essential Columns: jobUrl - Unique identifier for job listings title - Position Title descriptionText - Description of job listing hiringDemand/isHighVolumeHiring - Are they hiring at high volume? hiringDemand/isUrgentHire - Are they hiring at high urgency? isRemote - Is this job remote? jobType/0 - Job type: In person, Remote, Part-time, etc. companyCeo/name - CEO name collected from Tavily's search icebreaker - Column for holding custom icebreakers for each job listing (Not completed in the workflow. I will upload another that does this called "Personalized IJSFE") scrapedCeo - CEO name collected from Apify Scraper email - Email listed on for job listing companyName - Name of company that posted the job companyDescription - Description of the company that posted the job companyLinks/corporateWebsite - Website of the company that posted the job companyNumEmployees - Number of employees the company listed that they have location/country - Location of where the job is to take place salary/salaryText - Salary on job listing Setup Instructions: Create a new Google Sheet with these column headers in the first row Name the sheet whatever you please Connect your Google Sheets OAuth credentials in n8n Update the document ID in the workflow nodes The merge logic relies on the id column to prevent duplicate processing, so this structure is essential for the workflow to function correctly. Feel free to reach out for additional help or clarification at my gmail: terflix45@gmail.com and I'll get back to you as soon as I can. Set Up Steps: Configure Apify Integration: Sign up for an Apify account and obtain API key Get indeed job scraper actor and use Apify's integration to send a HTTP request to your n8n webhook (if test URL doesn't work use production URL) Use Apify node with Resource: Dataset, Operation: Get items and use your Api key as your credentials Set Up AI Services: Add OpenAI API credentials for job filtering Add Tavily API credentials for company research Set up appropriate rate limiting for cost control Database Configuration: Create Google Sheets database with provided column structure Connect Google Sheets OAuth credentials Configure the merge logic for duplicate detection Content Filtering Setup: Customize the AI prompts for your specific niche, requirements or interest Adjust the filtering criteria to fit your needs