by Nick Saraev
AI Facebook Ad Spy Tool with Apify, OpenAI, Gemini & Google Sheets Categories: Competitive Intelligence, Marketing Automation, AI Analysis This workflow creates a comprehensive Facebook ad spy tool that scrapes competitor ads from Facebook's ad library and generates detailed analysis with rewritten versions. The system processes text, image, and video ads using different AI models, providing strategic intelligence for PPC agencies and marketers. Built to be sold as a premium service for $2,000+, this tool combines web scraping, multi-modal AI analysis, and competitor intelligence into one powerful automation. Benefits Complete Competitive Intelligence** - Analyze competitor strategies across all ad formats (text, image, video) Multi-Modal AI Analysis** - Uses GPT-4 Vision for images and Gemini for video content understanding Automated Ad Rewriting** - Generates inspired variations of successful competitor ads Quality Filtering** - Targets high-performing advertisers with significant page likes Scalable Processing** - Handle hundreds of competitor ads with detailed strategic analysis Premium Service Potential** - Easily sold to agencies and marketers for $2,000+ implementations How It Works Facebook Ad Library Scraping: Connects to Facebook's public ad library through Apify's specialized scraper Searches for active ads using customizable keywords and targeting parameters Extracts comprehensive ad data including creative assets, targeting info, and engagement metrics Filters results to focus on high-quality advertisers with substantial page followings Intelligent Content Routing: Automatically categorizes ads into text-only, image-based, or video content types Routes each ad type to specialized processing pipelines optimized for that content format Ensures appropriate AI models are used for each type of creative analysis Maintains data integrity while processing different content formats simultaneously Advanced Video Analysis Pipeline: Downloads video ads directly from Facebook's content delivery network Uploads videos to Google Drive for temporary storage and processing Initiates Gemini AI video upload sessions for multi-modal analysis Uses Gemini's advanced video understanding to generate detailed content descriptions Processes video narrative, visual elements, messaging strategy, and target audience insights Image and Text Processing: Analyzes image ads using GPT-4 Vision for comprehensive visual content understanding Processes text-only ads using GPT-4 for messaging strategy and copywriting analysis Identifies key persuasion techniques, target demographics, and messaging frameworks Generates detailed competitive intelligence reports for each ad format Strategic Intelligence Generation: Creates comprehensive summaries analyzing competitor messaging strategies and target audiences Generates rewritten ad copy that captures successful elements while avoiding direct copying Produces recreation prompts for images and videos that can be used with AI generation tools Organizes all insights in structured Google Sheets database for easy analysis and reporting Required Setup Configuration Apify Integration: Sign up for Apify account and obtain API key Replace <your-apify-api-key-here> in "Run Ad Library Scraper" node Customize Facebook Ad Library search URLs with your target keywords and regions AI Service Configuration: OpenAI API**: Set up for text analysis and image understanding with GPT-4 Vision Gemini API**: Configure for advanced video content analysis and description Replace <your-gemini-api-key-here> in all Gemini-related nodes Google Services Setup: Google Drive**: Configure OAuth for temporary video storage during Gemini processing Google Sheets**: Create results database with proper column structure for ad intelligence storage Facebook Ad Library Search Configuration: Customize the search parameters in the Apify scraper Google Sheets Database Structure: Create a sheet with these columns: ad_archive_id - Unique Facebook ad identifier page_id - Advertiser's Facebook page ID page_name - Advertiser's business name page_url - Link to advertiser's Facebook page type - Ad format (text, image, or video) date_added - When ad was analyzed summary - Detailed competitive intelligence analysis rewritten_ad_copy - AI-generated inspired version image_prompt - Description for recreating image ads video_prompt - Description for recreating video ads Business Use Cases PPC Agencies - Offer comprehensive competitor analysis services to clients for strategic advantage Marketing Teams - Research competitor strategies and messaging before launching new campaigns E-commerce Businesses - Analyze successful ads in your industry for creative inspiration SaaS Companies - Study how competitors position their products and target audiences Course Creators - Research educational content marketing approaches and messaging strategies Affiliate Marketers - Identify successful promotional strategies and high-converting ad formats Difficulty Level: Advanced Estimated Build Time: 3-4 hours Monthly Operating Cost: ~$200 (Apify + OpenAI + Gemini + Google Workspace APIs) Watch My Complete Build Process Want to see exactly how I built this entire Facebook ad spy system from scratch? I walk through the complete development process live, including API integrations, multi-modal AI setup, error handling, and the exact business strategy for selling this as a premium service. 🎥 Watch My Live Build: "Build A Facebook Ads Spy Tool With N8N (Sell for $2k+)" This comprehensive tutorial shows the real development process - including complex API orchestration, multi-modal AI integration, and proven strategies for monetizing competitive intelligence systems. Set Up Steps Apify Scraper Configuration: Set up Apify account and configure Facebook Ad Library scraper Customize search parameters for your target industries and regions Configure result limits and filtering parameters for quality control Test scraper with sample searches to verify data quality Multi-Modal AI Setup: Configure OpenAI API credentials for text and image analysis Set up Gemini API access for advanced video content understanding Configure appropriate rate limits and error handling for API stability Test AI analysis with sample ads to optimize prompt quality Google Services Integration: Set up Google Drive OAuth for temporary video storage during processing Create Google Sheets database with proper column structure for intelligence storage Configure sharing permissions and access controls for team collaboration Test complete data flow from scraping to final intelligence reports Quality Control and Filtering: Configure page likes threshold in "Filter For Likes" node (recommend 1,000+ for quality) Adjust content routing logic in Switch node based on your analysis needs Set up error handling and retry logic for reliable large-scale processing Test complete workflow with various ad types to ensure proper routing Advanced Customization: Customize AI prompts for your specific industry analysis needs Configure additional filtering criteria beyond page likes Set up automated scheduling for regular competitor monitoring Add custom fields to database for tracking specific competitive metrics Advanced Features Scale the system with additional capabilities: Industry-Specific Analysis - Customize prompts and filters for different verticals Trend Tracking - Monitor messaging changes over time for strategic insights Performance Correlation - Cross-reference ad engagement with business outcomes Alert Systems - Notify when competitors launch new campaign types Custom Reporting - Generate client-ready intelligence reports automatically Integration Extensions - Connect to CRM and marketing platforms for strategic workflow Important Considerations API Rate Limits - Built-in delays and error handling prevent service interruptions Content Rights - System generates inspired variations, not direct copies, for legal compliance Data Storage - Organize intelligence database for easy client reporting and analysis Scalability - Batch processing handles hundreds of ads efficiently without blocking Quality Assurance - Filtering logic ensures analysis focuses on successful, high-quality advertisers Why This System Works The competitive advantage lies in comprehensive multi-modal analysis: Complete format coverage - analyzes text, image, and video ads with appropriate AI models Strategic depth - goes beyond basic scraping to provide actionable intelligence Automation scale - processes competitor research that would take weeks manually Premium positioning - advanced AI analysis justifies higher service pricing Immediate value - clients receive actionable insights within hours of setup Check Out My Channel For more advanced automation systems that generate real business results and premium service opportunities, explore my YouTube channel where I share proven strategies for building profitable automation businesses.
by Nick Saraev
Google Maps Email Scraper System Categories: Lead Generation, Web Scraping, Business Automation This workflow creates a completely free Google Maps email scraping system that extracts unlimited business emails without requiring expensive third-party APIs. Built entirely in N8N using simple HTTP requests and JavaScript, this system can generate thousands of targeted leads for any industry or location while operating at 99% free cost structure. Benefits Zero API Costs** - Operates entirely through free Google Maps scraping without expensive third-party services Unlimited Lead Generation** - Extract emails from thousands of Google Maps listings across any industry Geographic Targeting** - Search by specific cities, regions, or business types for precise lead targeting Complete Automation** - From search query to organized email list with minimal manual intervention Built-in Data Cleaning** - Automatic duplicate removal, filtering, and data validation Scalable Processing** - Handle hundreds of businesses per search with intelligent rate limiting How It Works Google Maps Search Integration: Uses strategic HTTP requests to Google Maps search URLs Processes search queries like "Calgary + dentist" to extract business listings Bypasses API restrictions through direct HTML scraping techniques Intelligent URL Extraction: Custom JavaScript regex patterns extract website URLs from Google Maps data Filters out irrelevant domains (Google, schema, static files) Returns clean list of actual business websites for processing Smart Website Processing: Loop-based architecture prevents IP blocking through intelligent batching Built-in delays and redirect handling for reliable scraping Processes each website individually with error handling Email Pattern Recognition: Advanced regex patterns identify email addresses within website HTML Extracts contact emails, info emails, and administrative addresses Handles multiple email formats and validation patterns Data Aggregation & Cleaning: Automatically removes duplicate emails across all processed websites Filters null entries and invalid email formats Exports clean, organized email lists to Google Sheets Required Google Sheets Setup Create a Google Sheet with these exact column headers: Search Tracking Sheet: searches - Contains your search queries (e.g., "Calgary dentist", "Miami lawyers") Email Results Sheet: emails - Contains extracted email addresses from all processed websites Setup Instructions: Create Google Sheet with two tabs: "searches" and "emails" Add your target search queries to the searches tab (one per row) Connect Google Sheets OAuth credentials in n8n Update the Google Sheets document ID in all sheet nodes The workflow reads search queries from the first sheet and exports results to the second sheet automatically. Business Use Cases Local Service Providers** - Find competitors and potential partners in specific geographic areas B2B Sales Teams** - Generate targeted prospect lists for cold outreach campaigns Marketing Agencies** - Build industry-specific lead databases for client campaigns Real Estate Professionals** - Identify businesses in target neighborhoods for commercial opportunities Franchise Development** - Research potential markets and existing competition Market Research** - Analyze business density and contact information across regions Revenue Potential This system transforms lead generation economics: $0 per lead vs. $2-5 per lead from paid databases Process 1,000+ leads daily without hitting API limits Sell as a service for $500-2,000 per industry/location Perfect for agencies offering lead generation to local businesses Difficulty Level: Intermediate Estimated Build Time: 1-2 hours Monthly Operating Cost: $0 (completely free) Watch My Complete Build Process Want to watch me build this entire system live from scratch? I walk through every single step - including the JavaScript code, regex patterns, error handling, and all the debugging that goes into creating a bulletproof scraping system. 🎥 Watch My Live Build: "Scrape Unlimited Leads WITHOUT Paying for APIs (99% FREE)" This comprehensive tutorial shows the real development process - including writing custom JavaScript, handling rate limits, and building systems that actually work at scale without getting blocked. Set Up Steps Basic Workflow Architecture: Set up manual trigger for testing and Google Sheets integration Configure initial HTTP request node for Google Maps searches Enable SSL ignore and response headers for reliable scraping URL Extraction Code Setup: Configure JavaScript code node with custom regex patterns Set up input data processing from Google Maps HTML responses Implement URL filtering logic to remove irrelevant domains Website Processing Pipeline: Add "Split in Batches" node for intelligent loop processing Configure HTTP request nodes with proper delays and redirect handling Set up error handling for websites that can't be scraped Email Extraction System: Implement JavaScript code node with email-specific regex patterns Configure email validation and format checking Set up data aggregation for multiple emails per website Data Cleaning & Export: Configure filtering nodes to remove null entries and duplicates Set up "Split Out" node to aggregate emails into single list Connect Google Sheets integration for organized data export Testing & Optimization: Use limit nodes during testing to prevent IP blocking Test with small batches before scaling to full searches Implement proxy integration for high-volume usage Advanced Optimizations Scale the system with: Multi-Page Scraping:** Extract URLs from homepages, then scrape contact pages for more emails Proxy Integration:** Add residential proxies for unlimited scraping without rate limits Industry Templates:** Create pre-configured searches for different business types Contact Information Expansion:** Extract phone numbers, addresses, and social media profiles CRM Integration:** Automatically add leads to sales pipelines and marketing sequences Important Considerations Rate Limiting:** Built-in delays prevent IP blocking during normal usage Scalability:** For high-volume usage, consider proxy services for unlimited requests Compliance:** Ensure proper usage rights for extracted contact information Data Quality:** System includes filtering but manual verification recommended for critical campaigns Check Out My Channel For more advanced automation systems and business-building strategies that generate real revenue, explore my YouTube channel where I share proven automation techniques used by successful agencies and entrepreneurs.
by Mario
Purpose This workflow snippet allows for advanced error catching during retry attempts. There are cases, where you want to check if an item exists first, so you can determine the following actions. Some API’s do not support an endpoint (e.g. Todoist: completed tasks) to do so, which is why you would work with the error branch, only that this does not work well in combination with the retry functionality. How it works Instead of the builtin retry function of a Node a custom loop is used, to get more granular control in between the iterations If the main executed node fails, the error can be filtered for an expected error, which can trigger a separate action The retries only happen, if an unexpected error happened The workflow only stops, if the defined amount of retries exceeded Setup Copy the nodes into your existing workflow Replace the “Replace me” placeholder with the Node you want to apply the retry logic on Follow the sticky notes for more instructions and optional settings
by Julian Kaiser
This automated workflow scrapes and processes the monthly "Who is Hiring" thread from Hacker News, transforming raw job listings into structured data for analysis or integration with other systems. Perfect for job seekers, recruiters, or anyone looking to monitor tech job market trends. How it works Automatically fetches the latest "Who is Hiring" thread from Hacker News Extracts and cleans relevant job posting data using the HN API Splits and processes individual job listings into structured format Parses key information like location, role, requirements, and company details Outputs clean, structured data ready for analysis or export Set up steps Configure API access to [Hacker News](https://github.com/HackerNews/API ) (no authentication required) Follow the steps to get your cURL command from https://hn.algolia.com/ Set up desired output format (JSON structured data or custom format) Optional: Configure additional parsing rules for specific job listing information Optional: Set up integration with preferred storage or analysis tools The workflow transforms unstructured job listings into clean, structured data following this pattern: Input: Raw HN thread comments Process: Extract, clean, and parse text Output: Structured job listing data This template saves hours of manual work collecting and organizing job listings, making it easier to track and analyze tech job opportunities from Hacker News's popular monthly hiring threads.
by Tiartyos
Voice Cloning Workflow - Zyphra Zonos API Who is this for? This workflow is designed for developers, content creators, and businesses looking to automate high-quality voice synthesis using AI voice cloning technology. What problem does this solve? It automates the process of generating natural-sounding speech from text using a sample voice file, eliminating the need for manual voice recording and providing consistent voice output for applications like audiobooks, virtual assistants, or content localization. What this workflow does The workflow receives text and voice cloning parameters via webhook, reads a sample voice file from your storage, sends the data to Zyphra's Zonos API for voice synthesis, and saves the generated audio file to your specified output location. Prerequisites You'll need: API key from Zyphra (obtain from https://playground.zyphra.com/settings/api-keys) Account registration at https://playground.zyphra.com Sample voice file stored on accessible disk/cloud storage n8n instance running with webhook capabilities Setup Configure your Zyphra API key in the "Call Zyphra Clone API" node under Header Parameters (Name: X-API-Key, Value: your-api-key) Ensure your sample voice files are accessible at the paths you'll specify Test the webhook endpoint is accessible Supported Audio Formats The API supports multiple output formats through the mime_type parameter: WebM** (default) - audio/webm Ogg** - audio/ogg WAV** - audio/wav MP3** - audio/mp3 or audio/mpeg MP4/AAC** - audio/mp4 or audio/aac Usage Example Endpoint: POST http://localhost:5678/webhook-test/voice-clone Headers: Content-Type: application/json Request Body: { "text": "Hello there! This voice sounds just like the sample!", "speaking_rate": 18, "sample_voice_path": "/data/output/sampleVoice.wav", "output_path": "/data/output/", "language_iso_code": "en-us", "mime_type": "audio/wav", "model": "zonos-v0.1-transformer", "emotion": { "happiness": 0.8, "neutral": 0.3, "sadness": 0.05, "disgust": 0.05, "fear": 0.05, "surprise": 0.05, "anger": 0.05, "other": 0.5 } } Parameters Required Parameters text**: Text to synthesize into speech sample_voice_path**: Path to your voice sample file output_path**: Directory where generated audio will be saved Optional Parameters (with defaults) speaking_rate**: 15 - Speech speed language_iso_code**: "en-us" - Language code mime_type**: "audio/wav" - Output audio format model**: "zonos-v0.1-transformer" - AI model to use emotion**: Object with emotion levels (0-1 scale)
by Franz
🕸️ Dynamic Website Change Monitor with Smart Email Alerts Never miss important website updates again! This workflow automatically tracks changes on dynamic websites (think React apps, JavaScript-heavy sites) and sends you instant email notifications when something changes. Perfect for keeping tabs on competitors, monitoring product updates, or staying on top of important announcements. ✨ What makes this special? 🚀 Handles Dynamic Websites: Uses Firecrawl API to scrape JavaScript-rendered content that basic scrapers can't touch 📧 Smart Email Alerts: Only sends notifications when content actually changes (no spam!) 📊 Historical Tracking: Keeps a complete log of all changes in Google Sheets 🛡️ Bulletproof: Continues working even if one part fails ⚡ Ready to Deploy: Webhook-triggered, perfect for cron jobs or external schedulers 🎯 Perfect for monitoring: Competitor pricing pages Job board postings Product availability updates News sites for breaking stories API documentation changes Terms of service updates 🛠️ What you'll need to get started: API Accounts & Keys: Firecrawl Account 🔥 Sign up at firecrawl.dev Grab your API key from the dashboard Create a "Bearer Auth" credential in n8n Google Cloud Setup ☁️ Enable Google Sheets API Enable Gmail API Set up OAuth2 credentials Add both as credentials in n8n Google Sheets Document 📋 Create a new spreadsheet Add two tabs: "Log" and "comparison" Follow the structure outlined in the workflow notes 🚀 How it works: Webhook receives trigger → Starts the monitoring process Firecrawl scrapes website → Gets fresh content (even JavaScript-rendered!) Smart comparison → Checks against previously stored content Change detected? → If yes, send email + log everything Update storage → Prepares for next monitoring cycle ⚙️ Setup Steps: Import this workflow into your n8n instance Configure credentials for Firecrawl, Google Sheets, and Gmail Update the target URL in the Firecrawl node Set your email address in the Gmail node Create your Google Sheets with the required structure Test it manually first, then activate! 🎨 Customize it your way: Target any website** by updating the URL Change email templates** to match your style Adjust monitoring frequency** with external cron jobs Switch between markdown/HTML** extraction formats Fine-tune change detection** sensitivity 🔧 Troubleshooting: Firecrawl errors?** Check your API key and rate limits Google Sheets issues?** Verify OAuth permissions and sheet structure Email not sending?** Check Gmail API quotas and spam folders Webhook problems?** Make sure the workflow is activated Ready to never miss another website change? Let's get this automation running! 🎉
by David Olusola
AI-Powered Airtable Contact Manager Overview The AI-Powered Airtable Contact Manager is an intelligent n8n workflow that enables AI assistants to seamlessly manage contact data in Airtable through natural language interactions. Using the Model Context Protocol (MCP), this workflow bridges the gap between conversational AI and structured data management. How It Works This workflow creates a powerful AI-to-database interface that allows users to manage their Airtable contacts through natural language commands. Here's the complete flow: 1. AI Interaction Layer Users interact with an AI assistant using natural language Examples: "Add John Doe to contacts", "Find all contacts assigned to Sarah", "Show me contact details for ID xyz" 2. MCP Server Trigger The AI assistant processes the user's request and identifies the needed operation Sends structured commands to the n8n workflow via the MCP (Model Context Protocol) Acts as the intelligent routing system for all contact operations 3. Airtable Operations The workflow provides four core contact management functions: 🔍 Get Record: Retrieves specific contact details using a Record ID Input: Record ID from AI Output: Complete contact information (Name, Email, Assignee, Status) ➕ Create Record: Adds new contacts to the database Input: Contact details (Name, Email, Assignee) Output: New record with auto-generated ID and default status 🗑️ Delete Record: Removes contacts permanently Input: Record ID to delete Output: Confirmation of successful deletion 🔎 Search Records: Finds contacts using flexible criteria Input: Airtable formula for filtering Output: All matching contact records 4. Smart Data Handling The workflow uses AI-powered field mapping with $fromAI() functions Automatically extracts relevant information from natural language requests Maintains data integrity with proper field validation Setup Steps Prerequisites n8n instance (cloud or self-hosted) Airtable account with API access MCP-compatible AI system Step 1: Airtable Preparation Create Airtable Base: Set up a new base or use existing one Note your Base ID (starts with app) Set Up Contact Table: Create a table with these fields: Name (Single line text) email (Email) Assignee (Single line text) Status (Single select: Todo, In progress, Done) Note your Table ID (starts with tbl) Generate API Token: Go to https://airtable.com/developers/web/api/introduction Create a personal access token with full permissions Save this token securely Step 2: n8n Configuration Import Workflow: Copy the enhanced JSON workflow Import into your n8n instance Configure Airtable Credentials: Go to Credentials in n8n Create new "Airtable Personal Access Token" credential Enter your Airtable API token Name it "full access" (or update credential references in workflow) Update Base and Table IDs: Replace YOUR_AIRTABLE_BASE_ID with your actual Base ID (starts with app) Replace YOUR_AIRTABLE_TABLE_ID with your actual Table ID (starts with tbl) Update in all four Airtable nodes Update Credential References: Replace your-airtable-credential-id with your actual credential ID Or rename your credential to match "Airtable API Token" Step 3: MCP Integration Configure MCP Server: Set up your MCP server to communicate with n8n Replace your-webhook-path-here and your-webhook-id-here with your actual webhook details Configure your AI system to use this workflow Update Node IDs (Optional): The workflow uses placeholder node IDs for privacy n8n will auto-generate new IDs when you import No action needed unless you're referencing specific nodes Test the Integration: Activate the workflow in n8n Test each operation through your AI interface Verify data flows correctly between AI and Airtable Step 4: Customization (Optional) Add More Fields: Extend the Airtable schema as needed Update the Create Record node field mappings Modify the Search Record filters Enhanced Error Handling: Add error handling nodes Set up notifications for failed operations Implement retry logic for reliability Usage Examples Once set up, users can interact with the system naturally: Creating Contacts: "Add Sarah Johnson with email sarah@company.com, assign to Mike" "Create a new contact for David Wilson, email david@startup.io" Finding Contacts: "Show me all contacts assigned to Jennifer" "Find contacts with status 'In progress'" "Search for contacts with gmail addresses" Managing Records: "Get details for contact rec123ABC" "Delete the contact with ID rec456DEF" "Update John's status to Done" Benefits Natural Language Interface**: No technical knowledge required Automated Data Entry**: Reduces manual work and errors Flexible Searching**: Find contacts using any criteria AI-Powered**: Leverages advanced language understanding Scalable**: Easily extend with more operations Integrated**: Works seamlessly with existing Airtable workflows Technical Notes Uses n8n's $fromAI() function for intelligent data extraction Implements MCP for standardized AI-to-automation communication Supports Airtable's formula syntax for complex searches Maintains security through proper credential management Designed for high reliability with error handling capabilities This workflow transforms contact management from a manual, time-consuming task into an effortless, conversational experience powered by AI.
by David Ashby
Complete MCP server exposing 2 Wayback API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add Wayback API credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the Wayback API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.archive.org • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (2 total) 🔧 Wayback (2 endpoints) • GET /wayback/v1/available: GET /wayback/v1/available • POST /wayback/v1/available: POST /wayback/v1/available 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native Wayback API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Ranjan Dailata
Who this is for? This workflow is designed for professionals and teams who need real-time, structured insights from Perplexity Search results without manual effort. What problem is this workflow solving? This n8n workflow solves the problem of automating Perplexity Search result extraction, cleanup, summarization, and AI-enhanced formatting for downstream use like sending the results to a webhook or another system. What this workflow does Automates Perplexity Search via Bright Data Uses Bright Data’s proxy-based SERP API to run a Google Search query programmatically. Makes the process repeatable and scriptable with different search terms and regions/zones. Cleans and Extracts Useful Content The Readable Data Extractor uses LLM-based cleaning to remove HTML/CSS/JS from the response and extract pure text data. Converts messy, unstructured web content into structured, machine-readable format. Summarizes Search Results Through the Gemini Flash + Summarization Chain, it generates a concise summary of the search results. Ideal for users who don’t have time to read full pages of search results. Formats Data Using AI Agent The AI Agent acts like a virtual assistant that: - Understands search results Formats them in a readable, JSON-compatible form Prepares them for webhook delivery Delivers Results to Webhook Sends the final summary + structured search result to a webhook (could be your app, a Slack bot, Google Sheets, or CRM). Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). Update the Perplexity Search Request node with the prompt you wish to perform the search. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. How to customize this workflow to your needs 1. Change the Perplexity Search Input Default: It searches a fixed query or dataset. Customize: Accept input from a Google Sheet, Airtable, or a form. Auto-trigger searches based on keywords or schedules. 2. Customize Summarization Style (LLM Output) Default: General summary using Google Gemini or OpenAI. Customize: Add tone: formal, casual, technical, executive-summary, etc. Focus on specific sections: pricing, competitors, FAQs, etc. Translate the summaries into multiple languages. Add bullet points, pros/cons, or insight tags. 3.Choose Where the Results Go Options: Email, Slack, Notion, Airtable, Google Docs, or a dashboard. Auto-create content drafts for WordPress or newsletters. Feed into CRM notes or attach to Salesforce leads.
by max e
Turn plain-language chat like “Tomorrow 9 AM: write blog post” into neatly organised Todoist tasks with GPT-4o and n8n—zero code. 🪄 Ultimate Personal Todoist Agent Turn natural-language requests into perfectly-organized Todoist tasks—all on autopilot inside n8n. > “Add Finish quarterly report by Friday afternoon” → the agent creates the task, sets the due date & priority, and even drops it into the right project. ✨ 🌟 Why this workflow rocks All-in-one Todoist super‑powers** – create, update, complete, move, archive… every major Todoist endpoint is wired up (tasks, projects, sections, labels, comments). LLM‑powered intent detection** – an OpenAI model interprets plain-English (or emoji‑filled!) messages so you don’t have to remember slash‑commands. Minimal setup** – just two credentials and you’re live. Battle‑tested building block** – use it as‑is, or plug the Todoist Agent node into your own agents & chatbots. 🛠️ What you’ll need | Credential | Where it’s used | How to set it up | | ------------------ | -------------------------------------- | --------------------------------------------------------------------------------------------- | | OpenAI API | Orchestrator & LLM nodes | Paste your OpenAI secret key into an OpenAI credential in n8n. | | Todoist OAuth2 | Todoist node and HTTP Request node | Log in Todoist from your browser to set up credential in n8n. | > That’s it—no webhooks, no extra secrets. > Tested with *gpt‑4o‑latest* – the fastest & most accurate model in our trials. ⚡ Quick‑start (5 minutes) Import the JSON template (hit ▶️ Try it out on the n8n template page or drag‑drop the file into your canvas). Select your credentials in the two credential dropdowns. Click Test workflow. In the sample Function node, tweak the message field (e.g. “Tomorrow at 9 am: write blog post”). Run → watch your new Todoist task appear. (Optional) Swap the Function node for your favourite chat trigger (Telegram, Slack, WhatsApp, Discord, you name it). Boom—your personal Todoist genie is alive! 🧞♂️ 🧩 How it works (under the hood) [Trigger / Chat message] │ ▼ [🗂️ Orchestrator Agent] ← OpenAI Chat Model + Short‑term Memory │ ↳ Parses intent & entities │ ▼ [🤖 Todoist Agent] ← 15+ Todoist endpoints │ ↳ Executes the right call (create, update, complete, etc.) ▼ [Done ✅ ] The Orchestrator is an example. In production you can drop it and simply expose the Todoist Agent as a tool for any other agent workflow. 🎛️ Customising & extending | Idea | How to do it | | ------------------------- | ---------------------------------------------------------------------------------------- | | Notion / Sheets sync | After the Todoist Agent node, add a Notion or Google Sheets node to log completed items. | | Voice commands | Swap the chat trigger for a Speech‑to‑Text node (e.g. Whisper). | 🤝 Need custom automations? Want me to build or tweak something for you? → Email maxemelyanenko@gmail.com and let’s make it happen! ⚠️ What’s not included (yet) Shared projects & other Todoist Pro/Business endpoints. File attachments in the comments. Editing comments. Pull requests welcome! 🙌
by David Ashby
Complete MCP server exposing 2 CarbonDoomsDay API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add CarbonDoomsDay credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the CarbonDoomsDay API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.carbondoomsday.com/api • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (2 total) 🔧 Co2 (2 endpoints) • GET /co2/: Get CO2 Measurement by Date • GET /co2/{date}/: CO2 measurements from the Mauna Loa observatory. This data is made available through the good work of the people at the Mauna Loa observatory. Their release notes say: These data are made freely available to the public and the scientific community in the belief that their wide dissemination will lead to greater understanding and new scientific insights. We currently scrape the following sources: [co2_mlo_weekly.csv] [co2_mlo_surface-insitu_1_ccgg_DailyData.txt] [weekly_mlo.csv] We have daily CO2 measurements as far back as 1958. Learn about using pagination via [the 3rd party documentation]. [co2_mlo_weekly.csv]: https://www.esrl.noaa.gov/gmd/webdata/ccgg/trends/co2_mlo_weekly.csv [co2_mlo_surface-insitu_1_ccgg_DailyData.txt]: ftp://aftp.cmdl.noaa.gov/data/trace_gases/co2/in-situ/surface/mlo/co2_mlo_surface-insitu_1_ccgg_DailyData.txt [weekly_mlo.csv]: http://scrippsco2.ucsd.edu/sites/default/files/data/in_situ_co2/weekly_mlo.csv [the 3rd party documentation]: http://www.django-rest-framework.org/api-guide/pagination/#pagenumberpagination 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native CarbonDoomsDay API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Arlin Perez
Sort New Gmail messages by category with AI 👥 Who's it for This workflow is perfect for individuals or teams who receive a high volume of emails 📥 and want to automatically organize them into Gmail labels 🏷️ using AI. No coding required! For sorting existing emails messages in your gmail inbox, please use this free workflow: Categorize and Label Existing Gmail Emails Automatically with GPT-4o mini. 🤖 What it does It automatically processes new Gmail emails, skips those that already have labels, sends the content to an AI Agent powered by GPT-4o mini 🧠, and applies a relevant label based on the content. All labels must exist in Gmail beforehand. ⚙️ How it works 📬 Gmail Trigger – Activates on new email received. 🚫 Filter – Skips emails that already have a label. 🧠 AI Agent (GPT-4o mini) – Analyzes the message and decides which label fits best. 🧾 Structured Output Parser – Formats the AI output into a clean JSON. 🔀 Switch Node – Routes each email to the correct label path based on the AI result. 🏷️ Gmail Nodes – Assign the Gmail label to the original email. 📋 Requirements Gmail account connected to n8n Pre-created labels in Gmail matching the AI categories OpenAI credentials with GPT-4o mini access n8n's AI Agent & Structured Output Parser nodes 🛠️ How to set up Open the workflow and adjust the trigger interval (e.g., every minute, hours or Custom using Cron ⏱️) Check that the Filter skips emails with existing labels Define your categories in the AI Agent prompt and make sure they match the Gmail labels Configure the Switch Node conditions for each category Ensure each Gmail Label Node applies the correct label Save and activate the workflow ✅ 🎨 How to customize the workflow Add or remove categories in the AI prompt & Switch Node Fine-tune prompt instructions to match your specific use case