by Dvir Sharon
🛒 Monitor Google Shopping Prices with Bright Data & Email Alerts This template requires a self-hosted n8n instance to run. A comprehensive n8n automation that monitors product prices daily using Bright Data's Google Shopping dataset and sends smart email alerts when price conditions are met. 📋 Overview This workflow provides an automated price monitoring solution that tracks product prices from Google Shopping daily and sends intelligent email notifications. Perfect for e-commerce monitoring, competitor analysis, deal hunting, and inventory management. ✨ Key Features 🕘 Scheduled Monitoring: Daily automated price checks at 9 AM 🛍️ Google Shopping Integration: Uses Bright Data's dataset for accurate pricing 📊 Smart Price Comparison: Compares current prices with historical data 📧 Intelligent Alerts: Sends emails only when prices meet criteria 📈 Data Storage: Updates Google Sheets with latest pricing data 🔄 Batch Processing: Handles multiple products with rate limiting ⚡ Fast & Reliable: Built-in error handling 🎯 Customizable Filters: Advanced price comparison logic 🎯 What This Workflow Does Schedule Trigger: Runs daily at 9 AM Data Retrieval: Fetches product list from Google Sheets Price Extraction: Scrapes current prices using Bright Data Data Update: Updates Google Sheets with new prices Price Comparison: Compares new vs. old prices Smart Filtering: Filters products that meet alert criteria Email Notifications: Sends alerts for qualifying changes Rate Limiting: Adds delay between emails Output Data Points | Field | Description | Example | | :------------ | :------------------------- | :------------------------------- | | Product URL | Original Google Shopping URL | https://shopping.google.com/product/... | | Product Name | Product title | iPhone 15 Pro Max 256GB | | Ratings | Product rating score | 4.5 | | Reviews | Number of reviews | 1,247 | | Old Price | Previous price | $1,199.00 | | New Price | Current scraped price | $1,199.00 | | Timestamp | When the check occurred | 2025-05-30T09:00:00Z | 🚀 Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access Bright Data account with Google Shopping dataset access Gmail account for notifications Steps Import the workflow JSON into n8n Configure Bright Data credentials and dataset access Set up Google Sheets with required columns Configure Gmail OAuth2 credentials Update sheet IDs and schedule settings Test with sample products and activate 📖 Usage Guide Google Sheet Structure Your Google Sheet should have the following columns to ensure the workflow functions correctly: Product URL** (Text): The direct URL to the Google Shopping product page. This is the primary identifier for the product. Product Name** (Text): The name of the product. This will be automatically populated or updated by the workflow. Old Price** (Number/Currency): The price of the product from the previous check. This column is crucial for price comparison. New Price** (Number/Currency): The most recently scraped price of the product. Ratings** (Number): The star rating of the product. Reviews** (Number): The total number of reviews for the product. Timestamp** (Datetime): The date and time when the price check was performed. Adding Products Add Google Shopping URLs to your Google Sheet. The workflow will fetch product details and track prices. Historical price data builds over time. Understanding Price Alerts The default setting for this workflow is to send an email alert when the new price equals the old price. This might seem counterintuitive, but it's useful for specific scenarios, such as: Monitoring stable pricing:** If you are tracking a product and want to be notified when its price has remained consistent over time, indicating a potential stable buying opportunity or a benchmark. Verifying data consistency:** To confirm that the scraping process is working correctly and consistently retrieving the same price when no changes are expected. You can easily customize the alert logic to trigger on different conditions as described below. Customizing Alert Logic Price drops:** new_price < old_price Significant drops:** new_price < (old_price * 0.9) (e.g., price dropped by more than 10%) Price increases:** new_price > old_price Any change:** new_price != old_price Reading the Results Real-time pricing data Historical tracking Product metadata Timestamps for each check 🔧 Customization Options Add More Data:** Descriptions, availability, seller info, shipping, images Modify Email Templates:** Customize subject and body Multiple Recipients:** Duplicate email node and change recipients Webhook Integration:** Add real-time triggers or Slack alerts 🚨 Troubleshooting Bright Data connection failed:** Check API credentials and dataset access No price data extracted:** Verify URLs and test with different products Google Sheets permission denied:** Re-authenticate and check sharing Emails not sending:** Re-auth Gmail OAuth and verify recipients Filter not working:** Check price formats and logic Workflow failed:** Check logs, retry logic, and network status 📊 Use Cases & Examples E-commerce Monitoring:** Track competitor pricing and trends Deal Hunting:** Get alerts for price drops on wishlist items Inventory Management:** Monitor supplier pricing for procurement Market Research:** Analyze pricing trends and generate reports ⚙️ Advanced Configuration Batch Processing:** Increase batch size, add delays, use parallel processing Price History:** Store historical data, calculate averages, forecast trends Tool Integration:** CRM, Slack, databases, BI tools (Tableau, Power BI) 📈 Performance & Limits Single URL:** 2–5 seconds Concurrent Requests:** 3–5 (depends on Bright Data plan) Data Accuracy:** 95%+ Success Rate:** 90%+ Daily Capacity:** 100–500 products Memory:** ~100MB per execution API Calls:** 1 Bright Data + 2 Google Sheets per product 🤝 Support & Community n8n Forum:** <https://community.n8n.io> Documentation:** <https://docs.n8n.io> Bright Data Support:** Via your Bright Data dashboard GitHub Issues:** Report bugs and request features 🎯 Ready to Use! Your workflow provides a solid foundation for automated price monitoring. Customize it to fit your specific needs and use cases for maximum effectiveness in tracking Google Shopping prices with intelligent email notifications. Please note that this template uses Community Nodes. Ensure you understand the risks before using community nodes.
by Samir Saci
Tags: Scrapping, Events, European Union, Networking Context Hey! I’m Samir, a Supply Chain Engineer and Data Scientist from Paris, and the founder of LogiGreen Consulting. We use AI, automation, and data to support sustainable and data-driven operations across all types of organizations. This workflow is part of our networking strategy (as a business) to track official EU events that may relate to topics we cover. > Want to stay ahead of critical EU meetings and events without checking the website every day? This n8n workflow automatically scrapes the EU’s official event portal and logs the latest entries with clean metadata including date, location, category, and link. 📬 For collaborations, feel free to connect with me on LinkedIn Who is this template for? This workflow is useful for: Policy & public affairs teams** following institutional activities Sustainability teams** watching for relevant climate-related summits NGOs and researchers** interested in event calendars Data teams** building dashboards on public event trends What does it do? This n8n workflow: 🌐 Scrapes the EU events portal for new meetings and conferences 📅 Extracts event metadata (title, date, location, type, and link) 🔁 Handles pagination across multiple pages 🚫 Checks for duplicates already stored 📊 Saves new records into a connected Google Sheet How it works Triggered daily via cron HTTP node loads the event listing HTML Extract HTML blocks for each event article Parse event name, link, type, location, and full date Concatenate and clean dates for easy tracking Store non-duplicate entries in Google Sheets The workflow uses static data to track pagination and ensure only new events are stored, making it ideal for building up a clean dataset over time. What do I need to get started? You’ll need: A Google Sheet connected to your n8n instance No code or AI tools needed — just n8n and this template Follow the Guide! Sticky notes are included directly inside the workflow to guide you step-by-step through setup and customisation. 🎥 Watch My Tutorial Notes This is ideal for analysts and consultants who want clean, structured data from the EU portal You can add filtering, email alerts, or AI classifiers later This workflow was built using n8n version 1.93.0 Submitted: June 1, 2025
by Yaron Been
Workflow Overview This sophisticated n8n automation is a powerful lead generation and outreach tool designed to transform YouTube channel research into actionable marketing opportunities. By intelligently connecting multiple services and APIs, this workflow: Discovers Targeted Channels: Scrapes YouTube channels based on specific keywords Extracts comprehensive channel metadata Identifies potential business opportunities Intelligent Lead Qualification: Filters channels with contact emails Validates email authenticity Ensures high-quality lead generation Personalized Outreach: Sends customized cold emails Leverages channel-specific personalization Automates initial contact process Key Benefits 🕵️ Automated Lead Discovery: Find potential collaborators or clients 🧠 Smart Filtering: Eliminate invalid or irrelevant leads 📧 Personalized Outreach: Contextual, channel-specific communication ⏱️ Time-Saving: Eliminate manual research and email hunting Workflow Architecture 🔍 Stage 1: Channel Scraping Apify Integration**: Scrapes YouTube channels Keyword-Based Search**: Target specific niches Metadata Extraction**: Collect channel details, emails 🧩 Stage 2: Lead Qualification Email Existence Check**: Filter channels with contact info ZeroBounce Verification**: Validate email authenticity Quality Control**: Ensure only valid leads proceed 📬 Stage 3: Personalized Outreach Gmail Integration**: Send customized cold emails Dynamic Personalization**: Use channel-specific details Automated Communication**: Streamline initial contact Potential Use Cases Marketing Agencies**: Find potential clients Influencer Marketers**: Discover collaboration opportunities Content Creators**: Network and expand professional connections Sales Teams**: Generate targeted lead lists Recruitment Specialists**: Identify industry professionals Setup Requirements Apify Account API token YouTube Scraper Actor Configured search keywords ZeroBounce Account Email verification API Validation credits Gmail Account OAuth2 authentication Configured sending profile n8n Installation Cloud or self-hosted instance Import workflow configuration Configure API credentials Future Enhancement Suggestions 🤖 AI-powered email personalization 📊 Advanced lead scoring mechanisms 🔄 Automated follow-up sequences 📈 Integration with CRM platforms 🌐 Multi-platform lead generation Ethical Considerations Respect email communication guidelines Comply with anti-spam regulations Provide clear opt-out mechanisms Maintain professional, value-driven outreach Connect With Me Ready to supercharge your lead generation? 📧 Email: Yaron@nofluff.online 🎥 YouTube: @YaronBeen 💼 LinkedIn: Yaron Been Transform your outreach strategy with intelligent, automated workflows!
by Ranjan Dailata
Who this is for? The LinkedIn Profile Extract and JSON Resume Builder is a powerful workflow that scrapes professional profile data from LinkedIn using Bright Data's infrastructure, then transforms that data into a clean, structured JSON resume using Google Gemini. The workflow is ideal for automating resume parsing, candidate profiling, or integrating into recruiting platforms. This workflow is tailored for: HR professionals & recruiters automating resume screening Talent acquisition platforms enriching candidate profiles Developers & AI builders creating resume-parsing AI pipelines Data scientists working on labor market analytics Growth hackers profiling prospects via public data What problem is this workflow solving? Parsing resumes or LinkedIn profiles into machine-readable formats is often a manual, error-prone process. Most scraping tools either fail due to anti-bot protections or return unstructured HTML that's hard to work with. This workflow solves that by: Using Bright Data's Web Unlocker for reliable, CAPTCHA-free LinkedIn scraping Extracting clean text and structured profile data via Google Gemini LLM Automatically generating a standards-compliant JSON Resume and Skills Sending the resume to webhooks or storing it for downstream usage What this workflow does Accepts LinkedIn Profile URL and required metadata (Bright Data zone, webhook) Scrapes LinkedIn profile using Bright Data Web Unlocker Extracts clean content and skills using Google Gemini LLM Builds a JSON-formatted resume following the JSON resume schema Sends the JSON resume via Webhook Notification Persists the output by saving the file to disk Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). Update the Set URL and Bright Data Zone node with the LinkedIn profile, Bright Data Zone and the Webhook notification URL. For testing purposes, you can obtain a webhook url using https://webhook.site/ How to customize this workflow to your needs Add Language Translation Insert a translation LLM node to support multilingual profiles. Generate PDF Resumes Convert JSON to formatted PDF resumes using an HTML-to-PDF module. Push to ATS or CRM Add integration nodes to pipe data into applicant tracking systems (ATS), CRMs, or databases. Use Alternative LLMs Swap Gemini with OpenAI or Anthropic Claude if preferred.
by Sobek
📝 DESCRIPTION OF THE WORKFLOW This workflow connects Salesforce and Geotab to streamline fleet tracking for field service jobs (Work Orders). When a new Work Order is created in Salesforce (with a 'New' status and valid coordinates), it creates a circular geofence zone in Geotab and updates the Work Order with the zone ID. If geolocation is missing, an alert email is sent to dedicated email. The workflow uses a Salesforce Outbound Message to trigger an n8n webhook. It includes robust credential handling and optional logic to skip or notify on bad data. Use Cases: Automating vehicle geofence setup for service visits Enhancing CRM-to-fleet system synchronisation Enforcing work orders data quality via alerts Integrations Used: Salesforce Geotab API Microsoft Outlook (or any SMTP-compatible service) Tags: geotab, salesforce, fleet management, gps tracking, field service, crm, automation, webhook, integration ADDITIONAL RESOURCES 🔗 Salesforce Salesforce Login \[Salesforce Setup (Admin Console)]\(https://login.salesforce.com/ → click “Setup” gear icon) Outbound Messages Documentation Salesforce Developer Documentation Salesforce Workbench (API Testing Tool) 🔗 Geotab Geotab Login (MyGeotab) Geotab Developer Portal Geotab API Explorer Geotab SDK (JavaScript Samples) Geotab Support Centre
by Sergey Skorobogatov
Accept YooKassa payments and log transactions in Google Sheets 🧾 Summary This workflow allows you to accept online payments via YooKassa and log both orders and transactions in Google Sheets — all without writing a single line of code. It supports full payment flow: product selection, payment initiation, webhook processing, refund updates, and payment status checks. 👥 Who is this for? This template is ideal for: Online stores with simple checkout flows Sellers of digital products or info-courses Entrepreneurs using Telegram bots or web forms Anyone needing quick payment integration with Google Sheets tracking 🎯 What problem does this workflow solve? Setting up online payments usually requires backend infrastructure. This no-code solution automates the entire payment flow: Handles product listing and price retrieval Initiates payments with email and return URL Listens for payment.succeeded and refund.succeeded events Records every action into structured Google Sheets ⚙️ What this workflow does 1. GET /products Returns a sorted list of products from a Google Sheet (products). 2. POST /payment Validates required fields (product_id, email, return_url) Checks email format Fetches product data from products Generates a unique idempotence key Sends a request to YooKassa API Saves the order into the orders sheet Returns a payment confirmation link 3. POST /yoomoney Webhook to process payment/refund events: On payment.succeeded, adds entry to transactions On refund.succeeded, updates transaction status 4. GET /status/\:id Returns real-time payment status from YooKassa 🚀 Setup Connect credentials: Google Sheets (OAuth2) YooKassa (Basic Auth using shopId and secretKey) Update the following Google Sheets: products: should contain product_id, title, price orders: for saving confirmed purchases transactions: for logging all successful or refunded payments Test endpoints using any HTTP client: Example payload for /payment: { "product_id": "abc123", "email": "user@example.com", "return_url": "https://your.site/success" } 🔧 How to customize this workflow Add delivery logic (e.g., email with product link after successful payment) Replace Google Sheets with a database (e.g., PostgreSQL) Connect Telegram or other messengers for post-payment notifications Add promo codes, discounts, or subscriptions logic 💼 Use cases Simple online checkouts Telegram bots selling access Educational product sales MVP e-commerce flows Donation or membership payments 📎 Notes ✅ Includes Sticky Notes for sections ✅ Includes error handling and validation ✅ No custom code needed except UUID generation
by slow-groovin@api2o.com
AI Comprehensive Research on User's Query with Gemini and Web Search What is this? Perform comprehensive research on a user's query by dynamically generating search terms, querying the web using Google Search (by Gemini) , reflecting on the results to identify knowledge gaps, and iteratively refining its search until it can provide a well-supported answer with citations. (like Perplexity) This workflow is a reproduction of gemini-fullstack-langgraph-quickstart in N8N. The gemini‑fullstack‑langgraph‑quickstart is a demo by the Google‑Gemini team that showcases how to build a powerful full‑stack AI agent using Gemini and LangGraph How It Works Generate Query 💬 generates one or more search queries tasks based on the User's question. uses Gemini 2.0 Flash Web Research 🌐 execute web search tasks using the native Google Search API tool in combination with Gemini 2.0 Flash. Reflection 📚 Identifies knowledge gaps and generates potential follow-up queries. Setup Configure API Credentials: Create Google Gemini(PaLM) Api Credential using you own Gemini key Connect the credential with three nodes: Google Gemini Chat Model and GeminiSearch and reflection Configure Redis Source: prepare a Redis service that can be accessed by n8n Create Redis Crediential and connect it with all Redis node Customize Try using different Gemini models. Try modifying the parameters number_of_initial_queries and max_research_loops. Why use Redis? Use Redis as an external storage to maintain global variables (counter, search results, etc.) This workflow contains a loop process, which need global variables (as State in LangGraph). It is difficult to achieve global variables management without external storage in n8n.
by Jez
Workflow: Automated Weekly Google Calendar Summary via Email with AI ✨🗓️📧 Get a personalized, AI-powered summary of your upcoming week's Google Calendar events delivered straight to your inbox! This workflow automates the entire process, from fetching events to generating an intelligent summary and emailing it to you. 🌟 Overview This n8n workflow connects to your Google Calendar, retrieves events for the upcoming week (Monday to Sunday, based on the day the workflow runs), uses Google Gemini AI to create a well-structured and insightful summary, and then emails this summary to you. It's designed to help you start your week organized and aware of your commitments. Key Features: Automated Weekly Summary:** Runs on a schedule (default: weekly) to keep you updated. AI-Powered Insights:** Leverages Google Gemini to not just list events, but to identify important ones and offer a brief weekly outlook. Personalized Content:** Uses your specified timezone, locale, name, and city for accurate and relevant information. Clear Formatting:** Events are grouped by day and displayed chronologically with start and end times. Important events are highlighted. Email Delivery:** Receive your schedule directly in your inbox in a clean HTML format. Customizable:** Easily adapt to your specific calendar, AI preferences, and email settings. ⚙️ How It Works: Step-by-Step The workflow consists of the following nodes, working in sequence: weekly_schedule (Schedule Trigger): What it does: Initiates the workflow. Default: Triggers once a week at 12:00 PM. You can adjust this to your preference (e.g., Sunday evening or Monday morning). locale (Set Node): What it does: This is a crucial node for you to configure! It sets user-specific parameters like your preferred language/region (users-locale), timezone (users-timezone), your name (users-name), and your home city (users-home-city). These are used throughout the workflow for correct date/time formatting and personalizing the AI prompt. date-time (Set Node): What it does: Dynamically generates various date and time strings based on the current execution time and the locale settings. This is used to define the precise 7-day window (from the current day to 7 days ahead, ending at midnight) for fetching calendar events. get_next_weeks_events (Google Calendar Node): What it does: Connects to your specified Google Calendar and fetches all events within the 7-day window calculated by the date-time node. Requires: Google Calendar API credentials and the ID of the calendar you want to use. simplify_evens_json (Code Node): What it does: Runs a small JavaScript snippet to clean up the raw event data from Google Calendar. It removes several fields that aren't needed for the summary (like htmlLink, etag, iCalUID), making the data more concise for the AI. aggregate_events (Aggregate Node): What it does: Takes all the individual (and now simplified) event items and groups them into a single JSON array called eventdata. This is the format the AI agent expects for processing. Google Gemini (LM Chat Google Gemini Node): What it does: This node is the connection point to the Google Gemini language model. Requires: Google Gemini (or PaLM) API credentials. event_summary_agent (Agent Node): What it does: This is where the magic happens! It uses the Google Gemini model and a detailed system prompt to generate the weekly schedule summary. The Prompt Instructs the AI to: Start with a friendly greeting. Group events by day (Monday to Sunday) for the upcoming week, using the user's timezone and locale. Format event times clearly (e.g., 09:30 AM - 10:30 AM: Event Summary). Identify and prefix "IMPORTANT:" to events with keywords like "urgent," "deadline," "meeting," etc., in their summary or description. Conclude with a 1-2 sentence helpful insight about the week's schedule. Process the input eventdata (the JSON array of calendar events). Markdown (Markdown to HTML Node): What it does: Converts the text output from the event_summary_agent (which is generated in Markdown format for easy structure) into HTML. This ensures the email body is well-formatted with proper line breaks, lists, and emphasis. send_email (Email Send Node): What it does: Sends the final HTML summary to your specified email address. Requires: SMTP (email sending) credentials and your desired "From" and "To" email addresses. 🚀 Getting Started: Setup Instructions Follow these steps to get the workflow up and running: Import the Workflow: Download the workflow JSON file. In your n8n instance, go to "Workflows" and click the "Import from File" button. Select the downloaded JSON file. Configure Credentials: You'll need to set up credentials for three services. In n8n, go to "Credentials" on the left sidebar and click "Add credential." Google Calendar API: Search for "Google Calendar" and create new credentials using OAuth2. Follow the authentication flow. Once created, select these credentials in the get_next_weeks_events node. Google Gemini (PaLM) API: Search for "Google Gemini" or "Google PaLM" and create new credentials. You'll typically need an API key from Google AI Studio or Google Cloud. Once created, select these credentials in the Google Gemini node. SMTP / Email: Search for your email provider (e.g., "SMTP," "Gmail," "Outlook") and create credentials. This usually involves providing your email server details, username, and password/app password. Once created, select these credentials in the send_email node. ‼️ IMPORTANT: Customize User Settings in the locale Node: Open the locale node. Update the following values in the "Assignments" section: users-locale: Set your locale string (e.g., "en-AU" for English/Australia, "en-US" for English/United States, "de-DE" for German/Germany). This affects how dates, times, and numbers are formatted. users-timezone: Set your timezone string (e.g., "Australia/Sydney", "America/New_York", "Europe/London"). This is critical for ensuring event times are displayed correctly for your location. users-name: Enter your name (e.g., "Bob"). This is used to personalize the email greeting. users-home-city: Enter your home city (e.g., "Sydney"). This can be used for additional context by the AI. Configure the get_next_weeks_events (Google Calendar) Node: Open the node. In the "Calendar" parameter, you need to specify which calendar to fetch events from. The default might be a placeholder like c_4d9c2d4e139327143ee4a5bc4db531ffe074e98d21d1c28662b4a4d4da898866@group.calendar.google.com. Change this to your primary calendar (often your email address) or the specific Calendar ID you want to use. You can find Calendar IDs in your Google Calendar settings. Configure the send_email Node: Open the node. Set the fromEmail parameter to the email address you want the summary to be sent from. Set the toEmail parameter to the email address(es) where you want to receive the summary. You can also customize the subject line if desired. (Optional) Customize the AI Prompt in event_summary_agent: If you want to change how the AI summarizes events (e.g., different keywords for important events, a different tone, or specific formatting tweaks), you can edit the "System Message" within the event_summary_agent node's parameters. (Optional) Adjust the Schedule in weekly_schedule: Open the weekly_schedule node. Modify the "Rule" to change when and how often the workflow runs (e.g., a specific day of the week, a different time). Activate the Workflow: Once everything is configured, toggle the "Active" switch in the top right corner of the workflow editor to ON. 📬 What You Get You'll receive an email (based on your schedule) with a subject like "Next Week Calendar Summary : [Start Date] - [End Date]". The email body will contain: A friendly greeting. Your schedule for the upcoming week (Monday to Sunday), with events listed chronologically under each day. Event times displayed in your local timezone (e.g., 09:30 AM - 10:30 AM: Team Meeting). Priority events clearly marked (e.g., IMPORTANT: 02:00 PM - 03:00 PM: Project Deadline Review). A brief, insightful observation about your week's schedule. 🛠️ Troubleshooting & Notes Timezone is Key:** Ensure your users-timezone in the locale node is correct. This is the most common source of incorrect event times. Google API Permissions:** When setting up Google Calendar and Gemini credentials, make sure you grant the necessary permissions. AI Output Varies:** The AI-generated summary can vary slightly each time. The prompt is designed to guide it, but LLMs have inherent creativity. Calendar Event Details:** The quality of the summary (especially for identifying important events) depends on how detailed your calendar event titles and descriptions are. Including keywords like "meeting," "urgent," "prepare for," etc., in your events helps the AI. 💬 Feedback & Contributions Feel free to modify and enhance this workflow! If you have suggestions, improvements, or run into issues, please share them in the n8n community. Happy scheduling!
by Naveen Choudhary
Who is this for? Marketing, content, and enablement teams that need a quick, human-readable summary of every new video published by the YouTube channels they care about—without leaving Slack. What problem does this workflow solve? Manually checking multiple channels, skimming long videos, and pasting the highlights into Slack wastes time. This template automates the whole loop: detect a fresh upload from your selected channels → pull subtitles → distill the key take-aways with GPT-4o-mini → drop a neatly-formatted digest in Slack. What this workflow does Schedule Trigger fires every 10 min, then grabs a list of YouTube RSS feeds from a Google Sheet. HTTP + XML fetch & parse each feed; only brand-new videos continue. YouTube API fetches title/description, RapidAPI grabs English subtitles. Code nodes build an AI payload; OpenAI returns a JSON summary + article. A formatter turns that JSON into Slack Block Kit, and Slack posts it. Processed links are appended back to the “Video Links” sheet to prevent dupes. Setup Make a copy of this Google Sheet and connect a Google Sheets OAuth2 credential with edit rights. Slack App: create → add chat:write, channels:read, app_mention; enable Event Subscriptions; install and store the Bot OAuth token in an n8n Slack credential. RapidAPI key for https://yt-api.p.rapidapi.com/subtitles (300 free calls/mo) → save as HTTP Header Auth. OpenAI key → save in an OpenAI credential. Add your RSS feed URLs to the “RSS Feed URLs” tab; press Execute Workflow. How to customise Adjust the schedule interval or freshness window in “If newly published”. Swap the OpenAI model or prompt for shorter/longer digests. Point the Slack node at a different channel or DM. Extend the AI payload to include thumbnails or engagement stats. Use-case ideas Product marketing**: Instantly brief sales & CS teams when a competitor uploads a feature demo. Internal learning hub**: Auto-summarise conference talks and share bullet-point notes with engineers. Social media managers**: Get ready-to-post captions and key moments for re-purposing across platforms.
by Dvir Sharon
🎯 Automated TikTok Influencer Discovery & Analysis A complete n8n automation that discovers TikTok influencers using Bright Data, evaluates their fit using Claude AI, and sends personalized outreach emails. Designed for marketing teams and brands that need a scalable, intelligent way to find and connect with relevant creators. 📋 Overview This workflow provides a full-service influencer discovery pipeline: it finds TikTok profiles using search keywords, uses AI to assess alignment with your brand, and initiates contact with qualified influencers. Ideal for influencer marketing, brand partnerships, and campaign planning. ✨ Key Features 🔍 Keyword-Based Discovery** Locate TikTok influencers by specific niche-related keywords. 📊 Bright Data Integration** Access accurate TikTok profile data from Bright Data’s datasets. 🤖 AI-Powered Analysis** Claude AI evaluates each profile's fit with your brand based on bio, content, and more. 📧 Smart Email Notifications** Sends tailored outreach emails to creators deemed highly relevant. 📈 Data Storage** Google Sheets stores profile details, AI evaluation results, and outreach status. 🎯 Intelligent Filtering** Processes only influencers who meet your criteria (e.g., 5000+ followers, industry match). ⚡ Fast & Reliable** Uses professional scraping with robust error handling. 🔄 Batch Processing** Supports bulk influencer processing through a single automated flow. 🎯 What This Workflow Does Input Search Keywords** – TikTok terms for finding niche creators Business Info** – Brand description and industry Collaboration Criteria** – Follower count minimum, niche alignment Processing Steps Form Submission TikTok Discovery via Bright Data Data Extraction and Normalization Save to Google Sheets Relevance Scoring via Claude AI Filtering Based on AI Score + Follower Count Personalized Email Outreach Output Data Points | Field | Description | Example | |---------------|------------------------------------|----------------------------------| | Profile ID | TikTok profile identifier | tiktoker123456 | | Username | TikTok handle | @creativecreator | | URL | Profile link | https://tiktok.com/@creativecreator | | Description | Creator bio | "Fashion & lifestyle content..." | | Followers | Total follower count | 50,000 | | Collaboration | AI assessment of brand fit | "Highly Relevant" | | Analysis | 50-word Claude AI relevance summary| "Strong alignment with fashion..."| 🚀 Setup Instructions Prerequisites n8n (cloud or self-hosted) Bright Data account with TikTok access Google Sheets + Gmail Anthropic Claude API key 10–15 minutes setup time Step-by-Step Setup Import Workflow via JSON in n8n Configure Bright Data – Add API credentials and dataset ID Google Sheets – Setup credentials and map columns Claude AI – Insert API key and select desired model Gmail – Authenticate Gmail and update mail node settings Update Variables – Replace placeholders with business info Test & Launch – Submit a sample form and verify all outputs 📖 Usage Guide Adding Search Keywords Submit the form with search terms, business description, and industry category to trigger the workflow. Understanding AI Analysis Emails are sent only if: Collaboration status = Highly Relevant Follower count ≥ 5000 Industry alignment confirmed Claude AI returns a 50-word analysis justifying the match Customizing Filters Edit the "Find the Collaborator" prompt to adjust: Follower thresholds Industry relevance Additional metrics (e.g., engagement rate) Viewing Results Google Sheets log includes: Influencer metadata AI scores and rationale Collaboration status Email delivery timestamp 🔧 Customization Options Add More Fields:** Engagement rate, contact email, content themes Email Personalization:** Customize message templates or integrate other mail services Enhanced Filtering:** Use engagement rates, region, content frequency 🚨 Troubleshooting | Issue | Fix | |-------|-----| | Bright Data fails | Recheck API and dataset ID | | No influencer data | Adjust keywords or dataset scope | | Sheets permission error | Re-authenticate and check sharing | | Claude fails | Validate API key and prompt | | Emails not sent | Re-auth Gmail or update recipient field | | Form not triggering | Reconfirm webhook URL and permissions | Advanced Debugging Check n8n execution logs Run individual nodes for pinpointing failures Confirm all data formats Handle API rate limits Add error-catch nodes for retries 📊 Use Cases & Examples Brand Discovery:** Fashion, tech, fitness creators Competitor Insights:** Find influencers used by rival brands Campaign Planning:** Build targeted influencer lists Market Research:** Identify creator trends across regions ⚙️ Advanced Configuration Batch Execution:** Process multiple keywords with delay logic Engagement Metrics:** Scrape and calculate likes-to-follower ratios CRM Integration:** Sync qualified profiles to HubSpot, Salesforce, or Slack 📈 Performance & Limits Processing Time:** 3–5 minutes per keyword Concurrency:** 3–5 simultaneous fetches (depends on plan) Accuracy:** >95% influencer data reliability Success Rate:** 90%+ for outreach and processing
by Roman Rozenberger
How it works • Extract AI Overviews from Google Search - Receives data from browser extension via webhook • Convert HTML to Markdown - Automatically processes and cleans AI Overview content • Store in Google Sheets - Archives all extracted AI Overviews with metadata and sources • Generate SEO Guidelines - AI analyzes page content vs AI Overview to suggest improvements • Automate Analysis - Batch process multiple URLs and schedule regular checks Set up steps • Import workflow - Load the JSON template into your n8n instance (2 minutes) • Configure Google Sheets - Set up OAuth connection and create spreadsheet with required columns (5 minutes) • Set up AI provider - Add OpenRouter API credentials for Gemini 2.5 Pro (3 minutes) • Install browser extension - Deploy the companion Chrome/Firefox extension for data extraction (5 minutes) • Test webhook endpoint - Verify the connection between extension and n8n workflow (2 minutes) Total setup time: ~15 minutes What you'll need: Google account for Sheets integration Google Sheet template with required columns OpenRouter API key for Gemini 2.5 Pro model access Browser extension: Chrome Extension or Firefox Add-on n8n instance (local or cloud) Use cases: SEO agencies** - Monitor AI Overview presence for client keywords Content marketers** - Analyze what content gets featured in AI Overviews E-commerce** - Track AI Overview coverage for product-related searches Research** - Build datasets of AI Overview content across different topics The workflow comes with a free browser extension (Chrome | Firefox) that automatically extracts AI Overview content from Google Search and sends it via webhook to your n8n workflow for processing and analysis. GitHub Repository: https://github.com/romek-rozen/ai-overview-extractor/ Detailed Setup Instructions - AI Overview Extractor Prerequisites n8n instance** (local or cloud) - version 1.95.3+ Google account** for Sheets integration OpenRouter API account** for Gemini 2.5 Pro access Browser** (Chrome/Firefox) for the extension Step 1: Import the Workflow Open n8n and navigate to Workflows Click "Add workflow" → "Import from JSON" Upload the AI_OVERVIES_EXTRACTOR_TEMPLATE.json file Save the workflow Step 2: Configure Google Sheets Create Google Sheets Document Create new Google Sheet with these columns: extractedAt | searchQuery | sources | markdown | myURL | task | guidelines | key Here is public google sheet template: https://docs.google.com/spreadsheets/d/15xqZ2dTiLMoyICYnnnRV-HPvXfdgVeXowr8a7kU4uHk/edit?gid=0#gid=0 Copy the Google Sheets URL (you'll need it for the workflow) Set up Google Sheets Credentials In n8n, go to Settings → Credentials Click "Add credential" → "Google Sheets OAuth2 API" Follow the OAuth setup to authorize n8n access to Google Sheets Name the credential (e.g., "Google Sheets AI Overview") Configure Google Sheets Nodes Update these nodes with your Google Sheets URL: Get URLs to Analyze Save AI Overview to Sheets Save SEO Guidelines to Sheets In each node: Set documentId to your Google Sheets URL Set sheetName to your Google Sheets URL Select your Google Sheets credential Step 3: Configure AI Provider (OpenRouter) Get OpenRouter API Key Sign up at https://openrouter.ai/ Generate API key in your account settings Add credits to your account Set up OpenRouter Credentials In n8n, go to Settings → Credentials Click "Add credential" → "OpenRouter API" Enter your API key Name the credential (e.g., "OpenRouter AI Overview") Configure OpenRouter Node Select the Gemini 2.5 Pro Model node Choose your credential from the dropdown Verify the model (default: google/gemini-2.5-pro-preview) Step 4: Install Browser Extension Install in Chrome Official Extension (Recommended) Visit: https://chromewebstore.google.com/detail/ai-overview-extractor/cbkdfibgmhicgnmmdanlhnebbgonhjje Click "Add to Chrome" Install in Firefox Official Add-on Visit: https://addons.mozilla.org/en-US/firefox/addon/ai-overview-extractor/ Click "Add to Firefox" Step 5: Configure Webhook Connection Get Webhook URL In n8n workflow, click on the Webhook node Copy the webhook URL (should be like: http://localhost:5678/webhook/ai-overview-extractor-template-123456789) Configure Extension Go to Google Search and perform any search with AI Overview Click the browser extension button (AI Overview Extractor) In webhook configuration section, paste your webhook URL Click "Test" - should show ✅ Test successful Click "Save" to store the configuration Step 6: Activate and Test Activate Workflow In n8n, toggle the workflow to "Active" (top right switch) Verify all nodes are properly configured Test End-to-End Go to Google Search Search for something that shows AI Overview Use the extension to extract AI Overview Send via webhook - check your Google Sheets for the data Verify the markdown conversion worked correctly Optional: Batch Analysis Setup For SEO Analysis Features In your Google Sheets, add URLs in the myURL column Set task column to "create guidelines" Run the workflow manually or wait for the 15-minute scheduler Check guidelines column for AI-generated SEO recommendations Troubleshooting Webhook Issues Ensure n8n is running on port 5678 Check if workflow is activated Verify webhook URL format Google Sheets Errors Confirm OAuth credentials are working Check sheet URL format Verify column names match exactly Ensure nodes Get URLs to Analyze, Save AI Overview to Sheets, and Save SEO Guidelines to Sheets are properly configured OpenRouter Issues Check API key validity Ensure sufficient account credits Try different models if Gemini 2.5 Pro fails Verify the Gemini 2.5 Pro Model node is properly connected Extension Problems Check browser console for errors Verify extension is properly installed Ensure you're on google.com/search pages Confirm webhook URL is correctly configured in extension Next Steps Customize AI prompts** in the Generate SEO Recommendations node for your specific needs Adjust scheduler frequency** (default: 15 minutes) Add more URL analysis** by populating Google Sheets Monitor usage** and API costs Support GitHub Issues**: https://github.com/romek-rozen/ai-overview-extractor/issues n8n Community**: https://community.n8n.io/ Template Documentation**: Check the included README files
by Ranjan Dailata
Notice Community nodes can only be installed on self-hosted instances of n8n. Who this is for This workflow automates the real-time extraction of Job Descriptions and Salary Information from job listing pages using Bright Data MCP and analyzes content using OpenAI GPT-4o mini. This workflow is ideal for: Recruiters & HR Tech Startups**: Automate job data collection from public listings Market Intelligence Teams**: Analyze compensation trends across companies or geographies Job Boards & Aggregators**: Power search results with structured, enriched listings AI Workflow Builders**: Extend to other career platforms or automate resume-job match analysis Analysts & Researchers**: Track hiring signals and salary benchmarks in real time What problem is this workflow solving? Traditional scraping of job portals can be challenging due to cluttered content, anti-scraping measures, and inconsistent formatting. Manually analyzing salary ranges and job descriptions is tedious and error-prone. This workflow solves the problem by: Simulating user behavior using Bright Data MCP Client to bypass anti-scraping systems Extracting structured, clean job data in Markdown format Using OpenAI GPT-4o mini to analyze and extract precise salary details and refined job descriptions Merging and formatting the result for easy consumption Delivering final output via webhook, Google Sheets, or file system What this workflow does Components & Flow Input Nodes job_search_url: The job listing or search result URL job_role: The title or role being searched for (used in logging/formatting) MCP Client Operations MCP Salary Data Extractor Simulates browser behavior and scrapes salary-related content (if available) MCP Job Description Extractor Extracts full job description as structured Markdown content OpenAI GPT-4o mini Nodes Salary Information Extractor Uses GPT-4o mini to detect, clean, and standardize salary range data (if any) Job Description Refiner Extracts role responsibilities, qualifications, and benefits from unstructured text Company Information Extractor Uses Bright Data MCP and GPT-4o mini to extract the company information Merge Node Combines the refined job description and extracted salary information into a unified JSON response object Aggregate node Aggregates the job description and salary information into a single JSON response object Final Output Handling The output is handled in three different formats depending on your downstream needs: Save to Disk** Output stored with filename including timestamp and job role Google Sheet Update** Adds a new row with job role, salary, summary, and link Webhook Notification** Pushes merged response to an external system Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. In n8n, configure the OpenAi account credentials. In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data API_TOKEN within the Environments textbox above as API_TOKEN=<your-token> How to customize this workflow to your needs Modify Input Source Change the job_search_url to point to any job board or aggregator Customize job_role to reflect the type of jobs being analyzed Tweak LLM Prompts (Optional) Refine GPT-4o mini prompts to extract additional fields like benefits, tech stacks, remote eligibility Change Output Format Customize the merged object to output JSON, CSV, or Markdown based on downstream needs Add additional destinations (e.g., Slack, Airtable, Notion) via n8n nodes