by Billy Christi
Who is this for? This workflow is ideal for: Finance teams** that need to process incoming invoices faster with minimal errors Small to mid-sized businesses** that want to automate invoice intake, review, and storage Operations managers** who require approval workflows and centralized record-keeping What problem is this workflow solving? Manually processing invoices is time-consuming, error-prone, and often lacks structure. This workflow solves those challenges by: Automating the intake of invoices** from multiple sources (email, Google Drive, web form) Extracting invoice data using AI**, eliminating manual data entry Implementing an email-based approval system** to add human oversight Automatically storing approved invoice data** in Google Sheets for easy access and reporting Notifying stakeholders** when invoices are approved or rejected What this workflow does This end-to-end invoice processing workflow includes: Three invoice input methods: Google Drive folder monitor, Gmail attachments, and web form uploads PDF to text extraction for each input method using native PDF parsing AI-powered invoice analysis with GPT-4 to extract structured fields such as vendor, total, and due date Dynamic categorization of invoice type (e.g., Travel, Software, Utilities) via AI Email-based approval workflow with embedded forms to collect decisions and notes Automated Google Sheets logging of all invoice data, approval status, and reviewer feedback Rejection notifications sent automatically to your finance team for transparency and follow-up Setup Copy the Google Sheet template here: 👉 PDF Invoice Parser with Approval Workflow – Google Sheet Template Connect your Google Drive account and specify the invoice folder ID Set up Gmail to monitor incoming invoices with PDF attachments Enable your form trigger to accept direct uploads from your internal or external users Enter your OpenAI API key in the AI processing node for data extraction Configure Google Sheets with a target spreadsheet to store invoice data Set recipient email addresses for invoice approvals and rejection notifications Test with a sample invoice to ensure end-to-end flow is working How to customize this workflow to your needs Change input sources**: Replace Gmail with Outlook or use Slack uploads instead Add validation steps**: Include regex or keyword checks before AI analysis Customize the AI schema**: Modify the expected JSON structure based on your internal finance system Integrate with accounting tools**: Add Xero, QuickBooks, or custom API nodes to push data Route based on category**: Add conditional logic to handle invoices differently based on vendor or category Multi-level approvals**: Add additional email steps if higher-level signoff is needed Audit logging**: Use database or Google Sheets to maintain a historical log of approvals and rejections
by Ranjan Dailata
Notice Community nodes can only be installed on self-hosted instances of n8n. Who this is for The Brave Search Structured Data Extractor workflow is designed for professionals and teams that need high-quality, structured insights from Brave search results in real time. Whether you're performing market research, tracking competitors, training AI models, or powering content engines, this workflow offers a robust and automated solution. This workflow is tailored for: Market Researchers - Who analyze trends across multimedia channels AI Developers - Who require clean, structured datasets for model fine-tuning SEO & Content - Analysts looking to monitor visibility across news, images, and videos Media Researchers - Curating timely and relevant information across formats Automation Engineers - Integrating search insights into downstream workflows What problem is this workflow solving? Traditional web scraping and search result parsing is fragmented, inconsistent, and prone to errors, especially when dealing with multimedia (images, videos, news) data from search engines. This workflow provides: Centralized Brave search data extraction across all content types. Switches the search execution based upon the type of search that is being set. ex: news, images, videos, all Automated structured data transformation using Google Gemini Unified output persistence and notification across disk, webhook, and Google Sheets What this workflow does Input Configuration Define your Brave search query Set the search type: videos, images, news, or all Configure your Bright Data MCP zone Bright Data MCP Search Execution Initiates a Brave search via Bright Data MCP using the correct URL pattern for each search type Returns raw HTML of search results Google Gemini LLM Structured Data Extraction Transforms raw results into structured data (e.g., title, URL, source, snippet) Output Handling Save to disk (e.g., JSON or CSV file) Send Webhook notification with structured data (e.g., Slack, internal dashboards) Store in Google Sheets for team-wide access or dashboarding Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data API_TOKEN within the Environments textbox above as API_TOKEN=<your-token> How to customize this workflow to your needs Enhance Output Analysis Add additional LLM prompts for topic classification, sentiment scoring, or trend forecasting. Output Format Options Choose to output CSV, Markdown, or HTML reports based on your integration target. Schedule Automation Trigger the workflow on a schedule (daily/weekly) to keep monitoring topical content.
by Nasser
For Who? Content Creators Youtube Automation Marketing Team How it works? 1 - Enter your content idea in the Edit Fields node in a "raw" format. Ex : Boil Eggs Perfectly 2 - LLM create 3 keywords request based on the idea and Apify scrape the YTB Search 3 - Wait until the dataset is completed in Apify 4 - Retrieve Dataset from Apify, calculate approximation of CTR and filter top performing videos 5 - LLM analyze patterns of best performing titles and create a prompt based on it. Another LLM create 5 titles based on these criteria 6 - LLM analyze patterns of best performing thumbnails and create a prompt based on it. Another LLM create 1 thumbnail based on these criteria 7 - Return titles and thumbnail in a HTML Page 📺 YouTube Video Tutorial: SETUP Setup Input Content Idea : Enter Keyword Related to the niche you want. Trigger can be replaced with anything as long as you retrieve a content idea. For example : Form submission, Database entry, etc ... If you want to change the number of keywords, update the data accordingly in the "Create Keywords" LLM Chain node ➡️ Structured Output Parser AND in the "YTB Search Scrape" HTTP Request Node in Body ➡️ JSON ➡️ searchQueries. If you want to change the number of scraped videos for each keyword, update the data accordingly in the "Create Videos Dataset" HTTP Request Node in Body ➡️ JSON ➡️ maxResults. If you want to adjust the CTR Calculation feel free to update it in the Code Node ➡️ Follow the Comments (after "//") to find what you're looking for. If you want to adjust the level of virality of the videos kept for analaysis go to Filter Node ➡️ Value. Setup Output HTML Page : You can also replace this part with any type of storage. For example : Airtable Database, Google Drive/Google Sheet, Send to an email, etc ... APIs : For the following third-party integrations, replace ==[YOUR_API_TOKEN]== with your API Token or connect your account via Client ID / Secret to your n8n instance : Apify : https://docs.apify.com/api/v2/getting-started OpenAI : https://platform.openai.com/docs/overview (base URL : https://api.openai.com/v1) OR OpenRouter : https://openrouter.ai/docs/quickstart (base URL : https://openrouter.ai/api/v1) HuggingFace (FLUX.1) : https://huggingface.co/docs 👨💻 More Workflows : https://n8n.io/creators/nasser/
by Billy Christi
Who is this for? This workflow is perfect for: HR professionals** seeking to automate employee and department management Startups and SMBs** that want an AI-powered HR assistant on Telegram Internal operations teams** that want to simplify onboarding and employee data tracking What problem is this workflow solving? Managing employee databases manually is error-prone and inefficient—especially for growing teams. This workflow solves that by: Enabling natural language-based HR operations directly through Telegram Automating the creation, retrieval, and deletion of employee records in Airtable Dynamically managing related data such as departments and job titles Handling data consistency and linking across relational tables automatically Providing a conversational interface backed by OpenAI for smart decision-making What this workflow does Using Telegram as the interface and Airtable as the backend database, this intelligent HR workflow allows users to: Chat in natural language (e.g. “Show me all employees” or “Create employee: Sarah, Marketing…”) Interpret and route requests via an AI Agent that acts as the orchestrator Query employee, department, and job title data from Airtable Create or update records as needed: Add new departments and job titles automatically if they don’t exist Create new employees and link them to the correct department and job title Delete employees based on ID Respond directly in Telegram, providing user-friendly feedback Setup View & Copy the Airtable base here: 👉 Employee Database Management – Airtable Base Template Telegram Bot: Set up a Telegram bot and connect it to the Telegram Trigger node Airtable: Prepare three Airtable tables: Employees with links to Departments and Job Titles Departments with Name & Description Job Titles with Title & Description Connect your Airtable API key and base/table IDs into the appropriate Airtable nodes Add your OpenAI API key to the AI Agent nodes Deploy both workflows: the main chatbot workflow and the employee creation sub-workflow Test with sample messages like: “Create employee: John Doe, john@company.com, Engineering, Software Engineer” “Remove employee ID rec123xyz” How to customize this workflow to your needs Switch databases**: Replace Airtable with Notion, PostgreSQL, or Google Sheets if desired Enhance security**: Add authentication and validation before allowing deletion Add approval flows**: Integrate Telegram button-based approvals for sensitive actions Multi-language support**: Expand system prompts to support multiple languages Add logging**: Store every user action in a log table for auditability Expand capabilities**: Integrate payroll, time tracking, or Slack notifications Extra Tips This is a two-workflow setup. Make sure the sub-workflow is deployed and accessible from the main agent. Use Simple Memory per chat ID to preserve context across user queries. You can expand the orchestration logic by adding more tools to the main agent—such as “Get active employees only” or “List employees by job title.”
by Samir Saci
Tags: Scrapping, Events, European Union, Networking Context Hey! I’m Samir, a Supply Chain Engineer and Data Scientist from Paris, and the founder of LogiGreen Consulting. We use AI, automation, and data to support sustainable and data-driven operations across all types of organizations. This workflow is part of our networking strategy (as a business) to track official EU events that may relate to topics we cover. > Want to stay ahead of critical EU meetings and events without checking the website every day? This n8n workflow automatically scrapes the EU’s official event portal and logs the latest entries with clean metadata including date, location, category, and link. 📬 For collaborations, feel free to connect with me on LinkedIn Who is this template for? This workflow is useful for: Policy & public affairs teams** following institutional activities Sustainability teams** watching for relevant climate-related summits NGOs and researchers** interested in event calendars Data teams** building dashboards on public event trends What does it do? This n8n workflow: 🌐 Scrapes the EU events portal for new meetings and conferences 📅 Extracts event metadata (title, date, location, type, and link) 🔁 Handles pagination across multiple pages 🚫 Checks for duplicates already stored 📊 Saves new records into a connected Google Sheet How it works Triggered daily via cron HTTP node loads the event listing HTML Extract HTML blocks for each event article Parse event name, link, type, location, and full date Concatenate and clean dates for easy tracking Store non-duplicate entries in Google Sheets The workflow uses static data to track pagination and ensure only new events are stored, making it ideal for building up a clean dataset over time. What do I need to get started? You’ll need: A Google Sheet connected to your n8n instance No code or AI tools needed — just n8n and this template Follow the Guide! Sticky notes are included directly inside the workflow to guide you step-by-step through setup and customisation. 🎥 Watch My Tutorial Notes This is ideal for analysts and consultants who want clean, structured data from the EU portal You can add filtering, email alerts, or AI classifiers later This workflow was built using n8n version 1.93.0 Submitted: June 1, 2025
by Ranjan Dailata
Who this is for? The LinkedIn Profile Extract and JSON Resume Builder is a powerful workflow that scrapes professional profile data from LinkedIn using Bright Data's infrastructure, then transforms that data into a clean, structured JSON resume using Google Gemini. The workflow is ideal for automating resume parsing, candidate profiling, or integrating into recruiting platforms. This workflow is tailored for: HR professionals & recruiters automating resume screening Talent acquisition platforms enriching candidate profiles Developers & AI builders creating resume-parsing AI pipelines Data scientists working on labor market analytics Growth hackers profiling prospects via public data What problem is this workflow solving? Parsing resumes or LinkedIn profiles into machine-readable formats is often a manual, error-prone process. Most scraping tools either fail due to anti-bot protections or return unstructured HTML that's hard to work with. This workflow solves that by: Using Bright Data's Web Unlocker for reliable, CAPTCHA-free LinkedIn scraping Extracting clean text and structured profile data via Google Gemini LLM Automatically generating a standards-compliant JSON Resume and Skills Sending the resume to webhooks or storing it for downstream usage What this workflow does Accepts LinkedIn Profile URL and required metadata (Bright Data zone, webhook) Scrapes LinkedIn profile using Bright Data Web Unlocker Extracts clean content and skills using Google Gemini LLM Builds a JSON-formatted resume following the JSON resume schema Sends the JSON resume via Webhook Notification Persists the output by saving the file to disk Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). Update the Set URL and Bright Data Zone node with the LinkedIn profile, Bright Data Zone and the Webhook notification URL. For testing purposes, you can obtain a webhook url using https://webhook.site/ How to customize this workflow to your needs Add Language Translation Insert a translation LLM node to support multilingual profiles. Generate PDF Resumes Convert JSON to formatted PDF resumes using an HTML-to-PDF module. Push to ATS or CRM Add integration nodes to pipe data into applicant tracking systems (ATS), CRMs, or databases. Use Alternative LLMs Swap Gemini with OpenAI or Anthropic Claude if preferred.
by Sobek
📝 DESCRIPTION OF THE WORKFLOW This workflow connects Salesforce and Geotab to streamline fleet tracking for field service jobs (Work Orders). When a new Work Order is created in Salesforce (with a 'New' status and valid coordinates), it creates a circular geofence zone in Geotab and updates the Work Order with the zone ID. If geolocation is missing, an alert email is sent to dedicated email. The workflow uses a Salesforce Outbound Message to trigger an n8n webhook. It includes robust credential handling and optional logic to skip or notify on bad data. Use Cases: Automating vehicle geofence setup for service visits Enhancing CRM-to-fleet system synchronisation Enforcing work orders data quality via alerts Integrations Used: Salesforce Geotab API Microsoft Outlook (or any SMTP-compatible service) Tags: geotab, salesforce, fleet management, gps tracking, field service, crm, automation, webhook, integration ADDITIONAL RESOURCES 🔗 Salesforce Salesforce Login \[Salesforce Setup (Admin Console)]\(https://login.salesforce.com/ → click “Setup” gear icon) Outbound Messages Documentation Salesforce Developer Documentation Salesforce Workbench (API Testing Tool) 🔗 Geotab Geotab Login (MyGeotab) Geotab Developer Portal Geotab API Explorer Geotab SDK (JavaScript Samples) Geotab Support Centre
by slow-groovin@api2o.com
AI Comprehensive Research on User's Query with Gemini and Web Search What is this? Perform comprehensive research on a user's query by dynamically generating search terms, querying the web using Google Search (by Gemini) , reflecting on the results to identify knowledge gaps, and iteratively refining its search until it can provide a well-supported answer with citations. (like Perplexity) This workflow is a reproduction of gemini-fullstack-langgraph-quickstart in N8N. The gemini‑fullstack‑langgraph‑quickstart is a demo by the Google‑Gemini team that showcases how to build a powerful full‑stack AI agent using Gemini and LangGraph How It Works Generate Query 💬 generates one or more search queries tasks based on the User's question. uses Gemini 2.0 Flash Web Research 🌐 execute web search tasks using the native Google Search API tool in combination with Gemini 2.0 Flash. Reflection 📚 Identifies knowledge gaps and generates potential follow-up queries. Setup Configure API Credentials: Create Google Gemini(PaLM) Api Credential using you own Gemini key Connect the credential with three nodes: Google Gemini Chat Model and GeminiSearch and reflection Configure Redis Source: prepare a Redis service that can be accessed by n8n Create Redis Crediential and connect it with all Redis node Customize Try using different Gemini models. Try modifying the parameters number_of_initial_queries and max_research_loops. Why use Redis? Use Redis as an external storage to maintain global variables (counter, search results, etc.) This workflow contains a loop process, which need global variables (as State in LangGraph). It is difficult to achieve global variables management without external storage in n8n.
by Jez
Workflow: Automated Weekly Google Calendar Summary via Email with AI ✨🗓️📧 Get a personalized, AI-powered summary of your upcoming week's Google Calendar events delivered straight to your inbox! This workflow automates the entire process, from fetching events to generating an intelligent summary and emailing it to you. 🌟 Overview This n8n workflow connects to your Google Calendar, retrieves events for the upcoming week (Monday to Sunday, based on the day the workflow runs), uses Google Gemini AI to create a well-structured and insightful summary, and then emails this summary to you. It's designed to help you start your week organized and aware of your commitments. Key Features: Automated Weekly Summary:** Runs on a schedule (default: weekly) to keep you updated. AI-Powered Insights:** Leverages Google Gemini to not just list events, but to identify important ones and offer a brief weekly outlook. Personalized Content:** Uses your specified timezone, locale, name, and city for accurate and relevant information. Clear Formatting:** Events are grouped by day and displayed chronologically with start and end times. Important events are highlighted. Email Delivery:** Receive your schedule directly in your inbox in a clean HTML format. Customizable:** Easily adapt to your specific calendar, AI preferences, and email settings. ⚙️ How It Works: Step-by-Step The workflow consists of the following nodes, working in sequence: weekly_schedule (Schedule Trigger): What it does: Initiates the workflow. Default: Triggers once a week at 12:00 PM. You can adjust this to your preference (e.g., Sunday evening or Monday morning). locale (Set Node): What it does: This is a crucial node for you to configure! It sets user-specific parameters like your preferred language/region (users-locale), timezone (users-timezone), your name (users-name), and your home city (users-home-city). These are used throughout the workflow for correct date/time formatting and personalizing the AI prompt. date-time (Set Node): What it does: Dynamically generates various date and time strings based on the current execution time and the locale settings. This is used to define the precise 7-day window (from the current day to 7 days ahead, ending at midnight) for fetching calendar events. get_next_weeks_events (Google Calendar Node): What it does: Connects to your specified Google Calendar and fetches all events within the 7-day window calculated by the date-time node. Requires: Google Calendar API credentials and the ID of the calendar you want to use. simplify_evens_json (Code Node): What it does: Runs a small JavaScript snippet to clean up the raw event data from Google Calendar. It removes several fields that aren't needed for the summary (like htmlLink, etag, iCalUID), making the data more concise for the AI. aggregate_events (Aggregate Node): What it does: Takes all the individual (and now simplified) event items and groups them into a single JSON array called eventdata. This is the format the AI agent expects for processing. Google Gemini (LM Chat Google Gemini Node): What it does: This node is the connection point to the Google Gemini language model. Requires: Google Gemini (or PaLM) API credentials. event_summary_agent (Agent Node): What it does: This is where the magic happens! It uses the Google Gemini model and a detailed system prompt to generate the weekly schedule summary. The Prompt Instructs the AI to: Start with a friendly greeting. Group events by day (Monday to Sunday) for the upcoming week, using the user's timezone and locale. Format event times clearly (e.g., 09:30 AM - 10:30 AM: Event Summary). Identify and prefix "IMPORTANT:" to events with keywords like "urgent," "deadline," "meeting," etc., in their summary or description. Conclude with a 1-2 sentence helpful insight about the week's schedule. Process the input eventdata (the JSON array of calendar events). Markdown (Markdown to HTML Node): What it does: Converts the text output from the event_summary_agent (which is generated in Markdown format for easy structure) into HTML. This ensures the email body is well-formatted with proper line breaks, lists, and emphasis. send_email (Email Send Node): What it does: Sends the final HTML summary to your specified email address. Requires: SMTP (email sending) credentials and your desired "From" and "To" email addresses. 🚀 Getting Started: Setup Instructions Follow these steps to get the workflow up and running: Import the Workflow: Download the workflow JSON file. In your n8n instance, go to "Workflows" and click the "Import from File" button. Select the downloaded JSON file. Configure Credentials: You'll need to set up credentials for three services. In n8n, go to "Credentials" on the left sidebar and click "Add credential." Google Calendar API: Search for "Google Calendar" and create new credentials using OAuth2. Follow the authentication flow. Once created, select these credentials in the get_next_weeks_events node. Google Gemini (PaLM) API: Search for "Google Gemini" or "Google PaLM" and create new credentials. You'll typically need an API key from Google AI Studio or Google Cloud. Once created, select these credentials in the Google Gemini node. SMTP / Email: Search for your email provider (e.g., "SMTP," "Gmail," "Outlook") and create credentials. This usually involves providing your email server details, username, and password/app password. Once created, select these credentials in the send_email node. ‼️ IMPORTANT: Customize User Settings in the locale Node: Open the locale node. Update the following values in the "Assignments" section: users-locale: Set your locale string (e.g., "en-AU" for English/Australia, "en-US" for English/United States, "de-DE" for German/Germany). This affects how dates, times, and numbers are formatted. users-timezone: Set your timezone string (e.g., "Australia/Sydney", "America/New_York", "Europe/London"). This is critical for ensuring event times are displayed correctly for your location. users-name: Enter your name (e.g., "Bob"). This is used to personalize the email greeting. users-home-city: Enter your home city (e.g., "Sydney"). This can be used for additional context by the AI. Configure the get_next_weeks_events (Google Calendar) Node: Open the node. In the "Calendar" parameter, you need to specify which calendar to fetch events from. The default might be a placeholder like c_4d9c2d4e139327143ee4a5bc4db531ffe074e98d21d1c28662b4a4d4da898866@group.calendar.google.com. Change this to your primary calendar (often your email address) or the specific Calendar ID you want to use. You can find Calendar IDs in your Google Calendar settings. Configure the send_email Node: Open the node. Set the fromEmail parameter to the email address you want the summary to be sent from. Set the toEmail parameter to the email address(es) where you want to receive the summary. You can also customize the subject line if desired. (Optional) Customize the AI Prompt in event_summary_agent: If you want to change how the AI summarizes events (e.g., different keywords for important events, a different tone, or specific formatting tweaks), you can edit the "System Message" within the event_summary_agent node's parameters. (Optional) Adjust the Schedule in weekly_schedule: Open the weekly_schedule node. Modify the "Rule" to change when and how often the workflow runs (e.g., a specific day of the week, a different time). Activate the Workflow: Once everything is configured, toggle the "Active" switch in the top right corner of the workflow editor to ON. 📬 What You Get You'll receive an email (based on your schedule) with a subject like "Next Week Calendar Summary : [Start Date] - [End Date]". The email body will contain: A friendly greeting. Your schedule for the upcoming week (Monday to Sunday), with events listed chronologically under each day. Event times displayed in your local timezone (e.g., 09:30 AM - 10:30 AM: Team Meeting). Priority events clearly marked (e.g., IMPORTANT: 02:00 PM - 03:00 PM: Project Deadline Review). A brief, insightful observation about your week's schedule. 🛠️ Troubleshooting & Notes Timezone is Key:** Ensure your users-timezone in the locale node is correct. This is the most common source of incorrect event times. Google API Permissions:** When setting up Google Calendar and Gemini credentials, make sure you grant the necessary permissions. AI Output Varies:** The AI-generated summary can vary slightly each time. The prompt is designed to guide it, but LLMs have inherent creativity. Calendar Event Details:** The quality of the summary (especially for identifying important events) depends on how detailed your calendar event titles and descriptions are. Including keywords like "meeting," "urgent," "prepare for," etc., in your events helps the AI. 💬 Feedback & Contributions Feel free to modify and enhance this workflow! If you have suggestions, improvements, or run into issues, please share them in the n8n community. Happy scheduling!
by Naveen Choudhary
Who is this for? Marketing, content, and enablement teams that need a quick, human-readable summary of every new video published by the YouTube channels they care about—without leaving Slack. What problem does this workflow solve? Manually checking multiple channels, skimming long videos, and pasting the highlights into Slack wastes time. This template automates the whole loop: detect a fresh upload from your selected channels → pull subtitles → distill the key take-aways with GPT-4o-mini → drop a neatly-formatted digest in Slack. What this workflow does Schedule Trigger fires every 10 min, then grabs a list of YouTube RSS feeds from a Google Sheet. HTTP + XML fetch & parse each feed; only brand-new videos continue. YouTube API fetches title/description, RapidAPI grabs English subtitles. Code nodes build an AI payload; OpenAI returns a JSON summary + article. A formatter turns that JSON into Slack Block Kit, and Slack posts it. Processed links are appended back to the “Video Links” sheet to prevent dupes. Setup Make a copy of this Google Sheet and connect a Google Sheets OAuth2 credential with edit rights. Slack App: create → add chat:write, channels:read, app_mention; enable Event Subscriptions; install and store the Bot OAuth token in an n8n Slack credential. RapidAPI key for https://yt-api.p.rapidapi.com/subtitles (300 free calls/mo) → save as HTTP Header Auth. OpenAI key → save in an OpenAI credential. Add your RSS feed URLs to the “RSS Feed URLs” tab; press Execute Workflow. How to customise Adjust the schedule interval or freshness window in “If newly published”. Swap the OpenAI model or prompt for shorter/longer digests. Point the Slack node at a different channel or DM. Extend the AI payload to include thumbnails or engagement stats. Use-case ideas Product marketing**: Instantly brief sales & CS teams when a competitor uploads a feature demo. Internal learning hub**: Auto-summarise conference talks and share bullet-point notes with engineers. Social media managers**: Get ready-to-post captions and key moments for re-purposing across platforms.
by Dvir Sharon
🎯 Automated TikTok Influencer Discovery & Analysis A complete n8n automation that discovers TikTok influencers using Bright Data, evaluates their fit using Claude AI, and sends personalized outreach emails. Designed for marketing teams and brands that need a scalable, intelligent way to find and connect with relevant creators. 📋 Overview This workflow provides a full-service influencer discovery pipeline: it finds TikTok profiles using search keywords, uses AI to assess alignment with your brand, and initiates contact with qualified influencers. Ideal for influencer marketing, brand partnerships, and campaign planning. ✨ Key Features 🔍 Keyword-Based Discovery** Locate TikTok influencers by specific niche-related keywords. 📊 Bright Data Integration** Access accurate TikTok profile data from Bright Data’s datasets. 🤖 AI-Powered Analysis** Claude AI evaluates each profile's fit with your brand based on bio, content, and more. 📧 Smart Email Notifications** Sends tailored outreach emails to creators deemed highly relevant. 📈 Data Storage** Google Sheets stores profile details, AI evaluation results, and outreach status. 🎯 Intelligent Filtering** Processes only influencers who meet your criteria (e.g., 5000+ followers, industry match). ⚡ Fast & Reliable** Uses professional scraping with robust error handling. 🔄 Batch Processing** Supports bulk influencer processing through a single automated flow. 🎯 What This Workflow Does Input Search Keywords** – TikTok terms for finding niche creators Business Info** – Brand description and industry Collaboration Criteria** – Follower count minimum, niche alignment Processing Steps Form Submission TikTok Discovery via Bright Data Data Extraction and Normalization Save to Google Sheets Relevance Scoring via Claude AI Filtering Based on AI Score + Follower Count Personalized Email Outreach Output Data Points | Field | Description | Example | |---------------|------------------------------------|----------------------------------| | Profile ID | TikTok profile identifier | tiktoker123456 | | Username | TikTok handle | @creativecreator | | URL | Profile link | https://tiktok.com/@creativecreator | | Description | Creator bio | "Fashion & lifestyle content..." | | Followers | Total follower count | 50,000 | | Collaboration | AI assessment of brand fit | "Highly Relevant" | | Analysis | 50-word Claude AI relevance summary| "Strong alignment with fashion..."| 🚀 Setup Instructions Prerequisites n8n (cloud or self-hosted) Bright Data account with TikTok access Google Sheets + Gmail Anthropic Claude API key 10–15 minutes setup time Step-by-Step Setup Import Workflow via JSON in n8n Configure Bright Data – Add API credentials and dataset ID Google Sheets – Setup credentials and map columns Claude AI – Insert API key and select desired model Gmail – Authenticate Gmail and update mail node settings Update Variables – Replace placeholders with business info Test & Launch – Submit a sample form and verify all outputs 📖 Usage Guide Adding Search Keywords Submit the form with search terms, business description, and industry category to trigger the workflow. Understanding AI Analysis Emails are sent only if: Collaboration status = Highly Relevant Follower count ≥ 5000 Industry alignment confirmed Claude AI returns a 50-word analysis justifying the match Customizing Filters Edit the "Find the Collaborator" prompt to adjust: Follower thresholds Industry relevance Additional metrics (e.g., engagement rate) Viewing Results Google Sheets log includes: Influencer metadata AI scores and rationale Collaboration status Email delivery timestamp 🔧 Customization Options Add More Fields:** Engagement rate, contact email, content themes Email Personalization:** Customize message templates or integrate other mail services Enhanced Filtering:** Use engagement rates, region, content frequency 🚨 Troubleshooting | Issue | Fix | |-------|-----| | Bright Data fails | Recheck API and dataset ID | | No influencer data | Adjust keywords or dataset scope | | Sheets permission error | Re-authenticate and check sharing | | Claude fails | Validate API key and prompt | | Emails not sent | Re-auth Gmail or update recipient field | | Form not triggering | Reconfirm webhook URL and permissions | Advanced Debugging Check n8n execution logs Run individual nodes for pinpointing failures Confirm all data formats Handle API rate limits Add error-catch nodes for retries 📊 Use Cases & Examples Brand Discovery:** Fashion, tech, fitness creators Competitor Insights:** Find influencers used by rival brands Campaign Planning:** Build targeted influencer lists Market Research:** Identify creator trends across regions ⚙️ Advanced Configuration Batch Execution:** Process multiple keywords with delay logic Engagement Metrics:** Scrape and calculate likes-to-follower ratios CRM Integration:** Sync qualified profiles to HubSpot, Salesforce, or Slack 📈 Performance & Limits Processing Time:** 3–5 minutes per keyword Concurrency:** 3–5 simultaneous fetches (depends on plan) Accuracy:** >95% influencer data reliability Success Rate:** 90%+ for outreach and processing
by Roman Rozenberger
How it works • Extract AI Overviews from Google Search - Receives data from browser extension via webhook • Convert HTML to Markdown - Automatically processes and cleans AI Overview content • Store in Google Sheets - Archives all extracted AI Overviews with metadata and sources • Generate SEO Guidelines - AI analyzes page content vs AI Overview to suggest improvements • Automate Analysis - Batch process multiple URLs and schedule regular checks Set up steps • Import workflow - Load the JSON template into your n8n instance (2 minutes) • Configure Google Sheets - Set up OAuth connection and create spreadsheet with required columns (5 minutes) • Set up AI provider - Add OpenRouter API credentials for Gemini 2.5 Pro (3 minutes) • Install browser extension - Deploy the companion Chrome/Firefox extension for data extraction (5 minutes) • Test webhook endpoint - Verify the connection between extension and n8n workflow (2 minutes) Total setup time: ~15 minutes What you'll need: Google account for Sheets integration Google Sheet template with required columns OpenRouter API key for Gemini 2.5 Pro model access Browser extension: Chrome Extension or Firefox Add-on n8n instance (local or cloud) Use cases: SEO agencies** - Monitor AI Overview presence for client keywords Content marketers** - Analyze what content gets featured in AI Overviews E-commerce** - Track AI Overview coverage for product-related searches Research** - Build datasets of AI Overview content across different topics The workflow comes with a free browser extension (Chrome | Firefox) that automatically extracts AI Overview content from Google Search and sends it via webhook to your n8n workflow for processing and analysis. GitHub Repository: https://github.com/romek-rozen/ai-overview-extractor/ Detailed Setup Instructions - AI Overview Extractor Prerequisites n8n instance** (local or cloud) - version 1.95.3+ Google account** for Sheets integration OpenRouter API account** for Gemini 2.5 Pro access Browser** (Chrome/Firefox) for the extension Step 1: Import the Workflow Open n8n and navigate to Workflows Click "Add workflow" → "Import from JSON" Upload the AI_OVERVIES_EXTRACTOR_TEMPLATE.json file Save the workflow Step 2: Configure Google Sheets Create Google Sheets Document Create new Google Sheet with these columns: extractedAt | searchQuery | sources | markdown | myURL | task | guidelines | key Here is public google sheet template: https://docs.google.com/spreadsheets/d/15xqZ2dTiLMoyICYnnnRV-HPvXfdgVeXowr8a7kU4uHk/edit?gid=0#gid=0 Copy the Google Sheets URL (you'll need it for the workflow) Set up Google Sheets Credentials In n8n, go to Settings → Credentials Click "Add credential" → "Google Sheets OAuth2 API" Follow the OAuth setup to authorize n8n access to Google Sheets Name the credential (e.g., "Google Sheets AI Overview") Configure Google Sheets Nodes Update these nodes with your Google Sheets URL: Get URLs to Analyze Save AI Overview to Sheets Save SEO Guidelines to Sheets In each node: Set documentId to your Google Sheets URL Set sheetName to your Google Sheets URL Select your Google Sheets credential Step 3: Configure AI Provider (OpenRouter) Get OpenRouter API Key Sign up at https://openrouter.ai/ Generate API key in your account settings Add credits to your account Set up OpenRouter Credentials In n8n, go to Settings → Credentials Click "Add credential" → "OpenRouter API" Enter your API key Name the credential (e.g., "OpenRouter AI Overview") Configure OpenRouter Node Select the Gemini 2.5 Pro Model node Choose your credential from the dropdown Verify the model (default: google/gemini-2.5-pro-preview) Step 4: Install Browser Extension Install in Chrome Official Extension (Recommended) Visit: https://chromewebstore.google.com/detail/ai-overview-extractor/cbkdfibgmhicgnmmdanlhnebbgonhjje Click "Add to Chrome" Install in Firefox Official Add-on Visit: https://addons.mozilla.org/en-US/firefox/addon/ai-overview-extractor/ Click "Add to Firefox" Step 5: Configure Webhook Connection Get Webhook URL In n8n workflow, click on the Webhook node Copy the webhook URL (should be like: http://localhost:5678/webhook/ai-overview-extractor-template-123456789) Configure Extension Go to Google Search and perform any search with AI Overview Click the browser extension button (AI Overview Extractor) In webhook configuration section, paste your webhook URL Click "Test" - should show ✅ Test successful Click "Save" to store the configuration Step 6: Activate and Test Activate Workflow In n8n, toggle the workflow to "Active" (top right switch) Verify all nodes are properly configured Test End-to-End Go to Google Search Search for something that shows AI Overview Use the extension to extract AI Overview Send via webhook - check your Google Sheets for the data Verify the markdown conversion worked correctly Optional: Batch Analysis Setup For SEO Analysis Features In your Google Sheets, add URLs in the myURL column Set task column to "create guidelines" Run the workflow manually or wait for the 15-minute scheduler Check guidelines column for AI-generated SEO recommendations Troubleshooting Webhook Issues Ensure n8n is running on port 5678 Check if workflow is activated Verify webhook URL format Google Sheets Errors Confirm OAuth credentials are working Check sheet URL format Verify column names match exactly Ensure nodes Get URLs to Analyze, Save AI Overview to Sheets, and Save SEO Guidelines to Sheets are properly configured OpenRouter Issues Check API key validity Ensure sufficient account credits Try different models if Gemini 2.5 Pro fails Verify the Gemini 2.5 Pro Model node is properly connected Extension Problems Check browser console for errors Verify extension is properly installed Ensure you're on google.com/search pages Confirm webhook URL is correctly configured in extension Next Steps Customize AI prompts** in the Generate SEO Recommendations node for your specific needs Adjust scheduler frequency** (default: 15 minutes) Add more URL analysis** by populating Google Sheets Monitor usage** and API costs Support GitHub Issues**: https://github.com/romek-rozen/ai-overview-extractor/issues n8n Community**: https://community.n8n.io/ Template Documentation**: Check the included README files