by Ranjan Dailata
Who this is for? This workflow is designed for professionals and teams who need real-time, structured insights from Google Search results without manual effort. What problem is this workflow solving? This n8n workflow solves the problem of automating Google Search result extraction, cleanup, summarization, and AI-enhanced formatting for downstream use like sending the results to a webhook or another system. What this workflow does Automates Google Search via Bright Data Uses Bright Data’s proxy-based SERP API to run a Google Search query programmatically. Makes the process repeatable and scriptable with different search terms and regions/zones. Cleans and Extracts Useful Content The Google Search Data Extractor uses LLM based cleaning to remove HTML/CSS/JS from the response and extract pure text data. Converts messy, unstructured web content into structured, machine-readable format. Summarizes Search Results Through the Gemini Flash + Summarization Chain, it generates a concise summary of the search results. Ideal for users who don’t have time to read full pages of search results. Formats Data Using AI Agent The AI Agent acts like a virtual assistant that: Understands search results Formats them in a readable, JSON-compatible form Prepares them for webhook delivery Delivers Results to Webhook Sends the final summary + structured search result to a webhook (could be your app, a Slack bot, Google Sheets, or CRM). Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. A Google Gemini API key (or access through Vertex AI or proxy). Update the Google Search query as you wish by navigating to the Set Google Search Query node. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. How to customize This Workflow to your needs 1. Change the Search Input Default: It searches a fixed query or dataset. Customize: Accept input from a Google Sheet, Airtable, or a form. Auto-trigger searches based on keywords or schedules. 2. Customize Summarization Style (LLM Output) Default: General summary using Google Gemini or OpenAI. Customize: Add tone: formal, casual, technical, executive-summary, etc. Focus on specific sections: pricing, competitors, FAQs, etc. Translate the summaries into multiple languages. Add bullet points, pros/cons, or insight tags. 3.Choose Where the Results Go Options: Email, Slack, Notion, Airtable, Google Docs, or a dashboard. Auto-create content drafts for WordPress or newsletters. Feed into CRM notes or attach to Salesforce leads.
by Ranjan Dailata
Who this is for? This workflow automates the process of Wikipedia data extraction using the Bright Data Web Unlocker, parsing and cleaning the data, and then sending the results to a specified webhook URL for downstream processing, reporting, or integration. What problem is this workflow solving? Researchers who need structured information from Wikipedia pages regularly. Data Engineers building knowledge bases or enriching datasets with factual data. Digital Marketers or Content Writers automating fact-checking or content sourcing. Automation Enthusiasts who want to trigger external systems with rich context from Wikipedia. What this workflow does This workflow addresses the challenges of manually retrieving, structuring, and using data from Wikipedia at scale. Workflow Breakdown Trigger Type: Scheduled or Manual Purpose: Starts the workflow either on a fixed schedule (e.g., daily) or on-demand via a manual trigger or incoming webhook. Bright Data Wikipedia Scraping Tool Used: Bright Data Web Unlocker Action: Scrape the HTML content of one or multiple Wikipedia article URLs. Parse & Extract Structured Data The Basic LLM Chain node is responsible for producing a human readable content. Summarization Summarize the Wikipedia content by utilizing the Summarization Chain node. Send to Webhook Initiates a Webhook notification to the specified URL as part of the "Summary Webhook Notifier" node. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). Update the Set Wikipedia URL with Bright Data Zone node with the Wikipedia URL and Bright Data Zone. Update the Summary Webhook Notifier node with the Webhook endpoint of your choice. How to customize this workflow to your needs Update Wikipedia URL Replace with your own Wikipedia URL of your interest. Make sure to set the Wikipedia URL as part of the "Set Wikipedia URL with Bright Data Zone" node. Modify Data Extraction Logic Extract entire article content or just specific sections by extending the "LLM Data Extractor" node prompt. Extend AI Summarization Extract key bullet points or entities. Create short-form summaries by extending the "Concise Summary Generator" node. Extend Summary Webhook Notifier Send to Slack, Discord, Telegram, MS Teams via the Webhook notification mechanism. Connect to your internal database/API via the Webhook notification mechanism.
by Jimleuk
This n8n template reviews and audits recently active Google Drive files and reports on files with excessively open permissions. This shows how you can automate simple compliance tasks for access control management. File Sharing Permissions are routinely abused when access needs and scopes expand to many colleagues, clients and users. Often, granting excessively open permissions means you can get back to work rather than deal with numerous access request notifications. Whilst sometimes justified, the problem is that the permissions are rarely reverted to a safer setting at a later date when it is no longer needed. This template serves to improve your security posture by giving frequent reminders of these open files so that they can be actioned and not forgotten about. See example Audit Report here: https://docs.google.com/spreadsheets/d/1V2aiLhp3_nH7EBniMn7D0kFHg7-A5NjpDZXMhb4F5UI/edit?gid=503992967 How it works A scheduled trigger runs everyday to generate a new audit report. A new sheet is created in a designated Google Sheets document to store the day's results. The Google Drive node is used with Advanced Search params to fetch recently modified files for the user with each file result containing the current permission settings. The results are filtered for those with publicly accessible "anyone with link" and sharing with external users via domain. The results are then manipulated into rows so that we can append them to the Sheet we created earlier. The audit Google Sheet is updated with the results and an audit report is sent to the user to action. How to use Set the scheduled trigger to a more appropriate interval which works for you or your organisation. Consider using allowlists for organisations you frequently share with to reduce the number of false positives. The results can be forwarded to other security or analytical products as required. Requirements Google Drive for Document Management Google Sheet for Reports and Data Collection Gmail to Email Reports Customising the workflow Not using Google? Apply the same approach using Microsoft Sharepoint or Dropbox. If your security policies require it, you could automate fixing the file permissions as a proactive action instead and notify the user later.
by merfy
Use Case Manually extracting images from PDF files for analysis is often slow and inefficient. Many users resort to taking screenshots of each page, uploading them to an AI tool like OpenAI for image analysis, and then manually copying the insights into a document. This manual process is time-consuming and prone to errors. This workflow streamlines the entire process by automatically extracting images from a PDF, analyzing them using the GPT-4o model, and saving the results in seconds—eliminating the need for manual effort. What This Workflow Does Extracts all images from the uploaded PDF file automatically The workflow scans each page of the PDF and identifies embedded images without manual intervention. Uses the GPT-4o model to analyze each extracted image Each image is processed through GPT-4o to generate descriptive insights, summaries, or context-specific analysis depending on the use case. Saves the analysis results to a .txt file, including image URLs The final output is a plain text file containing both the image URLs (e.g., hosted on cloud storage) and the corresponding GPT-4o analysis, ready for further use or sharing. Setup 1.Set up your credentials when you first open the workflow. You’ll need accounts for OpenAI, Convert API, and Google Drive. 2.Convert API does not rate-limit your API, sometimes you may receive 503 service unavailable error. Nevertheless, it doesn’t mean that you cannot convert your file. It simply means that you should retry the conversion in a few seconds. 3.Upload a PDF with images to Google Drive. 4.Remove unnecessary parts and retrieve image-related information. 5.Integrate image and image analysis information together. 6.Analyze each image using the OPENAI GPT-4o model. 7.Retrieve all image analysis content and image URL 8.Integrate multiple image URLs and analysis content 9.Output content to a .txt file. Template was created in n8n v1.83.2 How to Customize Replace the manual trigger with a Google Drive trigger or other automation triggers Change the image analysis model (e.g., switch or fine-tune GPT-4o) Send the results to other platforms (e.g., Slack, Telegram, LINE, etc.) instead of saving to a .txt file
by GuanNan
Who is this for? This template is designed for anyone who wants to integrate MCP with their AI Agents. Whether you're a developer, a data analyst, or an automation enthusiast, if you're looking to leverage the power of MCP and Google Calendar in your n8n workflows, this template is for you. What problem is this workflow solving? This template caters to MCP beginners seeking a hands - on example and developers looking to integrate Google Calendar MCP service. When integrating MCP with Google Calendar, manually updating AI Agents after changes to Google Calendar tools on the MCP Server is time - consuming and error - prone. This template automates the process, enabling the AI Agent to instantly recognize changes made to Google Calendar on the MCP Server. In project management, for example, it ensures that task schedule updates in Google Calendar are automatically detected by the AI Agent. With detailed steps, it simplifies the integration process for all users. What this workflow does This workflow focuses on integrating MCP with Google Calendar within n8n. Specifically, it allows you to build an MCP Server and Client using Google Calendar nodes in n8n. Any changes made to the Google Calendar tools on the MCP Server are automatically recognized by the MCP Client in the workflow. This means that you can make changes to your Google Calendar (such as adding, deleting, or modifying events) on the MCP Server, and the MCP Client in the n8n workflow will immediately detect these changes without any manual intervention. Setup Requirements An active n8n account. Access to Google Calendar API. You need to enable the Google Calendar API, and create the necessary credentials (OAuth 2.0 client ID). Basic knowledge of n8n workflows and MCP concepts. Step - by - step guide Create a new workflow in n8n: Log in to your n8n account and create a new workflow. Add Google Calendar nodes: Search for and add the Google Calendar nodes to your workflow. Configure the nodes with your Google Calendar API credentials. Set up the MCP Server and Client: Use the appropriate nodes in n8n to set up the MCP Server and Client. Connect the Google Calendar nodes to the MCP nodes as required. Test the workflow: Make some changes to your Google Calendar on the MCP Server and check if the MCP Client in the n8n workflow can detect these changes. How to customize this workflow to your needs If you want to customize this workflow, you can: Modify the triggers**: You can change the conditions under which the MCP Client detects changes. For example, you can set it to detect only specific types of events in Google Calendar. Integrate with other services**: You can add more nodes to the workflow to integrate with other services, such as sending notifications to Slack or saving data to a database when a change is detected.
by Jimleuk
This n8n template imports purchase order submissions from Outlook and converts attached purchase order forms in XLSX format into structured output. Data entry jobs with user-submitted XLSX forms are time consuming, incredibly mundane but necessary tasks which in likelihood are inherited and critical to business operation. While we could dream of system overhauls and modernisation, the fact is that change is hard. There is another way however - using n8n and AI! N8N offers an end-to-end solution to parse XLSX form attachments using LLM-powered OCR and send the extracted output to your ERP or otherwise. How it works An Outlook trigger is used to watch for incoming purchase order forms submitted via a shared inbox. The email attachment for the submission is a form in xlsx format - like this one Purchase Order Example - which is imported into the workflow. The 'Extract from File' node is used with the 'code' node to convert the xlsx file to markdown. This is so our LLM can understand it. The Information Extractor node is used to read and extract the relevant purchase order details and line items from the form. A simple validation step is used to check for common errors such as missing PO number or the amounts not matching up. A notification is automated to reply to the buyer if so. Once validation passes, a confirmation is sent to the buyer and the purchase order structured output can be sent along to internal systems. How to use This template only works if you're expecting and receiving forms in XLSX format. These can be invoices, request forms as well as purchase order forms. Update the Outlook nodes with your email or other emails as required. What's next? I've omitted the last steps to send to an ERP or accounting system as this is dependent on your org. Requirements Outlook for Emails Check out how to setup credentials here: https://docs.n8n.io/integrations/builtin/credentials/microsoft OpenAI for LLM document understanding and extraction. Customising the workflow This template should work for other Excel files. Some will be more complicated than others so experiment with different parsers and extraction tools and strategies. Customise the Information Extractor Schema to pull out the specific data you need. For example, capture any notes or comments given by the buyer.
by Yaron Been
Scrape Indeed Job Listings for Hiring Signals Using Bright Data and LLMs How the flow runs Fill the form with job position you're hunting for. Bright data's scraper will scrape Indeed based on your requirments. Workflow waits for the snapshot. Data returns as JSON. Jobs append to Google Sheets. Each row goes to an LLM to analyze if you're a good fit for the job (based on your prompts). The LLMswrites YES or NO next to each job opportunity, helping you find job posts that are relevant to you. What you need Google Sheets with our template. Bright Data dataset and API key. OpenAI key for GPT‑4o mini (or any other LLM). n8n with required nodes. Form fields To Fill Job Location** – city or region. Keyword** – role or skills. Country** – two‑letter code. Setup steps Copy the sheet template link. Import the JSON workflow. Add your credentials in nodes. Test the form manually. Add a schedule if desired. Bright Data filter example [ { "country": "US", "domain": "indeed.com", "keyword_search": "Growth Marketer", "location": "Miami", "date_posted": "Last 24 hours" } ] Tips -Choose Last 24 hours often. -Increase wait time for big snapshots. -Narrow keywords to save credits. **Need help? **Email me anytime: Yaron@nofluff.online YouTube: @YaronBeen LinkedIn: https://www.linkedin.com/in/yaronbeen/ Bright Data Docs: https://docs.brightdata.com/introduction
by Ifeoluwa Ajetomobi
This workflow helps you stay updated with daily launches on Product Hunt. It automatically fetches product details (name, tagline, description, and website), checks if the website redirects to another URL, and logs the final information into a Google Sheet. Perfect for indie hackers, product managers, content curators, and anyone tracking daily launches. How It Works Schedule Trigger – Runs the workflow daily. Set Date – Captures today’s date in ISO format for filtering Product Hunt posts. HTTP Request (Product Hunt API) – Retrieves Product Hunt posts for the day using GraphQL. Extract Product Info (Code Node) – Parses the response to pull key details: Name Tagline Description Website URL HTTP Request (URL Check) – Follows each website URL to detect if it redirects. Merge Data – Combines product info with the final destination URL. Google Sheets Node – Appends all processed product info to your sheet. Pre-conditions A valid Product Hunt API token A Google account with access to Google Sheets A Google Sheet already created with the correct columns (see below) Connected Google Sheets and HTTP credentials in n8n Google Sheets Setup Your spreadsheet should include the following columns (in order): Name Tagline Description Original URL Final URL (after redirect) Ensure your Google Sheets node uses the correct Spreadsheet ID and Sheet Name. Setup Instructions Product Hunt API Auth: Replace {{YOUR_PRODUCT_HUNT_API_KEY}} in the HTTP Request headers: { "Authorization": "Bearer {{YOUR_PRODUCT_HUNT_API_KEY}}" } Google Sheets Node: Connect your Google account. Insert your Spreadsheet ID in the settings. Specify the sheet name (e.g., Daily Launches). Use the “Append” operation and map the 5 data fields accordingly. Notes Only fetches the first 10 posts for the day (can be extended). Consider adding Slack, Discord, or Email nodes to notify you of new entries. Useful for building launch databases, research, or content inspiration.
by Yang
Who is this for? This workflow is built for marketers, sales teams, agencies, virtual assistants, and anyone who regularly researches or contacts local businesses. It's ideal for building lead lists, tracking competitors, or creating location-specific outreach campaigns. What problem is this workflow solving? Instead of manually searching Google Maps and copying business info into spreadsheets, this automation pulls structured business data (e.g. restaurants, gyms, service providers) and logs it directly into Google Sheets. It saves hours of work and ensures cleaner, more usable data. What this workflow does The workflow takes a Google Maps search query (like "best restaurants in New York") and sends it to Dumpling AI. It returns a list of places including their name, address, website, phone number, rating, and more. Each result is split into a row and automatically added to a Google Sheet. Setup Dumpling AI Sign up at Dumpling AI Generate your API key In the HTTP Request node, select Header Auth and paste your key in the Authorization field Google Sheets Create a sheet with tab name Leads Add the following column headers to row 1: Name, Address, Phone number, Website, Rating, Price Level, Type, Booking Link, Position Connect your Google Sheets account and link this sheet in the node Customize the Query In the HTTP node, replace the query string (e.g., "best+restaurants+in+New+York") with your own search term Run It Use the manual trigger to test Optionally swap in a Schedule or Webhook node to run it automatically How to customize this workflow to your needs Change the search query to target different cities or business types Use filters to only save leads with a minimum rating or price level Add GPT to summarize listings or qualify leads Swap Google Sheets for Airtable or a CRM system for deeper integration
by RealSimple Solutions
🧠 Analyze and Diagnose n8n Workflow Errors Automatically via OpenAI and Email > ⚠️ This template is available on ☁️ Cloud & 🖥️ self-hosted n8n instances with the OpenAI node enabled. 👤 Who is this for? This workflow is designed for n8n developers, automation engineers, and DevOps teams who want to automatically capture and analyze workflow errors, and receive professional HTML-styled diagnostics directly in their inbox. 💥 What problem does this solve? Manually troubleshooting failed workflows in n8n can be time-consuming. This template streamlines error detection by: Capturing workflow failures using the Error Trigger node Diagnosing root causes with the help of OpenAI Sending a fully-formatted, human-readable HTML error report via email Including practical resolutions and next-step suggestions This helps you or your team resolve issues faster and avoid repeated manual debugging. ⚙️ What this workflow does ⚡ Triggers on any n8n workflow error 📦 Extracts relevant error metadata including node, execution ID, and timestamps 🧠 Sends error content to OpenAI for analysis and recommendations 💌 Generates an HTML email report with inline styles and clear formatting 📥 Emails the result to a system administrator or support email 🛠️ Setup Install the OpenAI node in your self-hosted n8n instance. Add your OpenAI API Key securely in credentials. Configure the SMTP Email node with your email credentials. Adjust the Error Trigger to monitor specific workflows or all workflows. Set your preferred admin or dev email address in the final node. 🔧 How to customize this workflow to your needs 🧩 Use a [Set node] to define your variables, such as: Default admin email Workflow filter (optional) ✍️ Customize the prompt sent to OpenAI if you want deeper or more specific analysis 🎨 Modify the email HTML styles to match your brand or internal format 💾 Add additional logging (e.g., to Airtable, Google Sheets, or Notion) for long-term error tracking 📌 Sticky Note Title: Automated Error Reporter with AI-Powered Diagnosis Description: Captures any n8n error, sends it to OpenAI, and emails a beautiful HTML report to the administrator with steps to resolve the issue. Requires OpenAI credentials and SMTP configured.
by SamirLiu
📝 What this workflow does Every morning at 8 a.m., this workflow fetches the latest AI-related articles from both GNews and NewsAPI. It merges up to 40 new articles daily, selects the 15 most relevant ones on AI technology and applications, and uses GPT-4.1 to generate concise summaries in accurate Traditional Chinese (while preserving essential English technical terms). Each summary also includes the article link for easy referral. The compiled digest is then posted to your designated Telegram account or group. 👥 Who is this for? AI enthusiasts, professionals, and anyone interested in artificial intelligence news Individuals and teams wanting a concise daily digest of AI developments in Traditional Chinese Telegram users who prefer automated information delivery 🎯 What problem does this workflow solve? With the rapid evolution of AI technology, it can be overwhelming to keep up with new developments. This workflow addresses information overload by automatically collecting, summarizing, and translating the most important AI news each morning — all delivered conveniently to your chosen Telegram channel or group. ⚙️ Setup 🔑 Add NewsAPI and GNews API Keys Register for accounts on NewsAPI.org and GNews to obtain your API keys. Input your NewsAPI key directly into the Fetch NewsAPI articles node. Input your GNews API key into the Fetch GNews articles node. 🤖 Set up your Telegram Bot Create a Telegram Bot via BotFather and copy the generated Bot Token. In n8n, create Telegram Bot credentials using this token. In the Send summary to Telegram node, enter the chat ID of your target user, group, or channel to receive the messages. 🧠 Configure OpenAI Credentials In n8n, create a new credential using your OpenAI API key. Assign this credential to the GPT-4.1 Model node (or equivalent OpenAI/AI nodes). After completing these steps, your workflow is fully configured to fetch, summarize, and deliver daily AI news to your selected Telegram chat automatically. 🛠️ How to customize this workflow 🔍 Change the topic:** Update the keywords in the NewsAPI and GNews nodes for other subjects (e.g., "blockchain", "quantum computing"). ⏰ Adjust delivery time:** Modify the scheduled trigger to your preferred hour. ✍️ Tweak summary style or language:** Refine the prompt in the AI summarizer node for different tones or translate into other languages as needed. 📦 Dependencies NewsAPI account GNews account Telegram Bot OpenAI API access (for GPT-4.1) or compatible AI model for Langchain agent
by n8n Team
This workflow automatically syncs your Zendesk tickets to your HubSpot contacts. Every 5 minutes, your Zendesk account collects all the new or updated tickets and syncs them accordingly, with your HubSpot contacts database. Prerequisites Zendesk account and Zendesk credentials HubSpot account and HubSpot credentials How it works Cron node triggers the workflow every 5 minutes. Function Item node checks all the tickets received after the last execution timestamp. Zendesk node collects all tickets updated after last execution. Zendesk node collects the user data of ticket requester. Set node collects the contact`s email, name and externalID. Merge by key node combines two inputs, ticket data and ticket requester data. If node splits the workflow conditionally, based on data received. If the data corresponds to a ticket that already exists, HubSpot node will update the ticket. If the data does not correspond to a ticket, HubSpot node creates/updates the contact. Zendesk node updates the external Id in Zendesk for the said contact. HubSpot node creates a new ticket. Zendesk node updates the ticket with the external Id. The Function Item node sets the new last execution timestamp.