by Alex Kim
Overview The n8n Workflow Cloner is a powerful automation tool designed to copy, sync, and migrate workflows across different n8n instances or projects. Whether you're managing multiple environments (development, staging, production) or organizing workflows within a team, this workflow automates the transfer process, ensuring seamless workflow deployment with minimal manual effort. By automatically detecting and copying only the missing workflows, this tool helps maintain consistency, improve collaboration, and streamline workflow migration between projects or instances. How to Use 1️⃣ Set Up API Credentials Configure API credentials for both source and destination n8n instances. Ensure the credentials have read and write access to manage workflows. 2️⃣ Select Source & Destination Update the "GET - Workflows" node to define the source instance. Set the "CREATE - Workflow" node to specify the destination instance. 3️⃣ Run the Workflow Click "Test Workflow" to start the transfer. The system will fetch all workflows from the source, compare them with the destination, and copy any missing workflows. 4️⃣ Change the Destination Project (Optional) By default, workflows are moved to the "KBB Workflows" project. Modify the "Filter" node to transfer workflows to a different project. 5️⃣ Monitor & Verify The Loop Over Items node ensures batch processing for multiple workflows. Log outputs provide details on transferred workflows and statuses. Key Benefits ✅ Automate Workflow Transfers – No more manual exports/imports. ✅ Sync Workflows Across Environments – Keep workflows up to date in dev, staging, and production. ✅ Effortless Team Collaboration – Share workflows across projects seamlessly. ✅ Backup & Migration Ready – Easily move workflows between n8n instances. Use Cases 🔹 CI/CD for Workflows – Deploy workflows between development and production environments. 🔹 Team Workflow Sharing – Share workflows across multiple n8n projects. 🔹 Workflow Backup Solution – Store copies of workflows in a dedicated backup project. Tags 🚀 Workflow Migration 🚀 n8n Automation 🚀 Sync Workflows 🚀 Backup & Deployment
by Angel Menendez
Automate Report Generation with n8n & Qualys Introducing the Save Qualys Reports to TheHive Workflow—a robust solution designed to automate the retrieval and storage of Qualys reports in TheHive. This workflow fetches reports from Qualys, filters out already processed reports, and creates cases in TheHive for the new reports. It runs every hour to ensure continuous monitoring and up-to-date vulnerability management, making it ideal for Security Operations Centers (SOCs). How It Works: Set Global Variables:** Initializes necessary global variables like base_url and newtimestamp. This step ensures that the workflow operates with the correct configuration and up-to-date timestamps. Ensure to change the Global Variables to match your environment. Fetch Reports from Qualys:** Sends a GET request to the Qualys API to retrieve finished reports. Automating this step ensures timely updates and consistent data retrieval. Convert XML to JSON:** Converts the XML response to JSON format for easier data manipulation. This transformation simplifies further processing and integration into TheHive. Filter Reports:** Checks if the reports have already been processed using their creation timestamps. This filtering ensures that only new reports are handled, avoiding duplicates. Process Each Report:** Loops through the list of new reports, ensuring each is processed individually. This step-by-step handling prevents issues related to bulk processing and improves reliability. Create Case in TheHive:** Generates a new case in TheHive for each report, serving as a container for the report data. Automating case creation improves efficiency and ensures that all relevant data is captured. Download and Attach Report:** Downloads the report from Qualys and attaches it to the respective case in TheHive. This automation ensures that all data is properly archived and easily accessible for review. Get Started: Ensure your Qualys and TheHive integrations are properly set up. Customize the workflow to fit your specific vulnerability management needs. Need Help? Join the discussion on our Forum or check out resources on Discord! Deploy this workflow to streamline your vulnerability management process, improve response times, and enhance the efficiency of your security operations.
by n8n Team
This n8n workflow automates the analysis of email messages received in a Microsoft Outlook inbox to identify indicators of compromise (IOCs), specifically suspicious URLs. It can be triggered manually or scheduled to run daily at midnight. The workflow begins by retrieving up to 100 read email messages from the Outlook inbox. However, there seems to be a configuration issue as it should retrieve unread messages, not read ones. It then marks these messages as read to avoid processing them again in the future. The messages are then split into individual items using the Split In Batches node for sequential processing. For each email, the workflow analyzes its content to find URLs, which are considered potential IOCs. If URLs are found, the workflow proceeds to check these URLs for potential threats using two services, URLScan.io and VirusTotal, in parallel. In the first path, URLScan.io scans each URL, and if there are no errors, the results from URLScan.io and VirusTotal are merged. If there are errors, the workflow waits 1 minute before attempting to retrieve the URLScan results again. The loop then continues for the next email. In the second path, VirusTotal is used to scan the URLs, and the results are retrieved. Finally, the workflow checks if the data field is not empty, filtering out items where no data was found. It then sends a summarized Slack message to report details about the analyzed email, including the subject, sender, date, URLScan report URL, and VirusTotal verdict for URLs that were reported as malicious. Potential issues during setup include configuring the Outlook node to retrieve unread messages, resolving a configuration issue in the VirusTotal node, and handling authentication and API keys for both URLScan.io and VirusTotal nodes. Additionally, proper error handling and testing with various email content types and URLs are essential to ensure the workflow accurately identifies IOCs and reports them to the Slack channel.
by NanaB
Description This n8n workflow automates the entire process of creating and publishing AI-generated videos, triggered by a simple message from a Telegram bot (YTAdmin). It transforms a text prompt into a structured video with scenes, visuals, and voiceover, stores assets in MongoDB, renders the final output using Creatomate, and uploads the video to YouTube. Throughout the process, YTAdmin receives real-time updates on the workflow’s progress. This is ideal for content creators, marketers, or businesses looking to scale video production using automation and AI. You can see a video demonstrating this template in action here: https://www.youtube.com/watch?v=EjI-ChpJ4xA&t=200s How it Works Trigger: Message from YTAdmin (Telegram Bot) The flow starts when YTAdmin sends a content prompt. Generate Structured Content A Mistral language model processes the input and outputs structured content, typically broken into scenes. Split & Process Content into Scenes The content is split into categorized parts for scene generation. Generate Media Assets For each scene: Images: Generated using OpenAI’s image model. Voiceovers: Created using OpenAI’s text-to-speech. Audio files are encoded and stored in MongoDB. Scene Composition Assets are grouped into coherent scenes. Render with Creatomate A complete payload is generated and sent to the Creatomate rendering API to produce the video. Progress messages are sent to YTAdmin. The flow pauses briefly to avoid rate limits. Render Callback Once Creatomate completes rendering, it sends a callback to the flow. If the render fails, an error message is sent to YTAdmin. If the render succeeds, the flow proceeds to post-processing. Generate Title & Description A second Mistral prompt generates a compelling title and description for YouTube. Upload to YouTube The rendered video is retrieved from Creatomate. It’s uploaded to YouTube with the AI-generated metadata. Final Update A success message is sent to YTAdmin, confirming upload completion. Set Up Steps (Approx. 10–15 Minutes)Step 1: Set Up YTAdmin Bot Create a Telegram bot via BotFather and get your API token. Add this token in n8n's Telegram credentials and link to the "Receive Message from YTAdmin" trigger. Step 2: Connect Your AI Providers Mistral: Add your API key under HTTP Request or AI Model nodes. OpenAI: Create an account at platform.openai.com and obtain an API key. Use it for both image generation and voiceover synthesis. Step 3: Configure Audio File Storage with MongoDB via Custom API Receives the Base64 encoded audio data sent in the request body. Connects to the configured MongoDB instance (connection details are managed securely within the API- code below). Uses the MongoDB driver and GridFS to store the audio data. Returns the unique _id (ObjectId) of the stored file in GridFS as a response. This _id is crucial as it will be used in subsequent steps to generate the download URL for the audio file. My API code can be found here for reference: https://github.com/nanabrownsnr/YTAutomation.git Step 4: Set Up Creatomate Create a Creatomate account, define your video templates, and retrieve your API key. Configure the HTTP request node to match your Creatomate payload requirements. Step 5: Connect YouTube In n8n, add OAuth2 credentials for your YouTube account. Make sure your Google Cloud project has YouTube Data API enabled. Step 6: Deploy and Test Send a message to YTAdmin and monitor the flow in n8n. Verify that content is generated, media is created, and the final video is rendered and uploaded. Customization Options Change the AI Prompts Modify the generation prompts to adjust tone, voice, or content type (e.g., news recaps, product videos, educational summaries). Switch Messaging Platform Replace Telegram (YTAdmin) with Slack, Discord, or WhatsApp by swapping out the trigger and response nodes. Add Subtitles or Effects Integrate Whisper or another speech-to-text tool to generate subtitles. Add overlay or transition effects in the Creatomate video payload. Use Local File Storage Instead of MongoDB Swap out MongoDB upload http nodes with filesystem or S3-compatible storage. Repurpose for Other Platforms Swap YouTube upload with TikTok, Instagram, or Vimeo endpoints for broader publishing. **Need Help or Want to Customize This Workflow? If you'd like assistance setting this up or adapting it for a different use case, feel free to reach out to me at nanabrownsnr@gmail.com. I'm happy to help!**
by Oneclick AI Squad
This automated n8n workflow scrapes job listings from Upwork using Apify, processes and cleans the data, and generates daily email reports with job summaries. The system uses Google Sheets for data storage and keyword management, providing a comprehensive solution for tracking relevant job opportunities and market trends. What is Apify? Apify is a web scraping and automation platform that provides reliable APIs for extracting data from websites like Upwork. It handles the complexities of web scraping including rate limiting, proxy management, and data extraction while maintaining compliance with website terms of service. Good to Know Apify API calls may incur costs based on usage; check Apify pricing for details Google Sheets access must be properly authorized to avoid data sync issues The workflow includes data cleaning and deduplication to ensure high-quality results Email reports provide structured summaries for easy review and decision-making Keyword management through Google Sheets allows for flexible job targeting How It Works The workflow is organized into three main phases: Phase 1: Job Scraping & Initial Processing This phase handles the core data collection and initial storage: Trigger Manual Run - Manually starts the workflow for on-demand job scraping Fetch Keywords from Google Sheet - Reads the list of job-related keywords from the All Keywords sheet Loop Through Keywords - Iterates over each keyword to trigger Apify scraping Trigger Apify Scraper - Sends HTTP request to start Apify actor for job scraping Wait for Apify Completion - Waits for the Apify actor to finish execution Delay Before Dataset Read - Waits a few seconds to ensure dataset is ready for processing Fetch Scraped Job Dataset - Fetches the latest dataset from Apify Process Raw Job Data - Filters jobs posted in the last 24 hours and formats the data Save Jobs to Daily Sheet - Appends new job data to the daily Google Sheet Update Keyword Job Count - Updates job count in the All Keywords summary sheet Phase 2: Data Cleaning & Deduplication This phase ensures data quality and removes duplicates: Load Today's Daily Jobs - Loads all jobs added in today's sheet for processing Remove Duplicates by Title/Desc - Removes duplicates based on title and description matching Save Clean Job Data - Saves the cleaned, unique entries back to the sheet Clear Old Daily Sheet Data - Deletes old or duplicate entries from the sheet Reload Clean Job Data - Loads clean data again after deletion for final processing Phase 3: Daily Summary & Email Report This phase generates summaries and delivers the final report: Generate Keyword Summary Stats - Counts job totals per keyword for analysis Update Summary Sheet - Updates the summary sheet with keyword statistics Fetch Final Summary Data - Reads the summary sheet for reporting purposes Build Email Body - Formats email with statistics and sheet link Send Daily Report Email - Sends the structured daily summary email to recipients Data Sources The workflow utilizes Google Sheets for data management: AI Keywords Sheet - Contains keyword management data with columns: Keyword (text) - Job search terms Job Count (number) - Number of jobs found for each keyword Status (text) - Active/Inactive status Last Updated (timestamp) - When keyword was last processed Daily Jobs Sheet - Contains scraped job data with columns: Job Title (text) - Title of the job posting Description (text) - Job description content Budget (text) - Job budget or hourly rate Client Rating (number) - Client's rating on Upwork Posted Date (timestamp) - When job was posted Job URL (text) - Direct link to the job posting Keyword (text) - Which keyword found this job Scraped At (timestamp) - When data was collected Summary Sheet - Contains daily statistics with columns: Date (date) - Report date Total Jobs (number) - Total jobs found Keywords Processed (number) - Number of keywords searched Top Keyword (text) - Most productive keyword Average Budget (currency) - Average job budget Report Generated (timestamp) - When summary was created How to Use Import the workflow into n8n Configure Apify API credentials and Google Sheets API access Set up email credentials for daily report delivery Create three Google Sheets with the specified column structures Add relevant job keywords to the AI Keywords sheet Test with sample keywords and adjust as needed Requirements Apify API credentials and actor access Google Sheets API access Email service credentials (Gmail, SMTP, etc.) Upwork job search keywords for targeting Customizing This Workflow Modify the Process Raw Job Data node to filter jobs by additional criteria like budget range, client rating, or job type. Adjust the email report format to include more detailed statistics or add visual aids, such as charts. Customize the data cleaning logic to better handle duplicate detection based on your specific requirements, or add additional data sources beyond Upwork for comprehensive job market analysis.
by Sirhexalot
This n8n workflow enables you to export data from Zammad, including Users, Roles, Groups, and Organizations, into individual Excel files. It simplifies data handling and reporting by creating structured outputs for further processing or sharing. Features Export Users with associated details such as email, firstname, lastname, role_ids, and group_ids. Export Roles and Organizations with their respective identifiers and names. Convert all data into separate Excel files for easy access and use. Usage Import this workflow into your n8n instance. Configure the required Zammad API credentials (zammad_base_url and zammad_api_key) in the Basic Variables node. Run the workflow to generate Excel files containing Zammad data. Issues and Suggestions If you encounter any issues or have suggestions for improvement, please report them on the GitHub repository. We appreciate your feedback to help enhance this workflow!
by Sk developer
YouTube Transcript Summarization in Any Language for Social Media This n8n workflow automates the process of: Retrieving YouTube Video Transcripts: It fetches the transcript for any YouTube video URL provided using the YouTube Transcript API from RapidAPI. Generating a Concise Summary in Any Language: The workflow uses Google Gemini (PaLM) to create a concise summary of the transcript in the language specified by the user (e.g., English, Spanish, etc.). Storing the Summary in Google Docs: The generated summary is inserted into a predefined Google Document, making it easy for users to share or edit. Features: Language Flexibility:** Summaries are created in the desired language. Fully Automated:** From fetching the transcript to updating Google Docs, the process is fully automated. Social Media Ready:** The summary is formatted and stored in a Google Doc, ready for use in social media posts. This workflow integrates with YouTube Transcript API via RapidAPI, allowing you to easily fetch video transcripts and summarize them with AI. The entire process is automated and seamless. Powered by RapidAPI: API Used:* YouTube Transcript API via *RapidAPI** to get the transcript data. Benefits: Saves Time:** Automates the transcript summarization process, eliminating the need for manual content extraction and summarization. Customizable Language Support:** Provides summaries in any language, enabling accessibility and engagement for a global audience. Streamlined Content Creation:** Automatically generates concise, engaging summaries that are ready for social media use. Google Docs Integration:** Saves summaries directly into a Google Doc for easy sharing, editing, and content management. Challenges Addressed: Manual Transcript Extraction:** Problem: Manually transcribing and summarizing YouTube videos for social media can be time-consuming and error-prone. Solution: This workflow fully automates the process, saving hours of manual work using the YouTube Transcript API. Lack of Language Support in Summaries:** Problem: Many automated tools only summarize content in a single language, limiting their accessibility. Solution: With language flexibility, the workflow creates summaries in the language of your choice, helping you cater to diverse audiences. Inconsistent Video Quality & Transcript Accuracy:** Problem: Not all YouTube videos have well-structured or accurate transcripts, leading to incomplete or inaccurate summaries. Solution: The workflow can process and format even imperfect transcripts, ensuring that the generated summaries are still accurate and useful. Managing Content Across Platforms:** Problem: Transcripts and summaries often need to be stored in multiple locations for social media posts, which can be cumbersome. Solution: The workflow integrates with Google Docs to automatically store and manage summaries in one place, making it easier to share and reuse content.
by Franz
🚀 What the “Agent Builder” template does Need to turn a one-line chat request into a fully-wired n8n workflow template—complete with AI agents, RAG, and web-search super-powers—without lifting a finger? That’s exactly what Agent Builder automates: Listens to any incoming chat message (via the Chat Trigger). Spins up an AI architect that analyses the request, searches the web, reads n8n docs from a Pinecone vector store, and designs the smallest possible set of nodes. Auto-generates a ready-to-import JSON template and hands it back as a downloadable file—plus all the supporting assets (embeddings, vector store etc.) so the next prompt is even smarter. Think of it as your personal “workflow chef”: you shout the order, it shops for ingredients, cooks, plates, and serves the meal. All you do is eat. 🤗 Who will love this? No-code builders / power users** who don’t want to wrestle with AI node wiring. Agencies & consultants** delivering lots of bespoke automations. Internal platform teams** who need a “workflow self-service portal” for non-technical colleagues. 🧩 How it’s wired | Sub-process | What happens inside | Key nodes | | -------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | -------------------------------------------------------------------------------------- | | Web Crawler (optional) | Firecrawl scrapes docs.n8n.io (or any URL you drop in) and streams raw markdown back. | Set URL → HTTP Request (Extract) → Wait & Retry | | RAG Trainer | Splits the scraped docs, embeds them with OpenAI, and upserts vectors into Pinecone. | Recursive Text Splitter → Embeddings OpenAI → Train Pinecone | | Agent Builder | The star of the show – orchestrates GPT-4o (via OpenRouter), SerpAPI web-search, your Pinecone index and a Structured Output Parser to produce → validate → prettify the final n8n template. | Chat Trigger → AI Agent → OpenAI (validator) → Code (extract) → Convert to JSON file | Every arrow in the drawn workflow is pre-connected, so the generated template always passes n8n’s import check. 🛠️ Getting set up (5 quick creds) | Service | Credential type | | --------------------------------------------------- | ---------------------------------------------------------- | | OpenAI / Azure OpenAI – embeddings & validation | OpenAI API | | Pinecone – vector store | Pinecone API | | OpenRouter – GPT-4o LLM | OpenRouter API Key | | SerpAPI – web search | SerpAPI Key | | Firecrawl (only if you plan to crawl) | Generic Header Auth → Authorization: Bearer YOUR_KEY | Each node already expects those creds; just create them once, select in the dropdown, hit Activate. 🏃♀️ What a typical run looks like User says: “Build me a workflow that monitors our support inbox, summarises new tickets with GPT and posts to Slack.” Chat Trigger captures the message. AI Agent: queries Pinecone for relevant n8n docs, fires a SerpAPI search for “n8n gmail trigger example”, sketches an architecture (Gmail Trigger → GPT Model → Slack). The agent returns JSON ➜ OpenAI node double-checks field names, connections, type versions. A tiny JS Code node slices the JSON out of the chat blob and saves it as template.json ready for download. You download, import, and… done. ✏️ Customising Switch the LLM* – plug in Claude 3, Gemini 1.5, or a local model; just swap the *OpenRouter Chat Model node. Point the RAG at your own docs* – change the crawl URL or feed PDFs via the *Default Data Loader. Hard-code preferred nodes* – edit the “User node preferences” in the system message so the agent always chooses *Notion for databases, etc. 🥡 Take-away notes It's a prototype feel free to experiment with it to improve its capabilities. Have fun building!**
by Liam
Ever wanted to just tell your 3d printer what to do remotely? This game changer let's you converse with OpenAI agents to manage OctoPrint connected 3d printers. Great for remote management and monitoring with easy to read responses. Query what jobs are available, Load a print job (upload workflow in development, will be released soon for community use), start the print job and request detailed reports on your printers performance as it runs. Chat > AI > OctoPrint API > Your 3d Printer ++Example 1:++ Human Input: "What print job is running?" AI Response: Current Print Job Details, Job Name: Stand v1.gcode File Size: 2,533,233 bytes Estimated Print Time: Approximately 6 hours 15 minutes (22536 seconds) Progress: Completion: 1.97%, Current Position in File: 49,940 bytes, Time Spent Printing: 2,611 seconds (~43.5 minutes), Time Remaining: Approximately 5 hours 34 minutes (20,123 seconds), Status, Current State: Printing." ++Example 2++ Human Input: "Is my 3d Printer ok?" AI Response: "Status: Printing, Operational: Yes, ++Temperature:++ Bed Temperature: Actual: 50°C, Target: 50°C, Nozzle (Tool 0) Temperature: Actual: 200°C, Target: 200°C, ++Connection Status:++ Connection State: Currently connected to the printer, Port: /dev/ttyUSB0, Baudrate: 115200, Summary, Your 3D printer is currently operational and printing without errors. The bed and nozzle temperatures are both at their target values."
by Miquel Colomer
Do you want to discover company-related information to enrich a signup process? This workflow enriches any company by name using the uProc Get Company by Name tool. This tool combines Google Maps and emails research on the internet to return results. You get no results if the company has no presence on Google Maps. You need to add your credentials (Email and API Key - real -) located at Integration section to n8n. You can replace node "Create Company Item" with any other supported service returning Company names and countries, like Hubspot, Google Sheets, MySQL, or Typeform. You can set up the uProc node with several parameters: country: the country name you want to use. name: the name of the company you need to locate. Every "uProc" node returns the next fields per every located company: name: Contains the company's given name. email: Contains the company's given email. cif: Contains company's cif number. address: Contains company's formatted address. city: Contains the city location of the company. state: Contains province location of the company. county: Contains state location of the company country: Contains country location of the company zipcode: Contains zipcode code of the company phone: Contains phone number of the company website: Contains website of the company latitude: Contains latitude of the company longitude: Contains longitude of the company Next, you can save results to a CRM or Google Sheets, and prepare returned email or phone to launch an email or telemarketing campaign.
by Yaron Been
This workflow automatically identifies trending topics and hashtags across social media platforms to keep you informed of current trends and viral content. It saves you time by eliminating the need to manually research trending topics and provides data-driven insights for content strategy and social media planning. Overview This workflow automatically scrapes trending hashtag platforms and social media sites to extract currently trending topics, hashtags, and viral content themes. It uses Bright Data to access trend data sources without restrictions and AI to intelligently analyze trending content and provide actionable insights for content creators and marketers. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping trend platforms and social media without being blocked OpenAI**: AI agent for intelligent trend analysis and content insights Google Sheets**: For storing trending topics data and analysis results How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your trending topics tracking spreadsheet Customize: Define target trend platforms and topics of interest Use Cases Content Marketing**: Discover trending topics for timely and relevant content creation Social Media Strategy**: Plan posts around viral hashtags and trending themes Brand Monitoring**: Track if your brand or industry topics are trending Influencer Marketing**: Identify trending content opportunities for collaborations Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #trendingtopics #hashtags #brightdata #webscraping #contentmarketing #n8nworkflow #workflow #nocode #socialmediatrends #trendanalysis #viralcontent #contentresearch #socialmediamonitoring #trendtracking #contentdiscovery #hashtagresearch #socialmediamarketing #contentautomation #trendmonitoring #socialmediainsights #contentplanning #trendalerts #viralmarketing #socialtrends #contentoptimization #trendingcontent #socialmediadata #contentintelligence
by Michael Yang
Who is this template for? This workflow is perfect for competitive‑intel analysts, product managers, content marketers, and anyone who tracks multiple company blogs or news sources. If you need a weekly snapshot of fresh, on‑topic articles—without wading through dozens of tabs—this template is for you. What does it do? The workflow reads a curated list of candidate URLs from Google Sheets, filters out duplicates and off‑topic pages with an AI agent, scrapes the surviving links, generates three‑sentence summaries, logs the results back to Sheets, and delivers a polished HTML digest to your inbox every week. Why is it useful? Instead of manually opening competitor links, checking for relevance, copying highlights, and pasting them into reports, this automation does the grunt work for you. It turns scattered URLs into a searchable knowledge base and a ready‑to‑share email, freeing you to focus on insights and strategy—not housekeeping. How does it work? A Sunday‑morning cron trigger kicks things off. The workflow pulls links from the Input Links tab, compares them to the existing Summary tab, and passes fresh candidates to an AI “bouncer” that keeps only blog posts, tutorials, news, and product updates. Firecrawl then scrapes each page; Gemini 2.5‑Flash and OpenAI condense the content into title, author, date, and summary. The structured data is appended to your Summary sheet and formatted into a company‑grouped HTML digest, which lands in your email before the workweek starts. Set up steps Clone the workflow Import the JSON into your n8n Cloud workspace. Create the Google Sheet Make a new spreadsheet with two tabs: Input Links and Summary (names must match). In Input Links, add columns Company, Page Type, and Link (or rename to match the node mapping). Leave Summary blank—the workflow will populate it. Copy the Sheet URL; you’ll paste it into two Google Sheets nodes. Add credentials (n8n ▸ Credentials) Google Sheets OAuth2 – Authorise with the Google account that owns the spreadsheet. Gmail OAuth2 – Authorise the Gmail account that should send the digest. Firecrawl HTTP Header Auth – Set Authorization: Bearer <YOUR_FIRECRAWL_API_KEY>. Point nodes to your Sheet Open each Google Sheets node (Input Links, Read_Url_Summary_Tool, Append row in sheet, Get row(s) in sheet). Paste the Document ID (found in the Sheet URL) and select the correct tab (Input Links or Summary). Update email recipients In the Send a message (Gmail) node, replace the sample addresses with your own distribution list. Adjust scheduling (optional) Double‑click the Schedule Trigger node and change the cron expression if you prefer a different day/time. Tune AI models (optional) OpenAI o4‑mini and Gemini 2.5‑Flash nodes default to cost‑efficient settings. Feel free to switch models or tweak temperature to suit your tone. Test with a single URL Add one row in Input Links, then execute the workflow manually (▶ Run). Verify that a new row appears in Summary and an email lands in your inbox. Go live Activate the workflow (toggle in top bar). Confirm the green status badge and wait for the next scheduled run. Tip: The Firecrawl Free tier limits you to ~10 requests/min. If you scale beyond that, raise the batching interval in both Firecrawl nodes or upgrade your Firecrawl plan.