by LuisBetancourt.co
Description Whenever a Zoom “Meeting assets” email arrives in your Gmail inbox, this workflow will: 1) Trigger on new Gmail messages filtered by the subject “Meeting assets”. 2) Extract from the email (HTML or plain text): 3) Type of session (e.g. “1 hour”, “2 hours”, or “exploratory call”). Client’s full name. Session date & time (from the GMT… timestamp). Duration (HH:MM:SS). Recording link. Quick summary. Detailed summary. List of next steps. 4) Lookup the client in your Master Airtable base, table People, by full name. 5) Send a personalized Gmail to the client with all extracted details. 6) Create a new record in your Sessions table in Airtable, linking back to that client. Quick Start Import this JSON into n8n as a new workflow. Connect your Gmail credentials (OAuth2). Connect your Airtable credentials (Personal Access Token). In the Search Records node: Base → your Master base ID. Table → “Your people table”. Filter By Formula → ={Full Name} = '{{ $json.clientName }}'. In the Create Record node: Table → “Sessions”. Map each field (dateTime, duration, summaries, next steps, client link). Activate the workflow. Prerequisites n8n v1.50 or higher A Gmail account with OAuth2 credentials configured An Airtable base containing: Table People with a Full Name field (and email). Table Sessions with fields: DateTime, Duration, Quick Summary, Detailed Summary, Next Steps, and a Linked Record to People. An Airtable Personal Access Token with read/write access to that base. Tips & Extensions Timezone conversion: Use a Function node with moment-timezone to convert UTC if needed. Error handling: Add a catch node to log or notify if any field fails to parse. Alternate notifications: Swap the Gmail node for Slack, Microsoft Teams, or SMS integrations. With this documentation, your team can import and deploy the workflow in minutes. Enjoy!
by Samir Saci
Tags: Productivity, Education, Learning, Language Context I’m a Supply Chain Data Scientist from Paris who lived six years in China — and yes, learning Mandarin while working full-time was tough. Learning Mandarin as an adult can be very difficult, especially if you have a full-time job. With AI, you can now have a Chinese tutor available 24/7 on your phone — no excuses left! It is with this spirit that I designed this workflow to support fellow Mandarin learners with a Chinese Teacher powered by GPT-4o. >Boost your language skills with AI using N8N! 📬 For business inquiries, you can add me on LinkedIn Who is this template for? This workflow template is designed for language learners and educators who need support to learn a vocabulary list in Mandarin (or any other language) using Open AI GPT-4o, an AI agent and a Telegram Bot to interact with users. For the vocabulary list, you can use another template shared in my profile 🉑 Generate Anki Flash Cards for Language Learning with Google Translate and GPT-4o to generate the Google Sheet needed in this workflow. How does it work? The workflow loads a vocabulary list stored in your Google Sheet. The bot will: 📥 Load your vocabulary list from Google Sheets 🧠 Generate multiple-choice questions with GPT-4o ✅ Evaluate your answer and give instant feedback 🔁 Loop to the next word until you're fluent These fields will be automatically added to a Google Sheet, ready to be loaded in Anki to create flash cards. What do I need to start? This workflow does not require any advanced programming skills. Prerequisite A Google Drive Account with a folder including a Google Sheet filled with the vocabulary list you want to learn. API Credentials: Open AI API for the chat model, Google Drive API and Google Sheets API activated with OAuth2 credentials A Telegram Bot with its token recorded in the Telegram Node Credentials A Google Sheet** with two columns (initialText: words in your own language, translatedText: words in the target language) Next Steps Follow the sticky notes to set up the parameters inside each node and get ready to pump your learning skills. 🎥 Watch My Tutorial 🚀 Curious how N8N can supercharge learning or supply chain? 📬 Let’s connect on LinkedIn Notes This workflow can be used for any language. In the AI Agent prompt, you just need to replace Chinese with your language. This workflow has been created with N8N 1.82.1 Submitted: March 23th, 2025
by Niko
Capture URL Screenshots Automatically from Google Sheets & Drive with ScreenshotOne & Gmail Alerts Summary This automation template streamlines the process of capturing screenshots for multiple URLs. Instead of manually visiting each URL, taking a screenshot, and organizing the results, this workflow automates everything. When a spreadsheet is added to a designated Google Drive folder, the template extracts URLs from the column named "Url." These URLs are then processed through ScreenshotOne to capture screenshots, which are saved back to the same folder. Finally, an email notification is sent via Gmail with a link to the folder containing the screenshots. Problem Solved This template addresses the challenge of manual screenshot capture for multiple URLs. Without this automation, a user would need to: Open each URL from a spreadsheet. Take a screenshot manually. Save each screenshot with an appropriate name. Organize the screenshots in a folder. Notify stakeholders when the process is complete. These steps are not only time-consuming but also repetitive, especially when handling a large number of URLs. Who Can Benefit: Digital Marketers:** Monitor website appearances for competitive analysis or to track campaign landing pages. Web Developers/Designers:** Capture screenshots of multiple websites for inspiration or reference. QA Teams:** Document the visual state of web pages during various stages of development. SEO Specialists:** Track visual changes to websites they are optimizing. Content Managers:** Monitor how content appears across various web properties. Prerequisites Google Drive Node:** Must have appropriate permissions to create and access folders. Connected Google Sheets Node:** To extract URLs from the spreadsheet. Authenticated Gmail Node:** For sending notifications. ScreenshotOne Account:* Either a free or paid plan depending on volume needs, along with an *Access key**. Ensure you replace the placeholder --YOUR ACCESS KEY-- with your generated access key in the "Get Screenshots" node. Workflow Details Step 1: Google Drive Integration Trigger Node:** Monitors a specific folder in Google Drive. When a spreadsheet is added, the workflow is initiated. Step 2: Google Sheets Processing Google Sheets Node:** Extracts URLs from the column named "Url." Step 3: Screenshot Capture Get Screenshots Node:** Sends each extracted URL to ScreenshotOne to capture screenshots. Step 4: Saving Screenshots and Notifications Google Drive Node:** Saves the captured screenshots back into the same folder. Gmail Node:** Sends an email notification with a link to the folder, alerting stakeholders that the screenshots are ready. Customization Guidance Folder Monitoring: The workflow is set to monitor a specific Google Drive folder. It can be customized by selecting a different folder in the node settings. Spreadsheet Structure: While the template expects a spreadsheet with a column named "Url." for extracting URLs, users can add additional columns (e.g., titles, categories, or tags) and modify the workflow to utilize them as needed. Email Settings: Customize the recipient, subject, and body of the notification email to suit your needs. If required, enable optional notifications for different stakeholders. ScreenshotOne Access Key & Configurations: A valid ScreenshotOne Access key is required to capture screenshots. Users can further refine screenshot settings (e.g., viewport size, device emulation, or delay timing) by exploring the available options in the ScreenshotOne API documentation.
by Abbas Ali
This automation fetches the latest article from a WordPress blog, summarizes it using OpenAI, and sends the summary to a list of subscribers via email. Ideal for content creators and bloggers who want to distribute digestible content without manual effort. Use Case Perfect for: • Newsletter creators • Content marketers • Bloggers • Knowledge managers Nodes Used • Schedule Trigger • HTTP Request • Set • OpenAI • Google Sheets • Email (Gmail/SMTP) • IF • SplitInBatches Workflow Steps Trigger: Starts on a schedule (e.g., daily at 9:00 AM). Fetch Blog Post: Retrieves the most recent post from a WordPress blog via HTTP Request. Extract Fields: A Set node extracts the title, link, and content. Summarize Article: OpenAI processes the article and returns a 3-point summary. Fetch Subscribers: Google Sheets reads email addresses from a subscriber list. Loop Emails: SplitInBatches and Send Email nodes loop through subscribers. Conditional Logic: IF node skips articles shorter than 300 words. Credentials Required • OpenAI API Key (for content summarization) • Google Sheets OAuth2 (to read subscriber emails) • Gmail or SMTP (for sending emails) Test Instructions Replace blog URL in HTTP Request node. Connect OpenAI API key. Link your Google Sheet with a column named Email. Set up Gmail or SMTP credentials. Run manually for testing, then activate schedule.
by Oneclick AI Squad
This n8n workflow monitors email alerts for disk utilization exceeding 80%, extracts the server IP, logs into the server, and purges logs from Nginx, PM2, Docker, and system files to clear disk space. Key Insights Ensure email alerts are consistently formatted with server IP details. SSH access must be properly configured to avoid authentication failures. Workflow Process Initiate the workflow with the Check Disk Alert Emails node when an email triggers on high disk usage. Parse the email to extract the server IP using the Extract Server IP from Email node. Set up SSH credentials and paths manually with the Prepare SSH Variables node. Execute cleanup commands to delete logs from Nginx, PM2, Docker, and system files using the Run LogCleanup Commands via SSH node. Usage Guide Import the workflow into n8n and configure email and SSH credentials. Test with a sample email alert to verify IP extraction and log deletion. Prerequisites Email service (e.g., IMAP or API) for alert monitoring SSH access with valid credentials Customization Options Modify the Prepare SSH Variables node to target specific log directories or adjust cleanup commands for different server setups.
by Yaron Been
Description This workflow automatically generates comprehensive property market reports by scraping real estate listings and market data from multiple sources. It helps real estate professionals save time and provide data-driven insights to clients without manual research. Overview This workflow automatically generates property market reports by scraping real estate listings and market data. It uses Bright Data to access multiple real estate websites and compiles the data into comprehensive reports. Tools Used n8n:** The automation platform that orchestrates the workflow. Bright Data:** For scraping real estate websites and property data without getting blocked. Spreadsheets/Databases:** For storing and analyzing property data. Document Generation:** For creating professional PDF reports. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Configure Bright Data: Add your Bright Data credentials to the Bright Data node. Set Up Data Storage: Configure where you want to store the property data. Customize: Specify locations, property types, and report format. Use Cases Real Estate Agents:** Generate market reports for clients. Property Investors:** Track market trends in target areas. Market Analysts:** Automate data collection for property market analysis. Connect with Me Website:** https://www.nofluff.online YouTube:** https://www.youtube.com/@YaronBeen/videos LinkedIn:** https://www.linkedin.com/in/yaronbeen/ Get Bright Data:** https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #realestate #propertymarket #brightdata #marketreports #propertyanalysis #realestatedata #markettrends #propertyinvestment #n8nworkflow #workflow #nocode #realestateanalysis #propertyreports #realestateintelligence #marketresearch #propertyscraping #realestateautomation #investmentanalysis #propertytrends #datadriven #realestatetech #propertyinsights #marketanalysis #realestateinvesting
by Emmanuel Bernard
🎉 Do you want to master AI automation, so you can save time and build cool stuff? I’ve created a welcoming Skool community for non-technical yet resourceful learners. 👉🏻 Join the AI Atelier 👈🏻 Keeping your YouTube video descriptions updated and consistent across your channel can be a daunting task. Manually editing each video is not only time-consuming but also prone to errors. 📋 Blog post 📺 Youtube Video This workflow streamlines this process, allowing you to maintain a shared section in all your video descriptions and effortlessly update them all at once. By incorporating a unique identifier, you can automate updates across your entire channel, keeping your content fresh and relevant with minimal effort. How it Works Define Your Unique Delimiter:** Choose your unique delimiter (e.g., "---n8ninja---"). It will be visible, so select something appropriate for your audience. Automate Updates:** Anything below the delimiter can be automatically updated by this workflow. Configure Text Updates:** Set the text you wish to add to every video description in the configuration node. Getting Started Integrate Google (YouTube) Credentials:** Securely add your credentials to enable API access. Set Up the Configuration Node:** Define your delimiter and the text for the shared section you wish to append to your video descriptions. Prepare Your Videos:** Add the chosen delimiter to all videos you want to update automatically. Execute the Workflow:** Run the workflow whenever you wish to batch update the descriptions of your videos. Created by the n8ninja ✨ follow on X 📺 follow on YT
by PiAPI
Who is the template for? This workflow is specifically designed for content creators and social media professionals, enabling Instagram and X (Twitter) influencers to produce highly artistic visual posts, empowering marketing teams to quickly generate event promotional graphics, assisting blog authors in creating featured images and illustrations, and helping knowledge-based creators transform key insights into easily shareable card visuals. Set up Instructions Fill in your API key from PiAPI. Fill in Basic Params Node following the sticky note guidelines. Set up a design template in Canvas Switchboard. Make a simple template in Switchboard. Click Crul and get the API code to fill in JSON of Design in Canvas. Click Test Workflow and get a url result. Use Case Here we will provide some setting examples to help users find a proper way to use this workflow. User could change these settings based on specific purposes. Basic Params Setting: theme: Hope scenario: Don't know about the future, confused and feel lost with tech-development. style: Cinematic Grandeur, Sci-Tech Aesthetic, 3D style example: 1. March. Because of your faith, it will happen. 2. Something in me will save me. 3. To everyone carrying a heavy heart in silence. You are going to be okay. 4. Tomorrow will be better. image prompt: A cinematic sci-fi metropolis where Deep Neural Nets control a hyper-connected society. Holographic interfaces glow in the air as robotic agents move among humans, symbolizing Industry 4.0. The scene contrasts organic human emotion with cold machine precision, rendered in a hyper-realistic 3D style with futuristic lighting. Epic wide shots showcase the grandeur of this civilization’s industrial evolution. Output Image: More Example Results for Reference
by Lucas Walter
Who's it for This template is perfect for sales professionals, marketers, and business developers who need to quickly gather contact information from company websites. Whether you're building prospect lists, researching potential partners, or collecting leads for outreach campaigns, this automation saves hours of manual email hunting. What it does This workflow automatically discovers and extracts email addresses from any website by: Taking a website URL as input through a simple form Using Firecrawl's mapping API to find relevant pages (about, contact, team pages) Batch scraping those pages to extract email addresses Intelligently handling common email obfuscations like "(at)" and "(dot)" Returning a clean, deduplicated list of valid email addresses The automation handles rate limiting, retries failed requests, and filters out invalid or hidden email addresses to ensure you get quality results. How to set up Get Firecrawl API access: Sign up at firecrawl.dev and obtain your API key Configure credentials: In n8n, create a new HTTP Header Auth credential named "Firecrawl" with: Header Name: Authorization Header Value: Bearer YOUR_API_KEY Import the workflow: Copy the workflow JSON into your n8n instance Test the form: Activate the workflow and test with a sample website URL How to customize the workflow Search parameters: Modify the search parameter in the map_website node to target different page types (currently searches for "about contact company authors team") Extraction limits: Adjust the limit parameter to scrape more or fewer pages per website Retry logic: The workflow includes retry logic with a 12-attempt limit - modify the check_retry_count node to change this Output format: The set_result node formats the final output - customize this to match your preferred data structure Email validation: The JSON schema in start_batch_scrape defines how emails are extracted - modify the prompt or schema for different extraction rules The workflow is designed to be reliable and handle common edge cases like rate limiting and failed requests, making it production-ready for regular use.
by Joachim Brindeau
Are you looking to install external libraries for your self-hosted N8N instance? This automated workflow makes adding npm packages to your N8N environment quick and effortless. Beware, this workflow only works on self-hosted instances. What This Workflow Does This solution automatically installs npm packages like axios, cheerio, or node-fetch in your self-hosted N8N Docker container, making them immediately available in Code nodes. Key features ✅ Automated Installation: No manual npm commands needed ✅ Daily Updates: Scheduled trigger keeps packages current ✅ Smart Installation: Only installs missing packages ✅ Multiple Triggers: Manual, scheduled, and on startup of the N8N instance so you can upgrade your N8N version without worrying about external libraries. How to install and update external libraries automatically Step 1: Setting Up Your Environment Variables Before using external libraries in N8N Code nodes, configure these environment variables in your Docker comppose file. Option A to allow specific external npm packages in N8N Code nodes NODE_FUNCTION_ALLOW_EXTERNAL=axios,cheerio,node-fetch Option B to allow all external npm packages in Code nodes NODE_FUNCTION_ALLOW_EXTERNAL=* Step 2: Import the external packages workflow Import the workflow into your N8N instance by copy pasting all nodes. Step 3: Input the list of external libraries you need Edit the libraries_set node Change the comma-separated list (e.g., axios,cheerio,node-fetch). If you chose Option A above, update your NODE_FUNCTION_ALLOW_EXTERNAL variable with the same packages Step 4: Start the workflow! Run the workflow manually or let it trigger automatically Why use this to install NPM packages in N8N? Managing external packages manually in N8N can be time-consuming. This workflow automates the entire process, making sure your libraries are always installed and up-to-date.
by Davide
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This n8n workflow integrates the powerful Pipedream MCP server with AI capabilities to create a smart, extensible assistant that can interact with over 2,700 APIs and 10,000+ tools — all within a secure and modular structure. This setup seamlessly integrates Pipedream's MCP server with n8n, enabling your AI assistant to leverage thousands of APIs and tools securely. Benefits Massive Tool Access**: Instantly connect 2,700+ APIs using Pipedream MCP tools — from productivity apps to custom APIs — with zero-code integration. Dynamic AI Agent**: The use of a LangChain agent allows for flexible tool execution and contextual conversations, powered by GPT. Easy Customization**: Simply copy your MCP tool URL into the respective sseEndpoint field to extend the agent’s capabilities. Scalable and Modular**: Add or remove tools (like Slack, Notion, Stripe, etc.) without altering the core logic. Secure and Revocable**: Credentials and API access can be managed directly via Pipedream’s MCP dashboard. How It Works Chat Trigger: The workflow begins when a chat message is received via the When chat message received node, which acts as the entry point. AI Agent Processing: The message is passed to the AI Agent node, which orchestrates the interaction using the connected tools and memory. Language Model: The OpenAI Chat Model (GPT-4.1-mini) processes the user's input and generates responses or actions. Memory: The Simple Memory node retains context from the conversation to enable coherent multi-turn interactions. Tool Integration: The Calendly and Gmail nodes (connected via Pipedream's MCP server) allow the AI to perform actions like scheduling events or sending emails. These tools use SSE (Server-Sent Events) endpoints provided by Pipedream. Response: The AI Agent combines the model's output and tool responses to deliver a final reply to the user. Set Up Steps Sign Up for Pipedream: Create an account on and set up your MCP server. Configure MCP Tools: Connect your accounts (e.g., Calendly, Gmail) in Pipedream and obtain the SSE endpoints for each tool (e.g., https://mcp.pipedream.net/xxx/calendly_v2). Update n8n Nodes: Replace the placeholder SSE endpoints in the Calendly and Gmail nodes with your Pipedream MCP URLs. OpenAI Credentials: Ensure the OpenAI Chat Model node has valid API credentials (configured under "OpenAi account"). Activate Workflow: Enable the When chat message received node (currently disabled) and deploy the workflow. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Yaron Been
Description This workflow automatically collects weather data from multiple sources and compiles it into comprehensive reports. It helps you make informed decisions based on accurate weather forecasts without manually checking multiple weather services. Overview This workflow automatically scrapes weather data from multiple sources and compiles it into a comprehensive report. It uses Bright Data to access weather websites and can be configured to send you regular weather updates for your locations of interest. Tools Used n8n:** The automation platform that orchestrates the workflow. Bright Data:** For scraping weather websites and forecast data without getting blocked. Notification Services:** Email, messaging apps, or other platforms. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Configure Bright Data: Add your Bright Data credentials to the Bright Data node. Set Up Notifications: Configure how you want to receive weather reports. Customize: Add your locations of interest and reporting frequency. Use Cases Event Planners:** Get weather forecasts for upcoming outdoor events. Farmers:** Monitor weather conditions for agricultural planning. Travelers:** Check weather forecasts for destinations before trips. Connect with Me Website:** https://www.nofluff.online YouTube:** https://www.youtube.com/@YaronBeen/videos LinkedIn:** https://www.linkedin.com/in/yaronbeen/ Get Bright Data:** https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #weather #weatherforecasts #brightdata #webscraping #weatherreports #weatheralerts #weatherdata #weathermonitoring #n8nworkflow #workflow #nocode #weatherautomation #weatherscraping #weathertracking #weathernotifications #weatherupdates #forecastdata #weatherplanning #weatherservice #outdoorevents #weatherapi #weatherinformation #climatedata #weathertech