by Oleg Ivaniv
If you previously upgraded to n8n version 0.214.3, some of your workflows might have been accidentally rewired in the wrong way. This issue affected nodes with more than one output, such as If, Switch, and Compare Datasets. This workflow helps you identify potentially affected workflows and nodes that you should check. ❗️Please ensure that you run this workflow as the instance owner.❗️
by Yulia
This workflow demonstrates the conversion of a CSV file to Excel format. First, an example CSV file is downloaded via a direct link. The source file is taken from the European Open Data Portal: https://data.europa.eu/data/datasets/veranstaltungsplaetze-potsdam-potsdam?locale=en The binary data is then imported via the Spreadsheet File node and converted to Excel format. N.B. Note that as of version 1.23.0 n8n, the Spreadsheet File node has been redesigned and is now called Convert to File node. Learn more on the release notes page: https://docs.n8n.io/release-notes/#n8n1230
by Lucas Perret
Get recent funding rounds from Crunchbase in Google Sheets, along with 10+ data points (LinkedIn URL, monthly traffic, company size, etc.) You’ll be able to: Create a custom database Reach out to interesting leads at the right time Send custom alerts to your tools This workflow scrape recent funding rounds from Crunchbase, and add them in Google Sheets. It uses Piloterr API to get this data with ease. Full guide can be found here: https://lempire.notion.site/Get-recent-fundraising-in-Google-Sheets-dafbbda2635544b4925c4fb04abac8f5?pvs=74
by David Roberts
This workflow allows you to ask questions about data stored in a database using AI. To use it, you'll need an OpenAI API key (although you could also swap in a model from another service). Supported databases: Postgres MySQL SQLite The workflow uses n8n's embedded chat, but you could also modify it to work with a chat service such as Slack, MS Teams or WhatsApp. Note that to use this template, you need to be on n8n version 1.19.4 or later.
by ConvertAPI
Who is this for? For developers and organizations that need to combine PDF files. What problem is this workflow solving? PDF file merging problem. What this workflow does Downloads two PDF files from the web. Merges two PDF files into one. Stores the PDF file in the local file system. How to customize this workflow to your needs Open the PDF merge HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Add your secret to the Query Auth account parameter. Please create a ConvertAPI account to get an authentication secret. Optionally, additional Body Parameters can be added for the converter.
by Yaron Been
This workflow provides automated access to the Paigedutcher2 Paige AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for other generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete other generation process using the Paigedutcher2 Paige model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: "Custom AI model trained on Paige — bold, curvy, confident energy. Think Barbie meets boss. Great for glam, fantasy, seductive, and influencer-style prompts. Use trigger word CharacterPGE to activa... Key Capabilities Specialized AI model with unique capabilities** Advanced processing and generation features** Custom AI-powered automation tools** Artistic style control and customization** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Paigedutcher2/paige AI model Paigedutcher2 Paige**: The core AI model for other generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Other Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Specialized Processing**: Handle specific AI tasks and workflows Custom Automation**: Implement unique business logic and processing Data Processing**: Transform and analyze various types of data AI Integration**: Add AI capabilities to existing systems and workflows Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #aiprocessing #dataprocessing #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation
by explorium
Research Agent - Automated Sales Meeting Intelligence This n8n workflow automatically prepares comprehensive sales research briefs every morning for your upcoming meetings by analyzing both the companies you're meeting with and the individual attendees. The workflow connects to your calendar, identifies external meetings, enriches companies and contacts with deep intelligence from Explorium, and delivers personalized research reports—giving your sales team everything they need for informed, confident conversations. DEMO Template Demo Credentials Required To use this workflow, set up the following credentials in your n8n environment: Google Calendar (or Outlook) Type:** OAuth2 Used for:** Reading daily meeting schedules and identifying external attendees Alternative: Microsoft Outlook Calendar Get credentials at Google Cloud Console Explorium API Type:** Generic Header Auth Header:** Authorization Value:** Bearer YOUR_API_KEY Used for:** Business/prospect matching, firmographic enrichment, professional profiles, LinkedIn posts, website changes, competitive intelligence Get your API key at Explorium Dashboard Explorium MCP Type:** HTTP Header Auth Used for:** Real-time company intelligence and supplemental research for AI agents Connect to: https://mcp.explorium.ai/mcp Anthropic API Type:** API Key Used for:** AI-powered company and attendee research analysis Get your API key at Anthropic Console Slack (or preferred output) Type:** OAuth2 Used for:** Delivering research briefs Alternative options: Google Docs, Email, Microsoft Teams, CRM updates Go to Settings → Credentials, create these credentials, and assign them in the respective nodes before running the workflow. Workflow Overview Node 1: Schedule Trigger Automatically runs the workflow on a recurring schedule. Type:** Schedule Trigger Default:** Every morning before business hours Customizable:** Set to any interval (hourly, daily, weekly) or specific times Alternative Trigger Options: Manual Trigger:** On-demand execution Webhook:** Triggered by calendar events or CRM updates Node 2: Get many events Retrieves meetings from your connected calendar. Calendar Source:** Google Calendar (or Outlook) Authentication:** OAuth2 Time Range:** Current day + 18 hours (configurable via timeMax) Returns:** All calendar events with attendee information, meeting titles, times, and descriptions Node 3: Filter for External Meetings Identifies meetings with external participants and filters out internal-only meetings. Filtering Logic: Extracts attendee email domains Excludes your company domain (e.g., 'explorium.ai') Excludes calendar system addresses (e.g., 'resource.calendar.google.com') Only passes events with at least one external attendee Important Setup Note: Replace 'explorium.ai' in the code node with your company domain to properly filter internal meetings. Output: Events with external participants only external_attendees: Array of external contact emails company_domains: Unique list of external company domains per meeting external_attendee_count: Number of external participants Company Research Pipeline Node 4: Loop Over Items Iterates through each meeting with external attendees for company research. Node 5: Extract External Company Domains Creates a deduplicated list of all external company domains from the current meeting. Node 6: Explorium API: Match Business Matches company domains to Explorium's business entity database. Method:** POST Endpoint:** /v1/businesses/match Authentication:** Header Auth (Bearer token) Returns: business_id: Unique Explorium identifier matched_businesses: Array of matches with confidence scores Company name and basic info Node 7: If Validates that a business match was found before proceeding to enrichment. Condition:** business_id is not empty If True:** Proceed to parallel enrichment nodes If False:** Skip to next company in loop Nodes 8-9: Parallel Company Enrichment Node 8: Explorium API: Business Enrich Endpoints:** /v1/businesses/firmographics/enrich, /v1/businesses/technographics/enrich Enrichment Types:** firmographics, technographics Returns:** Company name, description, website, industry, employees, revenue, headquarters location, ticker symbol, LinkedIn profile, logo, full tech stack, nested tech stack by category, BI & analytics tools, sales tools, marketing tools Node 9: Explorium API: Fetch Business Events Endpoint:** /v1/businesses/events/fetch Event Types:** New funding rounds, new investments, mergers & acquisitions, new products, new partnerships Date Range:** September 1, 2025 - November 4, 2025 Returns:** Recent business milestones and financial events Node 10: Merge Combines enrichment responses and events data into a single data object. Node 11: Cleans Merge Data Output Transforms merged enrichment data into a structured format for AI analysis. Node 12: Company Research Agent AI agent (Claude Sonnet 4) that analyzes company data to generate actionable sales intelligence. Input: Structured company profile with all enrichment data Analysis Focus: Company overview and business context Recent website changes and strategic shifts Tech stack and product focus areas Potential pain points and challenges How Explorium's capabilities align with their needs Timely conversation starters based on recent activity Connected to Explorium MCP: Can pull additional real-time intelligence if needed to create more detailed analysis Node 13: Create Company Research Output Formats the AI analysis into a readable, shareable research brief. Attendee Research Pipeline Node 14: Create List of All External Attendees Compiles all unique external attendee emails across all meetings. Node 15: Loop Over Items2 Iterates through each external attendee for individual enrichment. Node 16: Extract External Company Domains1 Extracts the company domain from each attendee's email. Node 17: Explorium API: Match Business1 Matches the attendee's company domain to get business_id for prospect matching. Method:** POST Endpoint:** /v1/businesses/match Purpose:** Link attendee to their company Node 18: Explorium API: Match Prospect Matches attendee email to Explorium's professional profile database. Method:** POST Endpoint:** /v1/prospects/match Authentication:** Header Auth (Bearer token) Returns: prospect_id: Unique professional profile identifier Node 19: If1 Validates that a prospect match was found. Condition:** prospect_id is not empty If True:** Proceed to prospect enrichment If False:** Skip to next attendee Node 20: Explorium API: Prospect Enrich Enriches matched prospect using multiple Explorium endpoints. Enrichment Types:** contacts, profiles, linkedin_posts Endpoints:** /v1/prospects/contacts/enrich, /v1/prospects/profiles/enrich, /v1/prospects/linkedin_posts/enrich Returns: Contacts:** Professional email, email status, all emails, mobile phone, all phone numbers Profiles:** Full professional history, current role, skills, education, company information, experience timeline, job titles and seniority LinkedIn Posts:** Recent LinkedIn activity, post content, engagement metrics, professional interests and thought leadership Node 21: Cleans Enrichment Outputs Structures prospect data for AI analysis. Node 22: Attendee Research Agent AI agent (Claude Sonnet 4) that analyzes prospect data to generate personalized conversation intelligence. Input: Structured professional profile with activity data Analysis Focus: Career background and progression Current role and responsibilities Recent LinkedIn activity themes and interests Potential pain points in their role Relevant Explorium capabilities for their needs Personal connection points (education, interests, previous companies) Opening conversation starters Connected to Explorium MCP: Can gather additional company or market context if needed Node 23: Create Attendee Research Output Formats attendee analysis into a readable brief with clear sections. Node 24: Merge2 Combines company research output with attendee information for final assembly. Node 25: Loop Over Items1 Manages the final loop that combines company and attendee research for output. Node 26: Send a message (Slack) Delivers combined research briefs to specified Slack channel or user. Alternative Output Options: Google Docs:** Create formatted document per meeting Email:** Send to meeting organizer or sales rep Microsoft Teams:** Post to channels or DMs CRM:** Update opportunity/account records with research PDF:** Generate downloadable research reports Workflow Flow Summary Schedule: Workflow runs automatically every morning Fetch Calendar: Pull today's meetings from Google Calendar/Outlook Filter: Identify meetings with external attendees only Extract Companies: Get unique company domains from external attendees Extract Attendees: Compile list of all external contacts Company Research Path: Match Companies: Identify businesses in Explorium database Enrich (Parallel): Pull firmographics, website changes, competitive landscape, events, and challenges Merge & Clean: Combine and structure company data AI Analysis: Generate company research brief with insights and talking points Format: Create readable company research output Attendee Research Path: Match Prospects: Link attendees to professional profiles Enrich (Parallel): Pull profiles, job changes, and LinkedIn activity Merge & Clean: Combine and structure prospect data AI Analysis: Generate attendee research with background and approach Format: Create readable attendee research output Delivery: Combine: Merge company and attendee research for each meeting Send: Deliver complete research briefs to Slack/preferred platform This workflow eliminates manual pre-meeting research by automatically preparing comprehensive intelligence on both companies and individuals—giving sales teams the context and confidence they need for every conversation. Customization Options Calendar Integration Works with multiple calendar platforms: Google Calendar:** Full OAuth2 integration Microsoft Outlook:** Calendar API support CalDAV:** Generic calendar protocol support Trigger Flexibility Adjust when research runs: Morning Routine:** Default daily at 7 AM On-Demand:** Manual trigger for specific meetings Continuous:** Hourly checks for new meetings Enrichment Depth Add or remove enrichment endpoints: Company:** Technographics, funding history, news mentions, hiring signals Prospects:** Contact information, social profiles, company changes Customizable:** Select only needed data to optimize speed and costs Research Scope Configure what gets researched: All External Meetings:** Default behavior Filtered by Keywords:** Only meetings with specific titles By Attendee Count:** Only meetings with X+ external attendees By Calendar:** Specific calendars only Output Destinations Deliver research to your preferred platform: Messaging:** Slack, Microsoft Teams, Discord Documents:** Google Docs, Notion, Confluence Email:** Gmail, Outlook, custom SMTP CRM:** Salesforce, HubSpot (update account notes) Project Management:** Asana, Monday.com, ClickUp AI Model Options Swap AI providers based on needs: Default: Anthropic Claude (Sonnet 4) Alternatives: OpenAI GPT-4, Google Gemini Setup Notes Domain Configuration: Replace 'explorium.ai' in the Filter for External Meetings code node with your company domain Calendar Connection: Ensure OAuth2 credentials have calendar read permissions Explorium Credentials: Both API key and MCP credentials must be configured Output Timing: Schedule trigger should run with enough lead time before first meetings Rate Limits: Adjust loop batch sizes if hitting API rate limits during enrichment Slack Configuration: Select destination channel or user for research delivery Data Privacy: Research is based on publicly available professional information and company data This workflow acts as your automated sales researcher, preparing detailed intelligence reports every morning so your team walks into every meeting informed, prepared, and ready to have meaningful conversations that drive business forward.
by Jonathan
This workflow will archive empty pages in your Notion databases, Add your n8n integration to the Notion databases that you want to process. To configure this workflow set the Notion credentials in the 4 Notion nodes and if needed change the time in the Cron node, The default is to run at 2am every day.
by Vitali
Template Description This n8n workflow is designed to manage Fastmail masked email addresses using the Fastmail API. The workflow provides the following functionalities: Retrieve all masked emails: Fetches all masked email addresses associated with the Fastmail account. Create masked email: Allows creating a new masked email with a specified state (pending, enabled, etc.). Update masked email state: Updates the state of a masked email such as enabling, disabling, or deleting it. Generate HTML template: Constructs an HTML table to display the masked emails in a user-friendly format. Steps to Make it Work Webhook Node: This node listens for incoming requests to manage masked emails. Needs Basic Authentication credentials to secure the endpoint. Session Node: Sends a request to obtain session information from Fastmail's API. Requires an HTTP Header Auth credential with your Fastmail API token. Switch Node: Routes the workflow based on the state of the incoming masked email request (pending, enabled, disabled, deleted). HTTP Request Nodes: These nodes handle various Fastmail API calls for masked emails (get, set, update, delete). All HTTP Request nodes require an HTTP Header Auth credential attached, using the Fastmail API token. Set Node: Gathers the retrieved masked email list into an array for further processing. HTML Node: Generates an HTML template to render the masked email addresses in a table format. Respond to Webhook Node: Sends back the HTML table to the client in response to the webhook request. Needed Credentials Fastmail Masked E-Mail Addresses: An API token from Fastmail's API. Each HTTP call to Fastmail requires this credential for authentication. Note Ensure that you correctly configure authentication for the API calls and webhook security. Use your actual Fastmail API credentials with the correct scope. The workflow assumes that the Fastmail API is correctly configured and accessible from your n8n instance. Update URLs and credentials IDs according to your n8n configuration.
by Yaron Been
This workflow provides automated access to the Aihilums Sehatsanjha AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for other generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete other generation process using the Aihilums Sehatsanjha model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Advanced AI model for automated processing and generation tasks. Key Capabilities Specialized AI model with unique capabilities** Advanced processing and generation features** Custom AI-powered automation tools** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Aihilums/sehatsanjha AI model Aihilums Sehatsanjha**: The core AI model for other generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Other Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Specialized Processing**: Handle specific AI tasks and workflows Custom Automation**: Implement unique business logic and processing Data Processing**: Transform and analyze various types of data AI Integration**: Add AI capabilities to existing systems and workflows Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #aiprocessing #dataprocessing #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation
by Eduard
> Like this template? Connect with Eduard via LinkedIn. This workflow is a prototype of an AI-powered image editing interface, similar to Photoshop's Generative Fill feature, but running entirely in the browser. It provides a web-based editor that allows users to: Select areas in images using an adjustable brush tool Input text prompts to guide the AI generation Compare original and generated images side by side Iterate on edits with different prompts and settings Save or reuse generated images > 🎨 Perfect for product catalog management, seasonal content updates, and creative image editing tasks! 📋 Requirements FLUX API Access: You'll need API credentials from FLUX to use this workflow. Configure the HTTP Header Auth credential in n8n with your FLUX API key 🔧 Key Components FLUX Fill API for AI-powered image generation Konva.js for canvas manipulation img-comparison-slider for result visualization Custom CSS/JS for editor functionality Simple Editor Interface HTML page with an editor is served on the Webhook call Adjustable brush selection tool Provides several mock examples and allows uploading custom images Basic prompt and FLUX model parameter controls Image Processing Pipeline Handles image and mask separately Processes FLUX Fill API requests Delivers results back to the editor Result Viewer Split-screen comparison of original and generated images Interactive slider for before/after comparison Options to save or continue editing Support for multiple iteration cycles 🎯 Use Cases This prototype is particularly useful for: Testing AI-powered image editing concepts Quick product visualization experiments Exploring creative image variations Demonstrating inpainting capabilities > 💡 Pro Tip: Save masks for frequently edited areas to quickly generate variations with different prompts! The workflow can be extended to integrate with various data sources and can be customized for specific business needs.
by Yaron Been
Description This workflow automates the process of finding local events and adding them directly to your Google Calendar. It eliminates the need for manual event tracking by automatically scraping event information and creating calendar entries. Overview This workflow automates the process of finding local events and adding them to your Google Calendar. It uses Bright Data to scrape event information from a specified source and then creates new events in your calendar, ensuring you never miss out on what's happening around you. Tools Used n8n:** The automation platform that orchestrates the workflow. Bright Data:** For scraping event data from websites without getting blocked. Google Calendar API:** To create and manage calendar events. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Configure Bright Data: Add your Bright Data credentials to the Bright Data node. Set Up Google Calendar: Authenticate your Google Calendar account in the Google Calendar node. Customize: Adjust the workflow to target the specific websites and event types you're interested in. Use Cases Community Managers:** Keep track of local meetups and community events. Event Enthusiasts:** Never miss a concert, festival, or local gathering. Marketing Professionals:** Monitor competitor events and industry conferences. Connect with Me Website:** https://www.nofluff.online YouTube:** https://www.youtube.com/@YaronBeen/videos LinkedIn:** https://www.linkedin.com/in/yaronbeen/ Get Bright Data:** https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #googlecalendar #brightdata #webscraping #events #eventautomation #localevents #calendarintegration #eventtracking #n8nworkflow #workflow #nocode #eventmanagement #productivitytools #timemanagement #eventplanning #automatedcalendar #eventdiscovery #techautomation #eventnotifications #eventscheduling #calendarsync #eventorganizer #automatedevents