by Niklas Hatje
Use case When collecting leads via an online form, you often need to manually add those new leads into your Pipedrive CRM. This not only takes a lot of time but is also error-prone. This workflow automates this tedious work for you. What this workflow does The workflow is triggered each time a form is submitted in n8n. It validates the email address using Hunter.io. If the email is valid, the workflow checks for an existing person with that email in Pipedrive. If no existing person is found, it utilizes Clearbit to enrich the person's information. It then verifies if the person's organization already exists in Pipedrive, creating a new organization if necessary. The workflow then registers the person in Pipedrive. Lastly, it creates a lead in Pipedrive using information from the person and organization. Setup This workflow is very quick to set up. Add your Hunter.io, Clearbit and Pipedrive credentials Click the test workflow button Activate the workflow and use the form trigger production URL to collect your leads in a smart way How to adjust it to your needs Exchange the n8n form trigger with your form of choice (Typeform, Google Forms, SurveyMonkey...) Add a filter criteria to only add new leads if they match certain requirements Remove the email check with Hunter.io if you don't own this tool and expect new form submission to have a correct email anyways Add ways to handle invalid emails or existing Persons
by L Hùng
Pre-Conditions A Facebook Developer account with an active app. Basic understanding of n8n workflows. Access to a database (optional, for storing tokens). Setup Webhook Activation: Configure the Webhook to receive user requests and process input data. Ensure the Webhook URL is correctly set in your Facebook App settings. Short-Lived Token Retrieval: Use Facebook OAuth to fetch a short-lived token from the authorization code. Long-Lived Token Conversion: Convert the short-lived token into a long-lived token (valid for ~60 days). Page Token Retrieval: Follow the provided instructions to retrieve Page Tokens for posting on managed Facebook Pages. Customizable Scopes: Edit the correctScopes array to include or exclude permissions as needed. Optional Database Storage: Extend the workflow to save tokens to a database instead of displaying them on-screen. Step-by-Step Instructions: Detailed guidance is provided via sticky notes for activating the app, configuring Webhook, and editing parameters like fb_redirect_uri, app_id, and app_secret. Who the Template is For Developers**: Integrating Facebook APIs into their applications. Social Media Managers**: Automating posting and engagement on Facebook Pages. n8n Users**: Looking for a ready-to-use workflow for Facebook Token management. Primary Use Automates Facebook Token retrieval and management. Supports posting to Facebook Pages via Page Tokens. Easily customizable and extendable for specific requirements.
by Agent Studio
Overview This workflow allows you to trigger custom logic in n8n directly from Retell's Voice Agent using Custom Functions. It captures a POST webhook from Retell every time a Voice Agent reaches a Custom Function node. You can plug in any logic—call an external API, book a meeting, update a CRM, or even return a dynamic response back to the agent. Who is it for For builders using Retell who want to extend Voice Agent functionality with real-time custom workflows or AI-generated responses. Prerequisites Have a Retell AI Account A Retell agent with a Custom Function node in its conversation flow (see template below) Set your n8n webhook URL in the Custom Function configuration (see "How to use it" below) (Optional) Familiarity with Retell's Custom Function docs Start a conversation with the agent (text or voice) Retell Agent Example To get you started, we've prepared a Retell Agent ready to be imported, that includes the call to this template. Import the agent to your Retell workspace (top-right button on your agent's page) You will need to modify the function URL in order to call your own instance. This template is a simple hotel agent that calls the custom function to confirm a booking, passing basic formatted data. How it works Retell sends a webhook to n8n whenever a Custom Function is triggered during a call (or test chat). The webhook includes: Full call context (transcript, call ID, etc.) Parameters defined in the Retell function node You can process this data and return a response string back to the Voice Agent in real-time. How to use it Copy the webhook URL (e.g. https://your-instance.app.n8n.cloud/webhook/hotel-retell-template) Modify the Retell Custom Function webhook URL (see template description for screenshots) Edit the function Modify the URL Modify the logic in the Set node or replace it with your own custom flow Deploy and test: Retell will hit your n8n workflow during the conversation Extension Ideas Call a third-party API to fetch data (e.g. hotel availability, CRM records) Use an LLM node to generate dynamic responses Trigger a parallel automation (Slack message, calendar invite, etc.) 👉 Reach out to us if you're interested in analyzing your Retell Agent conversations.
by Lucía Maio Brioso
🧑💼 Who is this for? If you’re using Notion to manage a database (like saving links, tasks, notes, or anything really), and it’s starting to get messy with duplicate entries, this workflow is for you. It’s especially useful if you want to keep things tidy without doing any manual cleanup. 🧠 What problem is this workflow solving? Notion doesn’t have a built-in way to find or remove duplicates, so you either clean them up manually 😩 or just let them pile up. This workflow automatically finds entries that share the same property (like a URL or title) and archives the extra copies, keeping just one. ⚙️ What this workflow does Pulls all pages from a Notion database. Identifies duplicates based on a property you choose. Archives the duplicate pages (which is like soft-deleting them). Keeps one version of each duplicate group. It includes two optional triggers: Run it every day ⏰ Or trigger it automatically when a new page is added to the database ⚡ 🛠️ Setup Connect your Notion account in n8n. Select your database in the Notion nodes. In the “Format items properly” node, replace "SET YOUR PROPERTY HERE" with a reference to the property you want to use for detecting duplicates. I recommend using the n8n property drag-and-drop feature. Enable whichever trigger you prefer — or both. And that’s it. It runs on its own after that. 🧩 How to customize this workflow to your needs Use a different property for detecting duplicates by updating the Set node. Want to tag duplicates instead of archiving them? Just replace the last Notion node with an update operation. Adjust the schedule to run it hourly, weekly, or whenever suits your setup.
by Jordan Lee
This flexible template scrapes business listings for any industry and location, perfect for sales teams, marketers, and researchers. Good to know Works with any business category (restaurants, contractors, retailers, etc.) Fully customizable search parameters Results automatically organized in Google Sheets Built-in delay ensures scraping completes before data collection How it works Trigger: Manual or scheduled start Apify Configuration: Sets scraping parameters (industry, location, data fields) Scraping Execution: Runs the web scraping job Data Processing: Cleans and structures the raw data Storage: Saves results to your Google Sheets What is Apify? Apify is a webscraping tool, in this workflow the data is scraped from a google maps scraper: https://apify.com/compass/crawler-google-places How to use Apify Small # Lead Generation (Purple) https://apify.com/compass/crawler-google-places Add location and industry to scrape (Apify) Add the number of leads to output (Apify) Copy over the JSON file into N8N Copy & paste API endpoint "Get Run URL" in N8N Apify Large # Lead Generation (Grey) Configure the Manual Trigger When clicking 'Execute workflow' node is ready to use as-is This triggers the entire lead generation process Setup "Start Results (Apify)" Node Get Your Apify API Information Go to Apify.com and create a free account Navigate to Settings → Integrations → API tokens Copy your API token Find the Google Maps scraper actor ID: Configure the HTTP Request (start results) Method: POST URL: Replace "enter apify (get run)" with: https://api.apify.com/v2/acts/nwua9Gu5YrADL7ZDj/runs?token=YOUR_API_TOKEN C. Customize the JSON Body Parameters In the JSON body, modify these key fields: Location & Search: "locationQuery": Change "Toronto" to your target city "searchStringsArray": Change ["barber"] to your business type Examples: ["restaurants"], ["dentists"], ["contractors"] Configure the HTTP Request (start results) Method : Get Url: enter the get dataset URL from Apify Split Out Node Select fields to append in the google sheet Test the Configuration Click Execute workflow to test Check that the Apify job starts successfully Note the job ID returned for the next section This section initiates the scraping process and should complete in 30-60 seconds depending on your lead count. Setup Google Sheets Create a new Google Sheet with these columns: title (business name) address (full address) state (state/province) neighborhood (area/district) phone (contact number) emails (email addresses) Copy your Google Sheets document ID for workflow configuration Requirements Apify account Google Sheets document Google OAuth credentials Customization Options For different use cases: Lead Gen: Get business leads Local SEO: Collect competitor data Market Research: Analyze industry trends Advanced mofications: Add email enrichment Integrate with CRM systems Set up automatic daily runs
by Nicolas
What is it This workflow aims to build a simple bot that will send a message to a telegram channel every time there is a new saved item to the Reader. This workflow can be easily modify to support other way of sending the notification, thanks to existing n8n nodes. Warning: This is only for folks who already have access to the Reader, it won't work if you don't Also, this workflow use a file to store the last update time in order to not sync everything everytime. Setup The config node : It contains the telegram channel id It also contains the file used as storage To get the header auth, you have to : Go to the reader Open the devtools, Option + ⌘ + J (on macOS), or Shift + CTRL + J (on Windows/Linux) Go to network and find a profile_details/ request, click on it Go to Request Headers Copy the value for "Cookie" In n8n, set the name of the Header auth account to Cookie and the value with the one you copied before
by Julian Kaiser
🗂️ Bulk File Upload to Google Drive with Folder Management How it works User submits files and target folder name via form Workflow checks if folder exists in Drive Creates folder if needed or uses existing one Processes and uploads all files maintaining structure Set up steps (Est. 10-15 mins) Set up Google Drive credentials in n8n Replace parent folder ID in search query with your Drive folder ID Configure form node with: Multiple file upload field Folder name text field Test workflow with sample files 💡 Detailed configuration steps and patterns are documented in sticky notes within the workflow. Perfect for: Bulk file organization Automated Drive folder management File upload automation Maintaining consistent file structures
by Sirhexalot
This workflow facilitates seamless synchronization between Entra (Microsoft Azure AD) and Zammad. It automates the following processes: Fetch Entra Contacts: Create Universal User Object: Extracts key user information, such as email, phone, and name, and formats it for Zammad compatibility. Synchronize with Zammad: Identifies users in Zammad who need updates based on Entra data. Adds new users from Entra to Zammad. Deactivates users in Zammad if they are no longer in Entra. Key Features Dynamic Matching**: Compares contacts from Entra with existing Zammad users based on email and updates records accordingly. Efficient Management**: Automatically creates, updates, or deactivates Zammad users based on their status in Entra. Custom Fields**: Supports custom field mapping, ensuring enriched user profiles in Zammad. Setup Instructions Microsoft Entra Integration: Ensure proper API permissions for accessing Entra contacts. Configure Microsoft OAuth2 credentials in n8n. Zammad Integration: Set up Zammad API credentials with appropriate access rights. Customize the workflow to include additional fields or map existing fields as needed. Run Workflow: Trigger the workflow manually or set up an automation schedule (e.g., daily sync). Review created/updated/deactivated users in Zammad. Use Cases IT Administration**: Keep your support system in sync with the organization’s Entra data. Customer Management**: Ensure accurate and up-to-date user records in Zammad. Prerequisites Access to an Entra (Azure AD) environment with contacts data. A Zammad instance with API credentials for user management. A custom field in Zammad User Object (entra_key) of type String. A custom field in Zammad User Object (entra_object_type) of type `Single selection field with two key value pairs user = User contact = Contact` This workflow is fully customizable and can be adapted to your organization’s specific needs. Save time and reduce manual errors by automating your user sync process with this template! If you have found an error or have any suggestions, please report them here on Github.
by Juan Carlos Cavero Gracia
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Description See the transformation in action! Here's an example of what this workflow can achieve: This automation template is designed for content creators, social media managers, and anyone looking to breathe new life into old family photos and historical images. It transforms any old black and white or sepia photograph into a colorized, animated video using cutting-edge AI technology, then automatically publishes the results across multiple social media platforms including Facebook, Instagram, YouTube, and X (Twitter). The workflow combines powerful AI services to create engaging content from vintage photographs: first enhancing and colorizing the image using FLUX Kontext, then bringing it to life with realistic animations using Kling Video AI, and finally distributing the results across your social media channels automatically. Note: The estimated cost per workflow execution is approximately $0.29 USD, covering the AI processing for both image colorization and video animation. The upload-post node only works for self-hosted n8n instances, but you can use the standard HTTP request node for uploading content on n8n Cloud.* Who Is This For? Content Creators & Social Media Managers:** Transform historical content into engaging videos that capture audience attention and drive engagement across platforms. Family History Enthusiasts:** Bring old family photos to life by adding color and motion, creating emotional connections with your audience. Marketing Professionals:** Leverage nostalgic content for brand storytelling, using vintage aesthetics to create compelling social media campaigns. Digital Artists & Photo Restorers:** Streamline the process of enhancing and sharing restored vintage photographs with automated AI enhancement. Social Media Influencers:** Create unique, eye-catching content from historical images that stands out in crowded social feeds. What Problem Does This Workflow Solve? Creating engaging social media content from old photos typically requires multiple manual steps: photo restoration, colorization, animation, and then individual posting to each platform. This workflow addresses these challenges by: Automating Photo Enhancement:** Uses advanced AI (FLUX Kontext) to automatically colorize and enhance old photographs, removing artifacts and improving quality. Creating Dynamic Content:** Transforms static images into animated videos using Kling Video AI, making historical photos come alive with natural movements. Streamlining Multi-Platform Publishing:** Automatically distributes the final animated videos across Facebook, Instagram, YouTube, and X with a single workflow execution. Saving Time & Effort:** Eliminates the need for manual photo editing, video creation, and individual social media posting. How It Works Photo Upload: Users submit old photographs through a simple web form, with optional custom animation descriptions. Image Enhancement: The workflow uploads the photo to imgbb, then sends it to FLUX Kontext AI for colorization and quality enhancement. Animation Creation: The colorized image is processed by Kling Video AI to create a 5-second animated video with natural movements. Cloud Storage: The final video is automatically saved to Google Drive for backup and easy access. Multi-Platform Publishing: The animated video is simultaneously posted to Facebook, Instagram, YouTube, and X using the upload-post service. Setup FAL.AI API Key: Sign up at fal.ai and add your API key to the HTTP Request nodes for both FLUX Kontext and Kling Video AI services. ImgBB API Token: Create a free account at api.imgbb.com to get an API token for image hosting, then update the "Upload Image to imgbb" node. Google Drive Connection: Connect your Google Drive account to enable automatic video storage and backup. Upload-Post Service: Create an account at upload-post.com to get your API credentials for multi-platform social media posting. Important: The upload-post node currently only works with self-hosted n8n instances. For n8n Cloud users, replace the upload-post node with standard HTTP request nodes to publish to each social media platform individually. Form Customization: (Optional) Modify the form fields in the "Photo Upload Form" node to collect additional information or customize the user experience. Requirements Accounts:** n8n, FAL.AI, ImgBB, Google Drive, upload-post.com API Keys & Credentials:** FAL.AI API Key, ImgBB API Token, Google Drive OAuth2, Upload-post.com API Token & User ID File Types:** Supports JPG, PNG image formats for photo uploads Cost:** Approximately $0.29 USD per workflow execution for AI processing Transform your old photographs into viral social media content with this powerful AI-driven workflow that handles everything from restoration to distribution automatically.
by Airtop
Trump-o-meter: Extract and Evaluate Truth Social Posts Use Case Automatically extracting posts from Donald Trump's Truth Social account and estimating their potential impact on the U.S. stock market enables teams to monitor high-profile communications that may influence financial markets. This automation streamlines intelligence gathering for analysts, traders, and policy observers. What This Automation Does This automation retrieves up to 3 posts from Donald Trump's Truth Social profile and outputs structured information including: Author name Image URL Post text Post URL Estimated stock market impact: Direction: positive, negative, or neutral Magnitude: None, Small, Medium, Large How It Works Creates a browser session on Truth Social using an Airtop profile. Navigates to https://truthsocial.com/@realDonaldTrump. Uses a natural language prompt with a defined JSON schema to extract structured data for up to 3 posts. Splits the results into individual post items. Filters posts that contain actual content and have a non-zero estimated market impact. Sends selected posts and impact summaries to a Slack channel. Terminates the browser session to clean up. Setup Requirements Airtop API Key — free to generate. An Airtop Profile that is connected and logged into Truth Social. A Slack workspace and authorized app with write permissions to a target channel. Next Steps Integrate with Trading Signals**: Link output to financial alert systems or dashboards for timely insights. Expand Monitoring**: Extend to other high-impact accounts (e.g., politicians, CEOs). Enhance Analysis**: Add sentiment scoring or topic classification for deeper context. Legal Disclaimer This tool is intended solely for informational and analytical purposes. The market impact estimations provided are speculative and should not be construed as financial advice. Do not make investment decisions based on this automation. Always consult with a licensed financial advisor before making any trades. Read more about Trump-o-meter automation
by Airtop
Use Case Automatically responding to X (formerly Twitter) posts can help you engage with potential customers at scale, saving time while maintaining a personal touch. What This Automation Does This automation replies to specified X posts using the following input parameters: airtop_profile: The name of your Airtop Profile connected to X. thread_url: The URL of the X post to reply to. Example reply_text: The message you want to post as a reply. How It Works Creates a browser session using Airtop. Navigates to the specified X post. Types and submits the reply text. Setup Requirements Airtop API Key — free to generate. An Airtop Profile connected to X (requires one-time login). Next Steps Combine with X Monitoring**: Use this with the X monitoring automation to create a fully automated engagement pipeline. Extend to Other Platforms**: Adapt the automation for use on LinkedIn, Reddit, or any web community. Read more about this Airtop Automation.
by Extruct AI
Who’s it for: Investors, analysts, and startup enthusiasts who need a complete overview of startups, including industry, product, funding, and leadership information. How it works / What it does: Enter a startup’s name into the form, and the workflow will automatically collect and organize details such as the company’s industry, product, investors, and key decision-makers. All this information is neatly updated in your Google Sheet, making it easy to track and compare startups. How to set up: Sign up for Extruct at www.extruct.ai/. Open the Extruct table template, copy the table ID from the URL, and save it. Copy the Google Sheets template to your own Drive. Paste the table ID into the variables node in your n8n flow. Set up Bearer authentication in each HTTP Request node using your Extruct API token. In the Google Sheets node, paste your template link and connect your Google account. Run the flow once to reveal the mapping fields, then match each field to the correct column. Activate the flow and add startups via the form. Requirements: Extruct account and API token Extruct table template Google account with Google Sheets How to customize the workflow: Add new columns in both the Extruct table and your Google Sheet, then map them in the Google Sheets node to track additional startup data.