by Niklas Hatje
Use case When collecting new leads via a form, you need to follow up on new submissions. Often, this required a lot of manual work that includes reviewing each submission, checking if they meet your criteria and then outreaching. With this workflow you can do all of that fully automatically and save a lot of your valuable time. What this workflow does This workflow runs every time you're receiving a new submission from an n8n form. It then filters out typical personal emails (such as Gmail, Hotmail, Yahoo etc.) before enriching the submission via Clearbit. It then checks, if the company of the submitter is a B2B company and has more than 499 employees. If it does, it sends an email via Gmail to the user. Setup Add the Clearbit and Gmail credentials Click on Test Workflow Enter your own email (which needs to be a business email to work) in the Form Check your email Once you're happy don't forget to activate this workflow How to adjust this template Replace the form trigger with your form provider of choice (e.g. Typeform, SurveyMonkey, Google Forms etc.) Adjust the criteria to your needs via the If node Adjust the email you're sending in the Gmail node
by Milorad Filipovic
How it works? This workflows sends you a monthly lists of live music events for a specific location. Events are fetched from songkick.com and delivered to you by email to a provided email address(es). Each event in the list has a link to a full SongKick page where you can see more details about the event and buy tickets for it. Example email: How to set it up? First thing that this workflow needs is a location link for your desired city from the SongKick website. You can get it by following these steps: Visit songkick.com and enter the city name in the search box: From the results page, click the result that contains the location info. These will have the Location tag on top of the location name: Once on the location page, copy the url Back in the n8n workflow page, paste the url in the location parameter of the node called Setup location and email: Second thing it needs is the email address which will receive the monthly list. For this, simply enter it in the email field of the Setup location and email node. If you want to set multiple receivers, simply separate email addresses by ,: Required accounts This workflow requires you to have a properly set up Gmail account that will be used by Gmail Node to send emails. You can read more about how to create credentials for a Gmail node in n8n documentation here.
by Arthur Braghetto
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Clean Web Content Extraction with Anti-Bot Fallback Extract clean and structured text from any webpage with optional fallback to an anti-bot scraping service. Ideal for AI tools and content workflows. 🧠 How it Works This sub-workflow enables reliable and clean scraping of any public webpage by simply passing a url parameter. It is designed to be embedded into other workflows or used as a tool for AI agents. It supports two output modes: fulltext:* true — returns *{ title, text } with full page content fulltext:* false — returns *{ title, url, content } with a short excerpt 💡 If the site is protected by anti-bot systems (like Cloudflare), it will automatically fallback to Scrape.do, a scraping API with a generous free plan. 🧩 This template requires the n8n-nodes-webpage-content-extractor community node, so it only works in self-hosted n8n environments. 🚀 Use Cases As a reusable sub-workflow, via Execute Sub-workflow node. As a tool for an AI Agent, compatible with Call n8n Workflow Tool. Perfect for chatbots, summarization workflows, or RSS/feed enrichment. Empowers your AI Agent with the ability to browse and extract readable content from websites automatically. 🔖 Parameters url (string): the webpage URL to scrape fulltext (boolean): set true for full page content, false for summarized output ⚙️ Setup Install the community node n8n-nodes-webpage-content-extractor in your self-hosted n8n instance. Create a free account at Scrape.do and obtain your API Token. In the workflow, locate the Scrape.do HTTP Request node and configure the credentials using your API Token. Detailed step-by-step instructions are available in the workflow notes. The Scrape.do API is only used as a fallback when conventional scraping fails, helping you preserve your API credits.
by scrapeless official
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Prerequisites A n8n account (free trial available) A Scrapeless account and API key A Google account to access Google Sheets 🛠️ Step-by-Step Setup 1. Create a New Workflow in n8n Start by creating a new workflow in n8n. Add a Manual Trigger node to begin. 2. Add the Scrapeless Node Add the Scrapeless node and choose the Scrape operation Paste in your API key Set your target website URL Execute the node to fetch data and verify results 3. Clean Up the Data Add a Code node to clean and format the scraped data. Focus on extracting key fields like: Title Description URL 4. Set Up Google Sheets Create a new spreadsheet in Google Sheets Name the sheet (e.g., Business Leads) Add columns like Title, Description, and URL 5. Connect Google Sheets in n8n Add the Google Sheets node Choose the operation Append or update row Select the spreadsheet and worksheet Manually map each column to the cleaned data fields 6. Run and Test the Workflow Click "Execute Workflow" in n8n Check your Google Sheet to confirm the data is properly inserted Results With this automated workflow, you can continuously extract business lead data, clean it, and push it directly into a spreadsheet — perfect for outbound sales, lead lists, or internal analytics. How to Use ⚙️ Open the Variables node and plug in your Scrapeless credentials. 📄 Confirm the Google Sheets node points to your desired spreadsheet. ▶️ Run the workflow manually from the Start node. Perfect For: Sales teams doing outbound prospecting Marketers building lead lists Agencies running data aggregation tasks
by Mark Shcherbakov
Video Guide I prepared a detailed guide that showed the whole process of building a resume analyzer. Who is this for? This workflow is ideal for developers, data analysts, and business owners who want to enable conversational interactions with their database. It’s particularly useful for cases where users need to extract, analyze, or aggregate data without writing SQL queries manually. What problem does this workflow solve? Accessing and analyzing database data often requires SQL expertise or dedicated reports, which can be time-consuming. This workflow empowers users to interact with a database conversationally through an AI-powered agent. It dynamically generates SQL queries based on user requests, streamlining data retrieval and analysis. What this workflow does This workflow integrates OpenAI with a Supabase database, enabling users to interact with their data via an AI agent. The agent can: Retrieve records from the database. Extract and analyze JSON data stored in tables. Provide summaries, aggregations, or specific data points based on user queries. Dynamic SQL Querying: The agent uses user prompts to create and execute SQL queries on the database. Understand JSON Structure: The workflow identifies JSON schema from sample records, enabling the agent to parse and analyze JSON fields effectively. Database Schema Exploration: It provides the agent with tools to retrieve table structures, column details, and relationships for precise query generation. Setup Preparation Create Accounts: N8N: For workflow automation. Supabase: For database hosting and management. OpenAI: For building the conversational AI agent. Configure Database Connection: Set up a PostgreSQL database in Supabase. Use appropriate credentials (username, password, host, and database name) in your workflow. N8N Workflow AI agent with tools: Code Tool: Execute SQL queries based on user input. Database Schema Tool: Retrieve a list of all tables in the database. Use a predefined SQL query to fetch table definitions, including column names, types, and references. Table Definition: Retrieve a list of columns with types for one table.
by PollupAI
Who is this for? This workflow is designed for professionals and teams who need to monitor multiple RSS feeds, filter the latest content, and distribute actionable updates as a Trello comment. Ideal for content managers, marketers, and team leads managing news or content pipelines. What problem is this workflow solving? Manually monitoring RSS feeds and keeping track of the latest content can be time-consuming. This workflow automates the aggregation, filtering, and distribution of news, ensuring that only relevant and timely updates are shared with your team or audience. What this workflow does: Aggregates RSS Feeds: Pulls data from up to three RSS feeds simultaneously. Filters Content: Filters articles based on their publication date (default: last 7 days). Organizes and Sorts: Sorts filtered articles by date for clarity. Formats Updates: Transforms news items into Markdown format for better readability. Publishes and Notifies: Posts comments to Trello cards and sends an email to a moderator to check the comment. Setup: Connect your RSS feeds by configuring the RSS Read nodes. Link your Trello and Gmail accounts for seamless integration. Adjust the schedule trigger to set how often the workflow should run (e.g., daily, weekly). Test the workflow to ensure all connections and configurations are correct. How to customize this workflow to your needs: Change the Number of RSS Feeds: Add or remove RSS Read nodes and update the merge configuration accordingly. Adjust the Date Filter: Modify the date logic in the “Filter by date” node to include more or fewer days. Limit the Number of Articles: Adjust the limit in the “Limit news to x” node. Custom Formatting: Update the Transform node to format the news items differently. Alternative Notifications: Replace Trello and Gmail with other integrations, such as Slack or Microsoft Teams. This workflow ensures your team stays informed with minimal effort and delivers content updates in an organized and professional manner.
by Extruct AI
Who’s it for: Sales and business development professionals who want to monitor company news, hiring trends, and business signals for their leads. How it works / What it does: Add a company to the form, and the workflow will automatically search for the latest news, recent hires, company stage, and LinkedIn activity. The results are sent straight to your Google Sheet, helping you stay up to date with your leads and prospects. How to set up: Register for Extruct at www.extruct.ai/. Open the Extruct table template, copy the table ID from the browser’s address bar. Make a copy of the Google Sheets template to your Drive. Enter the table ID into the variables node in your n8n flow. Set up Bearer authentication in all HTTP Request nodes using your Extruct API token. In the Google Sheets node, paste your template link and connect your Google account. Run the flow once to load the mapping fields, then match each output to the correct column. Activate the flow and start adding companies through the form. Requirements: Extruct account and API token Extruct table template Google account with Google Sheets How to customize the workflow: To track more business development signals, add new columns in both the Extruct table and your Google Sheet, then map them in the Google Sheets node.
by Henry
Who is this for? This workflow is ideal for SEO specialists, web designers, and digital marketers who want to quickly draft effective landing page layouts by referencing established competitors. It suits users who need a fast, structured starting point for web design while ensuring competitive relevance. What problem is this workflow solving? / Use case Designing a high-converting landing page from scratch can be time-consuming. This workflow automates the process of analyzing a competitor’s website, identifying essential sections, and producing a tailored layout—helping users save time and improve their website’s effectiveness. What this workflow does The workflow fetches and analyzes your chosen competitor’s landing page, using web scraping and structure-detection nodes in n8n. It identifies primary sections like hero banners, service highlights, testimonials, and contact forms, and then generates a simplified, customizable layout suitable for wireframing or initial design. Setup Prepare your unique services and target audience profile for customization later. Gather the competitor’s landing page URL you wish to analyze. Run the workflow, inputting your competitor’s URL when prompted. How to customize this workflow to your needs After generating the initial layout, adapt section names and content blocks to highlight your services and brand messaging. Add or remove sections based on your objectives and audience insights. Integrate additional nodes for richer analysis, such as keyword extraction or design pattern detection, to tailor the output further.
by Extruct AI
Who’s it for: Sales teams, marketers, and analysts who need to quickly access all the social media and public profile links for any company. How it works / What it does: When you enter a company into the form, this workflow automatically searches for and collects all available links to the company’s social media accounts, review sites, and public profiles from sources like Crunchbase and Zoominfo. All discovered URLs are added directly to your Google Sheet. How to set up: Create an Extruct account at www.extruct.ai/. Open the Extruct table template, find the table ID in your browser’s address bar, and copy it. Make a copy of the provided Google Sheets template to your own Google Drive. In n8n, paste the table ID into the variables node of your flow. Set up Bearer authentication in every HTTP Request node using your Extruct API token (found on the API page in Extruct). In the Google Sheets node, paste the link to your copied template and connect your Google account. Run the flow once to load the fields, then map the output fields to the correct columns in your sheet. Activate the flow and start adding companies via the form. Requirements: Extruct account and API token Extruct table template Google account with Google Sheets How to customize the workflow: You can add your own columns to the Extruct table and your Google Sheet. Just add the new column in both places and map it in the Google Sheets node in n8n.
by n8n Team
This workflow combines customers' details with their payment data and passes the input to Pipedrive as a note to the organization. Prerequisites Stripe account and Stripe credentials Pipedrive account and Pipedrive credentials How it works Cron node triggers the workflow every day at 8 a.m. HTTP Request node searches for payments in Stripe. The Item Lists node creates separate items from a list of payment data. Merge node takes in the payment data as an input 1. Stripe node gets all the customers data. Set node renames customer-related data fields and keeps only needed fields. Merge node takes in the customer data as an input 2. Merge node combines the payment data with the customers one. Pipedrive node searches for the organization and creates a note with payment data.
by Bao Duy Nguyen
Who is this for? This template is ideal for developers, DevOps engineers, and automation managers who manage their n8n workflows using GitHub. It helps teams streamline their CI/CD automation by syncing changes from GitHub directly into n8n after a pull request (PR) is merged. What problem is this workflow solving? Manually restoring workflows after reviewing and merging code in GitHub can be tedious and error-prone. This workflow solves that by automating the restore process, ensuring that any new or updated workflow committed to your GitHub repo is automatically imported into your n8n environment. What this workflow does Triggers when a GitHub pull request is closed and merged. Fetches the details of the merge commit. Retrieves the list of added and modified workflow files. Downloads and decodes each workflow file. Creates or updates** the corresponding workflow in your n8n instance automatically. Setup Connect GitHub: Use the GitHub Trigger node and configure GitHub API credentials. Note: I'd recommended to use GitHub PAT (Personal Access Token) classic with repo and admin:repo_hook permission scopes enabled. Connect n8n API: Provide your n8n API credentials in the n8n nodes. - Check this doc Set repository variables: Update github_owner and repo_name in the Define Local Variables node. Enable webhook: Make sure your GitHub repository has a webhook for pull_request events pointing to this workflow. How to customize this workflow to your needs Modify filters to handle only certain branches or file paths. Add Slack or email notifications to confirm successful imports. Insert logging or version tagging for better traceability. Extend with conditional logic for workflow testing before applying changes. This automated flow provides a seamless CI/CD loop between GitHub and n8n, empowering teams to manage workflow versioning efficiently and securely.
by Allan Daemon
This creates a git backup of the workflows and credentials. It uses the n8n export command with git diff, so you can run as many times as you want, but only when there are changes they will create a commit. Setup You need some access to the server. Create a repository in some remote place to host your project, like Github, Gitlab, or your favorite private repo. Clone the repository in the server in a place that the n8n has access. In the example, it's the ., and the repository name is repo. Change it in the commands and in the workflow commands (you can set it as a variable in the wokflow). Checkout to another branch if you won't use the master one. cd . git clone repository Or you could git init and then add the remote (git remote add origin YOUR_REPO_URL), whatever pleases you more. As the server, check if everything is ok for beeing able to commit. Very likely you'll need to setup the user email and name. Try to create a commit, and push it to upstream, and everything you need (like config a user to comit) will appear in way. I strong suggest testing with exporting the commands to garantee it will work too. cd ./repo git commit -c "Initial commmit" --allow-empty -u is the same as --set-upstream git push -u origin master Testing to push to upstream with the first exported data npx n8n export:workflow --backup --output ./repo/workflows/ npx n8n export:credentials --backup --output repo/credentials/ cd ./repo git add . git commit -c "manual backup: first export" git push After that, if everything is ok, the workflow should work just fine. Adjustments Adjust the path in used in the workflow. See the the git -C PATH command is the same as cd PATH; git .... Also, adjust the cron to run as you need. As I said in the beginning, you can run it even for every minute, but it will create commits only when there are changes. Credentials encryption The default for exporting the credentials is to do them encrypted. You can add the flag --decrypted to the n8n export:credentials command if you need to save them in plain. But as general rule, it's better to save the encryption key, that you only need to do that once, and them export it safely encrypted.