by Milorad Filipović
How it works It’s very important to come prepared to Sales calls. This often means a lot of manual research about the person you’re calling with. This workflow delivers a summary of the latest social media activity (LinkedIn + X) for businesses you are about to interact with each day. Scans Your Calendar**: Each morning, it reviews your Google Calendar for any scheduled meetings or calls with companies based on each attendee email address. Fetches Latest Posts**: For each identified company, it fetches recent LinkedIn and X posts and summerizes them using AI to deliver a qucik overview for a busy sales rep. Delivers Insights**: You receive personalized emails via Gmail, each dedicated to a company you’re meeting with that day, containing a reminder of the meeting and a summary of company's recent social media activity. Setup steps The workflow requires you to have the following accounts set up in their respective nodes: Google Calendar GMail Clearbit OpenAI Besides those, you will need an account on the RapidAPI platform and subscribe to the following APIs: Fresh LinkedIn Profile Data Twitter Email example
by Niklas Hatje
Use case When collecting new leads via a form, you need to follow up on new submissions. Often, this required a lot of manual work that includes reviewing each submission, checking if they meet your criteria and then outreaching. With this workflow you can do all of that fully automatically and save a lot of your valuable time. What this workflow does This workflow runs every time you're receiving a new submission from an n8n form. It then filters out typical personal emails (such as Gmail, Hotmail, Yahoo etc.) before enriching the submission via Clearbit. It then checks, if the company of the submitter is a B2B company and has more than 499 employees. If it does, it sends an email via Gmail to the user. Setup Add the Clearbit and Gmail credentials Click on Test Workflow Enter your own email (which needs to be a business email to work) in the Form Check your email Once you're happy don't forget to activate this workflow How to adjust this template Replace the form trigger with your form provider of choice (e.g. Typeform, SurveyMonkey, Google Forms etc.) Adjust the criteria to your needs via the If node Adjust the email you're sending in the Gmail node
by Milorad Filipovic
How it works? This workflows sends you a monthly lists of live music events for a specific location. Events are fetched from songkick.com and delivered to you by email to a provided email address(es). Each event in the list has a link to a full SongKick page where you can see more details about the event and buy tickets for it. Example email: How to set it up? First thing that this workflow needs is a location link for your desired city from the SongKick website. You can get it by following these steps: Visit songkick.com and enter the city name in the search box: From the results page, click the result that contains the location info. These will have the Location tag on top of the location name: Once on the location page, copy the url Back in the n8n workflow page, paste the url in the location parameter of the node called Setup location and email: Second thing it needs is the email address which will receive the monthly list. For this, simply enter it in the email field of the Setup location and email node. If you want to set multiple receivers, simply separate email addresses by ,: Required accounts This workflow requires you to have a properly set up Gmail account that will be used by Gmail Node to send emails. You can read more about how to create credentials for a Gmail node in n8n documentation here.
by TreyDong
How it works • Automatically detects when new pages are created in your Notion workspace • Uses AI to generate contextually relevant icons based on page titles for perfect visual representation • Fetches random high-quality cover images from Unsplash to add visual appeal to each page • Seamlessly integrates with your existing Notion workflow without manual intervention Set up steps • Connect your Notion workspace using API credentials - takes about 5 minutes to configure • Set up AI service integration for intelligent icon generation based on page titles • Configure Unsplash API access for random cover image fetching • Configure webhook triggers to monitor new page creation events • Test the workflow with a sample page to ensure proper functionality • Keep detailed setup instructions and troubleshooting tips in the workflow notes for future reference This template helps streamline your Notion workspace by automatically beautifying new pages with AI-generated icons and stunning Unsplash covers, saving you time while maintaining a visually appealing and professional appearance across your knowledge base.
by Milorad Filipović
How it works It’s very important to come prepared to Sales calls. This often means a lot of manual research about the person you’re calling with. This workflow delivers the latest social media activity (LinkedIn + X) for businesses you are about to interact with each day. Scans Your Calendar**: Each morning, it reviews your Google Calendar for any scheduled meetings or calls with companies based on each attendee email address. Fetches Latest Posts**: For each identified company, it fetches recent LinkedIn and X posts Delivers Insight**s: You receive personalized emails via Gmail, each dedicated to a company you’re meeting with that day, containing a reminder of the meeting, list of posts categorized by the social media platform, and direct links to posts. Setup steps The workflow requires you to have the following accounts set up in their respective nodes: Google Calendar GMail Clearbit Besides those, you will need an account on the RapidAPI platform and subscribe to the following APIs: Fresh LinkedIn Profile Data Twitter Email example
by Yaron Been
Automated weekly report that summarizes technology stack changes, trends, and insights from your tracked companies. 🚀 What It Does Compiles weekly technology updates Highlights significant changes Identifies emerging trends Provides actionable insights Delivers scheduled reports 🎯 Perfect For CTOs and technical leaders Sales and marketing teams Business intelligence Technology consultants Market researchers ⚙️ Key Benefits ✅ Weekly digest of changes ✅ Trend analysis ✅ Competitive intelligence ✅ Time-saving automation ✅ Data-driven decisions 🔧 What You Need BuiltWith API access n8n instance Email service (for delivery) Google Sheets (for data storage) 📊 Report Includes New technology adoptions Technology removals Industry trends Competitive analysis Custom metrics 🛠️ Setup & Support Quick Setup Get your first report in 15 minutes with our step-by-step guide 📺 Watch Tutorial 💼 Get Expert Support 📧 Direct Help Stay ahead of technology trends with a comprehensive weekly digest of your industry's technology landscape.
by Marcelo Abreu
What this workflow does Runs automatically every Monday morning at 8 AM Collects your Meta Ads data from the last 7 days for a given account (date range is configurable) Formats the data, aggregating it at the campaign, ad set, and ad levels Generates AI-driven analysis and insights on your results, providing actionable recommendations Renders the report as a visually appealing PDF with charts and tables Sends the report via Slack (you can also add email or WhatsApp) A sample for the first page of the report: Setup Guide Create an account of pdforge and use the pre-made Meta Ads template. Connect Meta Ads, OpenAI and Slack to n8n Set your Ad Account Id and date range (choose from 'last_7d', 'last_14d', 'last30d') (opcional) Customize the scheduling date and time Requirements Meta Ads (via Facebook Graph API): Documentation pdforge access: Integration guide AI API access (e.g. via OpenAI, Anthropic, Google or Ollama) Slack acces (via OAuth2): Documentation Feel free to contact me via Linkedin, if you have any questions! 👋🏻
by Jimleuk
This n8n workflow demonstrates how to automate indexing of images to build a object-based image search. By utilising a Detr-Resnet-50 Object Classification model, we can identify objects within an image and store these associations in Elasticsearch along with a reference to the image. How it works An image is imported into the workflow via HTTP request node. The image is then sent to Cloudflare's Worker AI API where the service runs the image through the Detr-Resnet-50 object classification model. The API returns the object associations with their positions in the image, labels and confidence score of the classification. Confidence scores of less the 0.9 are discarded for brevity. The image's URL and its associations are then index in an ElasticSearch server ready for searching. Requirements A Cloudflare account with Workers AI enabled to access the object classification model. An ElasticSearch instance to store the image url and related associations. Extending this workflow Further enrich your indexed data with additional attributes or metrics relevant to your users. Use a vectorstore to provide similarity search over the images.
by Jean-Marie Rizkallah
🧩 Jamf Patch Summary to Slack Stay on top of software patch compliance by automatically posting Jamf patch summaries to Slack. This helps IT and security teams quickly identify outdated installs and take action—without logging into Jamf. ✅ Prerequisites • A Jamf Pro API key with permissions to read software titles and patch summary • A Slack app or incoming webhook URL with permission to post messages to your desired channel 🔍 How it works • Manually trigger the flow or Add a webhook • Fetch a list of software titles from Jamf Pro • Filter to select the software you're tracking (e.g. Chrome, Edge) • Retrieve the patch summary for that software (latest version, up-to-date, out-of-date counts) • Format the summary into Slack Block Kit • Post the formatted summary into a Slack channel ⚙️ Set up steps • Takes ~5–10 minutes to configure • Set your server BaseURL variable in the Set Node • Add your Jamf Pro API credentials in the HTTP Request nodes (Get & Retrieve) • Set the target software ID in the Filter node • Add your Slack webhook URL or token in the final HTTP node • Optional: Adjust Slack formatting inside the Function node
by Arthur Braghetto
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Clean Web Content Extraction with Anti-Bot Fallback Extract clean and structured text from any webpage with optional fallback to an anti-bot scraping service. Ideal for AI tools and content workflows. 🧠 How it Works This sub-workflow enables reliable and clean scraping of any public webpage by simply passing a url parameter. It is designed to be embedded into other workflows or used as a tool for AI agents. It supports two output modes: fulltext:* true — returns *{ title, text } with full page content fulltext:* false — returns *{ title, url, content } with a short excerpt 💡 If the site is protected by anti-bot systems (like Cloudflare), it will automatically fallback to Scrape.do, a scraping API with a generous free plan. 🧩 This template requires the n8n-nodes-webpage-content-extractor community node, so it only works in self-hosted n8n environments. 🚀 Use Cases As a reusable sub-workflow, via Execute Sub-workflow node. As a tool for an AI Agent, compatible with Call n8n Workflow Tool. Perfect for chatbots, summarization workflows, or RSS/feed enrichment. Empowers your AI Agent with the ability to browse and extract readable content from websites automatically. 🔖 Parameters url (string): the webpage URL to scrape fulltext (boolean): set true for full page content, false for summarized output ⚙️ Setup Install the community node n8n-nodes-webpage-content-extractor in your self-hosted n8n instance. Create a free account at Scrape.do and obtain your API Token. In the workflow, locate the Scrape.do HTTP Request node and configure the credentials using your API Token. Detailed step-by-step instructions are available in the workflow notes. The Scrape.do API is only used as a fallback when conventional scraping fails, helping you preserve your API credits.
by scrapeless official
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Prerequisites A n8n account (free trial available) A Scrapeless account and API key A Google account to access Google Sheets 🛠️ Step-by-Step Setup 1. Create a New Workflow in n8n Start by creating a new workflow in n8n. Add a Manual Trigger node to begin. 2. Add the Scrapeless Node Add the Scrapeless node and choose the Scrape operation Paste in your API key Set your target website URL Execute the node to fetch data and verify results 3. Clean Up the Data Add a Code node to clean and format the scraped data. Focus on extracting key fields like: Title Description URL 4. Set Up Google Sheets Create a new spreadsheet in Google Sheets Name the sheet (e.g., Business Leads) Add columns like Title, Description, and URL 5. Connect Google Sheets in n8n Add the Google Sheets node Choose the operation Append or update row Select the spreadsheet and worksheet Manually map each column to the cleaned data fields 6. Run and Test the Workflow Click "Execute Workflow" in n8n Check your Google Sheet to confirm the data is properly inserted Results With this automated workflow, you can continuously extract business lead data, clean it, and push it directly into a spreadsheet — perfect for outbound sales, lead lists, or internal analytics. How to Use ⚙️ Open the Variables node and plug in your Scrapeless credentials. 📄 Confirm the Google Sheets node points to your desired spreadsheet. ▶️ Run the workflow manually from the Start node. Perfect For: Sales teams doing outbound prospecting Marketers building lead lists Agencies running data aggregation tasks
by Mark Shcherbakov
Video Guide I prepared a detailed guide that showed the whole process of building a resume analyzer. Who is this for? This workflow is ideal for developers, data analysts, and business owners who want to enable conversational interactions with their database. It’s particularly useful for cases where users need to extract, analyze, or aggregate data without writing SQL queries manually. What problem does this workflow solve? Accessing and analyzing database data often requires SQL expertise or dedicated reports, which can be time-consuming. This workflow empowers users to interact with a database conversationally through an AI-powered agent. It dynamically generates SQL queries based on user requests, streamlining data retrieval and analysis. What this workflow does This workflow integrates OpenAI with a Supabase database, enabling users to interact with their data via an AI agent. The agent can: Retrieve records from the database. Extract and analyze JSON data stored in tables. Provide summaries, aggregations, or specific data points based on user queries. Dynamic SQL Querying: The agent uses user prompts to create and execute SQL queries on the database. Understand JSON Structure: The workflow identifies JSON schema from sample records, enabling the agent to parse and analyze JSON fields effectively. Database Schema Exploration: It provides the agent with tools to retrieve table structures, column details, and relationships for precise query generation. Setup Preparation Create Accounts: N8N: For workflow automation. Supabase: For database hosting and management. OpenAI: For building the conversational AI agent. Configure Database Connection: Set up a PostgreSQL database in Supabase. Use appropriate credentials (username, password, host, and database name) in your workflow. N8N Workflow AI agent with tools: Code Tool: Execute SQL queries based on user input. Database Schema Tool: Retrieve a list of all tables in the database. Use a predefined SQL query to fetch table definitions, including column names, types, and references. Table Definition: Retrieve a list of columns with types for one table.