by James Francis
Overview Slack quietly released an update to their API that allows developers to build "AI Apps & Agents", which is a special classification of apps that have access to several special capabilities including: Multiple simultaneous chat threads with one user Loading "three dots" UI while your agent is thinking Option for users to pin your app to their top bar for quick chat access This workflow demonstrates how to build a Slack agent that takes advantage of all of these features. For a full video walkthrough of this workflow, watch this YouTube tutorial. Setup Instructions All of the below steps are required for this workflow to function properly unless otherwise noted. Create a Slack App Visit api.slack.com and click "Your Apps" Create a new app from scratch and follow the setup instructions In the Agents & AI Apps tab, enable the toggle and give your app a brief description In the OAuth & Permissions tab, enable the following bot token scopes: assistant:write chat:write channels:read im:history Install the app into your workspace and grant the requested permissions In your Slack workspace, right click your app's name in the sidebar, click "View app details", and make note of your apps Channel ID - you'll need this later. Copy your app's Bot User OAuth Token - you'll need that to create your n8n credentials In the Event Subscriptions tab, enable events and paste the workflows PRODUCTION webhook url (from this workflow's trigger node) into the input. In the same tab under "Susbcribe to bot events", select message.im Create a Postgres database In order to save the chat history and give your agent a working memory, you'll need your own Postgres database. You can use Supabase, Neon, or any other Postgres database provider. Once you've added your database's credentials to n8n, you can select those credentials in the Postgres Chat Memory node. This worklow saves all chat history in a table called chat_histories, but you name the table whatever you want. Create n8n Credentials You'll need to create the following credentials: Slack API. Use your Bot User OAuth Token referenced above. Bearer Auth. Use the same Bot User OAuth Token. Postgres. Use the connection string or config from your database provider. OpenRouter (or any other LLM model for the agent's model node) Wire Everything Up Now that you've created your Slack app, have your Postgres database, and have created credentials, follow these steps to wire up your workflow: In the "On Message Received" trigger, use your Slack API credential and enter your apps Channel ID in the "Channel To Watch" field. In the "Set Thinking Status" node, use your Bearer Auth credential. In the "Postgres Chat Memory" node, use your Postgres credential. In the "Send Reply" node, use your Slack API credential. Using the Chatbot Once you've completed the setup process and added in your credentials, you'll have a fully functional Slack chatbot complete with threads, loading UI, and the ability to pin your app to your workspace's top bar. Taking the Next Steps Now that this skeleton app is in place, it's up to you to add horsepower to the AI agent at the center of it all. Customize the prompts and add whatever tools you'd like. The sky is the limit! If you have any questions or feedback about this workflow, or would like me to build custom workflows for your business, email me at n8n@paperjam.agency.
by Santhej Kallada
Who is this for? Marketers, lead generation agencies, freelancers, consultants, and sales teams who need to collect business leads from Google Maps. Small business owners looking to build targeted local business lists. Anyone interested in automating web scraping without coding skills. What problem is this workflow solving? Manually scraping business data from Google Maps is time-consuming and repetitive. This automation simplifies the process by: Collecting business details based on search terms and location. Filtering out irrelevant results. Delivering qualified leads directly to your inbox. What this workflow does This workflow automates Google Maps lead scraping using APIFY and sends the gathered leads via email. The steps include: Collecting user input through a simple form (business type, location, recipient email). Sending an HTTP request to APIFY to run a Google Maps scraper (actor). Filtering results to include only businesses with email addresses. Converting results to CSV format. Sending an automated email with the leads as a CSV attachment via Gmail. Setup Create an APIFY Account: Sign up at APIFY.COM (https://apify.com/). You get $5 in free credits (~1,000 leads). Get Your API Key: Copy your API key from APIFY Prepare n8n: Create a new workflow. Add an HTTP Request node to interact with APIFY. Configure authentication with your API key. Customize the Form: Build a simple form inside n8n to collect user inputs: Business Type, City, Country, Recipient Email. Filter Results: Use IF and Filter nodes to remove entries without email addresses. Convert to CSV: Use a "Spreadsheet File" node to generate a CSV from the filtered leads. Send Email: Use the Gmail node (or any email node) to send the CSV file to the provided recipient. How to customize this workflow to your needs Change search parameters to target different business niches or locations. Add filters to only include businesses with websites. Customize the email subject and body. Integrate with CRM or marketing platforms for direct lead injection. Expand filtering logic for more refined targeting. Notes This template uses APIFY (paid service after free credits). You will need an APIFY API key and a Gmail account (or SMTP credentials) to run this automation. For self-hosted n8n users: ensure you have internet access and proper credentials set up for external HTTP requests. Want A Video Tutorial on How To Setup This Automation : https://www.youtube.com/watch?v=Kz_Gfx7OH6o
by Damian Karzon
This workflow checks a configured list of Github repositories daily to see if a new release has been published. How it works: Workflow has a daily trigger RepoConfig node is a JSON array that defines a list of repositories to check releases for For each of the configured repos it fetches the latest release If the release was published within the last 24 hours it is output The release is sent as a Slack message showing the repo name, release name and link Setup Update the JSON in the RepoConfig node to the Github repos you wish to get notifications for Setup your Slack connection (or replace with your choice of notification)
by Jah coozi
AI Social Media Content Generator & Scheduler Transform your social media strategy with AI-powered content generation that creates platform-specific posts in seconds! 🚀 What It Does This workflow uses AI to generate optimized content for multiple social media platforms from a single topic input. Perfect for marketers, content creators, and businesses looking to maintain consistent social media presence. ✨ Key Features Multi-Platform Support**: LinkedIn, Twitter/X, Instagram, Facebook, TikTok AI-Powered Generation**: Uses GPT-4 for creative, engaging content Platform Optimization**: Respects character limits and best practices Hashtag Generation**: Platform-specific hashtag strategies Posting Time Suggestions**: Optimal times for each platform Tone Customization**: Professional, casual, friendly, or custom Multi-Language Support**: Generate content in any language Engagement Predictions**: Estimate reach and engagement Daily Automation**: Schedule automatic content generation Bulk Processing**: Generate content for multiple topics at once 📊 Use Cases Marketing Teams: Streamline content creation across channels Small Businesses: Maintain consistent social presence Content Agencies: Scale content production efficiently Personal Brands: Build thought leadership E-commerce: Product launches and promotions 🛠️ Setup Instructions Add OpenAI Credentials Get API key from OpenAI Add to n8n credentials Configure Webhook (Optional) Set custom path if needed Enable for external integrations Customize Settings Adjust tone and style Set platform preferences Configure posting schedule Test Generation Use example prompts Verify output quality 💡 Example Inputs "New product launch - eco-friendly water bottle" "Company milestone - 10 years in business" "Industry insights - Future of AI in healthcare" "Team spotlight - Meet our new developer" "Seasonal campaign - Summer sale 50% off" 📈 Benefits 10x Faster**: Create content in seconds vs hours Consistency**: Maintain brand voice across platforms Engagement**: Platform-optimized for maximum reach Scalability**: Generate unlimited content Cost-Effective**: Reduce content creation costs by 80% 🔧 Customization Options Custom brand voice training Industry-specific content rules Competitor analysis integration A/B testing capabilities Analytics webhook integration Auto-posting to platforms Image generation add-on Translation services 🎯 Pro Tips Train the AI with your best-performing posts Use platform analytics to refine strategies Test different tones for audience engagement Schedule content during peak hours Monitor and iterate based on performance Start creating engaging social media content today! Categories: Marketing & Growth Content Creation Social Media AI & Automation Productivity Difficulty: Beginner Required Services: OpenAI API (or compatible LLM) n8n instance Optional: Social media APIs for auto-posting
by Miquel Colomer
This n8n workflow template checks for new major releases (tagged with .0) of the n8n project using its official GitHub releases feed. It runs multiple times a day and sends notifications via email and Telegram if a new release is found. > ⚠️ Note: You must *activate the workflow* to start receiving release notifications. 🚀 What It Does Monitors the n8n GitHub releases feed Detects major versions (e.g., 1.0.0, 2.0.0) Sends alert messages via Telegram and email (SES) when a release is published ⏰ Scheduling Details The Cron node checks for new releases three times per day: 10:00, 14:00, and 18:00 server time. 🛠️ Step-by-Step Setup Configure Telegram Bot Connect your Telegram bot and specify the chat ID where you want to receive notifications. Set up AWS SES Credentials Use a verified sender email and set up AWS SES credentials in your n8n instance. Activate the Workflow Enable the workflow in your instance to start receiving notifications. Customize Notification Messages (Optional) You can modify the email subject, Telegram format, or filter logic. 🧠 How It Works: Workflow Overview Cron Trigger Runs the workflow at 10:00, 14:00, and 18:00 daily. Read RSS Feed Pulls data from https://github.com/n8n-io/n8n/releases.atom. Filter by Current Day Filters the feed to match: Releases published in the last 4 hours Titles starting with n8n@ and ending with .0 Condition Check Uses a regex to check if the filter result contains any release data. Notifications If a new major release is found, sends: Telegram message to a specified chat Email via AWS SES with release info 📨 Final Output You'll receive a Telegram message and email when a new major n8n version is released. 🔐 Credentials Used Telegram API** – For sending chat notifications AWS SES** – To send email alerts ✨ Customization Tips Change Notification Channels**: Add Slack, Discord, or other preferred channels. Adjust Cron Schedule**: Modify the Cron node to fit your check frequency. Modify Filters**: Detect patch or beta versions by changing the .0 condition. Send Release Notes**: Extend the feed parsing to include release content. ❓Questions? Template created by Miquel Colomer and n8nhackers.com. Need help customizing or deploying? Contact us for consulting and support.
by n8n Team
This workflow automatically adds a note of the PR from GitHub to the Pipedrive contact if their GitHub email matches a Person in Pipedrive. Prerequisites Pipedrive account and Pipedrive credentials GitHub account and GitHub credentials How it works GitHub Trigger node activates the workflow when a GitHub user adds a PR. HTTP Request node gets the user's data and sends it further. Pipedrive node searches the same email that GitHub user has in Pipedrive. IF node checks whether a person with the same email exists in Pipedrive. In case there's such a person in Pipedrive, the Pipedrive node creates a note within the person's profile.
by Artur
Streamline your accounting by automatically creating QuickBooks Online customers and sales receipts whenever a successful Stripe payment is processed. Ideal for businesses looking to reduce manual data entry and improve accounting efficiency. How it works Trigger: The workflow is triggered when a new successful payment intent event is received from Stripe. Retrieve Customer Data: Fetches customer details from Stripe associated with the payment. Check QuickBooks Customer: Searches QuickBooks Online to see if the customer already exists using their email address. Create or Use Existing Customer: If the customer doesn't exist in QuickBooks, they are created; otherwise, the existing customer is used. Generate Sales Receipt: A sales receipt is created in QuickBooks Online with payment details, including item descriptions, amounts, and currency. Set up steps Connect Accounts: Authenticate both your QuickBooks Online and Stripe accounts in n8n. Webhook Setup: Configure the Stripe webhook to send payment_intent.succeeded events to this workflow. Test the Workflow: Trigger a test payment in Stripe to validate the integration. Customize Details: Adjust item descriptions or other fields in the QuickBooks sales receipt JSON body as needed. This workflow requires basic familiarity with n8n, but setup can be completed in under 15 minutes for most users.
by Yang
Who is this for? This workflow is perfect for customer support teams, sales departments, or solopreneurs who receive frequent email enquiries and want to automate the initial response process using AI. If you spend too much time answering similar questions, this system helps respond faster and more intelligently—without writing a single line of code. What problem is this workflow solving? Manually responding to repeated customer enquiries slows productivity and increases delay. This workflow classifies if an incoming email is a real enquiry, analyzes the content with a LangChain-powered agent, fetches helpful context using Dumpling AI, and sends a personalized reply using Gmail—all within minutes. What this workflow does Listens for new incoming Gmail messages using the Gmail Trigger node. Classifies whether the email is an enquiry using a GPT-4o classification prompt. Uses a Filter node to continue only if the email was classified as an enquiry. Passes the email content to a LangChain Agent, enhanced with memory, AI tools, and Dumpling AI to search for relevant information. The agent constructs a smart, relevant response, then sends it to the original sender via Gmail. Setup Connect Gmail Use the Gmail Trigger node to connect to the Gmail account that receives enquiries. Make sure Gmail OAuth2 credentials are authenticated. Configure Dumpling AI Agent Sign up at Dumpling AI. Create an agent trained to search your help docs, site content, or FAQs. Copy your Dumpling agent ID and API key. Paste it in the Dumpling AI Agent – Search for Relevant Info HTTP Request node. Set Up LangChain Agent No extra setup needed beyond connecting OpenAI credentials. GPT-4o is used for classification and reply generation. Enable Gmail Reply Node The final Send Email Response via Gmail node will send the AI-generated reply back to the same thread. How to customize this workflow to your needs Change the classification prompt to include other email types like “support”, “complaint”, or “sales”. Add additional logic if you want to CC someone or forward certain types of enquiries. Add a Notion or Google Sheets node to log the conversation for analytics. Replace Gmail with Outlook or another email provider by switching the nodes. Improve context by adding more AI tools like database queries or preloaded FAQs.
by Yang
Who is this for? This workflow is built for newsletter writers, marketers, content creators, or anyone who curates and summarizes web articles. It’s especially helpful for virtual assistants and founders who need to quickly turn web content into digestible, branded newsletters using AI. What problem is this workflow solving? Manually reading, summarizing, and formatting multiple articles into a newsletter takes time and focus. This workflow automates the process using Dumpling AI for crawling, GPT-4o for summarization, and Gmail for delivery—so you can go from raw URLs to a polished email in minutes. What this workflow does Starts manually (can also be scheduled) Reads a list of article URLs from Google Sheets Sends URLs to Dumpling AI to crawl and extract content Splits each article into a single item for processing Uses a Code node to clean and structure article data Uses an Edit Fields node to merge articles into one JSON block GPT-4o summarizes and generates HTML content for the newsletter Sends the formatted newsletter via Gmail Setup Google Sheets Create a sheet with a column (A) for article URLs Update the Read URLs from Google Sheet node to use your Sheet ID and tab name Connect your Google account in the credentials Dumpling AI Sign up at https://app.dumplingai.com Create an agent for web crawling under /crawl Add your Dumpling API key in the HTTP headers of the Crawl Content with Dumpling AI node Split Node Breaks apart the array of articles from Dumpling AI so each article is processed individually Code Node Structures each article as JSON with title, url, and cleaned text content Edit Fields Node Gathers all structured articles back into a single JSON array to prepare for AI summarization OpenAI (GPT-4o) Processes the article list and returns a formatted subject line and HTML newsletter content Gmail Connect your Gmail account to send the AI-generated newsletter to your inbox or team Update the recipient field in the Send HTML Email via Gmail node How to customize this workflow to your needs Replace the manual trigger with a Schedule node to send newsletters weekly Modify the GPT-4o prompt to change tone (e.g., more professional, funny, casual) Add filtering logic to skip low-value articles Connect Slack, Airtable, or Notion for internal team usage Change Gmail to SendGrid or Outlook if preferred Final Notes This workflow uses: Dumpling AI** /crawl endpoint to extract article content Split, **Code, and Edit Fields nodes to format multi-article input GPT-4o** for summarization and HTML formatting Gmail** for delivery This setup eliminates manual steps and delivers fast, consistent newsletters powered by AI.
by AlQaisi
Template Information Who is this template for? This template is for users looking to retrieve email information from LinkedIn profiles and update Google Sheets with the collected data. 🎥 quick set up video How it works** The template utilizes a series of nodes to fetch email information from LinkedIn profiles. It starts with a Schedule Trigger node that sets the interval for the workflow. The Conditional Check node verifies if certain fields like Name, Gender, Job Title, Summary, and LinkedIn URL are not empty. The HTTP Request node sends a POST request to the specified URL with API key and profile information. The Data Merge node merges the data collected. The Field Editing node modifies the fields as needed. Finally, the Google Sheets Update node updates the Google Sheets with the gathered information. Set Up Instructions Make sure to have the necessary credentials and permissions for accessing LinkedIn and Google Sheets. Set up the API key required for the HTTP Request node. Configure the Google Sheets Update node with the appropriate document ID and sheet name. Check and adjust field mappings in the Field Editing node according to your needs. Run the workflow and monitor the updates in your Google Sheets document. Overview: The workflow is designed to find contact information for LinkedIn profile URLs stored in a Google Sheet. It involves various nodes for different operations such as making HTTP requests, scheduling triggers, reading from and updating Google Sheets, field editing, data merging, and conditional checks. A video demonstrating the workflow process can be accessed here. Copy this template to get started : Google Sheets Using Prospeo.io LinkedIn Email Finder API with cURL To use the API endpoint "https://api.prospeo.io/linkedin-email-finder" with cURL, follow these steps: Use the cURL command with the following parameters: curl -X POST \ -H "Content-Type: application/json" \ -H "X-KEY: your_api_key" \ -d '{ "url": "https://www.linkedin.com/in/john-doe/" }' \ "https://api.prospeo.io/linkedin-email-finder" Replace "your_api_key" with your actual API key. Update the "url" field in the JSON data with the LinkedIn profile URL for which you want to find the email address. To get access to this API and obtain your API key, you need to sign up on the Prospeo platform and subscribe to their LinkedIn email finder service. Once you have subscribed, you will receive an API key that you can use to authenticate your requests to the API endpoint. Description: Schedule Trigger:** Triggers the workflow based on a defined schedule interval, in this case, based on minutes. Schedule Trigger Node Documentation Google Sheets Read:** Reads data from a Google Sheets document and sheet based on the provided document ID and sheet name. Google Sheets Node Documentation Conditional Check:** Checks multiple conditions based on the input data and performs actions accordingly. Conditional Node Documentation HTTP Request:** Sends an HTTP POST request to a specified URL with headers and body parameters. HTTP Request Node Documentation No Operation, do nothing:** Placeholder node that does not perform any operation. Data Merge:** Merges data based on specified mode and combination settings. Merge Node Documentation Field Editing:** Edits fields by setting specific values for each field based on input data. Set Node Documentation Google Sheets Update:** Updates data in a Google Sheets document and sheet based on specified columns and values. Google Sheets Node Documentation
by Aditya Sharma
Description This intelligent n8n automation streamlines the process of collecting, extracting, and scoring resumes sent to a Gmail inbox—making it an ideal solution for recruiters who regularly receive hundreds of applications. The workflow scans incoming emails with attachments, extracts relevant candidate information from resumes using AI, evaluates each candidate based on customizable criteria, and logs their scores alongside contact details in a connected Google Sheet. Who Is This For? Recruiters & Hiring Managers**: Automate the resume screening process and save hours of manual work. HR Teams at Startups & SMBs**: Quickly evaluate talent without needing large HR ops infrastructure. Agencies & Talent Acquisition Firms**: Screen large volumes of resumes efficiently and with consistent criteria. Solo Founders Hiring for Roles**: Use AI to help score and shortlist top candidates from email applications. What Problem Does This Workflow Solve? Manually reviewing resumes is time-consuming, error-prone, and inconsistent. This workflow solves these challenges by: Automatically detecting and extracting resumes from Gmail attachments. Using OpenAI to intelligently extract candidate info from unstructured PDFs. Scoring resumes using customizable evaluation criteria (e.g., relevant experience, skills, education). Logging all candidate data (Name, Email, LinkedIn, Score) in a centralized, filterable Google Sheet. Enabling faster, fairer, and more efficient candidate screening. How It Works 1. Gmail Trigger Runs on a scheduled interval (e.g., every 6 or 24 hours). Scans a connected Gmail inbox (using OAuth credentials) for unread emails that contain PDF attachments. 2. Extract Attachments Downloads the attached resumes from matching emails. 3. Parse Resume Text Sends the PDF file to OpenAI's API (via GPT-4 or GPT-3.5 with file support or via base64 + PDF-to-text tool). Prompts GPT with a structured format to extract fields like Name, Email, LinkedIn, Skills, and Education. 4. Score Resume Evaluates the resume on predefined scoring logic using AI or logic inside the workflow (e.g., "Has X skill = +10 points"). 5. Log to Google Sheets Appends a new row in a connected Google Sheet, including: Candidate Name Email Address LinkedIn URL Resume Score Setup Accounts & API Keys You’ll need accounts and credentials for: n8n** (hosted or self-hosted) Google Cloud Platform** (for Gmail, Drive, and Sheets APIs) OpenAI** (for GPT model access) Google Sheet Make a Google Sheet and connect it via Google Sheets node in n8n. Columns should include: Name Email LinkedIn Score Configuration Google Cloud: Enable Gmail API and Google Sheets API. Set up OAuth 2.0 Credentials in Google Console. Connect n8n Gmail, Drive, and Sheets nodes to these credentials. OpenAI: Generate an API Key. Use the HTTP Request node or official OpenAI node to send prompt requests. n8n Workflow: Add Gmail Trigger. Add extraction logic (e.g., filter PDFs). Add OpenAI prompt for resume parsing and scoring. Connect structured output to a Google Sheets node. Requirements Accounts: n8n** Google** (Gmail, Sheets, Drive, Cloud Console) OpenAI** API Keys & Credentials: OpenAI API Key Google Cloud OAuth Credentials Gmail Access Scopes (for reading attachments) Configured Google Sheet OpenAI usage (after free tier) Google Cloud API usage (if exceeding free quota)
by Manuel
Who is this template for? This workflow template is designed for everyone with a Gmail address, who wants to forward all Netflix emails, including temporary login codes, to friends and family effortlessly. How it works Scans your Gmail inbox every minute for new e-mails from Netflix Forwards all Netflix e-mails to all desired e-mail addresses via the e-mail provider Mailjet Setup Steps Connect your Google Mail Account to n8n following the official n8n instructions Add all recipients you want to the recipients array at the "Set all recipients" node. Create and connect your Mailjet Account to n8n following the official n8n instructions. Note: You cannot use an Gmail e-mail address as the sender address, as mailjet does not support this. I recommend using your own email address from a custom domain. This works perfectly.