by n8n Team
This workflow automatically adds a note of the PR from GitHub to the Pipedrive contact if their GitHub email matches a Person in Pipedrive. Prerequisites Pipedrive account and Pipedrive credentials GitHub account and GitHub credentials How it works GitHub Trigger node activates the workflow when a GitHub user adds a PR. HTTP Request node gets the user's data and sends it further. Pipedrive node searches the same email that GitHub user has in Pipedrive. IF node checks whether a person with the same email exists in Pipedrive. In case there's such a person in Pipedrive, the Pipedrive node creates a note within the person's profile.
by Artur
Streamline your accounting by automatically creating QuickBooks Online customers and sales receipts whenever a successful Stripe payment is processed. Ideal for businesses looking to reduce manual data entry and improve accounting efficiency. How it works Trigger: The workflow is triggered when a new successful payment intent event is received from Stripe. Retrieve Customer Data: Fetches customer details from Stripe associated with the payment. Check QuickBooks Customer: Searches QuickBooks Online to see if the customer already exists using their email address. Create or Use Existing Customer: If the customer doesn't exist in QuickBooks, they are created; otherwise, the existing customer is used. Generate Sales Receipt: A sales receipt is created in QuickBooks Online with payment details, including item descriptions, amounts, and currency. Set up steps Connect Accounts: Authenticate both your QuickBooks Online and Stripe accounts in n8n. Webhook Setup: Configure the Stripe webhook to send payment_intent.succeeded events to this workflow. Test the Workflow: Trigger a test payment in Stripe to validate the integration. Customize Details: Adjust item descriptions or other fields in the QuickBooks sales receipt JSON body as needed. This workflow requires basic familiarity with n8n, but setup can be completed in under 15 minutes for most users.
by Yaron Been
๐ค Audio-to-Insights: Auto Meeting Summarizer Transform your meeting recordings into actionable insights automatically. This powerful n8n workflow monitors your Google Drive for new audio files, transcribes them using OpenAI's Whisper, generates intelligent summaries with ChatGPT, and logs everything in Google Sheets - all without lifting a finger. ๐ How It Works This workflow operates as a seamless 6-step automation pipeline: Step 1: Smart Detection The workflow continuously monitors a designated Google Drive folder (polls every minute) for newly uploaded audio files. Step 2: Secure Download When a new audio file is detected, the system automatically downloads it from Google Drive for processing. Step 3: AI Transcription OpenAI's Whisper technology converts your audio recording into accurate text transcription, supporting multiple audio formats. Step 4: Intelligent Summarization ChatGPT processes the transcript using a specialized prompt that extracts: Key discussion points and decisions Action items with assigned persons and deadlines Priority levels and follow-up tasks Clean, professional formatting Step 5: Timestamp Generation The system automatically adds the current date and formats it consistently for tracking purposes. Step 6: Automated Logging The final summary is appended to your Google Sheets document with the date, creating a searchable archive of all meeting insights. โ๏ธ Setup Steps Prerequisites Before setting up the workflow, ensure you have: Active Google Drive account OpenAI API key with credits Google Sheets access n8n instance (cloud or self-hosted) Configuration Steps 1. Credential Setup Google Drive OAuth2**: Required for folder monitoring and file downloads OpenAI API Key**: Needed for both transcription (Whisper) and summarization (ChatGPT) Google Sheets OAuth2**: Essential for writing summaries to your spreadsheet 2. Google Drive Configuration Create a dedicated folder in Google Drive for meeting recordings Copy the folder ID from the URL (the long string after /folders/) Update the folderToWatch parameter in the workflow 3. Google Sheets Preparation Create a new Google Sheet or use an existing one Ensure it has columns: Date and Meeting Summary Copy the spreadsheet ID from the URL Update the documentId parameter in the workflow 4. Audio Requirements Supported Formats**: MP3, WAV, M4A, MP4 Recommended Size**: Under 100MB for optimal processing Language**: Optimized for English (customizable for other languages) Quality**: Clear audio produces better transcriptions 5. Workflow Activation Import the workflow JSON into your n8n instance Configure all credential connections Test with a sample audio file Activate the workflow trigger ๐ Use Cases Project Management Team Standup Summaries**: Convert daily standups into actionable task lists Sprint Retrospectives**: Extract improvement points and action items Stakeholder Updates**: Generate concise reports for leadership Sales & Customer Success Discovery Call Notes**: Capture prospect pain points and requirements Demo Follow-ups**: Track questions, objections, and next steps Customer Check-ins**: Monitor satisfaction and expansion opportunities Consulting & Professional Services Client Strategy Sessions**: Document recommendations and implementation plans Requirements Gathering**: Organize complex project specifications Progress Reviews**: Track deliverables and milestone achievements HR & Training Interview Debriefs**: Standardize candidate evaluation notes Training Sessions**: Create searchable knowledge bases Performance Reviews**: Document development plans and goals Research & Development Brainstorming Sessions**: Capture innovative ideas and concepts Technical Reviews**: Log decisions and architectural choices User Research**: Organize feedback and insights systematically ๐ก Advanced Customization Options Enhanced Summarization Modify the ChatGPT prompt to focus on specific elements: Add speaker identification for multi-person meetings Include sentiment analysis for customer calls Generate department-specific summaries (technical, sales, legal) Extract financial figures and metrics automatically Integration Expansions Slack Integration**: Auto-post summaries to relevant channels Email Notifications**: Send summaries to meeting participants CRM Updates**: Push action items directly to Salesforce/HubSpot Calendar Integration**: Schedule follow-up meetings based on action items Quality Improvements Audio Preprocessing**: Add noise reduction before transcription Multi-language Support**: Configure for international teams Custom Templates**: Create industry-specific summary formats Approval Workflows**: Add human review before final storage ๐ ๏ธ Troubleshooting & Best Practices Common Issues Large File Processing**: Split recordings over 100MB into smaller segments Poor Audio Quality**: Use noise reduction tools before uploading API Rate Limits**: Implement delay nodes for high-volume usage Formatting Issues**: Adjust ChatGPT prompts for consistent output Optimization Tips Upload files in supported formats only Ensure stable internet connection for cloud processing Monitor OpenAI API usage and costs Regularly backup your Google Sheets data Test workflow changes with sample files first ๐ Expected Outputs Sample Summary Format: Meeting Summary - March 15, 2024 Key Discussion Points: Q1 budget review and allocation decisions New product launch timeline and milestones Team restructuring and role assignments Action Items: John: Finalize budget proposal by March 20th (High Priority) Sarah: Schedule product demo sessions for March 25th Team: Submit org chart feedback by March 18th Decisions Made: Approved additional marketing budget of $50K Delayed product launch to April 15th for quality assurance Promoted Lisa to Senior Developer role ๐ Questions & Support For any questions, customizations, or technical support regarding this workflow: ๐ง Email Support Primary Contact**: Yaron@nofluff.online Response Time**: Within 24 hours on business days Best For**: Setup questions, customization requests, troubleshooting ๐ฅ Learning Resources YouTube Channel**: https://www.youtube.com/@YaronBeen/videos Step-by-step setup tutorials Advanced customization guides Workflow optimization tips ๐ Professional Network LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Connect for ongoing support Share your workflow success stories Get updates on new automation ideas ๐ก What to Include in Your Support Request Describe your specific use case Share any error messages or logs Mention your n8n version and setup type Include sample audio file characteristics (if relevant) Ready to transform your meeting chaos into organized insights? Download the workflow and start automating your meeting summaries today!
by Airtop
Extracting Comments from an X Post Use Case Engaging with conversations on X (formerly Twitter) is critical for brands and individuals monitoring sentiment, leads, or emerging trends. Manually collecting comments is time-consumingโthis automation enables scalable extraction of comment data to inform your outreach or analysis. What This Automation Does This automation extracts comments from a specified X post, with the following input parameters: airtop_profile**: The name of your Airtop Profile connected to X. x_post_url**: The URL of the X post to extract comments from. max_number_of_comments**: The maximum number of comments to retrieve. How It Works Takes input via a form or another workflow. Normalizes the input values. Creates a new browser session using Airtop. Navigates to the provided X post. Uses a prompt to extract up to the specified number of comments, returning: Author name Author profile URL Comment text Setup Requirements Airtop API Key โ free to generate. An Airtop Profile connected to X (requires one-time login). Next Steps Pair with X Monitoring**: Use this with the X monitoring automation to detect relevant posts and extract discussion context automatically. Feed into Analytics**: Combine with summarization or sentiment analysis tools to understand audience response at scale. Export for CRM/BI**: Pipe the structured comment data into your CRM or business intelligence stack for lead tracking or reporting. Read more about Extracting Comments from X Posts
by n8n Team
This template shows how to sync data from one service to another. Specifically, in this example we're saving a new qualified lead from a Postgres database to a Google Sheets file. Setup instructions are located inside the workflow template.
by Yang
Who is this for? This workflow is perfect for marketers, SEO specialists, product teams, and competitive analysts who want to monitor and summarize public reviews of their competitors. Itโs especially helpful for small teams who want fast insights from Google reviews without spending hours manually reading and sorting them. What problem is this workflow solving? Manually going through competitor reviews is time-consuming and repetitive. You risk missing patterns or insights, and itโs hard to share summaries with your team quickly. This workflow automatically scrapes reviews from Google and generates a structured summary of pain points and positive feedback. That way, you can focus on strategy instead of sorting through dozens of reviews. What this workflow does This automation watches for new competitor entries in a Google Sheet, then: Uses Dumpling AI to scrape the latest Google reviews (up to 20) for each business. Splits and cleans the reviews for analysis. Sends them to GPT-4o, which summarizes the most common complaints and praises. Saves the structured result back to the same Google Sheet. Youโll instantly get an overview of what people are saying about any competitor. Setup Google Sheet Setup Create a Google Sheet with at least one column: Business Add names or search queries for the competitors you want to analyze Optional: Add columns for Summary of Reviews and Pain Points Connect Dumpling AI Sign up at Dumpling AI Create an agent using the get-google-reviews endpoint Copy your agent key Use it in the HTTP Request node in this workflow OpenAI Setup Use your API key with GPT-4o access The prompt is already structured to generate grouped summaries from reviews Run the Workflow Trigger it manually or schedule it Make sure your Google Sheets, OpenAI, and Dumpling AI connections are active How to customize this workflow to your needs You can expand the number of reviews retrieved by changing the Dumpling AI agent config Replace Google Sheets with Airtable if you want more robust data views Add more fields like star ratings or review dates in your agent for richer analysis Change the GPT prompt to highlight emotional tone, urgency, or feature mentions ๐ง Node Details Google Sheets Trigger**: Watches for new competitor names HTTP Request (Dumpling AI)**: Scrapes 20 recent reviews from Google SplitOut Node**: Breaks review array into individual items Code Node**: Extracts and combines review text Edit Fields Node**: Structures the review content before GPT GPT-4o Node**: Analyzes and summarizes top pain points and praise Google Sheets Output**: Saves the summary back to the same sheet Dependencies Dumpling AI account and review scraping agent setup OpenAI API key with GPT-4o access Google Sheets OAuth2 credentials
by please-open.it
Intro This workflow needs a user to authenticate by using an openid connect provider in order to call the webhook. If the user is not authenticated, it starts a login process by using an Authorization Code with PKCE https://datatracker.ietf.org/doc/html/rfc7636, a standard way to authenticate users with openid connect. Then, after the user logs in, the webhook is refreshed and gets the user's token from a cookie. With this token, all details about the user are requested through the userinfo endpoint on the identity provider. How to set up with Keycloak Keycloak Keycloak is an open source identity and access management solution. Feel free to get a demo realm at https://please-open.it or get your own Keycloak server up and running. After creating a realm, go to "Realm Settings" and click on "OpenID Endpoint Configuration" Retrieve authorization_endpoint, token_endpoint and userinfo_endpoint values. Set those variables in the "Set variables" node. In Keycloak, create a new client (name it as you want) Disable the client authentication, check only "standard flow" : At the third step, put the webhook url in "valid redirect URIs", fill "Web origins" with a "+". You're done, open the webhook and it asks you to authenticate. Usage User informations The userinfo node returns this structure about the user has logged in : [ { "sub":"73a6543f-f420-4fa6-9811-209e903c348b", "email_verified":true, "preferred_username": "mathieu.passenaud@please-open.it", "email": "mathieu.passenaud@please-open.it" } ] I can use those infos in my workflow for custom operations. APIs calls the "code" node returns me a cookie named "n8n-custom-auth" which is the access_token returned by the identity provider. This access_token can be used to call APIs connected to this identity provider (for example, we call userinfo API with this token). Example : asks a user to log in with his Google account then call an API (Gmail, drive...) with his own token. How it works We published a blog post about this flow, how it works and how you can use it : https://blog.please-open.it/n8n-openid-client/
by n8n Team
This workflow syncs Outlook Calendar events to a Notion database. The Outlook Calendar event must be within a specific time frame (default of within next year) for the workflow to pick up the event. The event subject will be the title of the Notion page, and the event link will be added to the Notion page as a property. Prerequisites Notion account and Notion credentials. Microsoft account and Microsoft credentials. How it works On scheduled intervals, find all Outlook Calendar events within a specific time frame. For each event, check if the event already exists in the Notion database. If it does not exist, create a new page in the Notion database, otherwise update the existing page. Setup This workflow requires that you set up a Notion database or use an existing one with at least the following fields: Title (title) Date (date) Event ID (text) Link (URL)
by Nskha
n8n Creators Template: Creator Profile Stats Updater This n8n workflow template is designed to automate the process of updating a creator's profile statistics, including total workflows, complex workflows, approved workflows, pending workflows, total nodes, and total views. It utilizes various nodes to fetch data, process it, and update a SVG file hosted on GitHub to reflect the latest stats. Workflow Overview Schedule Trigger: Triggers the workflow execution at specified intervals. Config: Sets up configuration details like creator username, colors for text, icons, border, and card. Get Workflows: Fetches workflows associated with the creator from the n8n API. Workflows Data: Processes the fetched data to calculate various statistics. Get User: Fetches user details from the n8n API. Download Image: Downloads the creator's profile image. Extract From File: Extracts binary data from the downloaded image file. SVG: Generates an SVG file with updated stats and visual representation. GitHub: Commits the updated SVG file to the specified GitHub repository. Final: Prepares the final data set for further processing or output. Sticky Note: Provides a visual note or reminder within the workflow editor. Embed & Live Preview Since it's a .SVG format you can host it anywhere. treat it like normal image so you can embed it with any site, forum, page that support posting images. here's example code for markdown: Here's the result Or served through CDN & Cache Setup Instructions GitHub Credentials: Ensure you have GitHub credentials set up in your n8n instance to allow the workflow to commit changes to your repository. Configure Trigger: Adjust the Schedule Trigger node to set the desired execution intervals for the workflow. Set Configuration: Customize the Config node with your GitHub username and preferred aesthetic options for the SVG. Deploy Workflow: Import the workflow into your n8n instance and deploy it. Customization Options Text and Icon Colors**: Customize the colors used in the SVG by modifying the respective fields in the Config node. Profile Image Size**: Adjust the image size in the Download Image node URL if needed. Commit Messages**: Modify the commit messages in the GitHub nodes to suit your version control conventions [I've used $now funaction to include current time in message which will gives allways a diffrent commit value]. Requirements n8n (Self-hosted or Cloud version compatible with 2024 releases and up) GitHub account and repository Basic understanding of n8n workflow configuration Support and Contributions For support, please refer to the n8n community forum or the official n8n documentation. Contributions to the template can be made you're allowed to reuse this workflow and reshare with edit (like new design/colors etc..) under your name.
by Tony Duffy
. IOT device control with MQTT and webhook This workflow is for users wanting a practical example of how to control IOT systems using the MQTT protocol in an an n8n environment. The template provides typical n8n MQTT and Webhook node implementation and configuration settings necessary to set IOT device inputs and outputs. How it works A webpage with IOT control 'on and 'off' buttons is presented to the user. When a button is selected on the webpage the value is sent via a webhook to trigger the active workflow. The workflow set node then prepares the received value into a message payload. It then passes the message to the MQTT node for publishing the topic with the payload to a cloud based MQTT broker. A remote ESP32 micro-controller subscribes to the broker and reads the payload contained in the topic. The ESP32 will then toggle the GPIO pin depending on the topic payload value. The IOT control webpage The webpage is a simple HTML page containing the clickable 'on' and 'off' buttons. It also has the get webhook URL that sends the selected value to the n8n workflow in this case running locally. The URL webhook format is http://localhost:5678/webhook/pin-control?value=action The webpage code IOT-control.html IOT device The IOT device is an ESP32 micro-controller running on a remote network. To keep it simple GPIO2 is selected as the control output. In this case when the received value is "on" GPIO2 goes high a led will turn on in the ESP32. It will go off when the received value is "off". The program for the ESP32 IOT control is 'main.py' . You will require a micropython interpreter to be uploaded to the ESP32 for the program to run automatically. The code can be easily edited and modified to accommodate any further attached IOT devices. The ESP32 main.py code main.py How to customise this workflow to your needs ESP32 You will need a working ESP32 installed with a micro-python interpreter. The code main.py is provided. The main.py program can loaded and edited with a python IDE. I used Thonny for this example. Use a free MQTT broker to get started. I used "broker.emqx.io" in the code. IOT Control Webpage The webpage contains HTML and can be easily edit to enhance functionality. The embedded webhook is configured for n8n production mode. http://localhost:5678/webhook/pin-control?value=action If you want to run the page in test mode you will use the following URL. http://localhost:5678/webhook-test/pin-control?value=action n8n workflow. The workflow is a good demonstration of how to control IOT devices using n8n. Following these steps will give a good insight for microcontroller automation.
by Yaron Been
Automated pipeline to collect and analyze investor data from Crunchbase, tracking investment patterns, funding history, and portfolio companies for market analysis and lead generation. ๐ What It Does Investor Profiling**: Collects comprehensive data on investors and VC firms Investment Pattern Analysis**: Tracks funding history and investment preferences Portfolio Monitoring**: Keeps tabs on investor portfolios and new investments Data Enrichment**: Enhances raw data with additional context and metrics ๐ฏ Perfect For Startup founders seeking investors Market research analysts Investment professionals Business development teams Competitive intelligence โ๏ธ Key Benefits โ Comprehensive investor profiles โ Real-time investment tracking โ Market trend analysis โ Data-driven investment decisions โ Time-saving automation ๐ง What You Need Crunchbase API access n8n instance Storage solution (database or spreadsheet) ๐ Data Points Collected Investor/Firm details Investment history Portfolio companies Funding rounds participated in Investment focus areas Contact information (when available) ๐ ๏ธ Setup & Support Quick Setup Deploy in 30 minutes with our step-by-step configuration guide ๐บ Watch Tutorial ๐ผ Get Expert Support ๐ง Direct Help Transform your investor research with automated data collection and analysis. Spend less time gathering data and more time making strategic decisions.
by AlQaisi
Template Information Who is this template for? This template is for users looking to retrieve email information from LinkedIn profiles and update Google Sheets with the collected data. ๐ฅ quick set up video How it works** The template utilizes a series of nodes to fetch email information from LinkedIn profiles. It starts with a Schedule Trigger node that sets the interval for the workflow. The Conditional Check node verifies if certain fields like Name, Gender, Job Title, Summary, and LinkedIn URL are not empty. The HTTP Request node sends a POST request to the specified URL with API key and profile information. The Data Merge node merges the data collected. The Field Editing node modifies the fields as needed. Finally, the Google Sheets Update node updates the Google Sheets with the gathered information. Set Up Instructions Make sure to have the necessary credentials and permissions for accessing LinkedIn and Google Sheets. Set up the API key required for the HTTP Request node. Configure the Google Sheets Update node with the appropriate document ID and sheet name. Check and adjust field mappings in the Field Editing node according to your needs. Run the workflow and monitor the updates in your Google Sheets document. Overview: The workflow is designed to find contact information for LinkedIn profile URLs stored in a Google Sheet. It involves various nodes for different operations such as making HTTP requests, scheduling triggers, reading from and updating Google Sheets, field editing, data merging, and conditional checks. A video demonstrating the workflow process can be accessed here. Copy this template to get started : Google Sheets Using Prospeo.io LinkedIn Email Finder API with cURL To use the API endpoint "https://api.prospeo.io/linkedin-email-finder" with cURL, follow these steps: Use the cURL command with the following parameters: curl -X POST \ -H "Content-Type: application/json" \ -H "X-KEY: your_api_key" \ -d '{ "url": "https://www.linkedin.com/in/john-doe/" }' \ "https://api.prospeo.io/linkedin-email-finder" Replace "your_api_key" with your actual API key. Update the "url" field in the JSON data with the LinkedIn profile URL for which you want to find the email address. To get access to this API and obtain your API key, you need to sign up on the Prospeo platform and subscribe to their LinkedIn email finder service. Once you have subscribed, you will receive an API key that you can use to authenticate your requests to the API endpoint. Description: Schedule Trigger:** Triggers the workflow based on a defined schedule interval, in this case, based on minutes. Schedule Trigger Node Documentation Google Sheets Read:** Reads data from a Google Sheets document and sheet based on the provided document ID and sheet name. Google Sheets Node Documentation Conditional Check:** Checks multiple conditions based on the input data and performs actions accordingly. Conditional Node Documentation HTTP Request:** Sends an HTTP POST request to a specified URL with headers and body parameters. HTTP Request Node Documentation No Operation, do nothing:** Placeholder node that does not perform any operation. Data Merge:** Merges data based on specified mode and combination settings. Merge Node Documentation Field Editing:** Edits fields by setting specific values for each field based on input data. Set Node Documentation Google Sheets Update:** Updates data in a Google Sheets document and sheet based on specified columns and values. Google Sheets Node Documentation