by Robert Breen
Automate company enrichment directly in Google Sheets using Dun & Bradstreet (D&B) Data Blocks. This workflow reads DUNS numbers from a sheet, fetches a Bearer token (via Basic Auth β /v3/token), calls the Data Blocks API for each row (/v1/data/duns/...), extracts Paydex, and appends or updates the sheet. A Filter node skips rows already marked Complete for efficient, idempotent runs. β What this template does Pulls DUNS values from a Google Sheet (Option A) Uses an HTTP Header Auth credential for D&B, or (Option B) Dynamically fetches a Bearer token from /v3/token (Basic Auth) Calls D&B Data Blocks per row to retrieve payment insights Extracts Paydex and upserts results back to the sheet Skips rows already Complete π€ Who's it for RevOps/Data teams enriching company lists at scale SDR/Marketing teams validating firmographic/credit signals BI/Automation builders who want a no-code/low-code enrichment loop π§© How it works (node-by-node) Get Companies (Google Sheets) β Reads rows with at least duns, paydex, Complete. Only New Rows (Filter) β Passes only rows where Complete is empty. D&B Info (HTTP Request) β Calls Data Blocks for each DUNS using a header credential (Authorization: Bearer <token>). Keep Score (Set) β Maps nested JSON to a single Paydex field: {{$json.organization.businessTrading[0].summary[0].paydexScoreHistory[0].paydexScore}} Append to g-sheets (Google Sheets) β Append or Update by duns, writing paydex and setting Complete = Yes. > The workflow also includes Sticky Notes with in-canvas setup help. π οΈ Setup instructions (from the JSON) 1) Connect Google Sheets (OAuth2) In n8n β Credentials β New β Google Sheets (OAuth2) and sign in. Use/prepare a sheet with columns like: duns, paydex, Complete. In your Google Sheets nodes, select your credential and target spreadsheet/tab. For upsert behavior, set Operation to Append or Update and Matching column to duns. > Replace any example Sheet IDs/URLs with your own (avoid publishing private IDs). 2) Get a D&B Bearer Token (Basic Auth β /v3/token) β Optional Dynamic Token Node Add/enable HTTP Request node named Get Bearer Token1. Configure: Authentication: Basic Auth (your D&B username/password) Method: POST URL: https://plus.dnb.com/v3/token Body Parameters: grant_type = client_credentials Headers: Accept = application/json Execute to receive access_token. Reference the token in other nodes via: Authorization: Bearer {{$node["Get Bearer Token1"].json["access_token"]}} > β οΈ Security: Don't hardcode tokens. Prefer credentials or fetch dynamically. 3) Call D&B Data Blocks (use Header Auth or dynamic token) Node: D&B Info (HTTP Request) Authentication:** Header Auth (recommended) URL:** https://plus.dnb.com/v1/data/duns/{{ $json.duns }}?blockIDs=paymentinsight_L4_v1&tradeUp=hq&customerReference=customer%20reference%20text&orderReason=6332 Headers:** Accept = application/json If not using a stored Header Auth credential, set: Authorization = Bearer {{$node["Get Bearer Token1"].json["access_token"]}} > {{ $json.duns }} is resolved from the current row provided by Get Companies. 4) Map Paydex and Upsert to Google Sheets Keep Score (Set)** Field Paydex (Number): {{$json.organization.businessTrading[0].summary[0].paydexScoreHistory[0].paydexScore}} Append to g-sheets (Google Sheets)** Operation: Append or Update Matching column: duns Columns mapping: duns = {{ $('Get Companies').item.json.duns }} paydex = {{ $json.Paydex }} Complete = Yes π§ͺ Test checklist Add a few test DUNS rows (leave Complete blank). Run the workflow and confirm Only New Rows passes expected items. Check D&B Info returns payment insight data. Confirm Paydex is set and the row is updated with Complete = Yes. π Security & best practices Store secrets in Credentials (HTTP Header Auth/Basic Auth). Avoid publishing real Sheet IDs or tokens in screenshots/notes. Consider rate limits and backoff for large sheets. Log/handle API errors (e.g., invalid DUNS or expired tokens). π©Ή Troubleshooting 401/403 from D&B:** Verify credentials/token; ensure correct environment and entitlements. Missing Paydex path:** D&B responses vary by subscription/data availabilityβadd guards (IF node) before mapping. Rows not updating:* Confirm *Append or Update* is used and *Matching column** exactly matches your sheet header duns. Filtered out rows:** Ensure Complete is truly empty (no spaces) for new items. π§― Customize further Enrich additional fields (e.g., viability score, portfolio comparison, credit limits). Add retry logic, batching, or scheduled triggers. Push results to a CRM/DB or notify teams via Slack/Email. π¬ Contact Need help customizing this (e.g., enriching more fields, normalizing responses, or bulk-processing large sheets)? π§ robert@ynteractive.com π https://www.linkedin.com/in/robert-breen-29429625/ π https://ynteractive.com
by 1Shot API
Automated Savings with 1Shot API and Aave Protocol With the increasing popularity of stablecoins like USDC, its becoming easier to pay for everyday items with crypto thanks to debit cards from issuers like MetaMask. These solutions work by processing payments on traditional payment rails then pulling the stablecoin out of your onchain account to cover the cost of the purchase. One downside is that if a stablecoin like USDC is simply sitting in your onchain account, you are missing out on competitive savings rates offered by decentralized protocols like Aave. Often the APY offered by these onchain lending protocols is much higher than those offered by traditional banks while maintaining a low-risk profile for high-quality assets like USDC or mUSD. Ideally your balance would automatically stay in something like Aave when you don't need it and automatically refill your onchain account used for payments when it gets low. This workflow, powered by 1Shot API, does just that. It also sends you real time Telegram notifications about your account balance and savings every time your account rebalances. Aave Protocol This workflow integrates the Aave lending protocol to generate savings APY on your excess USDC funds on the Base network. When your holdings in your primary Metamask smart account exceed a threshold you set, it will deposit them into the Aave Lending Pool. This results in your account receiving aBasUSDC, a deposit receipt token used to withdraw your savings at a later time. When your account drops below a refill-threshold you set, it will exchange the aBasUSDC back to USDC and put in back into your primary account so that you can spend it. Setup Instructions This workflow will automatically balance your USDC funds on Base network so that you keep funds you aren't using in Aave to earn interest and automatically move funds out of savings back into your wallet when funds get low. Create a free 1Shot API account. Generate an API key & secret and use these to create a credential for the 1Shot API node's. Click the trigger on the subworkflow to automatically import the required smart contract functions and provision a 1Shot API server wallet on Base that will relay your transactions when you are not online. Input the contract method IDs from 1Shot API into the "Savings Config" node. Set your saving thresholds in the "Savings Config" node. Create a Telegram bot and use the bot's API key to generate a credential for the Telegram nodes. Get your chat id with the bot and input it into the "Savings Config" node. Configure your desired schedule in the "Schedule" trigger node (like every 24 hours) Activate the workflow.
by Milan Vasarhelyi - SmoothWork
Video Introduction Want to automate your inbox or need a custom workflow? π Book a Call | π¬ DM me on Linkedin Overview This workflow automates invoice creation in QuickBooks Online by importing data directly from a Google Sheet. Instead of manually entering invoice details one by one, this template reads structured data from your spreadsheet and automatically generates corresponding invoices in QuickBooks, saving time and reducing data entry errors. Key Features Automatically reads invoice data from Google Sheets including customer IDs, descriptions, and amounts Creates properly formatted invoices in QuickBooks Online with line items Eliminates manual data entry and reduces human error Scalable solution for processing multiple invoices at once Common Use Cases Batch invoice generation from sales or order data Automated billing workflows for recurring services Syncing invoice data from external systems via Google Sheets Streamlining accounting processes for small businesses Setup and Configuration QuickBooks Developer Account: Register at developer.intuit.com and create a new app in the App dashboard Select 'Accounting' scope permissions for your application Copy your Client ID and Client Secret from the Keys & Credentials section Add the n8n OAuth redirect URL to your app's authorized redirect URIs In n8n, create a QuickBooks Online OAuth2 credential using your Client ID and Secret Set Environment to 'Sandbox' for testing or 'Production' for live data Click 'Connect my account' and authorize the connection Google Sheets Setup: Connect your Google Sheets account in n8n using OAuth2 authentication Update the 'Config - Sheet URL' node with your Google Sheets URL Your sheet must contain these columns: CustomerId (QuickBooks customer ID), Description (line item description), and Amount (invoice amount) Invoice Customization: In the 'Create Invoice in QuickBooks' node, adjust the itemId and Qty fields to match your QuickBooks accounting setup and product catalog.
by DevCode Journey
Telegram AI Chatbot with Google & Gemini Integration Simple overview This workflow connects a Telegram bot to Google Gemini (PaLM API) so the bot can reply to users with AI-generated answers. Useful for FAQs, assistants, classroom helpers, or bots that fetch document content to answer questions. Who is this for Educators, creators, developers, and support teams who want a low-code Telegram chatbot powered by Gemini. What it does (quick) Listens for messages sent to your Telegram bot. Sends incoming text to Google Gemini and receives a generated reply. Optionally fetches content from Google Docs or an external API to enrich replies. Sends the reply back to the original Telegram user. Processes messages in batches and adds short delays to avoid spamming. Quick setup (5 steps) Create a Telegram bot with @BotFather and copy the bot token. Add Telegram credentials to n8n (Telegram node). Get a Google Gemini (PaLM) API key and add it to n8n. (Optional) Connect Google Docs OAuth2 if you want the bot to read documents. Activate the workflow and test by messaging the bot. Required items Telegram bot token Google Gemini (PaLM) API key n8n instance with Telegram and HTTP nodes enabled (Optional) Google Docs OAuth2 credential How it works (step-by-step) Telegram message arrives β Trigger node. Workflow extracts message and user info. (Optional) Pull supporting content from Google Docs or an API. Send prompt + context to Gemini β receive reply. Send reply back to the Telegram user. Add small delays and batch processing to handle volume safely. How to customize Edit the Gemini prompt to change response style and behavior. Switch Gemini model (Flash vs Pro) for speed vs. quality. Add conditions (If / Switch) to route different inputs to different behaviors. Append more data sources (Sheets, external APIs) to enrich replies. Add error handling to retry or log failed requests. Testing checklist Send a test message to the bot and confirm a reply. If using Google Docs, confirm the bot can read the target document. Check logs and node outputs in n8n for any errors. Tips and best practices Keep prompts concise and include only needed context to reduce costs. Use rate limiting (Wait node) and batching to avoid API throttling. Store API keys securely in n8n credentials. Start with small tests before enabling automated production runs. For Help Our Website: devcodejourney.com LinkedIn: LinkedIn WhatsApp Channel: Chat Now WhatsApp: Chat Now
by Omar Kennouche
How it works Triggers manually or on schedule (03:00 daily by default) Fetches workflows tagged backup-workflows via n8n API Normalizes workflow names and applies [client: NAME] tag convention Prepares JSON in the same structure as an n8n UI export Checks GitLab repository: Create new file if missing Update file if content differs Skip if unchanged Logs results with recap (created, updated, unchanged, total) Set up steps Configure your GitLab credentials in n8n Create a repository and branch for workflow backups Set global variables (owner, project, branch, backup path) Tag workflows to include with backup-workflows Run manually once to test, then enable the schedule
by SerpApi
Google Play Store App Rank and Rating Monitoring What and who this is for This workflow will be useful for anyone looking to do SEO tracking on the Google Play Store. It automates checking Google Play Store rank positions and average ratings for a list of app titles. The SerpApi component can also be modified to use other APIs for anyone looking for SEO tracking on any other search engine supported by SerpApi. How it works This workflow takes in a list of keywords and app titles to identify the apps' rank in Google Play Store search results. It also grabs the average rating of the app. The search uses SerpApi's Google Play Store API. The results are then synced to two different sheets in a Google Sheet. The first is a log of all past run. The latest results are appended to the bottom of the log. The second updates a kind of "dashboard" to show the results from the latest run. The workflow includes a Wait node that delays 4 seconds between each app title and keyword pair to prevent hitting the default Google Sheets' API per minute rate limit. You can delete this if you have a high enough custom rate limit on the Google Sheets API. The Schedule Trigger is configured to run at 10 AM UTC every day. How to use Create a free SerpApi account here: https://serpapi.com/ Add SerpApi credentials to n8n. Your SerpApi API key is here: https://serpapi.com/manage-api-key Connect your Google Sheets accounts to n8n. Help available here: https://n8n.io/integrations/google-sheets/ Copy this Google Sheet to your own Google account: https://docs.google.com/spreadsheets/d/1DiP6Zhe17tEblzKevtbPqIygH3dpPCW-NAprxup0VqA/edit?gid=1750873622#gid=1750873622 Set your own list of keywords and app titles to match in the 'Latest Run' sheet. This is the source list used to run the searches and must be set. Connect your Google Sheet in the 'Get Keywords and Titles to Match' Google Sheet node Connect your Google Sheet in the 'Update Rank & Rating Log' Google Sheet node Connect your Google Sheet again in the 'Update Latest Run' Google Sheet node (Optional) Update the schedule or disable the schedule to only run manually Documentation SerpApi Google Play Store API SerpApi n8n Node Intro Guide
by Kev
Automatically create and publish ready-to-post social media news updates β all powered by AI. This workflow turns any RSS feed into professional, branded posts, complete with visuals and captions. Use cases include automating news updates, sharing industry insights, or maintaining an active social presence without manual work. Good to know Fully automated end-to-end publishing β from RSS feed to social post Uses JsonCut for dynamic image composition (backgrounds, text overlays, logos) Publishes directly to Instagram (or other channels) via Blotato Utilizes OpenAI GPT-5 for post text and image prompt generation Polling mechanism checks job status every 3 seconds Setup time: under 10 minutes once credentials are in place How it works The RSS Trigger monitors any RSS feed for new content. OpenAI GPT-5 rewrites the headline and creates a short, social-friendly post caption. An AI image prompt is generated to match the articleβs topic and mood. JsonCut combines the background, logo, and headline text into a branded image. Once the image is ready, Blotato uploads and publishes the post directly to Instagram (or other connected platforms). The process runs completely automatically β no human input required after setup. How to use Import the workflow into your n8n instance. Configure your RSS feed URL(s). Add your JsonCut, Blotato, and OpenAI credentials. Activate the workflow β it will automatically generate and post new content whenever your RSS source updates. Requirements Free account at jsoncut.com Account at blotato.com (paid service β can be replaced with any social media API or publishing platform) API keys for both services: JsonCut API Key via app.jsoncut.com Blotato API Key via www.blotato.com OpenAI credential** (GPT-5 or compatible model) RSS Feed URL** (e.g. from a news site, blog, or press page) Setup steps Sign up for a free account at app.jsoncut.com. If you use Blotato, create an account at blotato.com and generate an API key. In n8n, add: JsonCut API Key (HTTP Header Auth, header: x-api-key) Blotato API credential (optional β can be replaced) OpenAI credential for GPT-5 Replace the example RSS URL in the RSS Feed Trigger node with your own. Activate the workflow β it will start monitoring, generating, and posting automatically. Customising this workflow You can easily adjust: The image layout and branding (in the βCreate JsonCut Jobβ node) The tone or length of social captions (in the βCreate Instagram Textβ node prompt) The publishing platform β replace Blotato with another integration (e.g. Buffer, Hootsuite, or native social API) Posting frequency via the RSS trigger interval For advanced customization, check out: JsonCut Documentation JsonCut Image Generation Examples Blotato Website n8n Documentation
by V3 Code Studio
How it works This workflow provides an API endpoint /api/v1/get-companies that retrieves company records directly from your Odoo database. Itβs built for teams who need to query or export company data β either as structured JSON for integrations or as Excel (.xlsx) for reporting. When a request is made, the workflow: Accepts query parameters (name, response_format). Validates the name input (required for company search). Fetches all matching companies from Odoo using a like filter for partial name matches. Returns results as a JSON response or Excel file depending on the response_format parameter. This makes it ideal for quickly exporting or syncing company information with other tools. Setup steps Open the Webhook node and note the endpoint /api/v1/get-companies. Connect your Odoo API credentials in the Odoo node. Optionally update the fieldsList in the Odoo node to include more company details (VAT, address, etc.). Test using a browser or Postman: /api/v1/get-companies?name=Tech&response_format=json /api/v1/get-companies?name=Tech&response_format=excel
by Francisco Rivera
What this template does Connect a Vapi AI voice agent to Google Calendar to capture contact details and auto-book appointments. The agent asks for name, address, service type, and a preferred time. The workflow checks availability and either proposes times or books the slotβno code needed. How it works (node map) Webhook: Production URL = VAPI Server URL β receives tool calls from Vapi and returns results. CONFIGURATION (EDIT ME)** β your timezone, work hours, meeting length, buffers, and cadence. Route by Tool Name** β routes Vapi tool calls: checkAvailability β calendar lookup path bookAppointment β create event path Get Calendar Events (EDIT ME)** β reads events for the requested day. Calculate Potential Slots / Filter for Available Slots** β builds conflict-free options with buffers. Respond with Available Times β returns formatted slots to Vapi. Book Appointment in Calendar (EDIT ME)** β creates the calendar event with details. Booking Confirmation** β returns success back to Vapi. > Sticky notes in the canvas show exactly what to edit (required by n8n). No API keys are hardcoded; Google uses OAuth credentials. Requirements n8n (Cloud or self-hosted) Google account with Calendar (OAuth credential in n8n) Vapi account + one Assistant Setup (5 minutes) A) Vapi β n8n connection Open the Webhook node and copy the Production URL. In Vapi β Assistant β Messaging, set Server URL = that Production URL. In Server Messages, enable only toolCalls. B) Vapi tools (names must match exactly) Create two Custom Tools in Vapi and attach them to the assistant: Tool 1: checkAvailability Arguments** initialSearchDateTime (string, ISO-8601 with timezone offset, e.g. 2025-09-09T09:00:00-05:00) Tool 2: Arguments** startDateTime (string, ISO-8601 with tz) endDateTime (string, ISO-8601 with tz) clientName (string) propertyAddress (string) serviceType (string) > The Switch node routes based on C) Configure availability Open 1. CONFIGURATION (EDIT ME) and set: D) Connect Google Calendar Open 2. Get Calendar Events (EDIT ME) β Credentials: select/create Google Calendar OAuth. Then choose the calendar to check availability. Open 3. Book Appointment in Calendar (EDIT ME) β use the same credential and same calendar to book. E) Activate & test Toggle the workflow Active. Call your Vapi number (or start a session) and book a test slot. Verify the event appears with description fields (client, address, service type, call id). Customising Change summary/description format in 3. Book Appointment. Add SMS/Email confirmations, CRM sync, rescheduling, or analytics as follow-ups (see sticky note βIβm a noteβ). Troubleshooting No response back to Vapi** β confirm Vapi is set to send toolCalls only and the Server URL matches the Production URL. Switch doesnβt route** β tool names must be exactly checkAvailability and bookAppointment. No times returned** β ensure timezone + work hours + cadence generate at least one future slot; confirm Google credential and calendar selection. Event not created** β use the same Google credential & calendar in both nodes; check OAuth scopes/consent. Security & privacy Google uses OAuth; credentials live in n8n. No API keys hardcoded. Webhook receives only the fields needed to check times or book.
by Port IO
Auto-resolve Jira tickets with coding agents Coding agents can significantly speed up development, but crucial engineering context often gets lost in the process. This guide demonstrates how to use Port as a context lake in n8n workflows to automatically generate GitHub issues from Jira tickets with rich organizational context, ensuring that important information is preserved when assigning them to GitHub Copilot and linking pull requests back to Jira. This setup helps establish a seamless ticket-to-PR workflow, bridging the gap between Jira and GitHub while leveraging Port's comprehensive software catalog as a source of truth. How it works The n8n workflow orchestrates the following steps: Jira trigger β The workflow listens for Jira issue updates via webhook. Condition check β Verifies that the issue status is "In Progress" and has the required label (e.g., "product_approved") without the "copilot_assigned" label. Port context extraction β Uses Port's n8n node to query your software catalog for relevant context about services, repositories, teams, dependencies, and documentation related to the Jira issue. Parse response β Retrieves the AI-generated GitHub issue title and body from Port. Create GitHub issue β Creates a new GitHub issue with the enriched context from Port. Assign to Copilot β Adds a comment to the GitHub issue instructing Copilot to take ownership. Add issue link to Jira ticket β Adds a comment to the Jira ticket with the GitHub issue URL, providing clear traceability. Mark ticket as assigned β Updates the Jira ticket to add the "copilot_assigned" label, preventing duplicate processing. Setup [ ] Connect your Jira Cloud account and enable issue_updated events [ ] Register for free on Port.io [ ] Connect your Port.io account and add the API key [ ] Connect your GitHub account and select the target repository [ ] Ensure a Copilot bot or @copilot user has access to the repository [ ] Confirm the workflow webhook or Jira trigger URL is active [ ] Test by moving a product_approved ticket to In Progress. [ ] You should be good to go! Prerequisites You have a Port account and have completed the onboarding process. Port's GitHub app is installed in your account. Port's Jira integration is installed in your account. You have a working n8n instance (Cloud or self-hosted) with Port's n8n custom node installed. Your GitHub organization has GitHub Copilot enabled, so Copilot can be automatically assigned to any issues created through this guide. β οΈ This template is intended for Self-Hosted instances only.
by Jonathan
This workflow creates a project in Clockify that any user can track time against. Syncro should be setup with a webhook via Notification Set for Ticket - created (for anyone). > This workflow is part of an MSP collection, The original can be found here: https://github.com/bionemesis/n8nsyncro
by Jonathan
This workflow uses a WooCommerce trigger that will run when a new product has been added, It will then post it to Slack so your team is always kept up to date with new products. To use this workflow you will need to set the credentials to use for the WooCommerce and Slack nodes, You will also need to pick a channel to post the message to.