by Harshil Agrawal
This workflow allows you to send updates about the position of the ISS every minute to a topic in RabbitMQ
by Le Nguyen
LeadBot Autopilot — Chat-to-Lead for Salesforce Description — How It Works Greets & Guides: Welcomes the visitor and collects info step-by-step — Full Name → Email → Mobile → Product Interest. Validates Inputs: Checks email/phone formats; politely re-asks if invalid. De-dupe in Salesforce: Looks up by email; updates an existing lead if found. Create/Update Lead: Writes to Salesforce, including ProductInterest__c. Notify Instantly: Sends a Slack alert to your team and a personalized email to the prospect. Close the Loop: Confirms submission and ends the chat. Description — Set Up Steps (≈45–75 mins) Connect Credentials (20–30 mins):** Salesforce OAuth, OpenAI, Slack, SMTP. Tune the Prompt (5–10 mins):** Greeting, field order, product options. Map Fields (10–15 mins):** Name split, email/phone, ProductInterest__c. Smoke Test (10–15 mins):** Run a full chat; verify de-dupe, Slack + email. Go Live (5–10 mins):** Expose the webhook/chat entry point on your site.
by Angel Menendez
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Who’s it for Community managers, content marketers, and builders who want a daily, skimmable update from a subreddit—automatically summarized, formatted, and cross-posted to DEV Community. Here is a Link to video hackathon detailing this build. What it does Collects fresh posts from a subreddit (seeded via RSS). Uses the Bright Data node to batch-scrape each post for richer fields (upvotes, comment count, comments). Flattens comments/replies with JMESPath and trims payload size. Summarizes into a Morning Brew–style brief (Top Stories, Quick Hits, Community Q\&A, Comment Spotlight). Converts to clean Markdown and publishes to DEV with HTTP Request. Optional: emails the same digest via Gmail. How to set up Trigger: Start with Manual Trigger; swap to Cron (daily) when ready. RSS → URLs: Set the subreddit RSS of your choice, just add .rss to the end of the subreddit URL. Update AI Prompt to fit your needs Requirements DEV Community API key. Bright Data account + the Bright Data api key Found Here. Optional: LLM provider credentials (OpenAI, Gemini). How to customize Swap DEV publishing for email/Slack; or post both. Add more subreddits and dedupe by URL. Best practices No hardcoded API keys**—use credentials. Pin long-running outputs while building to save credits. Only collect publicly available data with Bright Data.
by Rosh Ragel
Automatically Send Monthly Sales Reports from Square via Outlook What It Does This workflow automatically connects to the Square API and generates a monthly sales summary report for all your Square locations. The report matches the figures displayed in Square Dashboard > Reports > Sales Summary. It's designed to run monthly and pull the previous month’s sales into a CSV file, which is then sent to a manager/finance team for analysis. This workflow builds on my previous template, which allows users to automatically pull data from the Square API into n8n for processing. (See here: https://n8n.io/workflows/6358) Prerequisites To use this workflow, you'll need: A Square API credential (configured as a Header Auth credential) A Microsoft Outlook credential How to Set Up Square Credentials: Go to Credentials > Create New Choose Header Auth Set the Name to Authorization Set the Value to your Square Access Token (e.g., Bearer <your-api-key>) How It Works Trigger: The workflow runs on the 1st of every month at 8:00 AM Fetch Locations: An HTTP request retrieves all Square locations linked to your account Fetch Orders: For each location, an HTTP request pulls completed orders for the previous calendar month Filter Empty Locations: Locations with no sales are ignored Aggregate Sales Data: A Code node processes the order data and produces a summary identical to Square’s built-in Sales Summary report Create CSV File: A CSV file is created containing the relevant data Send Email: An email is sent using Microsoft Outlook to the chosen third party Example Use Cases Automatically send monthly Square sales data to management for forecasting and planning Automatically send data to an external third party, such as a landlord or agent, who is paid via commission Automatically send data to a bookkeeper for entry into QuickBooks How to Use Configure both HTTP Request nodes to use your Square API credential Set the workflow to Active so it runs automatically Enter the email address of the person you want to send the report to and update the message body If you want to remove the n8n attribution, you can do so in the last node Customization Options Add pagination to handle locations with more than 1,000 orders per month Adjust the date filters in the HTTP node to cover the full calendar month (e.g., use Luxon or JavaScript to calculate start_date and end_date) Why It's Useful This workflow saves time, reduces manual report pulling from Square, and enables smarter automation around sales data — whether for operations, finance, or performance monitoring.
by Robert Breen
Replace YOUR_API_KEY with your actual SerpApi key. 2️⃣ Set Up OpenAI Connection Go to OpenAI Platform Navigate to Billing and ensure your account has credits/funding Copy your API Key into the OpenAI credentials in n8n 🧠 Workflow Breakdown Chat Trigger → User enters a financial question (e.g., “What’s the current price of Tesla?”). HTTP Request (SerpApi Finance Search) → Fetches the latest market data for the requested ticker or index. OpenAI Node → Takes both the raw financial data and the user’s query, then formulates a natural language response. Output → Returns a clear, conversational answer that can be displayed in chat, Slack, or another integration. 🎛️ Customization Guidance Multiple Tickers**: Update the workflow to query multiple tickers (e.g., TSLA, AAPL, AMZN) and return a combined report. Scheduling: Add a **Schedule Trigger to run this workflow every morning and send a market recap. Delivery Channels**: Use Slack, Email, or Google Sheets nodes to distribute reports automatically. Extended Data**: Adjust the SerpApi query to include more than prices — e.g., company info, market news, or related tickers. Custom Prompts**: Change the OpenAI system prompt to make the chatbot more formal (for reporting) or casual (for quick insights). 💬 Example Questions & Responses Question: “What’s the current price of the S&P 500?” Expected Response: “The S&P 500 (^GSPC) is currently trading at 4,725.13, down 0.8% today.” Question: “Summarize the performance of Tesla and Apple today.” Expected Response: Tesla (TSLA): $238.45, up 1.5% Apple (AAPL): $192.11, down 0.3% Question: “Give me a quick market recap.” Expected Response: “Markets are mixed today — the S&P 500 is slightly down, while tech stocks like Tesla are showing gains. Apple dipped slightly after earnings news.” 📬 Contact Need help customizing this workflow (e.g., multiple tickers, daily summaries, or integrating into dashboards)? 📧 robert@ynteractive.com 🔗 Robert Breen 🌐 ynteractive.com
by Websensepro
Automatically Generate AI Follow-Up Messages from Fireflies Transcripts This workflow automates creating personalized follow-up messages for your clients based on meeting transcripts fetched from Fireflies. It ensures the right guest information is captured, the transcript is processed by AI, and the output is stored neatly in Google Drive. What it Does Triggers on New Appointment: The workflow starts when a new appointment is created in Google Calendar. Extracts Guest and Appointment Details: The Edit Fields node extract the guest's email, appointment start/end time, status, and creator. Fetches Transcript from Fireflies: The GraphQL node queries Fireflies using the guest email to fetch the meeting transcript, including sentences, participants, and summary. Skip IF Empty: The Filter node skip passing the Info to AI Agent if there is no record in Fireflies Generates Follow-Up Messages via AI: The AI Agent node (powered by Google Gemini) creates 12 personalized follow-up messages/emails for the guest. Messages are conversational, concise, and reference topics and pain points mentioned in the call. The messages are tailored to re-engage the client and guide them towards making a purchase. Stores Messages in Google Drive: The Google Drive node saves the AI-generated messages in a specific folder, named after the guest, for easy reference. Use Cases Missed Follow-Ups:** Automatically create personalized follow-ups after client calls without manual effort. Sales & Customer Engagement:** Ensure every client gets context-specific messages, improving engagement and conversion. Team Collaboration:** Messages are saved in Google Drive, making it easy for your team to review and send manually if needed. Customization Transcript Source:** The GraphQL node can be customized to fetch transcripts for specific guests or date ranges. Message Personalization:** The AI prompt in AI Agent can be updated to change the tone, style, or length of messages. Storage Folder:** You can change the folder in the Google Drive node to organize messages per team, campaign, or client. Troubleshooting AI Messages Not Generated:** Verify AI Agent node is connected to Google Gemini Chat Model and has correct API credentials. Messages Not Saved:** Check the Google Drive folder ID and access permissions. Requirements An N8N instance (self-hosted or cloud). Google Gemini API credentials. Google Drive account with proper folder access. Fireflies API credentials with GraphQL access. How to Set Up Connect Credentials:** In Google Calendar Node, GraphQL, AI Agent, and Google Drive nodes, select your credentials for Google Calendar, Fireflies, Google Gemini, and Google Drive. Set Guest Details Extraction:** Verify the Edit Fields node extracts all required fields (first name, last name, email, appointment times, status). Update GraphQL Query:** Ensure the query correctly fetches transcripts by guest email. Adjust if your Fireflies workspace uses different fields. Customize AI Prompt:** Update AI Agent with the exact instructions for message generation (number of messages, tone, context, platform). Configure Google Drive Storage:** Select the proper folder to save messages, ideally using guest name as file name for easy reference. Activate Workflow:** Save and activate the workflow. Video Tutorial:** Step by step video instructions present here for beginners https://youtu.be/5t9xXCz4DzM
by Rosh Ragel
What It Does This workflow automatically connects to the Square API and generates a daily sales summary report for all your Square locations. The report matches the figures displayed in Square Dashboard > Reports > Sales Summary. It's designed to run daily and pull the previous day's sales into a Google Sheet for easy analysis and reporting. This workflow builds on my previous template, which allows users to automatically pull data from the Square API into N8N for processing. (See here: https://n8n.io/workflows/6358) Prerequisites To use this workflow, you'll need: A Square API credential (configured as a Header Auth credential) A Google Sheets credential How to Set Up Square Credentials: Go to Credentials > Create New Choose Header Auth Set the Name to "Authorization" Set the Value to your Square Access Token (e.g., Bearer <your-api-key>) How It Works Trigger – The workflow runs daily at 4:00 AM Fetch Locations – An HTTP request retrieves all Square locations linked to your account Fetch Orders – For each location, an HTTP request pulls completed orders for the specified report_date Filter Empty Locations – Locations with no sales are ignored Aggregate Sales Data – A Code node processes the order data and produces a summary identical to Square’s Sales Summary report Append to Google Sheets – The data will automatically be appended to an existing Google sheet Example Use Cases Automatically store daily sales data in Google Sheets for analysis and historical tracking Automatically create charts or visualizations from the imported data Build weekly/monthly reports after running for multiple days Quickly calculate commissions or rent payments based on sales volume How to Use Configure both HTTP Request nodes to use your Square API credential Set the workflow to Active so it runs automatically Select the Google Sheet you want to import data into, and map the data to your columns Customization Options Add pagination to handle locations with more than 1,000 orders per day Expand the workflow to save or send the report output via other integrations (email, database, webhook, etc.) Why It's Useful This workflow saves time, reduces manual report pulling from Square, and enables smarter automation around sales data—whether for operations, finance, or performance monitoring.
by Sk developer
Sales Tax Calculator API Integration: Automate Tax Calculation with Google Sheets & RapidAPI Effortlessly calculate and store sales tax rates based on user address data using the Sales Tax Calculator API on RapidAPI. Automate the process, format the data, and store results in Google Sheets for easy access.** Workflow Overview: This automation workflow integrates the Sales Tax Calculator API from RapidAPI to calculate and store sales tax rates based on user-provided address information. The workflow is designed to automate tax calculation, streamline data processing, and save results in a Google Sheets document for future reference. Node-by-Node Explanation: 1. On Form Submission: Trigger: This node listens for form submissions, capturing the user’s address data (street, city, state, zip). 2. Calculate Sales Tax: Action: Sends a POST request to the Sales Tax Calculator API (via RapidAPI) to fetch tax rates based on the submitted address data. 3. Reformat API Response: Processing: Processes and reformats the data received from the API, structuring the tax agencies, rates, and total tax calculations into rows. 4. Append to Google Sheets: Store: Appends the reformatted tax information into a Google Sheets document for easy storage and future analysis. Use Case: This workflow is ideal for businesses or individuals who need to automatically calculate sales tax based on customer-provided address information. It can be used in any e-commerce platform, accounting system, or sales management tool. Benefits: Automation: Streamline the tax calculation process by automatically calculating and storing tax rates based on user inputs. Real-Time Data: Ensure up-to-date tax rates are used for every transaction or form submission. Easy Data Access: Tax rates and details are stored in Google Sheets, providing easy access and better organization for future reference. Efficient Workflow: Saves time and reduces the possibility of human error by automating the entire process from data collection to storage. Integration with RapidAPI: This workflow is powered by the Sales Tax Calculator API from RapidAPI, which ensures accurate and real-time tax calculations based on user addresses. Key Features of the Sales Tax Calculator API: Fetch tax rates based on various address details (street, city, state, zip). Reliable and fast service via RapidAPI, ensuring smooth API integrations. Provides tax rate data for multiple jurisdictions (states, cities, etc.). Start using the Sales Tax Calculator API on RapidAPI today and streamline your sales tax process. 🔑 How to Get API Key from RapidAPI Sales Tax Calculator Follow these steps to get your API key and start using it in your workflow: Visit the API Page 👉 Click here to open Sales Tax Calculator on RapidAPI Log in or Sign Up Use your Google, GitHub, or email account to sign in. If you're new, complete a quick sign-up. Subscribe to a Pricing Plan Go to the Pricing tab on the API page. Select a plan (free or paid, depending on your needs). Click Subscribe. Access Your API Key Navigate to the Endpoints tab. Look for the X-RapidAPI-Key under Request Headers. Copy the value shown — this is your API key. Use the Key in Your Workflow In your n8n workflow (HTTP Request node), replace: "x-rapidapi-key": "your key" with: "x-rapidapi-key": "YOUR_ACTUAL_API_KEY" Keywords: Sales Tax Calculator, Sales Tax API, RapidAPI, Tax Calculation, Google Sheets Integration, Automation, API Integration
by Sk developer
🌐 Bulk Domain Authority (DA/PA) Checker And Google Sheet Logging Easily check Domain Authority (DA) and Page Authority (PA) for multiple domains using this automated n8n workflow powered by the Bulk DA PA Checker API on RapidAPI. Simply submit your domains via a web form, and the workflow fetches detailed SEO metrics and logs the data into Google Sheets. 🚀 What This Workflow Does This automation leverages the Bulk DA PA Checker API from RapidAPI to: Accept multiple domains via a user-friendly form Send bulk requests to the Bulk DA PA Checker API for fast SEO metric retrieval Process and reformat the API response for easy consumption Append the domain authority data directly into Google Sheets for tracking and analysis Perfect for SEO pros, marketers, and agencies looking to streamline their domain analysis with the power of RapidAPI. ⚙️ Workflow Highlights | 🧩 Node | 🔍 Description | |--------|----------------| | 📝 Form Trigger | User submits comma-separated domains through a simple form. | | 🌐 Check DA PA Bulk (RapidAPI) | Sends a POST request to the Bulk DA PA Checker API to fetch DA/PA and related SEO metrics. | | 🛠️ Re Format | Parses and extracts each domain’s data from the API response. | | 📊 Append in Google Sheets | Logs all metrics in a structured Google Sheet for easy review and reporting. | 🧠 Key SEO Metrics Retrieved Domain Authority Page Authority Spam Score HTTP Status Code Last Crawled Date External URLs and Redirects And many more from the Bulk DA PA Checker API response ✅ Why Use This Workflow with the Bulk DA PA Checker API? Bulk checking saves time compared to manual domain lookups Reliable data powered by a trusted RapidAPI service Seamless integration with Google Sheets for reporting Easily repeatable and scalable for large domain lists 🔑 How to Get API Key from RapidAPI Bulk DA PA Checker API Follow these steps to get your API key and start using it in your workflow: Visit the API Page 👉 Click here to open Bulk DA PA Checker API on RapidAPI Log in or Sign Up Use your Google, GitHub, or email account to sign in. If you're new, complete a quick sign-up. Subscribe to a Pricing Plan Go to the Pricing tab on the API page. Select a plan (free or paid, depending on your needs). Click Subscribe. Access Your API Key Navigate to the Endpoints tab. Look for the X-RapidAPI-Key under Request Headers. Copy the value shown — this is your API key. Use the Key in Your Workflow In your n8n workflow (HTTP Request node), replace: "x-rapidapi-key": "your key" with: "x-rapidapi-key": "YOUR_ACTUAL_API_KEY"
by Antxon Pous
Overview ✨ This template finds businesses on Google Maps → writes to Google Sheets → enriches + **verifies email contact so your outreach stays clean and deliverable. It includes Sticky Notes to explain the flow and setup. What it does 🔎➡️📄➡️✅ Search & save: Fetch places and append rows (Name, Address, Website, Rating, Phone) to your sheet. Enrich: Ask an LLM (Perplexity) for Email + a short company Background. Verify emails: For Email Verification we use VerificarEmails to validate every email address before writing back (fewer bounces, less blocking). > ℹ️ VerificarEmails node: install the community node @verificaremails/n8n-nodes-verificaremails (Settings → Community Nodes) and use it until it’s approved as a core node. ✨ Youtube Video Requirements 🔐 Use n8n Credentials for: Google Sheets, Perplexity (Bearer), and VerificarEmails. Keep prompts short and temperature: 0 to lower cost. Create a Google Sheet with the following headers: UUID, Name, Address, Website, Rating Email, Valid, Email, Phone, Valid Phone, Summary How to run ▶️ Set your search params (query, locale, pages). Run the flow: it saves results → enriches → verifies emails → updates the sheet. If the read/update step doesn’t run, ensure the finalization (Done) path and the correct sheet/tab are selected. Why verify? ✉️🧹 Bad emails cause bounces, throttling, and blocks. Verifying first protects sender reputation and improves deliverability. Beyond emails, VerificarEmails also offers **phone, names. Check Email Validation API documentation to better understand verification results.
by Kaden Reese
AI-Powered Local News Digest to Discord (or Slack/Telegram/WhatsApp) Stay on top of what’s happening in your community without drowning in endless RSS feeds. This workflow pulls in local news sources daily, filters duplicates, and uses Google’s Gemini API to rank the most relevant stories before sending a clean digest straight to Discord. (Easily adaptable for Slack, Telegram, WhatsApp, or email.) How it Works Daily Trigger – Runs every morning at 8AM by default. Collect Feeds – Pulls in stories from multiple RSS feeds (customizable to your region). Deduplicate & Prepare – Removes repeats and cleans article data. AI Scoring – Uses Gemini API to score stories by relevance. Filter & Sort – Keeps only the top articles (default: 5). Deliver Digest – Formats and sends the summary to Discord (or another messaging app). Why Use This Save time** – Skip endless scrolling through local feeds. Stay relevant** – AI ranks stories so you only get what matters most. Be flexible** – Works with any RSS feed and can send to Discord, Slack, Telegram, WhatsApp, or email. Newsletter/Blogs** – Get daily, relevant updates to share on social media, your newsletters, etc. Perfect for individuals, teams, or community groups who want a daily, high-quality snapshot of local news. 📰⚡
by Arunava
Amazon Price Tracker & Competitor Monitoring Workflow (Apify + Google Sheets) This n8n workflow automates Amazon price tracking and competitor monitoring by scraping product pricing via Apify and updating your Google Sheet every day. It removes manual price checks, keeps your pricing data always fresh, and helps Amazon sellers stay ahead in competitive pricing, Buy Box preparation, and daily audits. 💡 Use Cases Automatically track prices of your Amazon products Monitor competitor seller prices across multiple URLs Maintain a daily pricing database for reporting and insights Catch sudden competitor undercutting or pricing changes Support Buy Box analysis by comparing seller prices Scale from 10 to 1000+ product URLs without manual effort 🧠 How It Works Scheduled Trigger** runs the workflow every morning Google Sheets node** loads all product rows with seller URLs Loop node** processes each item one-by-one Apify Actor node** triggers the Amazon scraper HTTP Request node** fetches the scraped result from Apify JavaScript node** extracts, cleans, and formats price data Update Sheet node** writes the fresh prices back to the right row Supports additional price columns for more sellers or metrics ➕ Adding New Competitor Columns (Step-by-Step) 1. Add a new column in Google Sheets Add two new columns: competitor_url_3 price_comp_3 2. Update the Apify Actor (inside n8n) In the Apify Actor node, pass the new competitor URL: "competitor_url_3": {{$json.competitor_url_3}} This ensures Apify scrapes the additional competitor product page. 3. Update your Code (JavaScript) node Inside the Code node, extract the new competitor’s price from the Apify JSON and attach it to the output: const price_comp_3 = item?.offers?.[2]?.price || null; item.price_comp_3 = price_comp_3; return item; (Adjust the index [2] based on the Apify output structure.) Update the Google Sheet “Update Row” node To save the new values into your Sheet: Open your Google Sheets Update Row node Scroll to Field Mapping Map Columns with New Data Hit the "Save & Execute" Button.🚀 ⚡ Requirements Apify account (free tier is enough) Apify "Amazon Product Scraper" API (Costs $40/month - 14-day free trial) Google Sheet containing product URLs Basic credentials setup inside n8n 🙌 Want me to set it up for you? I’ll configure the full automation — Apify scraper, n8n workflow, Sheets mapping, and error handling. Email me at: imarunavadas@gmail.com Automate the boring work and focus on smarter selling. 🚀