by Sk developer
Automated DA PA Checker Workflow for SEO Analysis Description This n8n workflow collects a website URL via form submission, retrieves SEO metrics like Domain Authority (DA) and Page Authority (PA) using the Moz DA PA Checker API, and stores the results in Google Sheets for easy tracking and analysis. Node-by-Node Explanation On form submission – Captures the website input from the user to pass to the Moz DA PA Checker API. DA PA API Request – Sends the website to the Moz DA PA Checker API via RapidAPI to fetch DA, PA, spam score, DR, and organic traffic. If – Checks if the API request to the Moz DA PA Checker API returned a successful response. Clean Output – Extracts only the useful data from the Moz DA PA Checker API response for saving. Google Sheets – Appends the cleaned SEO metrics to a Google Sheet for record-keeping. Use Cases SEO Analysis** – Quickly evaluate a website’s DA/PA metrics for optimization strategies. Competitor Research** – Compare domain authority and organic traffic with competitors. Link Building** – Identify high-authority domains for guest posting and backlinks. Domain Purchase Decisions** – Check metrics before buying expired or auctioned domains. Benefits Automated Workflow** – From input to Google Sheets without manual intervention. Accurate Metrics* – Uses the trusted *Moz DA PA Checker API** for DA, PA, spam score, DR, and traffic. Instant Insights** – Get SEO scores in seconds for faster decision-making. Easy Integration** – Seamless connection between RapidAPI and Google Sheets for data storage.
by Shelly-Ann Davy
Build authentic Reddit presence and generate qualified leads through AI-powered community engagement that provides genuine value without spam or promotion. 🎯 What This Workflow Does: This intelligent n8n workflow monitors 9 targeted subreddits every 4 hours, uses AI to analyze posts for relevance and lead potential, generates authentic helpful responses that add value to discussions, posts comments automatically, and captures high-quality leads (70%+ potential score) directly into your CRM—all while maintaining full Reddit compliance and looking completely human. ✨ Key Features: 6 Daily Checks: Monitors subreddits every 4 hours for fresh content 9 Subreddit Coverage: Customizable list of target communities AI Post Analysis: Determines relevance, intent, and lead potential Intelligent Engagement: Only comments when you can add genuine value Authentic Responses: AI-generated comments that sound human, not promotional Lead Scoring: 0-1.0 scale identifies high-potential prospects (0.7+ captured) Automatic CRM Integration: High-quality leads flow directly to Supabase Rate Limit Protection: 60-second delays ensure Reddit API compliance Native Reddit Integration: Official n8n Reddit node with OAuth2 Beginner-Friendly: 14+ detailed sticky notes explaining every component 🎯 Target Subreddits (Customizable): Insurance & Claims: r/Insurance - General insurance questions r/ClaimAdvice - Claim filing help r/AutoInsurance - Auto coverage discussions r/FloodInsurance - Flood damage queries r/PropertyInsurance - Property coverage Property & Home: r/homeowners - Property issues and claims r/RoofingContractors - Roof damage discussions Financial & Legal: r/PersonalFinance - Insurance decisions r/legaladvice - Legal aspects of claims 🤖 AI Analysis Components: Post Evaluation: Relevance score (0-100%) User intent detection Damage type identification (hail, water, fire, wind) Urgency level (low/medium/high) Lead potential score (0-1.0) Recommended services Engagement opportunity assessment Decision Criteria: Should engage? (boolean) Can we add value? (quality check) Is this promotional? (avoid spam) Lead worth capturing? (70%+ threshold) Typical Engagement Rate: 5-10% of analyzed posts (67-135 comments/day) 🔧 Technical Stack: Trigger: Schedule (every 4 hours, 6x daily) Reddit API: Native n8n node with OAuth2 AI Analysis: Supabase Edge Functions Response Generation: AI-powered contextual replies Lead Capture: Supabase CRM integration Rate Limiting: Wait node (60-second delays)
by SIENNA
Automated FTP/SFTP to MinIO Object Backup with Scheduling $\mapsto$ Can work with FTP/SFTP Servers like your Wordpress Website ! What this workflow does ? This workflow performs automated, periodic backups of files from a FTP or SFTP server directory to a MinIO S3 bucket running locally or on a dedicated container/VM/server. It can also work if the MinIO bucket is running on a remote cloud provider's infrastructure; you just need to change the URL and keys. Who's this intended for ? Storage administrators, cloud architects, or DevOps who need a simple and scalable solution for retrieving data from a remote FTP or SFTP Server. This can also be practical for Wordpress Devs that need to backup data from a server hosting a Wordpress Website. In that case, you'll just need to specify the folder that you want to backup (could be one from wordpress/uploads or even the root one) How it works This workflow uses commands to list and download files from a specific directory on a FTP-SFTP Server, then send them to MinIO using their version of the S3 API. The source directory can be a specific one or the entire server (the root directory) Requirements None, just a source folder/directory on a FTP/SFTP Server and a destination bucket on MinIO. You'll also need to get MinIO running. You're using Proxmox VE ? Create a MinIO LXC Container : https://community-scripts.github.io/ProxmoxVE/scripts?id=minio Need a Backup from another Cloud Storage Provider ? Need automated backup from another Cloud Storage Provider ? $\mapsto$ Check out our templates, we've done it with AWS, Azure, and GCP, and we even have a version for FTP/SFTP servers! $\odot$ These workflow can be integrated to bigger ones and modified to best suit your needs ! You can, for example, replace the MinIO node to another S3 Bucket from another Cloud Storage Provider (Backblaze, Wasabi, Scaleway, OVH, ...)
by SerpApi
Sync Google Maps Reviews to Google Sheets for Any Google Maps Query How it works This workflow accepts any query you might run on actual Google Maps to search for places. The search happens through SerpApi's Google Maps API. Once the workflow receives place results from Google Maps, it loops through each place fetching reviews using SerpApi's Google Maps Reviews API. By default, the workflow will be limited to fetch up to 50 reviews per place. This can be customized in the 'Set Review Limit' node}. The first page of reviews for a place will only return 8 reviews. All subsequent pages will return up to 20 reviews. The fetched reviews are sent to a connected Google Sheet. How to use Create a free SerpApi account here: https://serpapi.com/ Add SerpApi credentials to n8n. Your SerpApi API key is here: https://serpapi.com/manage-api-key Connect your Google Sheets accounts to n8n. Help available here: https://n8n.io/integrations/google-sheets/ Create a Google Sheet with these column headers: name, iso_date, rating, snippet Connect your Google Sheet in the 'Append Reviews' Google Sheet node Update the Search Query in the 'Search Google Maps' node to set your own query (Optional) Update the review limit from the default 50 in the 'Set Review Limit' node. Set it to a very high number (e.g. 50000) to get all possible reviews. Hit 'Test Workflow' to manually trigger the workflow. Limitations Can only retrieve the top 20 results from Google Maps. It won't paginate to get more results. The workflow could be extended to support Google Maps Pagination. Warning Each request to SerpApi consumes 1 search credit. Be mindful of how many search credits your account has before requesting more reviews than your account supports. As an example, if a Google Maps query returns 20 results and you fetch the default limit of 50 reviews per page, this will use up to 61 SerpApi search credits. Documentation Google Maps API Google Maps Reviews API SerpApi n8n Node Intro Guide
by KlickTipp
Community Node Disclaimer: This workflow uses KlickTipp community nodes. Introduction This workflow automates Stripe checkout confirmations by capturing transaction data and syncing it into KlickTipp. Upon successful checkout, the contact's data is enriched with purchase details and tagged to trigger a personalized confirmation campaign in KlickTipp. Perfect for digital product sellers, course creators, and service providers seeking an end-to-end automated sales confirmation process. Benefits Instant confirmation emails**: Automatically notify customers upon successful checkout—no manual processing needed. Structured contact data**: Order data (invoice link, amount, transaction ID, products) is stored in KlickTipp custom fields. Smart campaign triggering**: Assign dynamic tags to start automated confirmation or fulfillment sequences. Seamless digital delivery**: Ideal for pairing with tools like Memberspot or Mentortools to unlock digital products post-checkout. Key Features Stripe Webhook Trigger**: Triggers on Triggers on Checkout Session.completed events events. Captures checkout data including product names, order number, and total amount. KlickTipp Contact Sync**: Adds or updates contacts in KlickTipp. Maps Stripe data into custom fields Assigns a tag such as Stripe Checkout to initiate a confirmation campaign. Router Logic (optional)**: Branches logic based on product ID or Stripe payment link. Enables product-specific campaigns or follow-ups. Setup Instructions KlickTipp Preparation Create the following custom fields in your KlickTipp account: | Field Name | Field Type | |--------------------------|------------------| | Stripe \| Products | Text | | Stripe \| Total | Decimal Number | | Stripe \| Payment ID | Text | | Stripe \| Receipt URL | URL | Define a tag for each product or confirmation flow, e.g., Order: Course XYZ. Credential Configuration Connect your Stripe account using an API key from the Stripe Dashboard. Authenticate your KlickTipp connection with username/password credentials (API access required). Field Mapping and Workflow Alignment Map Stripe output fields to the KlickTipp custom fields. Assign the tag to trigger your post-purchase campaign. Ensure that required data like email and opt-in info are present for the contact to be valid. Testing and Deployment Click on Inactive to activate the scenario. Perform a test payment using a Stripe product link. Verify in KlickTipp: The contact appears with email and opt-in status. Custom fields for Stripe are filled. The campaign tag is correctly applied and confirmation email is sent. ⚠️ Note: Use real or test-mode API keys in Stripe depending on your testing environment. Stripe events may take a few seconds to propagate. Campaign Expansion Ideas Launch targeted upsell flows based on the product tag. Use confirmation placeholders like: [[Stripe | Products]] [[Stripe | Total]] [[Stripe | Payment ID]] [[Stripe | Products]] Route customers to different product access portals (e.g., Memberspot, Mentortools). Send follow-up content over multiple days using KlickTipp sequences. Customization You can extend the scenario using a switch node to: Assign different tags per used payment link Branch into upsell or membership activation flows Chain additional automations like CRM entry, Slack notification, or invoice creation. Resources: Use KlickTipp Community Node in n8n Automate Workflows: KlickTipp Integration in n8n
by Yaron Been
Generate Custom Text Content with IBM Granite 3.3 8B Instruct AI This workflow connects to Replicate’s API and uses the ibm-granite/granite-3.3-8b-instruct model to generate text. ✅ 🔵 SECTION 1: Trigger & Setup ⚙️ Nodes 1️⃣ On clicking 'execute' What it does:* Starts the workflow manually when you hit *Execute. Why it’s useful:** Perfect for testing text generation on-demand. 2️⃣ Set API Key What it does:* Stores your *Replicate API key** securely. Why it’s useful:** You don’t hardcode credentials into HTTP nodes — just set them once here. Beginner tip:** Replace YOUR_REPLICATE_API_KEY with your actual API key. 💡 Beginner Benefit ✅ No coding needed to handle authentication. ✅ You can reuse the same setup for other Replicate models. ✅ 🤖 SECTION 2: Model Request & Polling ⚙️ Nodes 3️⃣ Create Prediction (HTTP Request) What it does:* Sends a *POST request** to Replicate’s API to start a text generation job. Parameters include:** temperature, max_tokens, top_k, top_p. Why it’s useful:** Controls how creative or focused the AI text output will be. 4️⃣ Extract Prediction ID (Code) What it does:* Pulls the *prediction ID** and builds a URL for checking status. Why it’s useful:** Replicate jobs run asynchronously, so you need the ID to track progress. 5️⃣ Wait What it does:* Pauses for *2 seconds** before checking the prediction again. Why it’s useful:** Prevents spamming the API with too many requests. 6️⃣ Check Prediction Status (HTTP Request) What it does:* Polls the Replicate API for the *current status** (e.g., starting, processing, succeeded). Why it’s useful:** Lets you loop until the AI finishes generating text. 7️⃣ Check If Complete (IF Condition) What it does:* If the status is *succeeded, it goes to “Process Result.” Otherwise, it loops back to **Wait and retries. Why it’s useful:** Creates an automated polling loop without writing complex code. 💡 Beginner Benefit ✅ No need to manually refresh or check job status. ✅ Workflow keeps retrying until text is ready. ✅ Smart looping built-in with Wait + If Condition. ✅ 🟢 SECTION 3: Process & Output ⚙️ Nodes 8️⃣ Process Result (Code) What it does:* Collects the final *AI output**, status, metrics, and timestamps. Adds info like:** ✅ output → Generated text ✅ model → ibm-granite/granite-3.3-8b-instruct ✅ metrics → Performance data Why it’s useful:** Gives you a neat, structured JSON result that’s easy to send to Sheets, Notion, or any app. 💡 Beginner Benefit ✅ Ready-to-use text output. ✅ Easy integration with any database or CRM. ✅ Transparent metrics (when it started, when it finished, etc.). ✅✅✅ ✨ FULL FLOW OVERVIEW | Section | What happens | | ------------------------------ | ---------------------------------------------------------------------------- | | ⚡ Trigger & Setup | Start workflow + set Replicate API key. | | 🤖 Model Request & Polling | Send request → get Prediction ID → loop until job completes. | | 🟢 Process & Output | Extract clean AI-generated text + metadata for storage or further workflows. | 📌 How You Benefit Overall ✅ No coding needed — just configure your API key. ✅ Reliable polling — the workflow waits until results are ready. ✅ Flexible — you can extend output to Google Sheets, Slack, Notion, or email. ✅ Beginner-friendly — clean separation of input, process, and output. ✨ With this workflow, you’ve turned Replicate’s IBM Granite LLM into a no-code text generator — running entirely inside n8n! ✨
by SerpApi
Google Play Store App Rank and Rating Monitoring What and who this is for This workflow will be useful for anyone looking to do SEO tracking on the Google Play Store. It automates checking Google Play Store rank positions and average ratings for a list of app titles. The SerpApi component can also be modified to use other APIs for anyone looking for SEO tracking on any other search engine supported by SerpApi. How it works This workflow takes in a list of keywords and app titles to identify the apps' rank in Google Play Store search results. It also grabs the average rating of the app. The search uses SerpApi's Google Play Store API. The results are then synced to two different sheets in a Google Sheet. The first is a log of all past run. The latest results are appended to the bottom of the log. The second updates a kind of "dashboard" to show the results from the latest run. The workflow includes a Wait node that delays 4 seconds between each app title and keyword pair to prevent hitting the default Google Sheets' API per minute rate limit. You can delete this if you have a high enough custom rate limit on the Google Sheets API. The Schedule Trigger is configured to run at 10 AM UTC every day. How to use Create a free SerpApi account here: https://serpapi.com/ Add SerpApi credentials to n8n. Your SerpApi API key is here: https://serpapi.com/manage-api-key Connect your Google Sheets accounts to n8n. Help available here: https://n8n.io/integrations/google-sheets/ Copy this Google Sheet to your own Google account: https://docs.google.com/spreadsheets/d/1DiP6Zhe17tEblzKevtbPqIygH3dpPCW-NAprxup0VqA/edit?gid=1750873622#gid=1750873622 Set your own list of keywords and app titles to match in the 'Latest Run' sheet. This is the source list used to run the searches and must be set. Connect your Google Sheet in the 'Get Keywords and Titles to Match' Google Sheet node Connect your Google Sheet in the 'Update Rank & Rating Log' Google Sheet node Connect your Google Sheet again in the 'Update Latest Run' Google Sheet node (Optional) Update the schedule or disable the schedule to only run manually Documentation SerpApi Google Play Store API SerpApi n8n Node Intro Guide
by Kev
Automatically create and publish ready-to-post social media news updates — all powered by AI. This workflow turns any RSS feed into professional, branded posts, complete with visuals and captions. Use cases include automating news updates, sharing industry insights, or maintaining an active social presence without manual work. Good to know Fully automated end-to-end publishing — from RSS feed to social post Uses JsonCut for dynamic image composition (backgrounds, text overlays, logos) Publishes directly to Instagram (or other channels) via Blotato Utilizes OpenAI GPT-5 for post text and image prompt generation Polling mechanism checks job status every 3 seconds Setup time: under 10 minutes once credentials are in place How it works The RSS Trigger monitors any RSS feed for new content. OpenAI GPT-5 rewrites the headline and creates a short, social-friendly post caption. An AI image prompt is generated to match the article’s topic and mood. JsonCut combines the background, logo, and headline text into a branded image. Once the image is ready, Blotato uploads and publishes the post directly to Instagram (or other connected platforms). The process runs completely automatically — no human input required after setup. How to use Import the workflow into your n8n instance. Configure your RSS feed URL(s). Add your JsonCut, Blotato, and OpenAI credentials. Activate the workflow — it will automatically generate and post new content whenever your RSS source updates. Requirements Free account at jsoncut.com Account at blotato.com (paid service — can be replaced with any social media API or publishing platform) API keys for both services: JsonCut API Key via app.jsoncut.com Blotato API Key via www.blotato.com OpenAI credential** (GPT-5 or compatible model) RSS Feed URL** (e.g. from a news site, blog, or press page) Setup steps Sign up for a free account at app.jsoncut.com. If you use Blotato, create an account at blotato.com and generate an API key. In n8n, add: JsonCut API Key (HTTP Header Auth, header: x-api-key) Blotato API credential (optional — can be replaced) OpenAI credential for GPT-5 Replace the example RSS URL in the RSS Feed Trigger node with your own. Activate the workflow — it will start monitoring, generating, and posting automatically. Customising this workflow You can easily adjust: The image layout and branding (in the “Create JsonCut Job” node) The tone or length of social captions (in the “Create Instagram Text” node prompt) The publishing platform — replace Blotato with another integration (e.g. Buffer, Hootsuite, or native social API) Posting frequency via the RSS trigger interval For advanced customization, check out: JsonCut Documentation JsonCut Image Generation Examples Blotato Website n8n Documentation
by Port IO
Auto-resolve Jira tickets with coding agents Coding agents can significantly speed up development, but crucial engineering context often gets lost in the process. This guide demonstrates how to use Port as a context lake in n8n workflows to automatically generate GitHub issues from Jira tickets with rich organizational context, ensuring that important information is preserved when assigning them to GitHub Copilot and linking pull requests back to Jira. This setup helps establish a seamless ticket-to-PR workflow, bridging the gap between Jira and GitHub while leveraging Port's comprehensive software catalog as a source of truth. How it works The n8n workflow orchestrates the following steps: Jira trigger — The workflow listens for Jira issue updates via webhook. Condition check — Verifies that the issue status is "In Progress" and has the required label (e.g., "product_approved") without the "copilot_assigned" label. Port context extraction — Uses Port's n8n node to query your software catalog for relevant context about services, repositories, teams, dependencies, and documentation related to the Jira issue. Parse response — Retrieves the AI-generated GitHub issue title and body from Port. Create GitHub issue — Creates a new GitHub issue with the enriched context from Port. Assign to Copilot — Adds a comment to the GitHub issue instructing Copilot to take ownership. Add issue link to Jira ticket — Adds a comment to the Jira ticket with the GitHub issue URL, providing clear traceability. Mark ticket as assigned — Updates the Jira ticket to add the "copilot_assigned" label, preventing duplicate processing. Setup [ ] Connect your Jira Cloud account and enable issue_updated events [ ] Register for free on Port.io [ ] Connect your Port.io account and add the API key [ ] Connect your GitHub account and select the target repository [ ] Ensure a Copilot bot or @copilot user has access to the repository [ ] Confirm the workflow webhook or Jira trigger URL is active [ ] Test by moving a product_approved ticket to In Progress. [ ] You should be good to go! Prerequisites You have a Port account and have completed the onboarding process. Port's GitHub app is installed in your account. Port's Jira integration is installed in your account. You have a working n8n instance (Cloud or self-hosted) with Port's n8n custom node installed. Your GitHub organization has GitHub Copilot enabled, so Copilot can be automatically assigned to any issues created through this guide. ⚠️ This template is intended for Self-Hosted instances only.
by V3 Code Studio
How it works This workflow provides an API endpoint /api/v1/get-companies that retrieves company records directly from your Odoo database. It’s built for teams who need to query or export company data — either as structured JSON for integrations or as Excel (.xlsx) for reporting. When a request is made, the workflow: Accepts query parameters (name, response_format). Validates the name input (required for company search). Fetches all matching companies from Odoo using a like filter for partial name matches. Returns results as a JSON response or Excel file depending on the response_format parameter. This makes it ideal for quickly exporting or syncing company information with other tools. Setup steps Open the Webhook node and note the endpoint /api/v1/get-companies. Connect your Odoo API credentials in the Odoo node. Optionally update the fieldsList in the Odoo node to include more company details (VAT, address, etc.). Test using a browser or Postman: /api/v1/get-companies?name=Tech&response_format=json /api/v1/get-companies?name=Tech&response_format=excel
by Francisco Rivera
What this template does Connect a Vapi AI voice agent to Google Calendar to capture contact details and auto-book appointments. The agent asks for name, address, service type, and a preferred time. The workflow checks availability and either proposes times or books the slot—no code needed. How it works (node map) Webhook: Production URL = VAPI Server URL — receives tool calls from Vapi and returns results. CONFIGURATION (EDIT ME)** — your timezone, work hours, meeting length, buffers, and cadence. Route by Tool Name** — routes Vapi tool calls: checkAvailability → calendar lookup path bookAppointment → create event path Get Calendar Events (EDIT ME)** — reads events for the requested day. Calculate Potential Slots / Filter for Available Slots** — builds conflict-free options with buffers. Respond with Available Times — returns formatted slots to Vapi. Book Appointment in Calendar (EDIT ME)** — creates the calendar event with details. Booking Confirmation** — returns success back to Vapi. > Sticky notes in the canvas show exactly what to edit (required by n8n). No API keys are hardcoded; Google uses OAuth credentials. Requirements n8n (Cloud or self-hosted) Google account with Calendar (OAuth credential in n8n) Vapi account + one Assistant Setup (5 minutes) A) Vapi → n8n connection Open the Webhook node and copy the Production URL. In Vapi → Assistant → Messaging, set Server URL = that Production URL. In Server Messages, enable only toolCalls. B) Vapi tools (names must match exactly) Create two Custom Tools in Vapi and attach them to the assistant: Tool 1: checkAvailability Arguments** initialSearchDateTime (string, ISO-8601 with timezone offset, e.g. 2025-09-09T09:00:00-05:00) Tool 2: Arguments** startDateTime (string, ISO-8601 with tz) endDateTime (string, ISO-8601 with tz) clientName (string) propertyAddress (string) serviceType (string) > The Switch node routes based on C) Configure availability Open 1. CONFIGURATION (EDIT ME) and set: D) Connect Google Calendar Open 2. Get Calendar Events (EDIT ME) → Credentials: select/create Google Calendar OAuth. Then choose the calendar to check availability. Open 3. Book Appointment in Calendar (EDIT ME) → use the same credential and same calendar to book. E) Activate & test Toggle the workflow Active. Call your Vapi number (or start a session) and book a test slot. Verify the event appears with description fields (client, address, service type, call id). Customising Change summary/description format in 3. Book Appointment. Add SMS/Email confirmations, CRM sync, rescheduling, or analytics as follow-ups (see sticky note “I’m a note”). Troubleshooting No response back to Vapi** → confirm Vapi is set to send toolCalls only and the Server URL matches the Production URL. Switch doesn’t route** → tool names must be exactly checkAvailability and bookAppointment. No times returned** → ensure timezone + work hours + cadence generate at least one future slot; confirm Google credential and calendar selection. Event not created** → use the same Google credential & calendar in both nodes; check OAuth scopes/consent. Security & privacy Google uses OAuth; credentials live in n8n. No API keys hardcoded. Webhook receives only the fields needed to check times or book.
by Harshil Agrawal
This workflow automatically creates an event in PostHog when a request is made to a webhook URL. Prerequisites A PostHog account and credentials Nodes Webhook node triggers the workflow when a URL is accessed. PostHog node creates a new event in PostHog.