by Richard Uren
Shopify GraphQL cursor loop Many Shopify GraphQL queries have the ability to return a cursor which you can loop over, however the N8N GraphQL node does not natively have the ability to fetch pages. This simple 3 node workflow displays how to setup a cursor to fetch all items in a collection. Note : The pageSize in the "Shopify, products" node is set to 5 to illustrate how querying by cursor works. In production set this to a much larger value. Also, Update the Endpoint in GraphQL node to reflect your Shopify store.
by Eric
This is a specific use case. The ElevenLabs guide for Cal.com bookings is comprehensive but I was having trouble with the booking API request. So I built a simple workflow to validate the request and handle the booking creation. Who's this for? You have an ElevenLabs voice agent (or other external service) booking meetings in your Cal.com account and you want more control over the book_meeting tool called by the voice agent. How's it work? Request is received by the webhook trigger node Request sent from ElevenLabs voice agent, or other source Request body contains contact info for the user with whom a meeting will be booked in Cal.com Workflow validates input data for required fields in Cal.com If validation fails, a 400 bad request response is returned If valid, meeting is booked in Cal.com api How do I use this? Create a custom tool in the ElevenLabs agent setup, and connect it to the webhook trigger in this workflow. Add authorization for security. Instruct your voice agent to call this tool after it has collected the required information from the user. Expected input structure Note: Modify this according to your needs, but be sure to reflect your changes in all following nodes. Requirements here depend on required fields in your Cal.com event type. If you have multiple event types in Cal.com with varying required fields, you'll need to handle this in this workflow, and provide appropriate instructions in your *voice agent prompt*. "body": { "attendee_name": "Some Guy", "start": "2025-07-07T13:30:00Z", "attendee_phone": "+12125551234", "attendee_timezone": "America/New_York", "eventTypeId": 123456, "attendee_email": "someguy@example.com", "attendee_company": "Example Inc", "notes": "Discovery call to find synergies." } Modifications Note: ElevenLabs doesn't handle webhook response headers or body, and only recognizes the response code. In other words, if the workflow responds with 400 Bad request that's the only info the voice agent gets back; it doesn't get back any details, eg. "User email still needed". You can modify the structure of the expected webhook request body, and then you should reflect that structure change in all following nodes in the workflow. Ie. if you change attendee_name to attendeeFirstName and attendeeLastName then you need to make this change in the following nodes that use these properties. You can also require or make optional other user data for the Cal.com event type which would reduce or increase the data the voice agent must collect from the user. You can modify the authorization of this webhook to meet your security needs. ElevenLabs has some limitations and you should be mindful of those, but it also offers a secret feature with proves useful. An improvement to this workflow could include a GET request to a CRM or other db to get info on the user interacting with the voice agent. This could reduce some of the data collection needed from the voice agent, like if you already have the user's email address, for example. I believe you can also get the user's phone number if the voice agent is set up on a dial-in interface, so then the agent wouldn't need to ask for it. This all depends on your use case. A savvy step might be prompting the voice agent to get an email, and using the email in this workflow to pull enrichment data from Apollo.io or similar ;-)
by David w/ SimpleGrow
This n8n workflow tracks user engagement in a specific WhatsApp group by capturing incoming messages via a Whapi webhook. It first filters messages to ensure they come from the correct group, then identifies the message type—text, emoji reaction, voice, or image. The workflow searches for the user in an Airtable database using their WhatsApp ID and increments their message count by one. It updates the Airtable record with the new count and the date of the last interaction. This automated process helps measure user activity and supports engagement initiatives like weekly raffles or rewards. The system is flexible and can be expanded to include more message types or additional actions. Overall, it provides a seamless way to encourage and track user participation in your WhatsApp community.
by David Olusola
Overview A comprehensive educational workflow that demonstrates practical JavaScript usage in n8n's Code node through real-world business scenarios. Perfect for learning data manipulation, transformation, and automation patterns that you can immediately apply to client projects. What This Template Teaches: Data Filtering & Transformation - Filter employees by age, calculate bonuses, format contact information Statistical Analysis - Generate team statistics, averages, role distributions, and KPIs Multi-Format Export - Create CSV files, email lists, and API-ready payloads from raw data n8n Best Practices - Proper JSON handling, return formats, and data flow patterns How It Works: Manual Trigger starts the workflow with sample employee data Set Sample Data provides realistic business data (employees with roles, salaries, ages) Three Code Node Examples process the same data differently: Filter & Transform: Creates adult employee list with calculated bonuses Calculate Stats: Generates comprehensive team analytics and reports Format for Export: Prepares data for external systems (APIs, emails, CSV) Key Learning Points: Access input data using items[0].json.propertyName Return proper n8n format with [{ json: data }] structure Use JSON.parse() for string-to-object conversion Apply JavaScript array methods (filter, map, reduce) for data processing Handle multiple output scenarios and data aggregation Perfect For: n8n beginners learning Code node fundamentals Developers transitioning to n8n automation Client demos showing data processing capabilities Team training and onboarding sessions Foundation for building custom business automation workflows Business Use Cases: Transform this template for lead qualification, customer segmentation, report generation, data enrichment, and API integrations. Each Code node pattern can be adapted for different industries and automation needs.
by Baptiste Fort
Who is it for? This workflow is perfect for marketers, sales teams, agencies, and local businesses who want to save time by automating lead generation from Google Maps. It’s ideal for real estate agencies, restaurants, service providers, and any local niche that needs a clean database of fresh contacts, including emails, websites, and phone numbers. ✅ Prerequisites Before starting, make sure you have: Apify account** → to scrape Google Maps data OpenAI API key** → for GPT-4 email extraction Airtable account & base** → for structured lead storage Gmail account with OAuth** → to send personalized outreach emails Your Airtable base should have these columns: | Title | Street | Website | Phone Number | Email | URL | |-------------------------|-------------------------|--------------------|-----------------|------------------------|----------------------| | Paris Real Estate Agency| 10 Rue de Rivoli, Paris | https://agency.fr | +33 1 23 45 67 | contact@agency.fr | maps.google.com/... | 🏡 Example Use Case To keep things clear, we’ll use real estate agencies in Paris as an example. But you can replace this with restaurants, plumbers, lawyers, or even hamster trainers (you never know). 🔄 How the workflow works Scrape Google Maps leads with Apify Clean & structure the data (name, phone, website) Visit each website & extract emails with GPT-4 Save all leads into Airtable Automatically send a personalized email via Gmail This works for any industry, keyword, or location. Step 1 – Scraping Google Maps with Apify Start simply: Open your n8n workflow and choose the trigger: “Execute Workflow” (manual trigger). Add an HTTP Request node (POST method). Now, head over to Apify Google Maps Extractor. Fill in the fields according to your needs: Keyword: e.g., "real estate agency" (or restaurant, plumber...) Location: "Paris, France" Number of results: 50 (or more) Optional: filters (with/without a website, by categories…) Click Run to test the scraper. Then click API → select API endpoints tab. Choose “Run Actor synchronously and get dataset items”. Copy the URL, go back to n8n, and paste it into your HTTP Request node (URL field). Then enable: Body Content Type → JSON Specify Body Using JSON Go back to Apify, click the JSON tab, copy everything, and paste it into the JSON field of your HTTP Request. If you now run your workflow, you'll get a nice structured table filled with Google Maps data. Pretty magical already—but we're just getting started! Step 2 – Cleaning Things Up (Edit Fields) Raw data is cool, but messy. Add an Edit Fields node next, using Manual Mapping mode. Here’s what you keep (copy-paste friendly): Title → {{ $json.title }} Address → {{ $json.address }} Website → {{ $json.website }} Phone → {{ $json.phone }} URL → {{ $json.url }} Now, you have a clean, readable table ready to use. Step 3 – Handling Each Contact Individually (Loop Over Items) Next, we process each contact one by one. Add the Loop Over Items node: Set Batch Size to 20 or more, depending on your needs. This node is simple but crucial to avoid traffic jams in the automation. Step 4 – Isolating Websites (Edit Fields again) Add another Edit Fields node (Manual Mapping). This time, keep just: Website → {{ $json.website }} We've isolated the websites for the next step: scraping them one by one. Step 5 – Scraping Each Website (HTTP Request) Now, we send our little robot to visit each website automatically. Add another HTTP Request node: Method: GET URL: {{ $json.website }} (from the previous node) This returns the raw HTML content of each site. Yes, it's ugly, but we won't handle it manually. We'll leave the next step to AI! Step 6 – Extracting Emails with ChatGPT We now use OpenAI (Message a Model) to politely ask GPT to extract only relevant emails. Configure as follows: Model: GPT-4-1-mini or higher Operation: Message a Model Simplify Output: ON Prompt to copy-paste: Look at this website content and extract only the email I can contact this business. In your output, provide only the email and nothing else. Ideally, this email should be of the business owner, so if you have 2 or more options, try for the most authoritative one. If you don't find any email, output 'Null'. Exemplary output of yours: name@examplewebsite.com {{ $json.data }} ChatGPT will kindly return the perfect email address (or 'Null' if none is found). Step 7 – Neatly Store Everything in Airtable Almost done! Add an Airtable → Create Record node. Fill your Airtable fields like this: | Airtable Field | Content | n8n Variable | | ------------------ | ------------------------------- | ------------------------------------------ | | Title | Business name | {{ $('Edit Fields').item.json.Title }} | | Street | Full address | {{ $('Edit Fields').item.json.Address }} | | Website | Website URL | {{ $('Edit Fields').item.json.Website }} | | Phone Number | Phone number | {{ $('Edit Fields').item.json.Phone }} | | Email | Email retrieved by the AI agent | {{ $json.message.content }} | | URL | Google Maps link | {{ $('Edit Fields').item.json.URL }} | Now, you have a tidy Airtable database filled with fresh leads, ready for action. Step 8 – Automated Email via Gmail (The Final Touch) To finalize the workflow, add a Gmail → Send Email node after your Airtable node. Here’s how to configure this node using the data pulled directly from your Airtable base (from the previous step): Recipient (To): Retrieve the email stored in Airtable ({{ $json.fields.Email }}). Subject: Use the company name stored in Airtable ({{ $json.fields.Title }}) to personalize the subject line. Body: You can include several fields directly from Airtable, such as: Company name: {{ $json.fields.Title }} Website URL: {{ $json.fields.Website }} Phone number: {{ $json.fields["Phone Number"] }} Link to the Google Maps listing: {{ $json.fields.URL }} All of this data is available in Airtable because it was automatically inserted in the previous step (Step 7). This ensures that each email sent is fully personalized and based on clear, reliable, and structured information.
by Jonathan
This workflow will take a timer entry from Clockify and submit it to a matching ticket in Syncro. It saves the time entry ID from Clockify and the time entry ID from Syncro into a Google Sheets. Then, it will check if a match already exists from a previous update and will update the same time entry if the description or time is changed in Clockify. There is a Set node with the name and Syncro IDs of technicians. If you have multiple technicians with the same name, this won't work for you. Likewise, if the name in Clockify doesn't exactly match what you put in the Set, it won't work. You also need to setup a webhook in Clockify set to trigger on "Time entry updated (anyone)" and pointed at your workflow. Configured this way, you can start and stop time entries at will and it won't do anything until you change the description. > This workflow is part of an MSP collection, The original can be found here: https://github.com/bionemesis/n8nsyncro
by n8n Team
Who this template is for This template is for everyone who needs to work with XML data a lot and wants to convert it to JSON instead. Use case Many products still work with XML files as their main language. Unfortunately, not every software still supports XML, as many switched to more modern storing languages such as JSON. This workflow is designed to handle the conversion of XML data to JSON format via a webhook call, with error handling and Slack notifications integrated into the process. How this workflow works Triggering the workflow: This workflow initiates upon receiving an HTTP POST request at the webhook endpoint specified in the "POST" node. The endpoint, designated as , can be accessed externally by sending a POST request to that URL. Data routing and processing: Upon receiving the POST request, the Switch node routes the workflow's path based on conditions determined by the content type of the incoming data or any encountered errors. The Extract From File and Edit Fields (Set) nodes manage XML input processing, adapting their actions according to the data's content type. XML to JSON conversion: The XML data extracted from the input is passed through the "XML" node, which performs the conversion process, transforming it into JSON format. Response handling: If the XML-to-JSON conversion is successful, a success response is sent back with a status of "OK" and the converted JSON data. If there are any errors during the XML-to-JSON conversion process, an error response is sent back with a status of "error" and an error message. Error handling: in case of an error during processing, the workflow sends a notification to a Slack channel designated for error reporting. Set up steps Set up your own in the Webhook node. While building or testing a workflow, use a test webhook URL. When your workflow is ready, switch to using the production webhook URL. Set credentials for Slack.
by Tom
This workflow shows a no code approach to creating Salesforce accounts and contacts based on data coming from Google Sheets. To run the workflow: Make sure you have both Google Sheets and Salesforce authenticated with n8n. Have a Google Sheet with contacts and their account names ready, copy the respective sheet ID from the URL: Add the sheet ID to the Google Sheet node of the workflow: Hit Execute Workflow Here is how it works: The workflow first searches for existing Salesforce accounts by name. It then branches out depending on whether the account already exists in Salesforce or not. If an account does not exist yet, it will be created. The data is then normalised before both branches converge again. Finally the contacts are created or updated as needed in Salesforce.
by bangank36
This workflow scrapes Trustpilot reviews for a given profile and saves them into Google Sheets. How It Works Clone this Google Sheets template, which includes two sheets: trustpilot A raw collection of Trustpilot reviews. You can customize it as needed. helpfulcrowd This sheet follows the format from this HelpfulCrowd guide, with a slight modification: an added review_id column to support the upsert process. Once the workflow is complete, export the sheet as a CSV and upload it to HelpfulCrowd. For detailed steps, see this post. Running the Workflow You can trigger the workflow on-demand or schedule it to run at a set interval. Requirements Trustpilot business name** (e.g., n8n.io in https://www.trustpilot.com/review/n8n.io). Update this name and pagination settings in the Global node. Google Sheets API credentials** Check out my other templates: 👉 My n8n Templates
by Rosh Ragel
Programatically Pull Square Report Data Into N8N What It Does This sub-workflow connects to the Square API and generates a daily sales summary report for all of your Square locations. The report matches the figures displayed in the Square Dashboard > Reports > Sales Summary. It’s designed to be reused in other workflows, ideal for reporting, data storage, accounting, or automation. Prerequisites To use this workflow, you'll need: Square API credentials (configured as a Header Auth credential) How to Set Up Square Credentials: Go to Credentials > Create New Choose Header Auth Set the Name to "Authorization" Set the Value to your Square Access Token (e.g., Bearer <your-api-key>) How It Works Trigger: The workflow is triggered as a sub-workflow, requiring a report_date input. Fetch Locations: An HTTP request gets all Square locations linked to your account. Fetch Orders: For each location, an HTTP request pulls completed orders for the specified report_date. Filter Empty Locations: Locations with no sales are ignored. Aggregate Sales Data: A Code node processes the order data and produces a summary identical to Square’s built-in Sales Summary report. Output: A cleaned, consistent summary that can be consumed by parent workflows or other nodes. Example Use Cases Automatically store daily sales data in Google Sheets, MySQL, or PostgreSQL for analysis and historical tracking Automatically send daily email or Slack reports to managers or finance teams Build weekly/monthly reports by looping over multiple dates Push sales data into accounting software like QuickBooks or Xero for automated bookkeeping Calculate commissions or rent payments based on sales volume How to Use Configure both HTTP Request nodes to use your Square API credential. If you are not in the Toronto/New York Timezone, please change the "start_at" and "end_at" parameters in the second HTTP node from "-05:00" to your local timezone Use as a sub-workflow inside a main workflow. Pass a report_date (formatted as YYYY-MM-DD) to the sub-workflow when you call it. Customization Options Add pagination to handle locations with more than 1,000 orders per day. Expand the workflow to save or send the report output via additional integrations (email, database, webhook, etc.). Why It's Useful This workflow saves time, reduces manual report pulling from Square, and enables smarter automation around sales data—whether for operations, finance, or performance monitoring.
by bangank36
This workflow automates the Mark as Fulfilled action in Squarespace for each order, ensuring a seamless fulfillment process without manual intervention. How It Works This workflow retrieves all pending Squarespace orders and processes their fulfillment automatically. The workflow follows these steps: 1️⃣ Get all pending orders using the HTTP Request node (Since Squarespace does not have a n8n node) 2️⃣ Create a fulfillment request using Fulfill Order node The Filter Orders node can be used to filter valid pending order to process. Step-by-step The workflow can be run as requested or on schedule You can adjust these parameters within the Global and filter nodes: Global node for API Setting api-version** (string, required) – The current API version (see Squarespace Orders API documentation). modifiedAfter**={a-datetime} (string, conditional) – Fetch orders modified after a specific date (ISO 8601 format). modifiedBefore**={b-datetime} (string, conditional) – Fetch orders modified before a specific date (ISO 8601 format). cursor**={c} (string, conditional) – Used for pagination, cannot be combined with other filters. fulfillmentStatus**={status} (optional, enum) – Filter by fulfillment status: PENDING, FULFILLED, or CANCELED. maxPage** – Set -1 to enables infinite pagination to fetch all available orders. Filter Orders node Order Filtering – Ensures only valid orders are fulfilled, particularly useful if: You sell digital downloads or gift cards exclusively. You use third-party fulfillment services for all products. Requirements Credentials To use this workflow, you need: Squarespace API Key – Retrieve from your Squarespace settings. Who Is This For? Squarespace store owners looking to automate their fulfillment process. Merchants selling digital or personalized products who need instant fulfillment. Explore More Templates Get all orders in Squarespace to Google Sheets Convert Squarespace Profiles to Shopify Customers in Google Sheets Fetch Squarespace Blog & Event Collections to Google Sheets 👉 Check out my other n8n templates
by Yaron Been
This workflow provides automated access to the Jhonp4 Jhonpiedrahita_Ai01 AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for other generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete other generation process using the Jhonp4 Jhonpiedrahita_Ai01 model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Advanced AI model for automated processing and generation tasks. Key Capabilities Specialized AI model with unique capabilities** Advanced processing and generation features** Custom AI-powered automation tools** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Jhonp4/jhonpiedrahita_ai01 AI model Jhonp4 Jhonpiedrahita_Ai01**: The core AI model for other generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Other Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Specialized Processing**: Handle specific AI tasks and workflows Custom Automation**: Implement unique business logic and processing Data Processing**: Transform and analyze various types of data AI Integration**: Add AI capabilities to existing systems and workflows Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #aiprocessing #dataprocessing #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation