by bangank36
This workflow retrieves all Shopify Customers and saves them into a Google Sheets spreadsheet using the Shopify Admin REST API. It uses pagination to ensure all customers are collected efficiently. N8n does not have built-in actions for Customers, so I built the workflow using an HTTP Request node. How It Works This workflow uses the HTTP Request node to fetch paginated chunks manually. Shopify uses cursor-based pagination (page_info) instead of traditional page numbers. Pagination data is stored in the response headers, so we need to enable Include Response Headers and Status in the HTTP Request node. The workflow processes customer data, saves it to Google Sheets, and formats a compatible CSV for Squarespace Contacts import. This workflow can be run on demand or scheduled to keep your data up to date. Parameters You can adjust these parameters in the HTTP Request node: limit** – The number of customers per request (default: 50, max: 250). fields** – Comma-separated list of fields to retrieve. page_info** – Used for pagination; only limit and fields are allowed when paginating. 📌 Note: When you query paginated chunks with page_info, only the limit and fields parameters are allowed. Credentials Shopify API Key** – Required for authentication. Google Sheets API credentials** – Needed to insert data into the spreadsheet. Google Sheets Template Clone this spreadsheet: 📎 Google Sheets Template According to Squarespace documentation, your spreadsheet can have up to three columns and must be arranged in this order (no header): Email Address First Name (optional) Last Name (optional) Shopify Customer ID (this field will be ignored) Exporting a Compatible CSV for Squarespace Contacts This workflow also generates a CSV file that can be imported into Squarespace Contacts. How to Import the CSV to Squarespace: Open the Lists & Segments panel and click on your mailing list. Click Add Subscribers, then select Upload a list. Click Add a CSV file and select the file to import. Toggle These subscribers accept marketing to confirm permission. Preview your list, then click Import. Who Is This For? Shopify store owners** who need to export all customers to Google Sheets. Anyone looking for a flexible and scalable** Shopify customers extraction solution. Squarespace website owners** who want to bulk-create their Contacts using CSV. Explore More Templates 👉 Check out my other n8n templates
by Julian Reich
This n8n workflow automates the transformation of press releases into polished articles. It converts the content of an email and its attachments (PDF or Word documents) into an AI-written article/blog post. What does it do? This workflow assists editors and journalists in managing incoming press-releases from governments, companies, NGOs, or individuals. The result is a draft article that can easily be reviewed by the editor, who receives it in a reply email containing both the original input and the output, plus an AI-generated self-assessment. This self-assessment represents an additional feedback loop where the AI compares the input with the output to evaluate the quality and accuracy of its transformation. How does it work? Triggered by incoming emails in Google, it first filters attachments, retaining only Word and PDF files while removing other formats like JPGs. The workflow then follows one of three paths: If no attachments remain, it processes the inline email message directly. For PDF attachments, it uses an extractor to obtain the document content. For Word attachments, it extracts the text content by a http request. In each case, the extracted content is then passed to an AI agent that converts the press release into a well-structured article according to predefined prompts. A separate AI evaluation step provides a self-assessment by comparing the output with the original input to ensure quality and accuracy. Finally, the workflow generates a reply email to the sender containing three components: the original input, the AI-generated article, and the self-assessment. This streamlined process helps editors and journalists efficiently manage incoming press releases, delivering draft articles that require minimal additional editing." How to set it up 1. Configure Gmail Connection: Create or use an existing Gmail address Connect it through the n8n credentials manager Configure polling frequency according to your needs Set the trigger event to "Message Received" Optional: Filter incoming emails by specifying authorized senders Enable the "Download Attachments" option 2. Set Up AI Integration: Create an OpenAI account if you don't have one Create a new AI assistant or use an existing one Customize the assistant with specific instructions, style guidelines, or response templates Configure your API credentials in n8n to enable the connection 3. Configure Google Drive Integration: Connect your Google Drive credentials in n8n Set the operation mode to "Upload" Configure the input data field name as "data" -Set the file naming format to dynamic: {{ $json.fileName }} 4. Configure HTTP Request Node: Set request method to "POST" Enter the appropriate Google API endpoint URL Include all required authorization headers Structure the request body according to API specifications Ensure proper error handling for API responses 5. Configure HTTP Request Node 2: Set request method to "GET" Enter the appropriate Google API endpoint URL Include all required authorization headers Configure query parameters as needed Implement response validation and error handling 6. Configure Self-Assessment Node: Set operation to "Message a Model" Select an appropriate AI model (e.g., GPT-4, Claude) Configure the following prompt in the Message field: Please analyze and compare the following input and output content: (for example) Original Input: {{ $('HTTP Request3').item.json.data }} {{ $('Gmail Trigger').item.json.text }} Generated Output: {{ $json.output }} Provide a detailed self-assessment that evaluates: Content accuracy and completeness Structure and readability improvements Tone and style appropriateness Any information that may have been omitted or misrepresented Overall quality of the transformation 7. Configure Reply Email Node: Set operation to "Send" and select your Gmail account Configure the "To" field to respond to the original sender: {{ $('Gmail Trigger').item.json.from }} Set an appropriate subject line: RE: {{ $('Gmail Trigger').item.json.subject }} Structure the email body with clear sections using the following template: handlebars EDITED ARTICLE* {{ $('AI Article Writer 2').item.json.output }} SELF-ASSESSMENT* Rating: 1 (poor) to 5 (excellent) {{ $json.message.content }} ORIGINAL MESSAGE* {{ $('Gmail Trigger').item.json.text }} ATTACHMENT CONTENT* {{ $('HTTP Request3').item.json.data }} Note: Adjust the template fields according to the input source (PDF, Word document, or inline message). For inline messages, you may not need the "ATTACHMENT CONTENT" section.
by Samir Saci
Tags: Supply Chain, Logistics, Control Tower Context Hey! I’m Samir, a Supply Chain Engineer and Data Scientist from Paris, and the founder of LogiGreen Consulting. We design tools to help companies improve their logistics processes using data analytics, AI, and automation—to reduce costs and minimize environmental impact. > Let’s use N8N to build smarter and more sustainable supply chains! 📬 For business inquiries, you can add me on LinkedIn Who is this template for? This workflow template is designed for logistics operations that need a monitoring solution for their distribution chains. Connected to your Transportation Management Systems, this AI agent can answer any question about the shipments handled by your distribution teams. How does it work? The workflow is connected to a Google BigQuery table that stores outbound order data (customer deliveries). Here’s what the AI agent does: 🤔 Receives a user question via chat. 🧠 Understands the request and generates the correct SQL query. ✅ Executes the SQL query using a BigQuery node. 💬 Responds to the user in plain English. Thanks to the chat memory, users can ask follow-up questions to dive deeper into the data. What do I need to get started? This workflow requires no advanced programming skills. You’ll need: A Google BigQuery account with an SQL table storing transactional records. An OpenAI API key (GPT-4o) for the chat model. Next Steps Follow the sticky notes in the workflow to configure each node and start using AI to support your supply chain operations. 🎥 Watch My Tutorial 🚀 Curious how N8N can transform your logistics operations? Notes The chat trigger can easily be replaced with Teams, Telegram, or Slack for a better user experience. You can also connect this to a customer chat window using a webhook. This workflow was built using N8N version 1.82.1 Submitted: March 24, 2025
by ParquetReader
📄 Convert Parquet, Feather, ORC & Avro Files with ParquetReader This workflow allows you to upload and inspect Parquet, Feather, ORC, or Avro files via the ParquetReader API. It instantly returns a structured JSON preview of your data — including rows, schema, and metadata — without needing to write any custom code. ✅ Perfect For Validating schema and structure before syncing or transformation Previewing raw columnar files on the fly Automating QA, ETL, or CI/CD workflows Converting Parquet, Avro, Feather, or ORC to JSON ⚙️ Use Cases Catch schema mismatches before pipeline runs Automate column audits in incoming data files Enrich metadata catalogs with real-time schema detection Integrate file validation into automated workflows 🚀 How to Use This Workflow 📥 Trigger via File Upload You can trigger this flow by sending a POST request with a file using curl, Postman, or from another n8n flow. 🔧 Example (via curl): curl -X POST http://localhost:5678/webhook-test/convert \ -F "file=@converted.parquet" > Replace converted.parquet with your local file path. You can also send Avro, ORC or Feather files. 🔁 Reuse from Other Flows You can reuse this flow by calling the webhook from another n8n workflow using an HTTP Request node. Make sure to send the file as form-data with the field name file. 🔍 What This Flow Does: Receives the uploaded file via webhook (file) Sends it to https://api.parquetreader.com/parquet as multipart/form-data (field name: file) Receives parsed data (rows), schema, and metadata in JSON format 🧪 Example JSON Response from this flow { "data": [ { "full_name": "Pamela Cabrera", "email": "bobbyharrison@example.net", "age": "24", "active": "True", "latitude": "-36.1577385", "longitude": "63.014954", "company": "Carter, Shaw and Parks", "country": "Honduras" } ], "meta_data": { "created_by": "pyarrow", "num_columns": 21, "num_rows": 10, "serialized_size": 7598, "format_version": "0.12" }, "schema": [ { "column_name": "full_name", "column_type": "string" }, { "column_name": "email", "column_type": "string" }, { "column_name": "age", "column_type": "int64" }, { "column_name": "active", "column_type": "bool" }, { "column_name": "latitude", "column_type": "double" }, { "column_name": "longitude", "column_type": "double" }, { "column_name": "company", "column_type": "string" }, { "column_name": "country", "column_type": "string" } ] } 🔐 API Info Authentication: None required Supported formats: .parquet, .avro, .orc, .feather Free usage: No signup needed; API is currently open to the public Limits: Usage and file size limits may apply in the future (TBD)
by Jonathan
This workflow is part of an MSP collection, which is publicly available on GitHub. This workflow archives or unarchives a Clockify projects, depending on a Syncro status. Note that Syncro should be setup with a webhook via 'Notification Set for Ticket - Status was changed'. It doesn't handle merging of tickets, as Syncro doesn't support a 'Notification Set' for merged tickets, so you should change a ticket to 'Resolved' first before merging it. Prerequisites A Clockify account and credentials Nodes Webhook node triggers the workflow. IF node filters projects that don't have the status 'Resolved'. Clockify nodes get all projects that (don't) have the status 'Resolved', based on the IF route. HTTP Request nodes unarchives unresolved projects, and archives resolved projects, respectively.
by Tom
This is the workflow powering the n8n demo shown at StrapiConf 2022. The workflow searches matching Tweets every 30 minutes using the Interval node and listens to Form submissions using the Webhook node. Sentiment analysis is handled by Google using the Google Cloud Natural Language node before the result is stored in Strapi using the Strapi node. (These were originally two separate workflows that have been combined into one to simplify sharing.)
by Airtop
About The LinkedIn Profile Discovery Automation Are you tired of manually searching for LinkedIn profiles or paying expensive data providers for often outdated information? If you spend countless hours trying to find accurate LinkedIn URLs for your prospects or candidates, this automation will change your workflow forever. Just give this workflow the information you have about a contact, and it will automatically augment it with a LinkedIn profile. How to find a LinkedIn Profile Link In this guide, you'll learn how to automate LinkedIn profile link discovery using Airtop's built-in node in n8n. Using this automation, you'll have a fully automated workflow that saves you hours of manual searching while providing accurate, validated LinkedIn URLs. What You'll Need A free Airtop API key A Google Workspace account. If you have a Gmail account, you’re all set Estimated setup time: 10 minutes Understanding the Process This automation leverages the power of intelligent search algorithms combined with LinkedIn validation to ensure accuracy. Here's how it works: Takes your input data (name, company, etc.) and constructs intelligent search queries Utilizes Google search to identify potential LinkedIn profile URLs Validates the discovered URLs directly against LinkedIn to ensure accuracy Returns confirmed, accurate LinkedIn profile URLs Setting Up Your Automation Getting started with this automation is straightforward: Prepare Your Google Sheet Create a new Google Sheet with columns for input data (name, company, domain, etc.) Add columns for the output LinkedIn URL and validation status (see this example) Configure the Automation Connect your Google Workspace account to n8n if you haven't already Add your Airtop API credentials (Optionally) Configure your Airtop Profile and sign-in to LinkedIn in order to validate profile URL's Run Your First Test Add a few test entries to your Google Sheet Run the workflow Check the results in your output columns Customization Options While the default setup uses Google Sheets, this automation is highly flexible: Webhook Integration**: Perfect for integrating with tools like Clay, Instantly, or your custom applications Alternatives**: Replace Google Sheets with Airtable, Notion, or any other tools you already use for more robust database capabilities Custom Output Formatting**: Modify the output structure to match your existing systems Batch Processing**: Configure for bulk processing of multiple profiles Real-World Applications This automation has the potential to transform how we organizations handle profile enrichment. Recruiting Firm Success Story With this automation, a recruiting firm could save hundreds of dollars a month in data enrichment fees, achieve better accuracy, and eliminate subscription costs. They would also be able to process thousands of profiles weekly with near-perfect accuracy. Sales Team Integration A B2B sales team could integrate this automation with their CRM, automatically enriching new leads with validated LinkedIn profiles and saving their SDRs hours per week on manual research. Best Practices To maximize the accuracy of your results: Always include company information (domain or company name) with your search queries Use full names rather than nicknames or initials when possible Consider including location data for more accurate results with common names Implement rate limiting to respect LinkedIn's usage guidelines Keep your input data clean and standardized for best results Use the integrated proxy to navigate more effectively through Google and LinkedIn What's Next? Now that you've automated LinkedIn profile discovery, consider exploring related automations: Automated lead scoring based on LinkedIn profile data Email finder automation using validated LinkedIn profiles Integration with your CRM for automated contact enrichment
by Airtop
About The Airtop Automation Are you tired of being shocked by unexpectedly high energy bills? With this automation using Airtop and n8n, you can take control of your daily energy costs and ensure you’re always informed. How to monitor your daily energy consumption With this automation, we’ll walk you through setting up an automation that retrieves your PG&E (Pacific Gas and Electric) energy usage data, calculates costs, and emails you the details—all without manual effort. What You’ll Need To get started, make sure you have the following: A free Airtop API Key PG&E Account Credentials - with minor adaptations, this will also work with other providers An Email Address - To receive the energy cost updates Estimated setup time: 5 minutes Understanding the Process This automation works by: Logging into your PG&E account using your credentials Navigating to your energy usage data Extracting relevant details about energy consumption and costs Emailing the daily summary directly to your inbox The automation is straightforward and ensures you have real-time insights into your energy usage, empowering you to adjust your habits and save money. Setting Up Your Automation We’ve created a step-by-step guide to help you set up this workflow. Here’s how: Insert Your Credentials: In the tools section, add your PG&E login details as variables In Airtop, add your Airtop API Key Configure your email address to receive the updates Run the Automation: Start the scenario, and watch as the automation retrieves your energy data and sends you a detailed email summary. Customization Options While the default setup works seamlessly, you can tweak it to suit your needs: Data Storage: Store energy usage data in a database for long-term tracking and analysis Visualization: Plot graphs of your energy usage trends over time for better insights Notifications: Change the automation to only send alerts on high usage instead of a daily email Real-World Applications This automation isn’t just about monitoring energy usage and taking control. Here are some practical applications: Daily Energy Management: Receive updates every morning and adjust your energy consumption based on costs Smart Home Integration: Use the data to automate appliances during off-peak hours Budgeting: Track energy expenses over weeks or months to plan your budget more effectively Happy automating!
by Ghaith Alsirawan
🧠 This workflow is designed for one purpose only, to bulk-upload structured JSON articles from an FTP server into a Qdrant vector database for use in LLM-powered semantic search, RAG systems, or AI assistants. The JSON files are pre-cleaned and contain metadata and rich text chunks, ready for vectorization. This workflow handles Downloading from FTP Parsing & splitting Embedding with OpenAI-embedding Storing in Qdrant for future querying JSON structure format for blog articles { "id": "article_001", "title": "reseguider", "language": "sv", "tags": ["london", "resa", "info"], "source": "alltomlondon.se", "url": "https://...", "embedded_at": "2025-04-08T15:27:00Z", "chunks": [ { "chunk_id": "article_001_01", "section_title": "Introduktion", "text": "Välkommen till London..." }, ... ] } 🧰 Benefits ✅ Automated Vector Loading Handles FTP → JSON → Qdrant in a hands-free pipeline. ✅ Clean Embedding Input Supports pre-validated chunks with metadata: titles, tags, language, and article ID. ✅ AI-Ready Format Perfect for Retrieval-Augmented Generation (RAG), semantic search, or assistant memory. ✅ Flexible Architecture Modular and swappable: FTP can be replaced with GDrive/Notion/S3, and embeddings can switch to local models like Ollama. ✅ Community Friendly This template helps others adopt best practices for vector DB feeding and LLM integration.
by Tom
This workflow shows a no code approach to creating Salesforce accounts and contacts based on data coming from Excel 365 (the online version of Microsoft Excel). For a version working with regular Excel files check out this workflow instead. To run the workflow: Make sure you have both Excel 365 and Salesforce authenticated with n8n. Have a Microsoft Excel workbook with contacts and their account names ready: Select the workbook and sheet in the Microsoft Excel node of the workflow, then configure the range to read data from: Hit the Execute Workflow button at the bottom of the n8n canvas: Here is how it works: The workflow first searches for existing Salesforce accounts by name. It then branches out depending on whether the account already exists in Salesforce or not. If an account does not exist yet, it will be created. The data is then normalised before both branches converge again. Finally the contacts are created or updated as needed in Salesforce.
by Samir Saci
Tags: Automation, AI, Marketing, Content Creation Context I’m a Supply Chain Data Scientist and content creator who writes regularly about data-driven optimization, logistics, and sustainability. Promoting blog articles on LinkedIn used to be a manual task — until I decided to automate it with N8N and GPT-4o. This workflow lets you automatically extract blog posts, clean the content, and generate a professional LinkedIn post using an AI Agent powered by GPT-4o — all in one seamless automation. >Save hours of repetitive work and boost your reach with AI. 📬 For business inquiries, you can add me on LinkedIn Who is this template for? This template is perfect for: Bloggers and writers** who want to promote their content on LinkedIn Marketing teams** looking to automate professional post-generation Content creators** using Ghost platforms It generates polished LinkedIn posts with: A hook A quick summary A call-to-action A signature to drive readers to your contact page How does it work? This workflow runs in N8N and performs the following steps: 🚀 Triggers manually or you can add a scheduler 📰 Pulls recent blog posts from your Ghost site (via API) 🧼 Cleans the HTML content for AI input 🤖 Sends content to GPT-4o with a tailored prompt to create a LinkedIn post 📄 Records all data (post content + LinkedIn output) in a Google Sheet What do I need to start? You don’t need to write a single line of code. Prerequisites: A Ghost CMS account with blog content A Google Sheet to store generated posts An OpenAI API Key Google Sheets API** connected via OAuth2 Next Steps Use the sticky notes in the workflow to understand how to: Add your Ghost API credentials Link your Google Sheet Customize the AI prompt (e.g., change the author name or tone) Optionally add auto-posting to LinkedIn using tools like Buffer or Make 🎥 Watch My Tutorial 🚀 Want to explore how automation can scale your brand or business? 📬 Let’s connect on LinkedIn Notes You can adapt this template for Twitter, Facebook, or even email newsletters by adjusting the prompt and output channel. This workflow was built using n8n 1.85.4 Submitted: April 9th, 2025
by Yaron Been
LinkedIn Hiring Signal Scraper — Jobs & Prospecting Using Bright Data Purpose: Discover recent job posts from LinkedIn using Bright Data's Dataset API, clean the results, and log them into Google Sheets — for both job hunting and identifying high-intent B2B leads based on hiring activity. Use Cases: Job Seekers** – Spot relevant openings filtered by role, city, and country. Sales & Prospecting** – Use job posts as buying signals. If a company is hiring for a role you support (e.g. marketers, developers, ops) — it's the perfect time to reach out and offer your services. Tools Needed: n8n Nodes:** Form Trigger HTTP Request Wait If Code Google Sheets Sticky Notes (for embedded guidance) External Services:** Bright Data (Dataset API) Google Sheets API Keys & Authentication Required: Bright Data API Key** → Add in the HTTP Request headers: Authorization: Bearer YOUR_BRIGHTDATA_API_KEY Google Sheets OAuth2** → Connect your account in n8n to allow read/write access to the spreadsheet. General Guidelines: Use descriptive names for all nodes. Include retry logic in polling to avoid infinite loops. Flatten nested fields (like job_poster and base_salary). Strip out HTML tags from job descriptions for clean output. Things to be Aware Of: Bright Data snapshots take ~1–3 minutes — use a Wait node and polling. Form filters affect output significantly: 🔍 We recommend filtering by "Last 7 days" or "Past 24 hours" for fresher data. Avoid hardcoding values in the form — leave optional filters empty if unsure. Post-Processing & Outreach: After data lands in Google Sheets, you can use it to: Personalize cold emails based on job titles, locations, and hiring signals. Send thoughtful LinkedIn messages (e.g., "Saw you're hiring a CMO...") Prioritize outreach to companies actively growing in your niche. Additional Notes: 📄 Copy the Google Sheet Template: Click here to make your copy → Rename for each campaign or client. Form fields include: Job Location (city or region) Keyword (e.g., CMO, Backend Developer) Country (2-letter code, e.g., US, UK) This workflow gives you a competitive edge — 📌 For candidates: Be first to apply. 📌 For sellers: Be first to pitch. All based on live hiring signals from LinkedIn. STEP-BY-STEP WALKTHROUGH Step 1: Set up your Google Sheet Open this template Go to File → Make a copy You'll use this copy as the destination for the scraped job posts Step 2: Fill out the Input Form in n8n The form allows you to define what kind of job posts you want to scrape. Fields: Job Location** → e.g. New York, Berlin, Remote Keyword** → e.g. CMO, AI Architect, Ecommerce Manager Country Code (2-letter)** → e.g. US, UK, IL 💡 Pro Tip: For best results, set the filter inside the workflow to: time_range = "Past 24 hours" or "Last 7 days" This keeps results relevant and fresh. Step 3: Trigger Bright Data Snapshot The workflow sends a request to Bright Data with your input. Example API Call Body: [ { "location": "New York", "keyword": "Marketing Manager", "country": "US", "time_range": "Past 24 hours", "job_type": "Part-time", "experience_level": "", "remote": "", "company": "" } ] Bright Data will start preparing the dataset in the background. Step 4: Wait for the Snapshot to Complete The workflow includes a Wait Node and Polling Loop that checks every few minutes until the data is ready. You don't need to do anything here — it's all automated. Step 5: Clean Up the Results Once Bright Data responds with the full job post list: ✔️ Nested fields like job_poster and base_salary are flattened ✔️ HTML in job descriptions is removed ✔️ Final data is formatted for export Step 6: Export to Google Sheets The final cleaned list is added to your Google Sheet (first tab). Each row = one job post, with columns like: job_title, company_name, location, salary_min, apply_link, job_description_plain Step 7: Use the Data for Outreach or Research Example for Job Seekers: You search for: Location: Berlin Keyword: Product Designer Country: DE Time range: Past 7 days Now you've got a live list of roles — with salary, recruiter info, and apply links. → Use it to apply faster than others. Example for Prospecting (Sales / SDR): You search for: Location: London Keyword: Growth Marketing Country: UK And find companies hiring growth marketers. → That's your signal to offer help with media buying, SEO, CRO, or your relevant service. Use the data to: Write personalized cold emails ("Saw you're hiring a Growth Marketer…") Start warm LinkedIn outreach Build lead lists of companies actively expanding in your niche API Credentials Required: Bright Data API Key** Used in HTTP headers: Authorization: Bearer YOUR_BRIGHTDATA_API_KEY Google Sheets OAuth2** Allows n8n to read/write to your spreadsheet Adjustments & Customization Tips: Modify the HTTP Request body to add more filters (e.g. job_type, remote, company) Increase or reduce polling wait time depending on Bright Data speed Add scoring logic to prioritize listings based on title or location Final Notes: 📄 Google Sheet Template: Make your copy here ⚙️ Bright Data Dataset API: Visit BrightData.com 📬 Personalization works best when you act quickly. Use the freshest data to reach out with context — not generic pitches. This workflow turns LinkedIn job posts into sales insights and job leads. All in one click. Fully automated. Ready for your next move.