by Jonathan
Task: Handle dates and times in your workflow Why: Date and time formats can be hard to work with, we have 2 main ways of doing that with n8n that cover all the main needs Main use cases: Change date format Set custom dates (incl. now and today) Date math
by siyad
Workflow Description: This workflow automates the synchronization of product data from a Shopify store to a Google Sheets document, ensuring seamless management and tracking. It retrieves product details such as title, tags, description, and price from Shopify via GraphQL queries. The outcome is a comprehensive list of products neatly organized in Google Sheets for easy access and analysis. Key Features: Automated: Runs on a schedule you define (e.g., daily, hourly) to keep your product data fresh. Complete Product Details: Retrieves titles, descriptions, variants, images, inventory, and more. Cursor-Based Pagination: Efficiently handles large product sets by navigating pages without starting from scratch. Google Sheets Integration: Writes product data directly to your designated sheets. Set up Instructions: Set up GraphQL node with Header Authentication for Shopify: Create Google Sheet Credentials: Follow this guide to set up your Google Sheet credentials for n8n: https://docs.n8n.io/integrations/builtin/credentials/google/ Choose your Google Sheet: Select the sheet where you want product information written. For the setup, we need a document with two sheets: 1. for storing Shopify data 2. for storing cursor details. Google sheet template : https://docs.google.com/spreadsheets/d/1I6JnP8ugqmMD5ktJlNB84J1MlSkoCHhAEuCofSa3OSM Schedule and run: Decide how often you want the data refreshed (daily, hourly, etc.) and let n8n do its magic!
by Kevin Cole
How It Works This workflow sends an HTTP request to OpenAI's Text-to-Speech (TTS) model, returning an .mp3 audio recording of the provided text. This template is meant to be adapted for your individual use case, and requires a valid OpenAI credential. Gotchas Per OpenAI's Usage Policies, you must provide a clear disclosure to end users that the TTS voice they are hearing is AI-generated and not a human voice, if you are using this workflow to provide audio output to users.
by Eduard
🚀 Supercharge Your Website Indexing with This Powerful n8n Workflow! 🌐 Google page indexing too slow? Tired of manually clicking through each page in the Google Search Console? 😴 Say goodbye to that tedious process and hello to automation with this n8n workflow! 🎉 **NB: this workflow was tested with sitemap.xml generated by Ghost CMS and WordPress. Reach out to Eduard if you need help adapting this workflow to your specific use-case!** ⚙️ How this automation works 📅 The workflow runs on a schedule or when you click "Test workflow". 🌐 It fetches the website's primary sitemap.xml and extracts all the content-specific sitemaps (this is a typical structure of the sitemap). 📑 Each content-specific sitemap is then parsed to retrieve the individual page data. 🔄 The extracted page data is converted to JSON format for easy manipulation. 🗃️ The lastmod (last modified date) and loc (page URL) fields are assigned to each page entry to ensure compliance with the Sitemap protocol. 🔀 The page entries are sorted by the lastmod field in descending order (newest to oldest). 🔁 The workflow then loops over each page entry and performs the following steps: 🔍 Checks the URL metadata in the Google Indexing API. ✅ If the page is new or has been updated since the last indexing request, it sends a request to the Google Indexing API to update the URL. ⏳ Wait a sec and move on with the next page. 🌟 Benefits ⏰ Save time by automating the indexing process. 🎯 Ensure all your website pages are consistently indexed by Google. 🚀 Improve your website's visibility and search engine rankings. 🛠️ Customize the workflow to fit your specific CMS and requirements. 🔧 Getting started To start using this powerful n8n workflow, follow these steps: ☑️ Make sure to verify the website ownership in the Google Search Console. 👨💻 Import the workflow JSON into your n8n instance. Edit the Get sitemap.xml node and update the URL with your website's valid sitemap.xml 🔑 Set up the necessary credentials for the Google Indexing API. 🎚️ Adjust the schedule trigger to run the workflow at your desired frequency. 🎉 Sit back and let the workflow handle the indexing process for you! Ready to take your website indexing to the next level? 🚀 Try this workflow now and see the difference it makes! 😊 ⚠️ IMPORTANT NOTE 1 Need help with connecting Google Cloud Platform to n8n? Check out our article on connecting Google Sheets to n8n. The process is mainly the same. When activating Google APIs, make sure to add Web Search Indexing API. Also, in the credential page of n8n, add the https://www.googleapis.com/auth/indexing scope: Check out Yulia's page for more n8n workflows! ⚠️ IMPORTANT NOTE 2 Free Google Cloud Platform account allows (re)indexing only 200 pages per day. If your website has more, then the workflow will automatically fail on quota limit ⛔. Next day it will skip the previously added items and continue with remaining pages. Example:* Assuming you have a free Google account, 500 pages on your website and they don't change for 3 days: On the first day 200 pages will be added for indexing and the workflow will fail due to quota limits. On the second day, the workflow will check 200 pages again and skip them (because the date of re-indexing is later then the page last modified date). The next 200 pages will be added to indexing. Workflow will fail again due to quota limits. On the third day 400 pages will be checked and skipped, the last 100 pages will be added for indexing and the workflow finishes successfully.
by Recrutei Automações
What This Workflow Does This workflow automates the candidate nurturing process, solving the common problem of candidates losing interest or "ghosting" after an application. It keeps them engaged and informed by sending a personalized, multi-channel (WhatsApp & Gmail) sequence of follow-up messages over their first week. The automation triggers when a new candidate is added to your ATS (e.g., via a Recrutei webhook). It then uses AI to generate a custom 3-part message (for Day 1, Day 3, and Day 7) tailored to the candidate's age and the specific job they applied for, ensuring a professional and empathetic experience that strengthens your employer brand. How it Works Trigger: A Webhook node captures the new candidate data from your Applicant Tracking System (ATS) or form. Data Preparation: Two Code nodes clean the incoming data. The first (Separating information) extracts key fields and formats the phone number. The second (Extract age) calculates the candidate's age from their birthday to be used by the AI. AI Content Generation: The workflow sends the candidate's details (name, age, job title) to an AI model (AI Recruitment Assistant). The AI has a detailed system prompt to generate three distinct messages for Day 1 (Thank You), Day 3 (Friendly Reminder), and Day 7 (Final Reinforcement), adapting its tone based on the candidate's age. Split Messages: A Code node (Separating messages per days) receives the single text block from the AI and splits it into three separate variables (day1, day3, day7). Day 1 Send: The workflow immediately sends the day1 message via both Gmail and WhatsApp (configured for Evolution API). Day 3 Send: A "Wait" node pauses the workflow for 2 days, after which it sends the day3 message. Day 7 Send: Another "Wait" node pauses for 4 more days, then sends the final day7 message, completing the 7-day nurturing sequence. Setup Instructions This workflow is plug-and-play once you configure the following 5 steps: Webhook Node: Copy the Test URL from the Webhook node and configure it in your ATS (e.g., Recrutei) or form builder to trigger whenever a new candidate is added. Run one test submission to make the data structure visible to n8n. AI Credentials: In the AI Recruitment Assistant node, select or create your OpenAI API credential. MCP Credential (Optional): If you use a Recrutei MCP, paste your endpoint URL into the MCP Recrutei node. Gmail Credentials: In all three Message Gmail nodes (Day 1, 3, 7), select or create your Gmail (OAuth2) credential. Optional: In the same nodes, go to Options and change the Sender Name from your_company to your actual company name. WhatsApp (Evolution API): This template is pre-configured for the Evolution API. In all three Message WhatsApp nodes (Day 1, 3, 7), you must: URL: Replace {server-url} and {instance} with your Evolution API details. Headers: In the "Header Parameters" section, replace your_api_key with your actual Evolution API key.
by ConvertAPI
Who is this for? For developers and organizations that need to convert DOCX files to PDF. What problem is this workflow solving? The file format conversion problem. What this workflow does Downloads the DOCX file from the web. Converts the DOCX file to PDF. Stores the PDF file in the local file system. How to customize this workflow to your needs Open the HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Add your secret to the Query Auth account parameter. Please create a ConvertAPI account to get an authentication secret. Optionally, additional Body Parameters can be added for the converter.
by ConvertAPI
Who is this for? For developers and organizations that need to convert image files to PDF. What problem is this workflow solving? The file format conversion problem. What this workflow does Downloads the JPG file from the web. Converts the JPG file to PDF. Stores the PDF file in the local file system. How to customize this workflow to your needs Open the HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Add your secret to the Query Auth account parameter. Please create a ConvertAPI account to get an authentication secret. Optionally, additional Body Parameters can be added for the converter.
by ConvertAPI
Who is this for? For developers and organizations that need to convert PDF files to PDFA for long term archiving. What problem is this workflow solving? The file format conversion problem. What this workflow does Downloads the PDF file from the web. Converts the PDF file to PDFA. Stores the PDFA file in the local file system. How to customize this workflow to your needs Open the HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Add your secret to the Query Auth account parameter. Please create a ConvertAPI account to get an authentication secret. Optionally, additional Body Parameters can be added for the converter.
by Oneclick AI Squad
AI-Driven Tax Compliance & Deadline Management System Description Automate tax deadline monitoring with AI-powered insights. This workflow checks your tax calendar daily at 8 AM, uses GPT-4 to analyze upcoming deadlines across multiple jurisdictions, detects overdue and critical items, and sends intelligent alerts via email and Slack only when immediate action is required. Perfect for finance teams and accounting firms who need proactive compliance management without manual tracking. 🏛️🤖📊 Good to Know AI-Powered**: GPT-4 provides risk assessment and strategic recommendations Multi-Jurisdiction**: Handles Federal, State, and Local tax requirements automatically Smart Alerts**: Only notifies executives when deadlines are overdue or critical (≤3 days) Priority Classification**: Categorizes deadlines as Overdue, Critical, High, or Medium priority Dual Notifications**: Critical alerts to leadership + daily summaries to team channel Complete Audit Trail**: Logs all checks and deadlines to Google Sheets for compliance records How It Works Daily Trigger - Runs at 8:00 AM every morning Fetch Data - Pulls tax calendar and company configuration from Google Sheets Analyze Deadlines - Calculates days remaining, filters by jurisdiction/entity type, categorizes by priority AI Analysis - GPT-4 provides strategic insights and risk assessment on upcoming deadlines Smart Routing - Only sends alerts if overdue or critical deadlines exist Critical Alerts - HTML email to executives + Slack alert for urgent items Team Updates - Slack summary to finance channel with all upcoming deadlines Logging - Records compliance check results to Google Sheets for audit trail Requirements Google Sheets Structure Sheet 1: TaxCalendar DeadlineID | DeadlineName | DeadlineDate | Jurisdiction | Category | AssignedTo | IsActive FED-Q1 | Form 1120 Q1 | 2025-04-15 | Federal | Income | John Doe | TRUE Sheet 2: CompanyConfig (single row) Jurisdictions | EntityType | FiscalYearEnd Federal, California | Corporation | 12-31 Sheet 3: ComplianceLog (auto-populated) Date | AlertLevel | TotalUpcoming | CriticalCount | OverdueCount 2025-01-15 | HIGH | 12 | 3 | 1 Credentials Needed Google Sheets - Service Account OAuth2 OpenAI - API Key (GPT-4 access required) SMTP - Email account for sending alerts Slack - Bot Token with chat:write permission Setup Steps Import workflow JSON into n8n Add all 4 credentials Replace these placeholders: YOUR_TAX_CALENDAR_ID - Tax calendar sheet ID YOUR_CONFIG_ID - Company config sheet ID YOUR_LOG_ID - Compliance log sheet ID C12345678 - Slack channel ID tax@company.com - Sender email cfo@company.com - Recipient email Share all sheets with Google service account email Invite Slack bot to channels Test workflow manually Activate the trigger Customizing This Workflow Change Alert Thresholds: Edit "Analyze Deadlines" node: Critical: Change <= 3 to <= 5 for 5-day warning High: Change <= 7 to <= 14 for 2-week notice Medium: Change <= 30 to <= 60 for 2-month lookout Adjust Schedule: Edit "Daily Tax Check" trigger: Change hour/minute for different run time Add multiple trigger times for tax season (8 AM, 2 PM, 6 PM) Add More Recipients: Edit "Send Email" node: To: cfo@company.com, director@company.com CC: accounting@company.com BCC: archive@company.com Customize Email Design: Edit "Format Email" node to change colors, add logo, or modify layout Add SMS Alerts: Insert Twilio node after "Is Critical" for emergency notifications Integrate Task Management: Add HTTP Request node to create tasks in Asana/Jira for critical deadlines Troubleshooting | Issue | Solution | |-------|----------| | No deadlines found | Check date format (YYYY-MM-DD) and IsActive = TRUE | | AI analysis failed | Verify OpenAI API key and account credits | | Email not sending | Test SMTP credentials and check if critical condition met | | Slack not posting | Invite bot to channel and verify channel ID format | | Permission denied | Share Google Sheets with service account email | 📞 Professional Services Need help with implementation or customization? Our team offers: 🎯 Custom workflow development 🏢 Enterprise deployment support 🎓 Team training sessions 🔧 Ongoing maintenance 📊 Custom reporting & dashboards 🔗 Additional API integrations Discover more workflows – Get in touch with us
by Angel Menendez
Introducing the Qualys Scan Slack Report Subworkflow—a robust solution designed to automate the generation and retrieval of security reports from the Qualys API. This workflow is a sub workflow of the Qualys Slack Shortcut Bot workflow. It is triggered when someone fills out the modal popup in slack generated by the Qualys Slack Shortcut Bot. When deploying this workflow, use the Demo Data node to simulate the data that is input via the Execute Workflow Trigger. That data flows into the Global Variables Node which is then referenced by the rest of the workflow. It includes nodes to Fetch the Report IDs and then Launch a report, and then check the report status periodically and download the completed report, which is then posted to Slack for easy access. For Security Operations Centers (SOCs), this workflow provides significant benefits by automating tedious tasks, ensuring timely updates, and facilitating efficient data handling. How It Works Fetch Report Templates:** The "Fetch Report IDs" node retrieves a list of available report templates from Qualys. This automated retrieval saves time and ensures that the latest templates are used, enhancing the accuracy and relevance of reports. Convert XML to JSON:** The response is converted to JSON format for easier manipulation. This step simplifies data handling, making it easier for SOC analysts to work with the data and integrate it into other tools or processes. Launch Report:** A POST request is sent to Qualys to initiate report generation using specified parameters like template ID and report title. Automating this step ensures consistency and reduces the chance of human error, improving the reliability of the reports generated. Loop and Check Status:** The workflow loops every minute to check if the report generation is complete. Continuous monitoring automates the waiting process, freeing up SOC analysts to focus on higher-priority tasks while ensuring they are promptly notified when reports are ready. Download Report:** Once the report is ready, it is downloaded from Qualys. Automated downloading ensures that the latest data is always available without manual intervention, improving efficiency. Post to Slack:** The final report is posted to a designated Slack channel for quick access. This integration with Slack ensures that the team can promptly access and review the reports, facilitating swift action and decision-making. Get Started Ensure your Slack and Qualys integrations are properly set up. Customize the workflow to fit your specific reporting needs. Link to parent workflow Link to Vulnerability Scan Trigger Need Help? Join the discussion on our Forum or check out resources on Discord! Deploy this workflow to streamline your security report generation process, improve response times, and enhance the efficiency of your security operations.
by Jenny
Vector Database as a Big Data Analysis Tool for AI Agents Workflows from the webinar "Build production-ready AI Agents with Qdrant and n8n". This series of workflows shows how to build big data analysis tools for production-ready AI agents with the help of vector databases. These pipelines are adaptable to any dataset of images, hence, many production use cases. Uploading (image) datasets to Qdrant Set up meta-variables for anomaly detection in Qdrant Anomaly detection tool KNN classifier tool For anomaly detection 1. This is the first pipeline to upload an image dataset to Qdrant. The second pipeline is to set up cluster (class) centres & cluster (class) threshold scores needed for anomaly detection. The third is the anomaly detection tool, which takes any image as input and uses all preparatory work done with Qdrant to detect if it's an anomaly to the uploaded dataset. For KNN (k nearest neighbours) classification 1. This is the first pipeline to upload an image dataset to Qdrant. The second is the KNN classifier tool, which takes any image as input and classifies it on the uploaded to Qdrant dataset. To recreate both You'll have to upload crops and lands datasets from Kaggle to your own Google Storage bucket, and re-create APIs/connections to Qdrant Cloud (you can use Free Tier cluster), Voyage AI API & Google Cloud Storage. [This workflow] Batch Uploading Images Dataset to Qdrant This template imports dataset images from Google Could Storage, creates Voyage AI embeddings for them in batches, and uploads them to Qdrant, also in batches. In this particular template, we work with crops dataset. However, it's analogous to uploading lands dataset, and in general, it's adaptable to any dataset consisting of image URLs (as the following pipelines are). First, check for an existing Qdrant collection to use; otherwise, create it here. Additionally, when creating the collection, we'll create a payload index, which is required for a particular type of Qdrant requests we will use later. Next, import all (dataset) images from Google Cloud Storage but keep only non-tomato-related ones (for anomaly detection testing). Create (per batch) embeddings for all imported images using the Voyage AI multimodal embeddings API. Finally, upload the resulting embeddings and image descriptors to Qdrant via batch upload.
by darrell_tw
This workflow automates the process of fetching agricultural transaction data from the Taiwan Agricultural Products Open Data Platform and storing it in a Google Sheets document for further analysis. Key Features Manual Trigger: Allows manual execution of the workflow to control when data is fetched. HTTP Request: Sends a request to the Open Data Platform's API to retrieve detailed transaction data, including: Pricing (Upper, Middle, Lower, Average) Transaction quantities Crop and market details Split Out Node: Processes each record individually, ensuring accurate handling of every data entry. Google Sheets Integration: Appends the data into a structured Google Sheets document for easy access and analysis. Node Configurations 1. Manual Trigger Purpose**: Start the workflow manually. Configuration**: No setup needed. 2. HTTP Request Purpose**: Fetch agricultural data. Configuration**: URL: https://data.moa.gov.tw/api/v1/SheepQuotation Query Parameters: Start_time: 2024/12/01 End_time: 2024/12/31 MarketName: 台北二 api_key: <your_api_key> Headers: accept: application/json 3. Split Out Purpose**: Split the API response data array into individual items. Configuration**: Field to Split Out: Data 4. Google Sheets Purpose**: Append the data to Google Sheets. Configuration**: Operation: Append Document ID: <your_document_id> Sheet Name: Sheet1 Mapped Fields: TransDate, TcType, CropCode, CropName, MarketCode, MarketName Upper_Price, Middle_Price, Lower_Price, Avg_Price, Trans_Quantity 此 Workflow 從 台灣農業產品開放資料平臺 獲取農產品交易數據,並將其儲存到 Google Sheets 文件 中進行進一步分析。 主要功能 Manual Trigger:允許手動執行工作流程,以控制數據獲取的時間。 HTTP Request:向開放資料平臺的 API 發送請求,獲取詳細的交易數據,包括: 價格 (Upper, Middle, Lower, Average) 交易數量 作物和市場詳細資料 Split Out Node:逐筆處理每一筆記錄,確保數據準確無誤。 Google Sheets Integration:將數據追加到結構化的 Google Sheets 文件中,方便存取和分析。 節點設定 1. Manual Trigger 用途**:手動啟動工作流程。 設定**:無需額外設定。 2. HTTP Request 用途**:抓取農產品數據。 設定**: URL: https://data.moa.gov.tw/api/v1/SheepQuotation 查詢參數 (Query Parameters): Start_time: 2024/12/01 End_time: 2024/12/31 MarketName: 台北二 api_key: <your_api_key> 標頭 (Headers): accept: application/json 3. Split Out 用途**:將 API 回應的數據陣列分解為個別項目。 設定**: Field to Split Out: Data 4. Google Sheets 用途**:將數據追加至 Google Sheets。 設定**: Operation:Append Document ID:<your_document_id> Sheet Name:Sheet1 映射欄位 (Mapped Fields): TransDate, TcType, CropCode, CropName, MarketCode, MarketName Upper_Price, Middle_Price, Lower_Price, Avg_Price, Trans_Quantity 請多利用 Curl Import 功能 例如 curl -X GET "https://data.moa.gov.tw/api/v1/AgriProductsTransType/?Start_time=114.01.01&End_time=114.01.01&MarketName=%E5%8F%B0%E5%8C%97%E4%BA%8C" -H "accept: application/json" 農業資料開放平台 文件