by siyad
Workflow Description: This workflow automates the synchronization of product data from a Shopify store to a Google Sheets document, ensuring seamless management and tracking. It retrieves product details such as title, tags, description, and price from Shopify via GraphQL queries. The outcome is a comprehensive list of products neatly organized in Google Sheets for easy access and analysis. Key Features: Automated: Runs on a schedule you define (e.g., daily, hourly) to keep your product data fresh. Complete Product Details: Retrieves titles, descriptions, variants, images, inventory, and more. Cursor-Based Pagination: Efficiently handles large product sets by navigating pages without starting from scratch. Google Sheets Integration: Writes product data directly to your designated sheets. Set up Instructions: Set up GraphQL node with Header Authentication for Shopify: Create Google Sheet Credentials: Follow this guide to set up your Google Sheet credentials for n8n: https://docs.n8n.io/integrations/builtin/credentials/google/ Choose your Google Sheet: Select the sheet where you want product information written. For the setup, we need a document with two sheets: 1. for storing Shopify data 2. for storing cursor details. Google sheet template : https://docs.google.com/spreadsheets/d/1I6JnP8ugqmMD5ktJlNB84J1MlSkoCHhAEuCofSa3OSM Schedule and run: Decide how often you want the data refreshed (daily, hourly, etc.) and let n8n do its magic!
by Kevin Cole
How It Works This workflow sends an HTTP request to OpenAI's Text-to-Speech (TTS) model, returning an .mp3 audio recording of the provided text. This template is meant to be adapted for your individual use case, and requires a valid OpenAI credential. Gotchas Per OpenAI's Usage Policies, you must provide a clear disclosure to end users that the TTS voice they are hearing is AI-generated and not a human voice, if you are using this workflow to provide audio output to users.
by Eduard
๐ Supercharge Your Website Indexing with This Powerful n8n Workflow! ๐ Google page indexing too slow? Tired of manually clicking through each page in the Google Search Console? ๐ด Say goodbye to that tedious process and hello to automation with this n8n workflow! ๐ **NB: this workflow was tested with sitemap.xml generated by Ghost CMS and WordPress. Reach out to Eduard if you need help adapting this workflow to your specific use-case!** โ๏ธ How this automation works ๐ The workflow runs on a schedule or when you click "Test workflow". ๐ It fetches the website's primary sitemap.xml and extracts all the content-specific sitemaps (this is a typical structure of the sitemap). ๐ Each content-specific sitemap is then parsed to retrieve the individual page data. ๐ The extracted page data is converted to JSON format for easy manipulation. ๐๏ธ The lastmod (last modified date) and loc (page URL) fields are assigned to each page entry to ensure compliance with the Sitemap protocol. ๐ The page entries are sorted by the lastmod field in descending order (newest to oldest). ๐ The workflow then loops over each page entry and performs the following steps: ๐ Checks the URL metadata in the Google Indexing API. โ If the page is new or has been updated since the last indexing request, it sends a request to the Google Indexing API to update the URL. โณ Wait a sec and move on with the next page. ๐ Benefits โฐ Save time by automating the indexing process. ๐ฏ Ensure all your website pages are consistently indexed by Google. ๐ Improve your website's visibility and search engine rankings. ๐ ๏ธ Customize the workflow to fit your specific CMS and requirements. ๐ง Getting started To start using this powerful n8n workflow, follow these steps: โ๏ธ Make sure to verify the website ownership in the Google Search Console. ๐จโ๐ป Import the workflow JSON into your n8n instance. Edit the Get sitemap.xml node and update the URL with your website's valid sitemap.xml ๐ Set up the necessary credentials for the Google Indexing API. ๐๏ธ Adjust the schedule trigger to run the workflow at your desired frequency. ๐ Sit back and let the workflow handle the indexing process for you! Ready to take your website indexing to the next level? ๐ Try this workflow now and see the difference it makes! ๐ โ ๏ธ IMPORTANT NOTE 1 Need help with connecting Google Cloud Platform to n8n? Check out our article on connecting Google Sheets to n8n. The process is mainly the same. When activating Google APIs, make sure to add Web Search Indexing API. Also, in the credential page of n8n, add the https://www.googleapis.com/auth/indexing scope: Check out Yulia's page for more n8n workflows! โ ๏ธ IMPORTANT NOTE 2 Free Google Cloud Platform account allows (re)indexing only 200 pages per day. If your website has more, then the workflow will automatically fail on quota limit โ. Next day it will skip the previously added items and continue with remaining pages. Example:* Assuming you have a free Google account, 500 pages on your website and they don't change for 3 days: On the first day 200 pages will be added for indexing and the workflow will fail due to quota limits. On the second day, the workflow will check 200 pages again and skip them (because the date of re-indexing is later then the page last modified date). The next 200 pages will be added to indexing. Workflow will fail again due to quota limits. On the third day 400 pages will be checked and skipped, the last 100 pages will be added for indexing and the workflow finishes successfully.
by Yaron Been
This workflow automatically monitors social media advertising performance across platforms to track campaign effectiveness and ROI. It saves you time by eliminating the need to manually check multiple ad platforms and provides consolidated performance data for all your social media campaigns. Overview This workflow automatically scrapes social media advertising platforms to extract campaign performance metrics including impressions, clicks, conversions, and cost data. It uses Bright Data to access ad platforms without being blocked and AI to intelligently parse advertising data into structured performance reports. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping social ad platforms without being blocked OpenAI**: AI agent for intelligent ad performance data extraction and analysis Google Sheets**: For storing and organizing advertising performance data How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your ad performance tracking spreadsheet Customize: Set target ad platform URLs and campaign monitoring parameters Use Cases Digital Marketing**: Track ROI and performance across all social media ad campaigns Performance Analysis**: Identify top-performing ads and optimize underperforming campaigns Budget Management**: Monitor ad spend and cost-per-acquisition metrics Campaign Optimization**: Make data-driven decisions for ad creative and targeting Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #socialads #adperformance #brightdata #webscraping #digitalmarketing #n8nworkflow #workflow #nocode #adautomation #campaigntracking #socialmediamarketing #adanalytics #performancetracking #marketingautomation #admonitoring #campaignanalysis #socialadvertising #marketingdata #admetrics #digitaladvertising #adoptimization #campaignmonitoring #marketinganalysis #adinsights #socialmediaads #paidads #adcampaigns #marketingroi
by Recrutei Automaรงรตes
What This Workflow Does This workflow automates the candidate nurturing process, solving the common problem of candidates losing interest or "ghosting" after an application. It keeps them engaged and informed by sending a personalized, multi-channel (WhatsApp & Gmail) sequence of follow-up messages over their first week. The automation triggers when a new candidate is added to your ATS (e.g., via a Recrutei webhook). It then uses AI to generate a custom 3-part message (for Day 1, Day 3, and Day 7) tailored to the candidate's age and the specific job they applied for, ensuring a professional and empathetic experience that strengthens your employer brand. How it Works Trigger: A Webhook node captures the new candidate data from your Applicant Tracking System (ATS) or form. Data Preparation: Two Code nodes clean the incoming data. The first (Separating information) extracts key fields and formats the phone number. The second (Extract age) calculates the candidate's age from their birthday to be used by the AI. AI Content Generation: The workflow sends the candidate's details (name, age, job title) to an AI model (AI Recruitment Assistant). The AI has a detailed system prompt to generate three distinct messages for Day 1 (Thank You), Day 3 (Friendly Reminder), and Day 7 (Final Reinforcement), adapting its tone based on the candidate's age. Split Messages: A Code node (Separating messages per days) receives the single text block from the AI and splits it into three separate variables (day1, day3, day7). Day 1 Send: The workflow immediately sends the day1 message via both Gmail and WhatsApp (configured for Evolution API). Day 3 Send: A "Wait" node pauses the workflow for 2 days, after which it sends the day3 message. Day 7 Send: Another "Wait" node pauses for 4 more days, then sends the final day7 message, completing the 7-day nurturing sequence. Setup Instructions This workflow is plug-and-play once you configure the following 5 steps: Webhook Node: Copy the Test URL from the Webhook node and configure it in your ATS (e.g., Recrutei) or form builder to trigger whenever a new candidate is added. Run one test submission to make the data structure visible to n8n. AI Credentials: In the AI Recruitment Assistant node, select or create your OpenAI API credential. MCP Credential (Optional): If you use a Recrutei MCP, paste your endpoint URL into the MCP Recrutei node. Gmail Credentials: In all three Message Gmail nodes (Day 1, 3, 7), select or create your Gmail (OAuth2) credential. Optional: In the same nodes, go to Options and change the Sender Name from your_company to your actual company name. WhatsApp (Evolution API): This template is pre-configured for the Evolution API. In all three Message WhatsApp nodes (Day 1, 3, 7), you must: URL: Replace {server-url} and {instance} with your Evolution API details. Headers: In the "Header Parameters" section, replace your_api_key with your actual Evolution API key.
by ConvertAPI
Who is this for? For developers and organizations that need to convert DOCX files to PDF. What problem is this workflow solving? The file format conversion problem. What this workflow does Downloads the DOCX file from the web. Converts the DOCX file to PDF. Stores the PDF file in the local file system. How to customize this workflow to your needs Open the HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Add your secret to the Query Auth account parameter. Please create a ConvertAPI account to get an authentication secret. Optionally, additional Body Parameters can be added for the converter.
by ConvertAPI
Who is this for? For developers and organizations that need to convert image files to PDF. What problem is this workflow solving? The file format conversion problem. What this workflow does Downloads the JPG file from the web. Converts the JPG file to PDF. Stores the PDF file in the local file system. How to customize this workflow to your needs Open the HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Add your secret to the Query Auth account parameter. Please create a ConvertAPI account to get an authentication secret. Optionally, additional Body Parameters can be added for the converter.
by ConvertAPI
Who is this for? For developers and organizations that need to convert PDF files to PDFA for long term archiving. What problem is this workflow solving? The file format conversion problem. What this workflow does Downloads the PDF file from the web. Converts the PDF file to PDFA. Stores the PDFA file in the local file system. How to customize this workflow to your needs Open the HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Add your secret to the Query Auth account parameter. Please create a ConvertAPI account to get an authentication secret. Optionally, additional Body Parameters can be added for the converter.
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically tracks local search trends and geographic-specific search patterns to optimize local SEO and marketing strategies. It saves you time by eliminating the need to manually research local search behavior and provides location-based insights for targeted marketing campaigns. Overview This workflow automatically scrapes local search results, geographic search trends, and location-based query data to understand regional search behavior and local market opportunities. It uses Bright Data to access location-specific search data and AI to intelligently analyze local trends and optimization opportunities. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping location-based search data without being blocked OpenAI**: AI agent for intelligent local search trend analysis Google Sheets**: For storing local search trend data and geographic insights How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your local trends tracking spreadsheet Customize: Define target locations and local search monitoring parameters Use Cases Local SEO**: Optimize for location-specific search queries and trends Regional Marketing**: Tailor campaigns to local search behavior and preferences Multi-location Businesses**: Track search trends across different geographic markets Market Expansion**: Identify new geographic opportunities based on search trends Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #localsearch #localseo #searchtrends #brightdata #webscraping #geographictrends #n8nworkflow #workflow #nocode #localmarketing #regionalseo #locationbased #localbusiness #searchgeography #localtrends #geoseo #localdata #regionalmarketing #localanalytics #geographicseo #localsearchdata #localoptimization #regionalsearch #locationmarketing #localsearchtrends #geomarketing #localinsights #regionalsearch
by Yaron Been
CFO Forecasting Agent - Marketplace Listing Headlines (Choose Your Favorite) Option 1 - Direct & Professional "AI-Powered CFO Forecasting Agent: Automated Revenue Predictions from Stripe Data" Option 2 - Benefit-Focused "Automate Your Financial Forecasting: Daily Revenue Predictions with AI Intelligence" Option 3 - Action-Oriented "Transform Stripe Sales Data into Intelligent 3-Month Revenue Forecasts Automatically" Marketplace Description ๐ AI-Powered Financial Forecasting on Autopilot Turn your Stripe sales data into intelligent revenue forecasts with this comprehensive CFO Forecasting Agent. This workflow automatically analyzes your transaction history, identifies trends, and generates professional 3-month revenue predictions using OpenAI's GPT-4. โจ What This Workflow Does: ๐ Automated Data Collection**: Fetches and processes all Stripe charges daily ๐ค AI-Powered Analysis**: Uses OpenAI GPT-4 to analyze trends and predict future revenue ๐ Structured Forecasting**: Generates monthly forecasts with confidence levels and insights ๐พ Multi-Platform Storage**: Saves results to both Supabase database and Google Sheets ๐ Scheduled Execution**: Runs automatically every day to keep forecasts current ๐ง Smart Context**: Optional Pinecone integration for historical context and improved accuracy ๐ง Key Features: Daily automated execution** at 9 AM Structured JSON output** with forecasts, trends, and confidence levels Dual storage system** for data backup and easy reporting RAG-enabled** for enhanced forecasting with historical context Professional CFO-grade insights** and trend analysis ๐ Prerequisites: Stripe account with API access OpenAI API key (GPT-4 recommended) Google Sheets API credentials Supabase account (optional) Pinecone account (optional, for enhanced context) ๐ฏ Perfect For: SaaS companies tracking subscription revenue E-commerce businesses needing sales forecasts Startups requiring investor-ready financial projections Finance teams automating reporting workflows ๐ฆ What You Get: Complete n8n workflow with all nodes configured Detailed documentation and setup instructions Sample data structure and output formats Ready-to-use Google Sheets template ๐ก Need Help or Want to Learn More? Created by Yaron Been - Automation & AI Specialist ๐ง Support: Yaron@nofluff.online ๐ฅ YouTube Tutorials: https://www.youtube.com/@YaronBeen/videos ๐ผ LinkedIn: https://www.linkedin.com/in/yaronbeen/ Get more automation tips, tutorials, and advanced workflows on my channels! ๐ท๏ธ Tags: AI, OpenAI, Stripe, Forecasting, Finance, CFO, Automation, Revenue, Analytics, GPT-4
by Angel Menendez
Introducing the Qualys Scan Slack Report Subworkflowโa robust solution designed to automate the generation and retrieval of security reports from the Qualys API. This workflow is a sub workflow of the Qualys Slack Shortcut Bot workflow. It is triggered when someone fills out the modal popup in slack generated by the Qualys Slack Shortcut Bot. When deploying this workflow, use the Demo Data node to simulate the data that is input via the Execute Workflow Trigger. That data flows into the Global Variables Node which is then referenced by the rest of the workflow. It includes nodes to Fetch the Report IDs and then Launch a report, and then check the report status periodically and download the completed report, which is then posted to Slack for easy access. For Security Operations Centers (SOCs), this workflow provides significant benefits by automating tedious tasks, ensuring timely updates, and facilitating efficient data handling. How It Works Fetch Report Templates:** The "Fetch Report IDs" node retrieves a list of available report templates from Qualys. This automated retrieval saves time and ensures that the latest templates are used, enhancing the accuracy and relevance of reports. Convert XML to JSON:** The response is converted to JSON format for easier manipulation. This step simplifies data handling, making it easier for SOC analysts to work with the data and integrate it into other tools or processes. Launch Report:** A POST request is sent to Qualys to initiate report generation using specified parameters like template ID and report title. Automating this step ensures consistency and reduces the chance of human error, improving the reliability of the reports generated. Loop and Check Status:** The workflow loops every minute to check if the report generation is complete. Continuous monitoring automates the waiting process, freeing up SOC analysts to focus on higher-priority tasks while ensuring they are promptly notified when reports are ready. Download Report:** Once the report is ready, it is downloaded from Qualys. Automated downloading ensures that the latest data is always available without manual intervention, improving efficiency. Post to Slack:** The final report is posted to a designated Slack channel for quick access. This integration with Slack ensures that the team can promptly access and review the reports, facilitating swift action and decision-making. Get Started Ensure your Slack and Qualys integrations are properly set up. Customize the workflow to fit your specific reporting needs. Link to parent workflow Link to Vulnerability Scan Trigger Need Help? Join the discussion on our Forum or check out resources on Discord! Deploy this workflow to streamline your security report generation process, improve response times, and enhance the efficiency of your security operations.
by Jenny
Vector Database as a Big Data Analysis Tool for AI Agents Workflows from the webinar "Build production-ready AI Agents with Qdrant and n8n". This series of workflows shows how to build big data analysis tools for production-ready AI agents with the help of vector databases. These pipelines are adaptable to any dataset of images, hence, many production use cases. Uploading (image) datasets to Qdrant Set up meta-variables for anomaly detection in Qdrant Anomaly detection tool KNN classifier tool For anomaly detection 1. This is the first pipeline to upload an image dataset to Qdrant. The second pipeline is to set up cluster (class) centres & cluster (class) threshold scores needed for anomaly detection. The third is the anomaly detection tool, which takes any image as input and uses all preparatory work done with Qdrant to detect if it's an anomaly to the uploaded dataset. For KNN (k nearest neighbours) classification 1. This is the first pipeline to upload an image dataset to Qdrant. The second is the KNN classifier tool, which takes any image as input and classifies it on the uploaded to Qdrant dataset. To recreate both You'll have to upload crops and lands datasets from Kaggle to your own Google Storage bucket, and re-create APIs/connections to Qdrant Cloud (you can use Free Tier cluster), Voyage AI API & Google Cloud Storage. [This workflow] Batch Uploading Images Dataset to Qdrant This template imports dataset images from Google Could Storage, creates Voyage AI embeddings for them in batches, and uploads them to Qdrant, also in batches. In this particular template, we work with crops dataset. However, it's analogous to uploading lands dataset, and in general, it's adaptable to any dataset consisting of image URLs (as the following pipelines are). First, check for an existing Qdrant collection to use; otherwise, create it here. Additionally, when creating the collection, we'll create a payload index, which is required for a particular type of Qdrant requests we will use later. Next, import all (dataset) images from Google Cloud Storage but keep only non-tomato-related ones (for anomaly detection testing). Create (per batch) embeddings for all imported images using the Voyage AI multimodal embeddings API. Finally, upload the resulting embeddings and image descriptors to Qdrant via batch upload.