by Davide
This workflow automates the creation of AI-generated virtual try-on images for fashion eCommerce stores. Instead of relying on expensive and time-consuming photoshoots, the system uses AI to generate realistic images of models wearing selected clothing items. This n8n workflow automates the process of generating AI-powered virtual try-on images for a WooCommerce store. It fetches product data from a Google Sheet, uses the Fal.ai Nano Banana model to create an image of a model wearing the clothing item, and then updates both the Google Sheet and the WooCommerce product with the final generated image. Advantages ✅ Cost Reduction: Eliminates the need for professional photo shoots, saving on models, photographers, and studio expenses. ✅ Time Efficiency: Automates the entire workflow—from data input to product update—minimizing manual work. ✅ Scalability: Works seamlessly across large product catalogs, making it easy to update hundreds of products quickly. ✅ Enhanced eCommerce Experience: Provides shoppers with realistic previews of clothing on models, boosting trust and conversion rates. ✅ Marketing Flexibility: The generated images can also be repurposed for ads, social media, and promotional campaigns. ✅ Centralized Management: Google Sheets acts as the control center, making it easy to manage inputs and track results. How It Works The workflow operates in a sequential, loop-based manner to process multiple products from a spreadsheet. Here is the logical flow: Manual Trigger & Data Fetch: The workflow starts manually (e.g., by clicking "Test workflow"). It first reads data from a specified Google Sheet, looking for rows where the "IMAGE RESULT" column is empty. Loop Processing: It loops over each row of data fetched from the sheet. Each row should contain URLs for a model image and a product image, along with a WooCommerce product ID. API Request to Generate Image: For each item in the loop, the workflow sends a POST request to the Fal.ai Nano Banana API. The request includes the two image URLs and a prompt instructing the AI to create a photo of the model wearing the submitted clothing item. Polling for Completion: The AI processing is asynchronous. The workflow enters a polling loop: it waits for 60 seconds and then checks the status of the processing request. If the status is not COMPLETED, it waits and checks again. This loop continues until the image is ready. Fetching and Storing the Result: Once the status is COMPLETED, the workflow retrieves the URL of the generated image, downloads the image file, and uploads it to a designated folder in Google Drive. Updating Systems: The workflow then performs two crucial update steps: It updates the original Google Sheet row, writing the URL of the final generated image into the "IMAGE RESULT" column. It updates the corresponding product in WooCommerce, adding the generated image to the product's gallery. Loop Continuation: After processing one item, the workflow loops back to process the next row in the Google Sheet until all items are complete. * Set Up Steps* To make this workflow functional, you need to configure three main connections: Step 1: Prepare the Google Sheet Create a Google Sheet with the following columns: IMAGE MODEL, IMAGE PRODUCT, PRODUCT ID, and IMAGE RESULT. Populate the first three columns for each product. The IMAGE RESULT column must be left blank; the workflow will fill it automatically. In the n8n workflow, configure the "Google Sheets" node to point to your specific Google Sheet and worksheet. Step 2: Configure the Fal.ai API Key Create an account at fal.ai and obtain your API key. In the n8n workflow, locate the three "HTTP Request" nodes named "Get Url image", "Get status", and "Create Image". Edit the credentials for these nodes (named "Fal.run API") and update the Value field in the Header Auth to be Key YOURAPIKEY (replacing YOURAPIKEY with your actual key). Step 3: Set Up WooCommerce API Ensure you have the API keys (Consumer Key and Consumer Secret) for your WooCommerce store's REST API. In the n8n workflow, locate the "WooCommerce" node. Edit its credentials and provide the required information: your store's URL and the API keys. This allows the workflow to authenticate and update your products. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Evoort Solutions
🚀 Automated Keyword Difficulty & SERP Checker with Google Sheets Integration Description: This n8n workflow automates keyword SEO analysis by collecting user input via a form, querying the Difficulty Checker API on RapidAPI to retrieve keyword difficulty and SERP data, and storing the results in Google Sheets for further SEO tracking and decision-making. 🔗 Node-by-Node Breakdown 1. 📝 On form submission Triggers the workflow by capturing keyword and country from a user-submitted form. 2. 🌐 Keyword Difficulty Checker Makes a POST request to the Keyword Difficulty Checker API on RapidAPI to fetch keyword difficulty index and SERP results. 3. 📦 Reformat 1 Extracts only the keywordDifficulty value from the API response JSON. 4. 📊 Keyword Difficulty Checker1 Appends the keyword and its difficulty index to the "backlink overflow" Google Sheet for structured keyword tracking. 5. 📦 Reformat 2 Extracts the serpResults list from the API response for additional ranking data. 6. 📄 SERP Results Stores the extracted SERP data into the "backlinks" Google Sheet for ranking comparison and analysis. ✅ Benefits of This Workflow Automation of SEO research* — Eliminates manual keyword analysis by integrating with the *Keyword Difficulty Checker API on RapidAPI**. Real-time keyword tracking** — Automatically stores difficulty scores and SERP results into Google Sheets. Scalable** — Easily extendable for bulk keyword analysis or reporting. Reliable data source* — Uses trusted third-party API (Keyword Difficulty Checker*) for accurate and updated metrics. No code** — Built with n8n, enabling low-code/no-code automation without writing backend services. 💡 Use Cases Content Planning for SEO Teams Identify low-competition keywords using real-time difficulty scoring to prioritize blog content. Client SEO Reporting Track and present SERP visibility and keyword trends in Google Sheets dashboards. Keyword Competition Monitoring Periodically monitor keyword rankings and adjust backlink strategy accordingly. Freelance SEO Projects Save time by automating research tasks using the Keyword Difficulty Checker API on RapidAPI. 🔑 How to Obtain Your API Key for Keyword Difficulty Checker API Sign Up or Log In Visit RapidAPI and create a free account using your email or social login. Go to the API Page Navigate to the Keyword Difficulty Checker API by PrineshPatel. Subscribe to the API Click Subscribe to Test, then choose a pricing plan that fits your needs (Free, Basic, Pro). Get Your API Key After subscribing, go to the Security tab on the API page to find your X-RapidAPI-Key. Use Your API Key Add the API key to your HTTP request headers: X-RapidAPI-Key: YOUR_API_KEY Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Dmytro
Description This workflow allows you to turn any idea for a post into platform-specific content using AI. You simply provide the concept, topic, or description of a post, and the AI generates drafts adapted to multiple social media platforms — LinkedIn, Telegram, TikTok, YouTube, X/Twitter, Instagram, Bluesky and Threads. Posts are created in PostPulse ready for review, scheduling, or publishing. ⚠️ Disclaimer: This workflow uses the community node @postpulse/n8n-nodes-postpulse. Make sure community nodes are enabled in your n8n instance before importing and using this template. 👉 To install it: Go to Settings → Community Nodes → Install and enter:"@postpulse/n8n-nodes-postpulse". 💡 For more details, see n8n Integration Guide: PostPulse Developers – n8n Integration Who Is This For? Social media managers** who want to create content for multiple platforms quickly. Content creators** who need posts automatically adapted to different platforms’ character limits. Agencies** managing multiple accounts who want to save time on copywriting and formatting. What Problem Does This Workflow Solve? Instead of manually writing, adapting, and publishing posts, you get: AI-powered content creation:** Generate posts from any idea you provide. Platform optimization:** Posts are automatically adapted to platform-specific character limits and formatting. Seamless publishing:** Draft posts are sent to PostPulse for scheduling or immediate publishing. Hashtag suggestions:** AI adds relevant hashtags for each platform. Time saving:** Automates content generation and publishing, freeing you for more strategic tasks. How It Works This workflow takes your idea, generates platform-specific posts with AI, and sends them to PostPulse: 1. Idea input: Enter any post concept in the idea node. 2. Setting Restrictions and Hashtags: Optional: adjust character limits or the number of hashtags. 3. AI Content Adapter: Generates text for each platform based on the input idea. 4. Unification of Platforms and Text + Merge: Aligns AI-generated content with the correct platforms. 5. Publish Post: Creates draft posts in PostPulse ready for scheduling or publishing. Setup 1. Connect PostPulse to n8n Request your OAuth client key and secret from PostPulse support at support@post-pulse.com. Add your PostPulse account in the Credentials section in n8n. 2. Enter an idea in the idea node Type any concept, topic, or description of a post. 3. (Optional) Adjust restrictions in Setting Restrictions and Hashtags node Change maximum characters per platform or number of hashtags if desired. 4. Run the workflow AI generates platform-specific drafts and sends them to PostPulse as draft posts. Requirements Connected PostPulse accounts** (TikTok, Instagram, YouTube, LinkedIn, Telegram, Bluesky, X, Threads). OAuth client key and secret** obtained from PostPulse. An n8n instance** with community nodes enabled. ✨ With this workflow, PostPulse and n8n become your all-in-one automation hub for generating and publishing social media posts. How To Customize The Workflow This workflow is flexible and adaptable to your needs: Character limits:** Adjust maximum characters per platform while respecting platform limits. Hashtags:** Modify the number of hashtags added by the AI. AI prompt:** Change text tone or style in the AI Content Adapter node. Add platforms:** Extend supported platforms by updating platform mappings in the workflow. Scheduling:** Adjust scheduledTime in Publish Post node for automated scheduling. 💡 Tip: Fully functional out-of-the-box, but easily customizable to match your brand’s tone, posting strategy, or any platform-specific rules.
by Rahul Joshi
Description: Recover missed opportunities automatically with this n8n automation template. The workflow connects with Calendly, identifies no-show meetings, and instantly sends personalized Telegram messages encouraging leads to reschedule. It then notifies the assigned sales representative via email, ensuring timely human follow-up. Perfect for sales teams, consultants, and customer success managers who want to minimize no-shows, improve conversion rates, and keep pipelines warm — all without manual tracking. What This Template Does (Step-by-Step) ⏰ Runs Every Hour Automatically triggers every hour to check your Calendly events for recently missed meetings. 📥 Fetch Active Calendly Appointments Retrieves all scheduled events from Calendly using your user URI and event metadata. 🔍 Filter for No-Shows (30+ Minutes Past) Uses a built-in logic block to detect appointments that ended over 30 minutes ago and were not attended. 🎯 Check Lead Intent Processes only leads tagged as “High Intent” in metadata to focus recovery efforts on qualified prospects. 💬 Send Telegram Message to Lead Sends a personalized message to the lead’s Telegram ID, including a direct reschedule link and friendly tone from your sales team. 📧 Notify Assigned Sales Rep via Email Alerts the relevant rep (from metadata) that the lead missed a meeting and has received an automated Telegram follow-up. Includes contact name, status update, and meeting link for manual re-engagement. 🔁 Continuous Follow-Up Automation Repeats hourly, ensuring no missed appointment goes unnoticed — even outside working hours. Key Features 🤖 Smart detection of no-shows via Calendly API 💬 Telegram message automation with personalization 📧 Sales rep email alerts with complete context 🎯 Filters by “High Intent” tag to focus efforts ⚙️ Easy setup with environment variables and credentials Use Cases 📞 Automatically re-engage missed sales calls 📅 Reduce no-show rates for Calendly meetings 💬 Keep your sales pipeline active and responsive 📢 Notify sales reps in real time about recovery actions Required Integrations Calendly API – to fetch scheduled events and meeting details Telegram API – to send automated reschedule messages SMTP or Gmail – to alert the assigned sales representative Why Use This Template? ✅ Saves hours of manual follow-up effort ✅ Boosts reschedule rate for missed meetings ✅ Keeps high-value leads warm and engaged ✅ Ensures your sales reps never miss a no-show
by Abbas Ali
This workflow is designed for teams or freelancers who want to auto-generate and send contracts based on information gathered from a Typeform (e.g., client name, project scope, deadlines). Perfect for HR onboarding, client agreements, or legal operations. Prerequisites To use this workflow, you’ll need: A Typeform account and a published form Access to Google Docs (or use a local document template) Gmail or SMTP email integration in n8n n8n Desktop or a hosted n8n instance How It Works Trigger:** Listens for new Typeform submissions. Extract Data:** Parses the answers from the form. Generate Contract:** Fills a contract template using form inputs. Create PDF:** Exports the filled contract as a PDF. Send Email:** Sends the PDF to the client’s email address provided in the form. Nodes Used Typeform Trigger** – Triggers on form submission. Set Node** – Maps form answers into variables. Google Docs (or HTTP Request)** – Uses a template to generate the contract. Google Drive / PDF Converter** – Converts to PDF (if needed). Email (Gmail/SMTP)** – Sends the completed contract to the recipient. Tips Replace the Google Docs template ID with your own. Ensure the variable placeholders (like {{client_name}}) match your document. Use the Cron node instead of Typeform Trigger if you want to poll periodically.
by Rahul Joshi
Description: Accelerate VIP support handling with this n8n workflow template that automatically identifies high-priority customers and ensures their tickets get instant attention. This automation pulls customer ticket data from Google Sheets, checks for the VIP tag, and seamlessly creates a priority task in ClickUp, while sending an instant Telegram alert to your team. What This Template Does: 🔍 Detects VIP-tagged customers in real time 📝 Creates high-priority ClickUp tasks for their tickets 📱 Sends Telegram alerts with ticket details (subject + requester email) 📊 Organizes VIP workload for easy tracking in ClickUp Built-in logic ensures: • Only VIP-tagged customers are escalated • Priority tasks are auto-assigned without delays • Teams receive instant notifications for fast response • No missed high-value customer interactions Requirements: • Google Sheets API credentials • ClickUp API credentials • Telegram Bot token & Chat ID • n8n instance (self-hosted or cloud) Perfect For: Customer support teams, sales organizations, and service-based businesses who need to prioritize VIP tickets instantly and streamline customer experience.
by Sk developer
Bilibili Video Downloader with Google Drive Upload & Email Notification Automate downloading of Bilibili videos via the Bilibili Video Downloader API (RapidAPI), upload them to Google Drive, and notify users by email — all using n8n workflow automation. 🧠 Workflow Overview This n8n automation allows users to: Submit a Bilibili video URL. Fetch download info from the Bilibili Video Downloader API (RapidAPI). Automatically download and upload the video to Google Drive. Share the file and send an email notification to the user. ⚙️ Node-by-Node Explanation | Node | Function | | ---------------------------------------------- | ---------------------------------------------------------------------------------------------------------------------------------------------------------------- | | On form submission | Triggers when a user submits the Bilibili video URL through the form. | | Fetch Bilibili Video Info from API | Sends the video URL to the Bilibili Video Downloader API (RapidAPI) to retrieve download info. | | Check API Response Status | Validates that the API returned a 200 success status before proceeding. | | Download Video File | Downloads the actual video from the provided resource URL. | | Upload Video to Google Drive | Uploads the downloaded video file to the user’s connected Google Drive. | | Google Drive Set Permission | Sets sharing permissions to make the uploaded video publicly accessible. | | Success Notification Email with Drive Link | Sends the Google Drive link to the user via email upon successful upload. | | Processing Delay | Adds a delay before executing error handling if something fails. | | Failure Notification Email | Sends an error notification to the user if download/upload fails. | 🧩 How to Configure Google Drive in n8n In n8n, open Credentials → New → Google Drive. Choose OAuth2 authentication. Follow the on-screen instructions to connect your Google account. Use the newly created credential in both Upload Video and Set Permission nodes. Test the connection to ensure access to your Drive. 🔑 How to Obtain Your RapidAPI Key To use the Bilibili Video Downloader API (RapidAPI): Visit bilibili videodownloade. Click Subscribe to Test (you can choose free or paid plans). Copy your x-rapidapi-key from the “Endpoints” section. Paste the key into your n8n Fetch Bilibili Video Info from API node header. Example header: { "x-rapidapi-host": "bilibili-video-downloader.p.rapidapi.com", "x-rapidapi-key": "your-rapidapi-key-here" } 💡 Use Case This automation is ideal for: Content creators archiving Bilibili videos. Researchers collecting media resources. Teams that need centralized video storage in Google Drive. Automated content management workflows. 🚀 Benefits ✅ No manual downloads – fully automated. ✅ Secure cloud storage via Google Drive. ✅ Instant user notification on success or failure. ✅ Scalable for multiple users or URLs. ✅ Powered by the reliable Bilibili Video Downloader API (RapidAPI). 👥 Who Is This For n8n developers** wanting to explore advanced workflow automations. Content managers** handling large volumes of Bilibili content. Digital archivists** storing video data in Google Drive. Educators** sharing Bilibili educational videos securely. 🏁 Summary With this n8n workflow, you can seamlessly integrate the Bilibili Video Downloader API (RapidAPI) into your automation stack — enabling effortless video downloading, Google Drive uploading, and user notifications in one unified system.
by Philippe
Summary This workflow enables the submission of business-critical URLs via the Google Indexing API and IndexNow. Why is this important for SEO? If your objective is visibility within AI-powered search and answer engines (such as Copilot, Perplexity, or OpenAI tools), the IndexNow integration is particularly relevant. IndexNow accelerates URL discovery for Bing and Yandex, which are key retrieval sources for several LLM-based platforms. In parallel, Google remains the dominant search engine, representing ~80% of global search traffic. Gemini is deeply integrated into Google’s ecosystem and, when grounding is enabled, can leverage Google Search as an external retrieval source. Ensuring fast and reliable indexation of critical URLs therefore remains a strategic foundation for both traditional SEO and AI-assisted search experiences. Description This workflow uses OnCrawl API endpoint to automatically discover your sitemaps.xml and submit their latest updates to both Google Indexing API and IndexNOW. It includes two variations: Index orphan pages detected in sitemap.xml and submit them to Google and IndexNow. Index newly released pages by identifying indexable canonical URLs added between a pre-release crawl and a post-release crawl. How it works This workflow works for Oncrawl users with API access enabled in their plan. if you are not an Oncrawl users, please refer to: https://n8n.io/workflows/8778-workflow-for-submitting-changed-sitemap-urls-using-google-indexing-api-and-bing-indexnow/(https://n8n.io/workflows/8778-workflow-for-submitting-changed-sitemap-urls-using-google-indexing-api-and-bing-indexnow/)) To get an API Key, just go in your User Account profile > tokens > + Add API access token: Description: any name Scope: select all checkboxes Click in Create token. Keep your API secret safe Discover & parse Sitemaps Create your first crawl by: Clicking in Create configuration > choose a template > Automate > Webhook. Webhook Node: In n8n, copy paste the Webhook callback URL into the Oncrawl Webhook section. At the end, Oncrawl sends a POST HTTP request to n8n containing: Workspace_ID, Project_ID, Crawl_ID. More details in Webhook Documentation: https://developer.oncrawl.com/#notification Discover_sitemaps endpoint: documentation: https://developer.oncrawl.com/. This endpoint checks the Sitemaps declared in your robots.txt file. You can filter the output to avoid duplicate sitemaps Config: It’s an initiation node that populate variables such as: Crawl_ID: Fetch from Webhook Node SITE_URL: Your site with the following format: https://your-site.com SITEMAP_URL: For subdomain sitemaps, you can duplicate this field. INDEXNOW_KEY: You can create it in the Bing Webmaster tools here https://www.bing.com/indexnow/getstarted INDEXNOW_KEY_URL: it's usually your domain and the INDEXNOW_KEY: wwww.example.com/ <INDEXNOW_KEY> Variables you can update depending on your specs: DAYS_BACK: 7 by default. For Google it's checking the status of the page before submitting but for Indexnow it will ask to index all the pages that been last updated in the last 7 days BATCH_SIZE: 500 it's the default recommended by IndexNow USE_GOOGLE, USE_INDEXNOW: by default it's true which means the process will run for both Google and IndexNow Google Node Check Status Node (OAuth Setup): documentation: https://developers.google.com/webmaster-tools/v1/urlInspection.index/inspect Create credentials: https://console.cloud.google.com/apis/credentials Enable Google Search Console API Download the Client ID / Client Secret JSON Connect n8n using: Client ID Client Secret Scopes Google Search Console account All explanations are contained in these tutorials: https://www.youtube.com/watch?v=HT56wExnN5k | https://www.youtube.com/watch?v=FBGtpWMTppw Scopes reference: https://developers.google.com/identity/protocols/oauth2/scopes Google Index API: Create a service account here https://console.cloud.google.com/iam-admin/serviceaccounts Assign role: Owner Generate a JSON key (contains email + private key) For the two Google API nodes: Authentication: Predefined credential type Credential Type: Google Service Account API Credential configuration: Region: Your project region Service Account Email / Private Key: From the JSON key Enable “Set up for use in HTTP Request node” Scope: https://www.googleapis.com/auth/indexing ⚠️ Important: once you have created a "Service account email" you need to add a user with this email and permission "Owner" in your Google Search Console: https://search.google.com/search-console/users Others Nodes Gate: Google Is USE_GOOGLE = true from Cofig? Check status: Useful to get the coverageState and lastCrawlTime of a given URL given by Google Search Console Loop Over Items: Prevents rate-limiting Switch: Case: coverageState= “Submitted and indexed” -> Push to is New node Case: coverageState= “Crawled - currently not indexed” -> Push to Submit node Is New: URLs from Sitemap with Last modification date AFTER the GoogleLast Crawl date If true, we submit URLs to Index API If false, no need to push that URL for indexation URL Updates doc: https://developers.google.com/search/apis/indexing-api/v3/using-api#gettinginfo Endpoint: https://indexing.googleapis.com/v3/urlNotifications:publish We call the Update URL request Wait: Generates a random delay between 0.30 and 1.50 seconds, rounded to 2 decimals ⚠️ Google alternative to batch index URLs consists in using Premium Service to by pass the URL inspection tool: https://fr.speedyindex.com/ IndexNow auto-submitting documentation: https://www.bing.com/indexnow/getstarted Gate: IndexNow: Is USE_INDEXNOW is true from Config? Split in Batches: split in batch of 500 URLs max to avoid rate Limiting issues Build IndexNow payload: description in the node name IndexNow Submit: Submit the URLs to indexNow VariationA: Index orphan pages API documentation: https://developer.oncrawl.com/#Data-API OQL definition: Get orphan pages for both sitemaps & logs Merge node: Merge Items that InnerJoin loc, url fields. This is useful to recover the lastmod from Orphan pages referenced into Sitemaps. This data can be shared into Google Node afterward. Input1 should be: "Assign mandatory sitemap fields" Node Next nodes change "Set Node" name in the script variables VariationB: Index newly added pages between a Crawl 1 & a Crawl2 API documentation: https://developer.oncrawl.com/#Data-API OQL definition: Returns indexable canonical pages added in Crawl 2 Merge node: Merge Items that match between loc, url fields. This is useful to recover the lastmod data for Google Node Input1 should be: "Assign mandatory sitemap fields" Node Next nodes change "Set Node" name in the script variables
by Oneclick AI Squad
This workflow automates the process of receiving vendor quotations, extracting and summarizing their contents using AI, and logging the results for comparison. The system listens for new file uploads via webhook, processes each file using a summarization engine, and generates a well-formatted summary table that is stored in Google Sheets and sent via email to stakeholders. Good to Know Saves hours of manual work** by auto-comparing multiple vendor quotations. Uses AI summarization** to intelligently identify highlights and differences in each quote. Supports structured summaries** for quick stakeholder decision-making. Maintains a Google Sheets log** for historical comparison and auditing. Email notifications** ensure stakeholders receive real-time updates. How It Works 1. Upload Quotes Webhook trigger that listens for uploaded vendor quotation files (PDF, Excel, or Docs). 2. Extract File Data Parses the uploaded file and extracts relevant quote data (price, items, vendor name, etc.). 3. AI Summarization Sends extracted data to an AI API (Grok) to generate a human-readable comparison summary. 4. Wait For Reply Pauses the workflow until the AI response is fully received. 5. Format Summary Formats the AI-generated content into a structured summary (e.g., table format or comparison bullets). 6. Log to Google Sheets Appends the formatted summary to a Google Sheet for tracking and reference. 7. Send Email Emails the summary to predefined recipients (procurement, finance, etc.). Data Sources Uploaded Vendor Quotation Files** – Typically in PDF, DOCX, or Excel format containing vendor proposals. AI API (Grok)** – Processes the quote data and returns a summarized comparison. How to Use Import the workflow into your n8n instance (self-hosted or cloud). Configure the Webhook URL to receive file uploads. Set up file extraction logic in the “Extract File Data” node to match your file format. Configure your Grok API credentials in the “AI Summarization” node. Connect your Google Sheets account to the “Log to Google Sheets” node. Customize the recipient email address in the “Send Email” node. Test with sample quotation files to validate the entire flow. Requirements Self-hosted n8n instance** (if using community nodes). API key for Grok** or another AI summarization service. Google account access** to log summary data to Sheets. Mail credentials** for sending automated emails (SMTP setup). File parsing logic** (for PDFs, DOCX, Excel) depending on your vendor formats. Customizing This Workflow Modify the Extract File Data node** to support additional quote formats or fields. Enhance AI Summarization** with custom prompts or models for industry-specific terms. Format output into a PDF summary** or comparison chart if needed. Add Slack/Teams integration** for real-time team alerts. Apply filters** to compare only specific vendors or line items.
by Sk developer
Automated Video Generation, Google Drive Upload, and Email Notification with Veo 3 Fast API This workflow automates the process of generating videos using the Veo 3 Fast API, uploading the video to Google Drive, and notifying the user via email. All tasks are executed seamlessly, ensuring a smooth user experience with automatic error handling. Node-by-Node Explanation On Form Submission: Triggers the workflow when a user submits a form with a prompt. Veo 3 Fast API Processor: Sends the user's prompt to the Veo 3 Fast API to generate a video. Wait for API Response: Pauses the workflow for 35 seconds to allow the API response. API Request: Check Task Status: Sends a request to check the status of the video generation task. Condition: Task Output Status: Evaluates whether the task was successful, still processing, or failed. Wait for Task to Complete: Pauses the workflow for 30 seconds to recheck the task status if processing is ongoing. Send Email: API Error - Task Failed: Sends an email if the task fails to generate the video. Send Email: API Error - Task ID Missing: Sends an email if the task ID is missing in the response. Download Video: Downloads the processed video from the provided output URL. Upload File to Google Drive: Uploads the processed video to the user's Google Drive. Set Google Drive Permissions: Sets the necessary sharing permissions for the uploaded video. Send an Email: Video Link: Sends an email with the link to the uploaded video. How to Obtain a RapidAPI Key Go to Veo 3 Fast on RapidAPI. Create an account or log in. Subscribe to the API plan that suits your needs. After subscription, find your API Key in the "Keys & Access" section. How to Configure Google Drive API Go to Google Cloud Console. Create a new project or select an existing one. Enable the Google Drive API for the project. Go to Credentials and create OAuth 2.0 credentials. Add the credentials to your n8n Google Drive node for seamless access to your Google Drive. Use Case Use Case**: A content creation team can automate the video production process, upload videos to Google Drive, and share them with stakeholders instantly after the task is complete. Benefits Efficiency**: Reduces manual tasks, saving time and effort by automating video creation and file management. Error Handling**: Sends notifications for task failures or missing data, ensuring quick resolutions. Seamless Integration**: Automatically uploads files to Google Drive and shares the link with users, streamlining the workflow. Who Is This For Content Creators**: Automates video creation and file management. Marketing Teams**: Quick and easy video generation for campaigns. Developers**: Can integrate with APIs and automate tasks. Business Teams**: Save time by automating repetitive tasks like file uploads and email notifications.
by Sk developer
Automated IMDB Video Downloader: Download, Upload to Google Drive & Notify via Email Easily download IMDB videos via a user-friendly form. Automatically fetch video links using the IMDB Downloader API, save videos to Google Drive, and notify users via email with shareable links or failure alerts. Perfect for content creators and marketers. Node-by-Node Explanation On form submission**: Triggers the workflow when a user submits an IMDB video URL via a form. Fetch IMDB Video Info from API: Sends the URL to the **IMDB Downloader API to get video metadata and download links. Check API Response Status**: Verifies if the API responded successfully (status code 200). Download Video File**: Downloads the video from the provided media URL. Upload Video to Google Drive**: Uploads the downloaded video file to a specified Google Drive folder. Google Drive Set Permission**: Sets sharing permissions on the uploaded video for easy access. Success Notification Email with Drive Link**: Emails the user the Google Drive link to access the video. Processing Delay**: Adds a wait time before sending failure notifications. Failure Notification Email**: Emails the user if the video download or processing fails. How to Obtain Your RapidAPI Key Go to RapidAPI's IMDB Downloader API page. Sign up or log in to your RapidAPI account. Subscribe to the IMDB Downloader API. Find your unique x-rapidapi-key in the dashboard under the API keys section. Replace "your key" in your workflow headers with this key to authenticate requests. Use Cases & Benefits Use Cases Content creators downloading trailers or clips quickly. Marketing teams preparing video content for campaigns. Educators sharing film excerpts. Social media managers sourcing videos efficiently. Benefits Fully automates video download and upload workflow. Seamless Google Drive integration with sharing. Instant user notifications on success or failure. User-friendly with simple URL form submission. Who Is This For? Content creators** looking for fast video downloads. Marketers** needing instant access to IMDB clips. Educators** requiring film excerpts for lessons. Social media managers** preparing engaging content. Any user wanting hassle-free IMDB video downloads with cloud storage.
by Philippe
Summary This workflow enables the submission of business-critical URLs via the Google Indexing API and IndexNow. Why is this important for SEO? If your objective is visibility within AI-powered search and answer engines (such as Copilot, Perplexity, or OpenAI tools), the IndexNow integration is particularly relevant. IndexNow accelerates URL discovery for Bing and Yandex, which are key retrieval sources for several LLM-based platforms. In parallel, Google remains the dominant search engine, representing ~80% of global search traffic. Gemini is deeply integrated into Google’s ecosystem and, when grounding is enabled, can leverage Google Search as an external retrieval source. Ensuring fast and reliable indexation of critical URLs therefore remains a strategic foundation for both traditional SEO and AI-assisted search experiences. Description This workflow uses OnCrawl API endpoint to automatically discover your sitemaps.xml and submit their latest updates to both Google Indexing API and IndexNOW. It includes two variations: Index orphan pages detected in sitemap.xml and submit them to Google and IndexNow. Index newly released pages by identifying indexable canonical URLs added between a pre-release crawl and a post-release crawl. How it works This workflow works for Oncrawl users with API access enabled in their plan. if you are not an Oncrawl users, please refer to: https://n8n.io/workflows/8778-workflow-for-submitting-changed-sitemap-urls-using-google-indexing-api-and-bing-indexnow/ To get an API Key, just go in your User Account profile > tokens > + Add API access token: Description: any name Scope: select all checkboxes Click in Create token. Keep your API secret safe Discover & parse Sitemaps Create your first crawl by: Clicking in Create configuration > choose a template > Automate > Webhook. Webhook Node: In n8n, copy paste the Webhook callback URL into the Oncrawl Webhook section. At the end, Oncrawl sends a POST HTTP request to n8n containing: Workspace_ID, Project_ID, Crawl_ID. More details in Webhook Documentation: https://developer.oncrawl.com/#notification Discover_sitemaps endpoint: documentation: https://developer.oncrawl.com/. This endpoint checks the Sitemaps declared in your robots.txt file. You can filter the output to avoid duplicate sitemaps Config: It’s an initiation node that populate variables such as: Crawl_ID: Fetch from Webhook Node SITE_URL: Your site with the following format: https://your-site.com SITEMAP_URL: For subdomain sitemaps, you can duplicate this field. INDEXNOW_KEY: You can create it in the Bing Webmaster tools here https://www.bing.com/indexnow/getstarted INDEXNOW_KEY_URL: it's usually your domain and the INDEXNOW_KEY: wwww.example.com/ <INDEXNOW_KEY> Variables you can update depending on your specs: DAYS_BACK: 7 by default. BATCH_SIZE: 500 it's the default recommended by IndexNow USE_GOOGLE, USE_INDEXNOW: by default it's true which means the process will run for both Google and IndexNow Google Node Check Status Node (OAuth Setup): documentation: https://developers.google.com/webmaster-tools/v1/urlInspection.index/inspect Create credentials: https://console.cloud.google.com/apis/credentials Enable Google Search Console API Download the Client ID / Client Secret JSON Connect n8n using: Client ID Client Secret Scopes Google Search Console account All explanations are contained in these tutorials: https://www.youtube.com/watch?v=HT56wExnN5k | https://www.youtube.com/watch?v=FBGtpWMTppw Scopes reference: https://developers.google.com/identity/protocols/oauth2/scopes Google Index API: Create a service account here https://console.cloud.google.com/iam-admin/serviceaccounts Assign role: Owner Generate a JSON key (contains email + private key) For the two Google API nodes: Authentication: Predefined credential type Credential Type: Google Service Account API Credential configuration: Region: Your project region Service Account Email / Private Key: From the JSON key Enable “Set up for use in HTTP Request node” Scope: https://www.googleapis.com/auth/indexing ⚠️ Important: once you have created a "Service account email" you need to add a user with this email and permission "Owner" in your Google Search Console: https://search.google.com/search-console/users Others Nodes Gate: Google Is USE_GOOGLE = true from Cofig? Check status: Useful to get the coverageState and lastCrawlTime of a given URL given by Google Search Console Loop Over Items: Prevents rate-limiting Switch: Case: coverageState= “Submitted and indexed” -> Push to "isNew" node Case: coverageState= “Crawled - currently not indexed” -> Push to "URL Updated" node Is New: URLs from Sitemap with Last modification date AFTER the GoogleLast Crawl date If true, we submit URLs to Index API If false, no need to push that URL for indexation URL Updates doc: https://developers.google.com/search/apis/indexing-api/v3/using-api#gettinginfo Endpoint: https://indexing.googleapis.com/v3/urlNotifications:publish We call the Update URL request Wait: Generates a random delay between 0.30 and 1.50 seconds, rounded to 2 decimals ⚠️ Google alternative to batch index URLs consists in using Premium Service to by pass the URL inspection tool: https://fr.speedyindex.com/ IndexNow auto-submitting documentation: https://www.bing.com/indexnow/getstarted Gate: IndexNow: Is USE_INDEXNOW is true from Config? Split in Batches: split in batch of 500 URLs max to avoid rate Limiting issues Build IndexNow payload: description in the node name IndexNow Submit: Submit the URLs to indexNow VariationA: Index orphan pages API documentation: https://developer.oncrawl.com/#Data-API OQL definition: Get orphan pages for both sitemaps & logs Merge node: Merge Items that InnerJoin loc, url fields. This is useful to recover the lastmod from Orphan pages referenced into Sitemaps. This data can be shared into Google Node afterward. Input1 should be: "Assign mandatory sitemap fields" Node Next nodes change "Set Node" name in the script variables VariationB: Index newly added pages between a Crawl 1 & a Crawl2 API documentation: https://developer.oncrawl.com/#Data-API OQL definition: Returns indexable canonical pages added in Crawl 2 Merge node: Merge Items that match between loc, url fields. This is useful to recover the lastmod data for Google Node Input1 should be: "Assign mandatory sitemap fields" Node Next nodes change "Set Node" name in the script variables