by Oneclick AI Squad
AI-Driven Tax Compliance & Deadline Management System Description Automate tax deadline monitoring with AI-powered insights. This workflow checks your tax calendar daily at 8 AM, uses GPT-4 to analyze upcoming deadlines across multiple jurisdictions, detects overdue and critical items, and sends intelligent alerts via email and Slack only when immediate action is required. Perfect for finance teams and accounting firms who need proactive compliance management without manual tracking. ๐๏ธ๐ค๐ Good to Know AI-Powered**: GPT-4 provides risk assessment and strategic recommendations Multi-Jurisdiction**: Handles Federal, State, and Local tax requirements automatically Smart Alerts**: Only notifies executives when deadlines are overdue or critical (โค3 days) Priority Classification**: Categorizes deadlines as Overdue, Critical, High, or Medium priority Dual Notifications**: Critical alerts to leadership + daily summaries to team channel Complete Audit Trail**: Logs all checks and deadlines to Google Sheets for compliance records How It Works Daily Trigger - Runs at 8:00 AM every morning Fetch Data - Pulls tax calendar and company configuration from Google Sheets Analyze Deadlines - Calculates days remaining, filters by jurisdiction/entity type, categorizes by priority AI Analysis - GPT-4 provides strategic insights and risk assessment on upcoming deadlines Smart Routing - Only sends alerts if overdue or critical deadlines exist Critical Alerts - HTML email to executives + Slack alert for urgent items Team Updates - Slack summary to finance channel with all upcoming deadlines Logging - Records compliance check results to Google Sheets for audit trail Requirements Google Sheets Structure Sheet 1: TaxCalendar DeadlineID | DeadlineName | DeadlineDate | Jurisdiction | Category | AssignedTo | IsActive FED-Q1 | Form 1120 Q1 | 2025-04-15 | Federal | Income | John Doe | TRUE Sheet 2: CompanyConfig (single row) Jurisdictions | EntityType | FiscalYearEnd Federal, California | Corporation | 12-31 Sheet 3: ComplianceLog (auto-populated) Date | AlertLevel | TotalUpcoming | CriticalCount | OverdueCount 2025-01-15 | HIGH | 12 | 3 | 1 Credentials Needed Google Sheets - Service Account OAuth2 OpenAI - API Key (GPT-4 access required) SMTP - Email account for sending alerts Slack - Bot Token with chat:write permission Setup Steps Import workflow JSON into n8n Add all 4 credentials Replace these placeholders: YOUR_TAX_CALENDAR_ID - Tax calendar sheet ID YOUR_CONFIG_ID - Company config sheet ID YOUR_LOG_ID - Compliance log sheet ID C12345678 - Slack channel ID tax@company.com - Sender email cfo@company.com - Recipient email Share all sheets with Google service account email Invite Slack bot to channels Test workflow manually Activate the trigger Customizing This Workflow Change Alert Thresholds: Edit "Analyze Deadlines" node: Critical: Change <= 3 to <= 5 for 5-day warning High: Change <= 7 to <= 14 for 2-week notice Medium: Change <= 30 to <= 60 for 2-month lookout Adjust Schedule: Edit "Daily Tax Check" trigger: Change hour/minute for different run time Add multiple trigger times for tax season (8 AM, 2 PM, 6 PM) Add More Recipients: Edit "Send Email" node: To: cfo@company.com, director@company.com CC: accounting@company.com BCC: archive@company.com Customize Email Design: Edit "Format Email" node to change colors, add logo, or modify layout Add SMS Alerts: Insert Twilio node after "Is Critical" for emergency notifications Integrate Task Management: Add HTTP Request node to create tasks in Asana/Jira for critical deadlines Troubleshooting | Issue | Solution | |-------|----------| | No deadlines found | Check date format (YYYY-MM-DD) and IsActive = TRUE | | AI analysis failed | Verify OpenAI API key and account credits | | Email not sending | Test SMTP credentials and check if critical condition met | | Slack not posting | Invite bot to channel and verify channel ID format | | Permission denied | Share Google Sheets with service account email | ๐ Professional Services Need help with implementation or customization? Our team offers: ๐ฏ Custom workflow development ๐ข Enterprise deployment support ๐ Team training sessions ๐ง Ongoing maintenance ๐ Custom reporting & dashboards ๐ Additional API integrations Discover more workflows โ Get in touch with us
by Matthew
AI-Powered Viral Video Factory ๐ This workflow automates the entire process of creating short, cinematic, fact-based videos ready for social media. It takes a single concept, generates a script and visuals, creates video clips, adds a voiceover, and assembles a final video, which is then uploaded directly to your Google Drive. It's perfect for content creators and marketing agencies looking to scale video production with minimal manual effort. How It Works ๐ฌ Generate a Viral Idea ๐ก: The workflow begins with the Create New Idea1 (OpenAI) node, which generates a viral-ready video concept, including a punchy title, hashtags, and a brief description based on a core theme (e.g., space, black holes). This idea is then logged in a Google Sheet. Create a Cinematic Script & Voiceover ๐: An OpenAI node (Generating scenes1) creates a detailed 12-scene script, outlining the visuals for a 60-second video. The script text for all scenes is combined and prepared for voiceover generation by another OpenAI node (Generate Voiceover). Generate Scene-by-Scene Visuals โจ: The workflow loops through each of the 12 scenes to create an animated clip: Image Generation: An HTTP Request node sends the scene's prompt to the fal-ai/flux model to create a photorealistic still image. Animation Prompting: The Video Prompts1 (OpenAI Vision) node analyzes the generated image and creates a new, specific prompt to animate it cinematically. Image-to-Video: Another HTTP Request node uses the fal-ai/kling-video model to turn the still image into a 5-second animated video clip based on the new animation prompt. Assemble the Final Video ๐๏ธ: Stitch Clips: Once all 12 clips are generated, the Merge Clips node uses the fal-ai/ffmpeg-api to concatenate them into a single, seamless 60-second video. Add Audio: The Combine Voice and Video node then layers the AI-generated voiceover onto the stitched video. Deliver to Google Drive ๐: Finally, the completed video is converted from a URL to a file and automatically uploaded to your specified Google Drive folder for easy access and publishing. Key Technologies Used n8n**: For orchestrating the entire automated workflow. OpenAI (GPT-4.1 & GPT-4o)**: For idea generation, scriptwriting, voiceover, and vision analysis. Fal.ai**: For high-performance, API-based image generation (Flux), video animation (Kling), and video processing (FFMPEG API). Google Drive & Sheets**: For logging ideas and storing the final video output. Setup Instructions Add Credentials: In n8n, add your OpenAI API key. Connect your Google account for Google Sheets and Google Drive access. You will need a Fal.ai API Key. Configure Fal.ai API Key: Crucially, you must replace the placeholder API key in all HTTP Request nodes that call the fal.run URL. Find the Authorization header in each of these nodes and replace the existing key with your own Key YOUR_FAL_AI_KEY_HERE. Nodes to update: Create Images1, Get Images1, Create Video1, Get Video1, Merge Clips, Get Final video, Combine Voice and Video. Configure OpenAI Nodes: Select each OpenAI node (e.g., Create New Idea1, Generating scenes1) and choose your OpenAI credential. You can customize the main prompt in the Create New Idea1 node to change the theme of the videos you want to generate. Configure Google Sheets & Drive: In the Organise idea, caption etc1 node, select your Google Sheets credential and specify the Spreadsheet and Sheet ID you want to use for logging ideas. In the Upload file to drive node, select your Google Drive credential and choose the destination folder for your final videos.
by sato rio
This workflow streamlines the entire inventory replenishment process by leveraging AI for demand forecasting and intelligent logic for supplier selection. It aggregates data from multiple sourcesโPOS systems, weather forecasts, SNS trends, and historical salesโto predict future demand. Based on these predictions, it calculates shortages, requests quotes from multiple suppliers, selects the optimal vendor based on cost and lead time, and executes the order automatically. ๐ Who is this for? Retail & E-commerce Managers** aiming to minimize stockouts and reduce overstock. Supply Chain Operations** looking to automate procurement and vendor selection. Data Analysts** wanting to integrate external factors (weather, trends) into inventory planning. ๐ก How it works Data Aggregation: Fetches data from POS systems, MySQL (historical sales), OpenWeatherMap (weather), and SNS trend APIs. AI Forecasting: Formats the data and sends it to an AI prediction API to forecast demand for the next 7 days. Shortage Calculation: Compares the forecast against current stock and safety stock to determine necessary order quantities. Supplier Optimization: For items needing replenishment, the workflow requests quotes from multiple suppliers (A, B, C) in parallel. It selects the best supplier based on the lowest total cost within a 7-day lead time. Execution & Logging: Places the order via API, updates the inventory system, and logs the transaction to MySQL. Anomaly Detection: If the AI's confidence score is low, it skips the auto-order and sends an alert to Slack for manual review. โ๏ธ Setup steps Configure Credentials: Set up credentials for MySQL and Slack in n8n. API Keys: You will need an API key for OpenWeatherMap (or a similar service). Update Endpoints: The HTTP Request nodes use placeholder URLs (e.g., pos-api.example.com, ai-prediction-api.example.com). Replace these with your actual internal APIs, ERP endpoints, or AI service (like OpenAI). Database Prep: Ensure your MySQL database has a table named forecast_order_log to store the order history. Schedule: The workflow is set to run daily at 03:00. Adjust the Schedule Trigger node as needed. ๐ Requirements n8n** (Self-hosted or Cloud) MySQL** database Slack** workspace External APIs for POS, Inventory, and Supplier communication (or mock endpoints for testing).
by Ranjan Dailata
This n8n workflow automates backlink monitoring, analysis, and AI-driven interpretation for any domain or URL. It combines backlink intelligence from SE Ranking with structured reasoning and summarization powered by OpenAI GPT 4.1-mini. Instead of manually reviewing backlink reports, this workflow transforms raw backlink metrics into clear, human-readable SEO insights and persists them to multiple storage layers for reporting and tracking. Who this is for? This workflow is ideal for: SEO professionals and technical SEO teams Digital marketing agencies managing multiple domains Growth and content teams tracking backlink quality Developers building SEO intelligence pipelines Data teams using n8n for enrichment and reporting What this workflow does? Accepts a backlink query (domain, host, or URL) Uses multiple SE Ranking Backlinks API endpoints to retrieve: Backlink summary metrics Referring domains, IPs, and subnets Authority and backlink quality indicators Raw backlink lists Routes the data through an AI Agent powered by GPT-4.1-mini that: Selects the appropriate backlink dataset automatically Normalizes noisy SEO data Generates structured summaries without subjective opinions Produces a clean backlink intelligence summary Persists results to: n8n DataTables Google Sheets CSV / JSON exports Setup If you are new to SE Ranking, please signup on https://seranking.com Prerequisites Active SE Ranking API access OpenAI API key with GPT-4.1-mini enabled n8n instance (self-hosted or cloud) Basic understanding of backlink and authority metrics Import the workflow JSON into n8n Configure credentials: SE Ranking** using HTTP Header Authentication. Please make sure to set the header authentication as below. The value should contain a Token followed by a space with the SE Ranking API Key. OpenAI API (GPT-4.1-mini) Google Sheets OAuth (optional, for reporting) Open the Set Input Fields node and define: query (e.g. Backlinks Summary for https://example.com) Verify storage destinations: Google Sheet ID and sheet name n8n DataTable File export nodes (CSV / JSON) Click Execute Workflow How to customize this workflow to your needs? You can easily extend or adapt this workflow by: Switching analysis mode (domain, host, or URL) Adding historical backlink trend analysis Enhancing the AI prompt to generate: Toxic backlink alerts Link-building opportunities Competitor backlink gap analysis Replacing storage with: Databases or data warehouses Slack / Email notifications BI dashboards Scheduling the workflow for continuous backlink monitoring Summary This n8n template delivers an end-to-end backlink intelligence system from raw backlink retrieval to AI-powered interpretation and structured storage. By combining SE Rankingโs backlink data with OpenAI-driven reasoning, it eliminates manual SEO analysis and enables scalable, repeatable backlink monitoring.
by Chandan Singh
This workflow creates a daily, automated backup of all workflows in a self-hosted n8n instance and stores them in Google Drive. Instead of exporting every workflow on every run, it uses content hashing to detect meaningful changes and only updates backups when a workflow has actually been modified. To keep Google Drive clean and predictable, the workflow intentionally deletes the existing backup file before uploading the updated version. This avoids duplicate files and ensures there is always one authoritative backup per workflow. A Data Table is used as an index to track workflow IDs, hash values, and timestamps. This allows the workflow to quickly determine whether a workflow already exists, whether its content has changed, or whether it should be skipped entirely. How it works Runs daily using a Cron Trigger. Fetches all workflows from the n8n API. Processes workflows one-by-one for reliability. Generates a SHA-256 hash for each workflow. Compares hashes against a stored Data Table. Deletes existing Google Drive backups when changes are detected. Uploads updated workflows and skips unchanged ones. Store new or updated workflows details in Data Table. Filters workflows based on the configured backup scope (all | active | tagged ). Backs up all workflows, only active workflows, or only workflows matching a specific tag. Applies the scope filter before hashing and comparison, ensuring only relevant workflows are processed. Setup steps Set the Cron schedule** Open the Cron Trigger node and choose the time you want the backup to run (for example, once daily during off-peak hours). Create a Data Table** Create a new n8n Data Table with the title defined in dataTableTitle. This table stores workflowId, workflowName, hashCode, and DriveFiveId. Configure the Set node** In the Set Backup Configuration node, provide the following values: { "n8nHost": "https://your-n8n-domain", "apiKey": "your-n8n-api-key", "backupFolder": "/n8n/workflow-backups", "hashAlgorithm": "sha256", "dataTableTitle": "n8n_workflow_backup_index", "backupScope" : "", "requiredTag" : "" } In the Set Backup Configuration node, choose how workflows should be selected for backup: all โ backs up every workflow (default) active โ backs up only enabled workflows tagged โ backs up only workflows containing a specific tag If using the tagged option, provide the required tag name to match. { "backupScope": "tagged", "requiredTag": "production" } Connect Google Drive credentials** Authorize your Google Drive account and ensure the backup folder exists. Activate the workflow** Once enabled, backups run automatically with no further action required.
by Janak Patel
Whoโs it for This template is ideal for YouTube video creators who spend a lot of time manually generating SEO assets like descriptions, tags, titles, keywords, and thumbnails. If you're looking to automate your YouTube SEO workflow, this is the perfect solution for you. How it works / What it does Connect a Google Sheet to n8n and pull in the Hindi script (or any language). Use OpenAI to generate SEO content: Video description Tags Keywords Titles Thumbnail titles etc. Use the generated description as input to create a thumbnail image using an image generation API. Store all outputs in the same Google Sheet in separate columns. Optionally, use tools like VidIQ or TubeBuddy to test the SEO strength of generated titles, tags, and keywords. ๐ก Note: This example uses Runwayโs image generation API, but you can plug in any other image-generation service of your choice. Requirements A Google Sheet with clearly named columns Hindi, English, or other language scripts in the sheet OpenAI API key Runway API key (or any other image generation API) How to set up You can set up this workflow in 15 minutes by following the pre-defined steps. Replace the manual Google Sheet trigger with a scheduled trigger for daily or timed automation. You may also swap Google Sheets with any database or data source of your choice. No Google Sheets API required. Requires minimal JavaScript or Python knowledge for advanced customizations.
by rana tamure
This n8n workflow automates the creation of high-quality, SEO-optimized blog posts using AI. It pulls keyword data from Google Sheets, conducts research via Perplexity AI, generates structured content (title, introduction, key takeaways, body, conclusion, and FAQs) with OpenAI and Anthropic models, assembles the post, performs final edits, converts to HTML, and publishes directly to WordPress. Ideal for content marketers, bloggers, or agencies looking to scale content production while maintaining relevance and engagement. Key Features Keyword-Driven Generation: Fetches primary keywords, search intent, and related terms from a Google Sheets spreadsheet to inform content strategy. AI Research & Structuring: Uses Perplexity for in-depth topic research and OpenAI/Anthropic for semantic analysis, outlines, and full content drafting. Modular Content Creation: Generates sections like introductions, key takeaways, outlines, body, conclusions, and FAQs with tailored prompts for tone, style, and SEO. Assembly & Editing: Combines sections into a cohesive Markdown post, adds internal/external links, and applies final refinements for readability and flow. Publishing Automation: Converts Markdown to styled HTML and posts drafts to WordPress. Customization Points: Easily adjust AI prompts, research depth, or output formats via Code and Set nodes. Requirements Credentials: OpenAI API (for GPT models), Perplexity API (for research), Google Sheets OAuth2 (for keyword input), WordPress API (for publishing). Setup: Configure your Google Sheets with columns like "keyword", "search intent", "related keyword", etc. Ensure the sheet is shared with your Google account. Dependencies: No additional packages needed; relies on n8n's built-in nodes for AI, HTTP, and data processing. How It Works Trigger & Input: Start manually or schedule; pulls keyword data from Google Sheets. Research Phase: Uses Perplexity to gather topic insights and citations from reputable sources. Content Generation: AI nodes create title, structure, intro, takeaways, outline, body, conclusion, and FAQs based on research and SEO guidelines. Assembly & Refinement: Merges sections, embeds links, edits for polish, and converts to HTML. Output: Publishes as a WordPress draft or outputs the final HTML for manual use. Benefits Time Savings: Automate 80-90% of content creation, reducing manual writing from hours to minutes. SEO Optimization: Incorporates primary/related keywords naturally, aligns with search intent, and includes semantic structures for better rankings. Scalability: Process multiple keywords in batches; perfect for content calendars or high-volume blogging. Quality Assurance: Built-in editing ensures engaging, error-free content with real-world examples and data-backed insights. Versatility: Adaptable for any niche (e.g., marketing, tech, finance) by tweaking prompts or sheets. Potential Customizations Add more AI models (e.g., via custom nodes) for varied tones. Integrate image generation or social sharing for full content pipelines. Filter sheets for specific topics or add notifications on completion.
by gotoHuman
Auto-detect news from n8n and turn into a human-approved LinkedIn post. gotoHuman is used to keep a human in the loop. There you can manually edit the AI draft of the post or request to regenerate it. How it works The workflow is triggered each day to fetch the latest version of https://blog.n8n.io. It then fetches each article, checks if it was published in the last 24 hours and uses an LLM to summarize it. An LLM then drafts a related LinkedIn post which is sent to gotoHuman for approval. In gotoHuman, the reviewer can manually edit it or ask to regenerate it with the option to even edit the prompt (Retries loop back to the AI Draft LinkedIn Post node) Approved Posts are automatically published to LinkedIn How to set up Most importantly, install the gotoHuman node before importing this template! (Just add the node to a blank canvas before importing) Set up your credentials for gotoHuman, OpenAI, and LinkedIn In gotoHuman, select and create the pre-built review template "Blog scraper agent" or import the ID: sMxevC9tSAgdfWsr6XIW Select this template in the gotoHuman node Requirements You need accounts for gotoHuman (human supervision) OpenAI (summary, draft) LinkedIn How to customize Change the blog URL to monitor. Adapt to its' HTML structure Provide the AI Draft LinkedIn Post with examples of previous posts so it picks up your writing style (consider adding gotoHuman's dataset of approved examples) Use the workflow to target other publications, like your newsletter, blog or other socials
by Sk developer
๐ Automated Website Traffic Tracker with Google Sheets Logging Track website traffic and backlinks effortlessly using the Website Traffic Checker - Ahref API. This n8n workflow automates data retrieval and logging into Google Sheets, making it perfect for SEO professionals and digital marketers. ๐งฉ What This Workflow Does (Summary) Accepts a domain via a simple web form. Sends the domain to Website Traffic Checker - Ahref API. If successful: Extracts backlink and traffic data. Appends the results to two separate Google Sheets. If failed: Sends an email alert with domain and status code. ๐ง Node-by-Node Explanation | Node | Purpose | | ---------------------------------- | ---------------------------------------------------------------------------------------------------------------- | | ๐ข Form Trigger | Starts the workflow when a domain is submitted via form. | | ๐ฉ Set Domain Value | Stores the submitted domain into a variable. | | ๐ HTTP Request | Calls Website Traffic Checker - Ahref API. | | โ IF Node | Checks if the API responded with statusCode = 200. | | โ Email Node (Fail) | Sends an alert email if API fails. | | ๐ฆ Code (Backlink Info) | Extracts backlink data from API response. | | ๐ Google Sheet: Backlink Info | Appends backlink data to a sheet. | | ๐ฆ Code (Traffic Info) | Extracts traffic data from API response. | | ๐ Google Sheet: Traffic Data | Appends traffic metrics to another sheet. | ๐ Google Sheet Columns Backlink Info Sheet | Column | Description | | ------------------ | --------------------------- | | website | Domain submitted | | ascore | Authority score | | referring domain | Number of referring domains | | total backlinks | Total backlinks | Traffic Data Sheet | Column | Description | |----------------------|---------------------------------------------| | accuracy | Accuracy level of the traffic data | | bounce_rate | Bounce rate percentage | | desktop_share | Percentage of traffic from desktop devices | | direct | Direct traffic sources | | display_ad | Display ad traffic sources | | display_date | Date when traffic data was captured | | mail | Traffic from email campaigns | | mobile_share | Percentage of traffic from mobile devices | | pages_per_visit | Average number of pages per visit | | paid | Paid traffic sources | | prev_bounce_rate | Bounce rate in the previous period | | prev_direct | Previous period's direct traffic | | prev_display_ad | Previous period's display ad traffic | | prev_mail | Previous period's email traffic | | prev_pages_per_visit | Previous period's pages per visit | | prev_referral | Previous period's referral traffic | | prev_search_organic | Previous organic search traffic | | prev_search_paid | Previous paid search traffic | | prev_social_organic | Previous organic social traffic | | prev_social_paid | Previous paid social traffic | | prev_time_on_site | Previous time spent on site | | prev_users | Number of users in the previous period | | prev_visits | Visits in the previous period | | rank | Global rank of the website | | referral | Referral traffic | | search | Total search traffic | | search_organic | Organic search traffic | | search_paid | Paid search traffic | | social | Total social traffic | | social_organic | Organic social traffic | | social_paid | Paid social traffic | | target | Targeted country or demographic | | time_on_site | Average time spent on site | | unknown_channel | Traffic from unknown sources | | users | Number of unique users | | visits | Total number of visits | ๐ How to Configure ๐ Get API Key Go to Website Traffic Checker - Ahref API on RapidAPI. Sign in or create a free RapidAPI account. Subscribe to the API plan. Copy your x-rapidapi-key from the Endpoints tab. ๐ Add Key in n8n Go to your HTTP Request node. Under Headers, set: x-rapidapi-host = website-traffic-checker-ahref.p.rapidapi.com x-rapidapi-key = your API key ๐ How to Setup Google Sheets in n8n Connect a Google account via Google Sheets credentials in n8n. Use the full Google Sheet URL in the documentId field. Set correct Sheet name or GID (e.g., "Traffic Data"). Use Auto Map or Custom Map to define columns. > Make sure your Google Sheet has edit access and headers already created. ๐ง Use Case & Benefits ๐ค Ideal For: SEO analysts Digital marketers Agencies managing multiple clients Web analytics consultants โ Benefits: Fully automated data collection. No manual copy-paste** from tools. Real-time insights delivered to Google Sheets. Easy monitoring of backlinks and traffic trends.
by Dataki
โ ๏ธ Disclaimer: > I am not a cybersecurity expert. This workflow was built through research and with the assistance of an LLM (Claude Opus 4.6). While it implements well-established security patterns (HMAC-SHA256, timing-safe comparison, replay protection, strict payload validation), please review the logic carefully and ensure it meets your own security requirements before deploying it in production. Who is this for? This template is for anyone exposing an n8n workflow via webhook and wanting to ensure that only authenticated, untampered requests are processed. What problem does this solve? Public webhooks are vulnerable by default. Without proper verification, anyone who discovers your URL can send forged requests, replay old ones, or inject unexpected parameters. While n8n's built-in Webhook authentication modes (Basic Auth, Header Auth, JWT) verify who is calling, they don't verify that the payload hasn't been altered, that the request is fresh, or that the data structure matches what you expect. This template adds those missing layers: Authentication** โ Verifies the sender's identity through HMAC-SHA256 signature validation Integrity** โ Ensures the payload hasn't been modified by signing the raw body byte-for-byte Replay protection** โ Rejects requests with expired timestamps (configurable, default: 5 minutes) Payload sanitization** โ Strict whitelist filtering blocks unauthorized fields before they reach your logic What this workflow does The workflow chains six security layers before any business logic runs: Webhook receives the request with Header Auth + Raw Body enabled to preserve the original payload Extract rawBody (Code node) decodes the binary into a UTF-8 string and extracts the security headers Crypto computes the HMAC-SHA256 signature of {timestamp}.{rawBody} using your HMAC secret Timing-Safe HMAC Check (Code node) validates the timestamp freshness and compares signatures using crypto.timingSafeEqual() Strict Payload Validation (Code node) parses the JSON, checks required fields, and rejects any unexpected keys AI Agent processes the prompt only after all checks pass Invalid requests are immediately rejected with 403 Forbidden (signature/timestamp failure) or 400 Bad Request (payload validation failure), with no response body to avoid leaking internal logic. Example use case The included example protects an AI Agent endpoint that expects a simple {"prompt": "..."} payload. But this is just a starting point โ replace the AI Agent with any node and adapt the payload validation to your own schema. Common adaptations: CRM or SaaS event callbacks CRUD operations on a database Third-party API integrations Setup Prerequisites An n8n instance (Cloud or Self-hosted) A shared HMAC secret between the sender and this workflow โ keep it safe and never expose it in workflow logs or execution data Going further This workflow is a solid starting point โ it's more secure than a raw exposed webhook. However, it focuses on application-level security (authentication, integrity, replay protection, payload sanitization). For a production-grade setup, consider adding layers at the infrastructure level : Rate limiting** IP whitelisting** Reverse proxy hardening**
by Shun Nakayama
Turn your favorite podcast episodes into engaging social media content automatically. This workflow fetches new episodes from an RSS feed, transcribes the audio using OpenAI Whisper, generates a concise summary using GPT-4o, and drafts a tweet. It then sends the draft to Slack for your review before posting it to X (Twitter). Who is this for Content creators, social media managers, and podcast enthusiasts who want to share insights without manually listening to and typing out every episode. Key Features Large File Support:** Includes a custom logic to download audio in chunks, ensuring stability even with long episodes (preventing timeouts). Human-in-the-Loop:** Nothing gets posted without your approval. You can review the AI-generated draft in Slack before it goes live. High-Quality AI:** Uses OpenAI's Whisper for accurate transcription and GPT-4o for intelligent summarization. How it works Monitor: Checks the Podcast RSS feed daily for new episodes. Process: Downloads the audio (handling large files via chunking) and transcribes it. Draft: AI summarizes the transcript into bullet points and formats it for X (Twitter). Approve: Sends the draft to a Slack channel. Publish: Once approved by you, it posts the tweet to your X account. Requirements OpenAI API Key Slack Account & App (Bot Token) X (Twitter) Developer Account (OAuth2) Setup instructions RSS Feed: The template defaults to "TED Talks Daily" for demonstration. Open the [Step 1] RSS node and replace the URL with your target podcast. Connect Credentials: Set up your credentials for OpenAI, Slack, and X (Twitter) in the respective nodes. Slack Channel: In the [Step 12] Slack node, select the Channel ID where you want to receive the approval request.
by NODA shuichi
Description: Don't just get a recipe. Get a Strategy. (Speed / Healthy / Creative) ๐ณ๐ค This workflow solves the "What should I eat?" problem by using Google Gemini to generate 3 distinct recipe variations simultaneously based on your fridge leftovers. It demonstrates advanced n8n concepts like Array Processing and Data Aggregation. Key Features: Array Processing: Demonstrates how to handle JSON lists (Gemini outputs an array -> n8n splits it -> API calls run for each item). Aggregation: Shows how to combine processed items back into a single summary email. Visual Enrichment: Automatically searches for recipe images using Google Custom Search. How it works: Input: Enter ingredients via the Form Trigger. Generate: Gemini creates 3 JSON objects: "Speed (5min)", "Healthy", and "Creative". Process: The workflow iterates through the 3 recipes, searching for images and logging data to Google Sheets. Aggregate: The results are combined into one HTML comparison table. Deliver: You receive an email with 3 options to choose from. Setup Requirements: Google Sheets: Create a sheet named Recipes with headers: date, ingredients, style, recipe_name, recipe_text, image_url. Credentials: Google Gemini API, Google Custom Search (API Key & Engine ID), Gmail, Google Sheets. Configuration: Enter your IDs in the "1. Configuration" node.