by Avkash Kakdiya
How it works This workflow runs every morning to collect news from multiple RSS feeds across global, business, finance, and tech categories. It processes and filters the most recent articles, then uses an AI model to select the most relevant global stories. The selected news is formatted into a clean digest. Finally, the digest is automatically delivered to a Slack channel. Step-by-step Trigger and setup feeds** Schedule Trigger – Runs the workflow daily at a fixed time. Define News Categories – Stores all RSS feed URLs in one place. Prepare RSS Feed List – Converts feed data into iterable items. Fetch and process articles** Loop Through RSS Feeds – Iterates through each RSS source. Fetch RSS Articles – Pulls articles from each feed. Filter Latest 10 Articles – Sorts and keeps the most recent items. Merge All RSS Articles – Combines all articles into one dataset. AI selection and delivery** AI Select Top Global News – Uses AI to pick the most important stories. Groq Chat Model – Provides the language model for analysis. Format News For Slack Message – Structures AI output into clean messages. (Sub-node) JSON Extraction Logic – Parses AI response into usable article data. Post News Digest to Slack – Sends the final formatted digest to Slack. Why use this? Automates end-to-end news aggregation and summarization Filters noise and highlights only high-impact global stories Keeps teams informed directly within Slack Reduces manual effort in tracking multiple news sources Easily scalable by adding more RSS feeds or categories
by Jeremiah Wright
Who’s it for Recruiters, freelancers, and ops teams who scan job briefs and want quick, relevant n8n template suggestions, saved in a Google Sheet for tracking. What it does Parses any job text, extracts exactly 5 search keywords, queries the n8n template library, and appends the matched templates (ID, name, description, author) to Google Sheets, including the canonical template URL. How it works Trigger receives a message or paste-in job brief. LLM agent returns 5 concise search terms (JSON). For each keyword, an HTTP request searches the n8n templates API. Results are split and written to Google Sheets; the workflow builds the public URL from ID+slug. Set up Add credentials for OpenAI (or swap the LLM node to your provider). Create a Google Sheet with columns: Template ID, Name, User, Description, URL. In the ⚙️ Config node, set: GOOGLE_SHEETS_DOC_ID, GOOGLE_SHEET_NAME, N8N_TEMPLATES_API_URL. Requirements • n8n (cloud or self-hosted) • OpenAI (or alternative LLM) credentials • Google Sheets OAuth credentials Customize • Change the model/system prompt to tailor keyword extraction. • Swap Google Sheets for Airtable/Notion. • Extend filters (e.g., only AI/CRM templates) before writing rows.
by Shun Nakayama
This workflow implements cutting-edge concepts from Google DeepMind's OPRO (Optimization by PROmpting) and Stanford's DSPy to automatically refine AI prompts. It iteratively generates, evaluates, and optimizes responses against a ground truth, allowing you to "compile" your prompts for maximum accuracy. Why this is powerful Instead of manually tweaking prompts (trial and error), this workflow treats prompt engineering as an optimization problem: OPRO-style Optimization**: The "Optimizer" LLM analyzes past performance scores and reasons to mathematically deduce a better prompt. DSPy-style Logic**: It separates the "Logic" (Workflow) from the "Parameters" (Prompts), allowing the system to self-correct until it matches the Ground Truth. How it works Define**: Set your initial prompt and a test case with the expected answer (Ground Truth). Generate**: The workflow generates a response using the current prompt. Evaluate**: An AI Evaluator scores the response (0-100) based on accuracy and format. Optimize**: If the score is low, the Optimizer AI analyzes the failure and rewrites the prompt. Loop**: The process repeats until the score reaches 95/100 or the loop limit is hit. Setup steps Configure OpenAI: Ensure you have an OpenAI credential set up in the OpenAI Chat Model node. Customize: Open the Define Initial Prompt & Test Data node and set your initial_prompt, test_input, and ground_truth. Run: Execute the workflow and check the Manage Loop & State node output for the optimized prompt.
by Avkash Kakdiya
How it works This workflow automatically syncs new Productboard features into Linear as issues and notifies the team via Telegram. It starts on a schedule, fetches Productboard features through API requests, and transforms the raw data into clean, structured fields. Newly created features are filtered, then inserted into Linear, and a success message is sent to Telegram for confirmation. Step-by-step 1. Trigger and fetch data Schedule Trigger** – Starts the workflow at predefined intervals. HTTP Request to Productboard** – Pulls the latest features from the Productboard API. 2. Transform and clean data Code (Transform Features)** – Strips HTML, formats dates, and extracts clean fields like name, description, status, owner, and link. 3. Filter for new items If (Filter New Features)** – Compares createdAt with today’s date, allowing only new features to proceed. 4. Create issues in Linear Create Linear Issue** – Opens a new Linear issue using the feature’s name and description. 5. Notify via Telegram Success Notification (Telegram)** – Sends a confirmation message once the sync is successful. Why use this? Automates the sync of Productboard features into Linear without manual copying. Ensures only new features are captured, preventing duplicates. Keeps your team updated instantly through Telegram notifications. Saves time by standardizing data and formatting before inserting into Linear. Creates a smooth handoff from product planning to engineering execution.
by Avkash Kakdiya
How it works This workflow automatically processes new GitHub issues and uses AI to classify them by type and priority. It extracts key issue data, sends it to an AI model for structured analysis, and formats the output for task creation. The workflow then creates a task in Linear and adds a comment back to the GitHub issue. This ensures consistent triage and faster issue handling without manual effort. Step-by-step Capture and filter new issues** Github Trigger – Listens for new issues or comments in the repository. If – Filters events to process only newly opened issues. Edit Fields – Extracts and structures title, description, author, and URL. AI classification and formatting** Information Extractor – Sends issue data to AI for classification (type, priority, labels). Code in JavaScript – Cleans, validates, and formats AI output for consistent use. OpenAI Chat Model (sub-node) – Provides the language model powering the AI classification. Merge – Combines original issue data with AI-processed output. Task creation and feedback** Create an issue – Creates a structured task in Linear with priority and description. Create a comment on an issue – Posts a comment back to GitHub with classification results. Why use this? Eliminates manual issue triage and prioritization work Ensures consistent classification using AI-driven logic Speeds up response time for bugs, features, and questions Automatically syncs GitHub issues with Linear tasks Improves team visibility with instant feedback on each issue
by WeblineIndia
Daily WooCommerce Sales Snapshot to Slack with Google Sheets Logging This workflow automatically collects WooCommerce sales data every day, calculates key sales metrics, sends a clean summary to Slack and logs the same data into Google Sheets for historical tracking. It helps teams stay informed about daily performance without manually checking dashboards or reports. Quick Implementation Steps (Get Started Fast) Import the workflow JSON into n8n. Connect your WooCommerce, Slack and Google Sheets credentials. Verify the Slack channel and Google Sheet selection. Activate the workflow. Receive daily sales updates automatically. What It Does This workflow runs on a daily schedule and fetches all recent orders from a WooCommerce store. It filters the orders to include only paid ones (Processing and Completed) and further narrows them down to those created within the last 24 hours. Using separate Code nodes, the workflow calculates essential sales metrics such as total revenue, number of orders, average order value (AOV) and the top-selling products. These metrics are merged into a single structured object for consistent downstream use. Finally, the workflow sends a formatted sales summary to a Slack channel for quick visibility and appends the same data as a row in Google Sheets. This creates a reliable daily log that can be used for trend analysis and reporting. Who’s It For WooCommerce store owners Sales and operations teams Marketing teams tracking daily performance Business managers who prefer Slack updates Analysts maintaining sales history in spreadsheets Requirements to Use This Workflow Active WooCommerce store with API access n8n instance (self-hosted or cloud) Slack workspace with permission to post messages Google Sheets document for logging data Valid credentials configured in n8n for: WooCommerce Slack Google Sheets How It Works A Schedule Trigger runs the workflow once per day. Orders are fetched from WooCommerce. Only paid orders (Processing / Completed) are considered. Orders from the last 24 hours are filtered. Sales metrics are calculated: Total Revenue Order Count Average Order Value (AOV) Top Selling Products Metrics are merged into a single object. A formatted summary is: Sent to Slack Appended or updated in Google Sheets How To Set Up Configure the Schedule Trigger time. Add WooCommerce credentials. Review paid order filtering logic. Select Slack channel. Select Google Sheet and worksheet. Test the workflow. Activate it. How To Customize Nodes Change schedule time in the Schedule Trigger. Modify order statuses in the Filter Paid Orders node. Adjust the 24-hour window in the Code node. Increase or decrease top products count. Customize Slack message formatting. Add or remove Google Sheets columns. Add-ons (Optional Enhancements) Weekly or monthly summaries Revenue comparison (day-over-day / week-over-week) Revenue threshold alerts Multiple Slack channels Dashboard integrations from Google Sheets Use Case Examples Daily sales snapshot for store owners Morning updates for sales teams Automated sales logging for finance teams Performance tracking without dashboards Remote team visibility via Slack Many more variations are possible depending on business needs. Troubleshooting Guide | Issue | Possible Cause | Solution | |------|---------------|----------| | Slack message not received | Slack credentials or channel issue | Verify Slack API and channel | | Google Sheet not updating | Incorrect sheet or mapping | Recheck sheet selection | | Orders missing | Order status filter too strict | Update filter conditions | | Revenue incorrect | Time filter issue | Verify last 24-hour logic | | Workflow not running | Workflow inactive | Activate workflow | Need Help? If you need help setting up, customizing or extending this workflow, our n8n automation experts at WeblineIndia can assist. We specialize in: n8n automation workflows Business process automation Custom integrations and reporting Contact WeblineIndia to build reliable and scalable automation tailored to your business.
by Oneclick AI Squad
Automatically detects new GitHub Pull Requests, analyzes changed code with AI, generates detailed review comments (quality, security, performance, best practices), posts suggestions back to the PR, stores results in a database, and sends notifications. Good to Know Triggers automatically on new/updated GitHub Pull Requests via webhook (or manual test) Fetches only changed files/diffs — no need to clone full repo Uses AI (Grok, OpenAI, Claude, Gemini, etc.) to provide intelligent, context-aware feedback Covers multiple dimensions: code quality, bugs, security vulnerabilities, performance issues, maintainability, style/best practices Posts formatted review comments directly on the GitHub PR (with severity levels, suggestions, code snippets) Stores review history & scores in PostgreSQL (or other DB) for auditing, metrics, team dashboards Sends real-time notifications (Slack, Discord, email, etc.) for high-severity findings Saves developers hours on initial reviews and catches issues early How It Works 1. Trigger PR Detection GitHub Webhook** node — listens for pull_request events (opened, synchronize, reopened, ready_for_review) Optional: Filter node to ignore drafts, dependabot PRs, or specific branches Manual trigger available for testing 2. Fetch & Analyze Code GitHub** node — retrieves PR details (title, body, number, repo, base/head commits) GitHub* or *HTTP Request** — fetches list of changed files + diffs (using GitHub API /pulls/{number}/files and diff content) Merge PR Details & Extract Diffs** — combines metadata + code changes into structured format Prepares payload: file paths, diff hunks, full file content if needed (truncated for large files) 3. AI Review & Score Sends prepared diff data + context (language, repo conventions, custom guidelines) to AI model Prompt engineering focuses on: Code correctness & bugs Security vulnerabilities (OWASP, secrets, injection risks) Performance optimizations Readability, maintainability, SOLID principles Best practices & style (specific to language/framework) Refactoring suggestions with examples AI returns structured output: severity (low/medium/high/critical), category, comment text, suggested fix (with code block) Optional: Score node — assigns overall PR quality score (0–100) based on findings 4. Post Review & Notify Route** by severity / issue count (e.g. critical → immediate Slack) GitHub** node — posts detailed review comments on the PR (as bot user) Supports threaded replies, line-specific comments (if hunk positions available) Adds label e.g. ai-reviewed, needs-changes Store Results in PostgreSQL** — logs full review (PR link, timestamp, AI output JSON, score, issues list) Send Summary to Slack** (or Discord/Email/Telegram) — concise message with key findings, link to PR, severity highlights Log Completion** — records successful execution for monitoring Data Sources GitHub** — Pull Requests, diffs, comments, labels (via webhook + API) AI Model** — Grok (xAI), OpenAI GPT-4o / o1, Anthropic Claude, Google Gemini, or local LLM Storage** — PostgreSQL (recommended for structured querying), or Supabase, Airtable, Google Sheets Notifications** — Slack, Discord, Microsoft Teams, Email (SMTP), Telegram How to Use Import the workflow JSON into your n8n instance Configure credentials: GitHub OAuth / Personal Access Token (with repo scope) AI provider API key (Grok/OpenAI/etc.) PostgreSQL database connection Slack/Discord/Email credentials Set up GitHub Webhook: In repo Settings → Webhooks → Add webhook Payload URL = your n8n webhook URL Content type: application/json Events: Pull requests Customize AI prompt — add repo-specific rules, coding standards, ignored patterns Tune filters — minimum severity to post, files to skip (e.g. lock files, generated code) Test — create/open a small PR or use Execute Workflow with sample payload Activate — turn on the workflow and monitor Executions + Logs Requirements n8n (self-hosted preferred for webhooks) GitHub repo with admin access to add webhook & bot token AI API access with sufficient token limit (large PRs = large prompts) PostgreSQL database (or alternative) for persistent storage Notification service account (Slack app, Discord bot, etc.) Customizing This Workflow Add custom best practices** — load from Google Sheets/Notion/Airtable and inject into prompt Support multi-file analysis** — chunk very large PRs or summarize per-file first Auto-approve low-risk PRs** — add approval action if score > 90 and no critical issues Security focus** — integrate with tools like Semgrep/Trivy results Comment on specific lines** — use GitHub API position/hunk data for inline comments Team routing** — notify language-specific experts via Slack channels Metrics dashboard** — connect DB to Grafana/Metabase for review trends Ignore patterns** — skip vendor/, node_modules/, tests/, etc. Multiple AI models** — fallback or ensemble (e.g. Claude for reasoning + Grok for speed)
by Olivier
This template is a pattern library (one importable workflow) that shows a repeatable way to structure n8n automations so they remain easy to extend, cheaper to run, and safer to scale. It’s intentionally opinionated and dry: the goal is not “plug & play”, but a set of proven building blocks you can copy into your own workflows. Problems this framework solves Spaghetti workflows that are hard to change** A consistent split into Trigger → Manager → Function → Utility so changes don’t ripple through everything. Duplicate processing when runs overlap** Uses “in progress / success / error” indicators so the trigger can skip items that are already being processed. Unnecessary re-runs that keep failing** Items that fail can be marked/parked, so you don’t burn executions repeating the same error. Execution costs exploding over time** Offers polling + batching alternatives when “one event = one execution” becomes too expensive. Rate limits and API throttling under load** Includes rate-limited processing patterns (delays/throttling) to smooth spikes. Missed items during downtime, deploys, or restarts** Stores sync state (e.g., lastSync) in n8n Data Tables instead of relying on in-memory state. Long-running pagination that becomes fragile** Demonstrates manual “page-wise” pagination (fetch N → process N → checkpoint → repeat) to avoid huge in-memory batches. Debugging incidents without visibility** Includes an error workflow pattern (Error Trigger + notification) and structured error logging. What you get in this template Trigger patterns (simple and rate-limited) Polling / batching patterns (basic → more robust → fully configurable with pagination) A “manager” pattern for stateful processing and overlap protection Function + utility workflow examples for reusability Error logging to a Data Table and an example Telegram alert Requirements / setup n8n version that includes the Data Table node Create/replace Data Tables used in the template (e.g. Timestamps, Errors) Example nodes use ProspectPro, HubSpot, and Telegram (optional). Replace these with your own tools if you’re not using them. Important notes This is not a finished automation. Import it, then choose the pattern(s) you need and swap the example “get items / process item” steps for your own logic. Some patterns include looping/recursion options—configure stop conditions carefully to avoid unintended infinite runs. This framework is one effective route to scalable n8n systems, not the only one. Note: this is a living document that will be updated periodically.
by deAPI Team
Who is this for? E-commerce store owners using Shopify Product managers who need consistent product imagery Marketing teams looking to automate visual content creation Dropshipping businesses needing quick product photos What problem does this solve? Creating professional product images is time-consuming and expensive. This workflow eliminates the need for manual photo editing by automatically generating styled hero images and transparent PNGs ready for your product catalog. What this workflow does Triggers when a new product is created in Shopify Extracts product title, description, category, and tags AI Agent analyzes the product data and uses the deAPI Prompt Booster tool to create an optimized image generation prompt Generates a professional product image using deAPI Removes background to create a transparent PNG version Updates the Shopify product with both images (hero + transparent) Setup Requirements n8n instance** (self-hosted or n8n Cloud) deAPI account for image generation and prompt boosting Shopify scopes: read_products, write_products, write_files Anthropic account for the AI Agent Installing the deAPI Node n8n Cloud: Go to **Settings → Community Nodes and toggle the "Verified Community Nodes" option Self-hosted: Go to **Settings → Community Nodes and install n8n-nodes-deapi Configuration Add your deAPI credentials (API key + webhook secret) Add your Shopify credentials (Shop subdomain + APP secret key + Access token) Add your Anthropic credentials (API key) Ensure your n8n instance has an HTTPS webhook URL Set up the webhook URL for the product creation event in Shopify How to customize this workflow Change the AI model**: Swap Anthropic for any other LLM providers Adjust the prompt**: Modify the AI Agent system message for different photography styles Add more processing**: Insert upscaling(e.g. deAPI Upscale an Image) or additional image transformations Different e-commerce platform**: Replace Shopify nodes with WooCommerce, BigCommerce, etc. Add human review**: Insert a Wait node + Slack notification before uploading to Shopify
by Rahul Joshi
📘 Description This workflow automates the complete DPDP-aligned Consent Manager Registration screening pipeline — from intake to eligibility evaluation and final compliance routing. Every incoming registration request is normalized, validated, logged, evaluated by an AI compliance engine (GPT-4o), and then routed into either approval or rejection flows. It intelligently handles missing documentation (treated as a minor issue), evaluates financial/technical/operational capacity, generates structured eligibility JSON, updates registration records in Google Sheets, and sends outcome-specific emails to applicants and compliance teams. The workflow creates a full audit trail while reducing manual screening workload and ensuring consistent eligibility decisions. ⚙️ What This Workflow Does (Step-by-Step) ▶️ Receive Consent Registration Event (Webhook) Collects incoming Consent Manager registration applications and triggers the processing pipeline. 🧹 Extract & Normalize Registration Payload (Code Node) Cleans the body payload and extracts key fields: action, organizationName, applicationType, contactEmail, netWorth, technicalCapacity, operationalCapacity, documentAttached, submittedAt. 🔍 Validate Registration Payload Structure (IF Node) Checks the presence of mandatory fields. Valid → continue to eligibility evaluation Invalid → log in the audit sheet. 📄 Log Invalid Registration Requests to Sheet (Google Sheets) Stores malformed or incomplete submissions for audit, follow-up, and retry handling. 📝 Write Initial Registration Entry to Sheet (Google Sheets) Creates the initial intake row in the master registration sheet before applying eligibility logic. 🧠 Configure GPT-4o — Eligibility Evaluation Model (Azure OpenAI) Prepares the AI model used to determine whether the applicant meets DPDP’s eligibility criteria. 🤖 AI Eligibility Evaluator (DPDP Compliance) Analyzes applicant data and evaluates their eligibility based on: financial capacity, technical capability, operational readiness, and documentation status. Missing documents → NOT a rejection condition. Returns strictly formatted JSON with: eligible, riskLevel, decisionReason, missingItems, recommendedNextSteps. 🧼 Parse AI Eligibility JSON Output (Code Node) Converts AI output into valid JSON by removing markdown artifacts and ensuring safe parsing. 🔎 Validate Eligibility Status (IF Node) Routes the outcome: Eligible → approval workflow Ineligible → rejection email. 📧 Send Rejection Email to Applicant (Gmail) Sends a structured rejection email listing issues and re-submission instructions. 🔗 Merge Registration + Eligibility Summary (Code Node) Combines raw registration data with AI eligibility results into one unified JSON package. 📬 Send Approval Email to Compliance Team (Gmail) Notifies compliance officers that an applicant passed eligibility and is ready for verification. 🧩 Prepare Status Update Fields (Set Node) Constructs the final status value (e.g., “passed”) for updating the database. 📘 Update Registration Status in Sheet (Google Sheets) Updates the applicant’s record using contactEmail as the key, marking the final eligibility status. 🧩 Prerequisites Azure OpenAI (GPT-4o) credentials Gmail OAuth connection Google Sheets OAuth connection Valid webhook endpoint for intake 💡 Key Benefits ✔ Fully automates DPDP Consent Manager registration screening ✔ AI-driven eligibility evaluation with standardized JSON output ✔ Smart handling of missing documents without unnecessary rejections ✔ Automatic routing to approval or rejection flows ✔ Complete audit logs for all submissions ✔ Reduces manual review time and improves consistency 👥 Perfect For DPDP compliance teams Regulatory operations units SaaS platforms handling consent manager onboarding Organizations managing structured eligibility workflows
by Adam ABDELMOUMNI
Back up & restore n8n workflows with preserved folder structure with GoogleDrive A. Backup workflows solution to google Drive ✅ What problem does this workflow solve? If you’re building and managing multiple automations with well-organized nested folders structure, then, losing a workflow due to accidental deletion or misconfiguration, can cost you hours of work and headache. This can be even more impactful for self-hosted n8n instances. This template solves that for any n8n setup (cloud or self hosted) by exporting a perfect mirrored setup of a whole n8n project, preserving the nested folder structure and the workflows within it. All of it is uploaded in google drive under one main backup folder per execution to keep track of different setup versions along time. 🧑💻Who’s it for ? This workflow is ideal for any n8n user, from beginner to advanced solo/team “flowgrammer” who wants to have a reliable, safe, automated and easy to access backup solution that reflects a perfect mirror of their n8n setup. ✨What it does Scheduled Execution**: The workflow runs automatically according to the schedule, could be up to X times a day (or can be triggered manually). Creates backup Folder**: It creates a new overall backup folder with a naming convention as “n8n_backup_folder_structure_DDMMYYYY_HHmmss” (i.e “n8n_backup_folder_structure_02022026_123343”) where the whole n8n nested folder structure along with workflows (JSON files) will be saved. For example if an n8n workflow instance look like this, then same structure will be preserved and reflected on google drive along with the workflows within each folder : projects-root-folder/ └── Your-project-folder-name/ └── Utilities/ └──Error_management/ └── error_alerting.json └──Log_analysis/ └── Reports/ └── Clients/ └──Client_A/ └──Client_B/ └── client_reporting_standard.json └── ... └── workflow_test.json Fetches the non documented n8n API**: It connects to your n8n instance via the n8n API, not the documented one, but the one used directly from your UI when you manipulate your instance. This API has been used instead of the native n8n node because it gives some more features that are being used here such as : Retrieve the workflow’s parentFolder name Retrieve the workflow’s description Retrieve the n8n instance’s projectId Create an n8n data table Get properties of an n8n data table 🛠️How to set up ? 1. Configure Credentials: Make sure you have valid credentials for : Google Drive: To allow the workflow to create folders and upload files. 2. Set Your Variables: In the first Set node named "n8n instance/project access details": n8n_instance_URL: Paste your n8n instance URL without the “/” at the end. For example : https://myautomations.app.n8n.cloud emailOrLdapLoginId: your login email address password: your password 3. Create your main backup folder: Create on your google drive space the main folder where all your backup will be uploaded. For example “my_n8n_backup” and then select it in the node Create backup folder "n8n_backup_folder_structure_ddMMyyyy_HHmmss" ” from the List under “Parent Folder”, 🚀 Activate the workflow, and you're all set! B. Restore Backup workflows to n8n from google Drive You can run it manually by adding the specific google drive backup folder name that you want to restore on n8n inside the node "Search for the backup folder in Drive". After running the workflow you'll see on your n8n the backup folder restored with all your saved folder/subfolders and associated workflows. And when you open that folder you'll see all your setup successfully restored!
by Dinakar Selvakumar
📌 Workflow Overview This workflow enables multi-platform social media posting using Google Sheets as the control center. Whenever a new row is added to the sheet, the workflow automatically posts the content to Instagram, Facebook, and/or LinkedIn based on platform flags, then updates the post status to prevent duplicates. Supported Platforms Instagram (Business) Facebook Pages LinkedIn Pages 🧠 Key Concept Google Sheets acts as a lightweight CMS and automation trigger. Each row represents one post, and simple TRUE/FALSE columns decide where that post should be published. 📄 Required Google Sheets Columns The content sheet must include the following columns: Content** – Text to publish Instagram** – TRUE / FALSE Facebook** – TRUE / FALSE LinkedIn** – TRUE / FALSE Status** – Updated after posting Row Number** – Used for precise updates ⚙️ How This Workflow Works 1️⃣ Trigger: New Content Added The workflow starts when a new row is added to Google Sheets. This allows near real-time publishing without manual execution. 2️⃣ Configuration Setup Platform-specific values like: Instagram Business Account ID Facebook Page ID Are defined once in a configuration node for easy reuse and maintenance. 3️⃣ Platform Routing Logic IF nodes check each platform column: Instagram = TRUE → post to Instagram Facebook = TRUE → post to Facebook LinkedIn = TRUE → post to LinkedIn One row can trigger posting to multiple platforms. 4️⃣ Platform Posting Posts are published using: Facebook Graph API (Instagram + Facebook) LinkedIn API (LinkedIn Pages) The Content column is used directly as the post body. 5️⃣ Status Update (Per Platform) After posting: The workflow updates the same row using Row Number Marks the post as completed for that platform This prevents duplicate or accidental re-posts. 🔄 Current Capabilities Multi-platform posting from one sheet Platform-specific routing logic Real-time execution on new content Safe status updates using row matching 🚀 Designed for Easy Expansion This workflow is intentionally modular and can be extended with: Scheduled posting (date/time columns) Image & media handling AI-generated captions Hashtag optimization Engagement analytics Retry & error handling logic ✅ Best Practices Use TRUE / FALSE consistently in platform columns Keep Google Sheets as the single source of truth Add validation or approval columns if used by teams 📦 Ideal Use Cases Social media managers Marketing teams Founders & creators Agencies handling multiple platforms This workflow provides a scalable foundation for social media automation while remaining simple, transparent, and easy to maintain.