by Kev
Important: This workflow uses the Autype community node and requires a self-hosted n8n instance. This workflow downloads a fillable PDF form from a URL, extracts all form field names and types using Autype, sends the field list to an AI Agent (OpenAI) together with applicant data, and uses the AI response to fill the form automatically. The AI is instructed to return raw JSON only, and a Code node validates the response before filling. The filled PDF is flattened (non-editable) and saved to Google Drive. Who is this for? Companies that regularly submit the same types of PDF form applications -- permit renewals, tax filings, compliance questionnaires, insurance claims, customs declarations, or any recurring government/regulatory paperwork. Instead of manually filling the same form fields every quarter or year, the AI reads the form structure and fills it with the correct data automatically. Concrete example: A manufacturing company must renew its operating permit every year by submitting a multi-page PDF application to the local regulatory authority. The form asks for company name, registration number, address, contact person, business type, employee count, and more. With this workflow, the company stores its data once in the AI Agent prompt, and every renewal period they simply run the workflow to get a completed, flattened PDF ready for submission. This also works as an additional skill for an AI agent. Instead of a manual trigger, connect the workflow to a webhook or chat trigger so an agent can call it when a user asks "fill out the permit renewal form for Q2 2026." What this workflow does On manual trigger, the workflow fetches a fillable PDF from a URL (e.g. a government portal, internal document server, or S3 bucket). It uploads the PDF to Autype and calls Get Form Fields to extract every field name, type (text, checkbox, dropdown, radio), current value, available options, and read-only status. The field list is passed directly to an AI Agent via an inline expression (no separate prompt-building Code node needed). The AI's system message instructs it to return only raw JSON. A Code node validates and parses the response before Autype fills the form, flattens it, and the result is saved to Google Drive. Showcase How it works Run Workflow -- Manual trigger starts the pipeline. Download PDF Form -- An HTTP Request node fetches the fillable PDF from a URL (the sample uses a registration form with 7 fields). Upload PDF Form -- Uploads the PDF binary to Autype Tools to get a file ID. Get Form Fields -- Autype extracts all form fields and returns them as metadata. Each field includes: name, type (text/checkbox/dropdown/radio/optionlist), value (current), options (for dropdowns/radio), and isReadOnly. No output file is created. AI Agent -- Receives the field list and applicant data directly in its prompt via an n8n expression. The system message instructs the AI to return only a raw JSON object mapping field names to values (strings for text/dropdown/radio, booleans for checkboxes). Prepare Fill Data -- A Code node parses and validates the AI JSON response (strips markdown fences if present), then pairs it with the Autype file ID. Fill PDF Form -- Autype fills every form field with the AI-generated values. Fields are flattened (non-editable) so the output is a clean, final PDF. Save Filled PDF to Drive -- The completed form is uploaded to Google Drive as filled-form-YYYY-MM-DD.pdf. Setup Install the Autype community node (n8n-nodes-autype) via Settings > Community Nodes. Create an Autype API credential with your API key from app.autype.com. See API Keys in Settings. Create an OpenAI API credential with your key from platform.openai.com. Create a Google Drive OAuth2 credential and connect your Google account. Import this workflow and assign your credentials to each node (including the OpenAI Chat Model sub-node). The sample form URL is pre-configured. To use your own form, replace the URL in the "Download PDF Form" node. Edit the applicant data directly in the AI Agent node prompt (the "Prompt (User Message)" field). Set YOUR_FOLDER_ID in the "Save Filled PDF to Drive" node to your target Google Drive folder. Click Test Workflow to run the pipeline. Note: This is a community node, so you need a self-hosted n8n instance to use community nodes. Requirements Self-hosted n8n instance (community nodes are not available on n8n Cloud) Autype account with API key (free tier available) n8n-nodes-autype community node installed OpenAI API key (gpt-4o-mini or any chat model) Google Drive account with OAuth2 credentials (optional, can replace with other output) How to customize Change applicant data:** Edit the prompt text directly in the "AI Agent" node. Replace the example person/company info with your own. Use a different AI model:** Swap the OpenAI Chat Model sub-node for Anthropic Claude, Google Gemini, or any LangChain-compatible chat model. Connect to an AI agent:** Replace the Manual Trigger with a Webhook or Chat Trigger so an AI agent can call this workflow as a tool (e.g. "fill the Q2 permit renewal form"). Skip flattening:** Set flatten to false in the "Fill PDF Form" node if you want the fields to remain editable after filling. Add watermark:** Insert an Autype Watermark step after Fill Form to stamp "DRAFT" or "SUBMITTED" on every page before saving. Add password protection:** Insert an Autype Protect step after filling to encrypt the PDF before uploading to Drive. Change output destination:** Replace the Google Drive node with Email (SMTP), S3, Slack, or any other n8n output node. Pull data from a database:** Instead of hardcoding data in the AI Agent prompt, query a database (Postgres, MySQL, Airtable) or CRM (HubSpot, Salesforce) to dynamically fill different forms for different entities.
by PretenderX
This template automates sending a DingTalk message on new Azure Dev Ops Pull Request Created Events. It uses a MySQL database to store mappings between Azure users and DingTalk users; so the right users get notified. Set up instructions Define the path value of ReceiveTfsPullRequestCreatedMessage Webhook node of your own, copy the webhook url to create a Azure DevOps ServiceHook that call webhook with Pull Request Created event. In order to configure the LoadDingTalkAccountMap node, you need to create a MySQL table as below: |Name|Type|Length|Key| |-|-|-|-| |TfsAccount|varchar|255| |UserName|varchar|255| |DingTalkMobile|varchar|255| You can customize the Ding Talk message content by editing the BuildDingTalkWebHookData node. Define the URL of SendDingTalkMessageViaWebHook Http Request node as your Ding Talk group chat robot webhook URL. Send test of production message from Azure DevOps to test.
by Eduard
Are you a visual thinker working with n8n? ๐จ View and understand workflow structures at a glance with this template! Built with mermaid.js, Bootstrap 5 and AXAJ to create an interactive web page displaying n8n workflows as flowcharts. ๐ Perfect for documentation, presentations, or just getting a clearer picture of your automation processes. Need customization help? Reach out to Eduard! Benefits ๐ Instant workflow visualization ๐ฑ Responsive design ๐ Direct links to n8n workflows ๐งฉ Special shapes for different node types ๐ซ Disabled node indication ๐ No external dependencies โ just paste the workflow and call the webhook ๐ ๏ธ Easily customizable โ enhance the JS script or add custom styling โ ๏ธ Important note for cloud users โ ๏ธ Since the cloud version doesn't support environmental variables, please make the following changes in the CONFIG node: Update the instance_url variable: Enter your n8n URL instead of {{$env["N8N_PROTOCOL"]}}://{{$env["N8N_HOST"]}} Change the webhook_path to simply "webhook" instead of {{$env["N8N_ENDPOINT_WEBHOOK"] || "webhook"}} ๐ Examples Multiple flowcharts on a single page: Several shapes for different nodes: Langchain nodes with special connections styling:
by Oneclick AI Squad
This workflow automates pre-dispatch customs document validation for international shipments. It ingests shipping document packages, extracts content from each file, uses Claude AI to cross-validate all documents for consistency, regulatory compliance, and HS code accuracy, then flags errors before goods are dispatched โ preventing costly delays, fines, and rejected shipments at the border. How it works Trigger โ Webhook submission or watched Drive/S3 folder when new shipment docs are uploaded Register Shipment โ Assigns shipment case ID, normalises metadata from payload Fetch Document Files โ Downloads each document from Google Drive or URL Extract Text Content โ Parses PDF/DOCX text from all documents Classify Document Types โ Identifies invoice, packing list, bill of lading, COO, etc. Cross-Document Consistency Check โ Detects mismatches across documents (values, weights, quantities) AI Compliance Validation โ Claude AI validates each doc against destination country rules Aggregate Findings โ Merges per-document results into a shipment-level report Route by Risk Level โ Branches on CLEAR / HOLD / REJECT Notify Logistics Team โ Slack alert with error summary and action items Email Exporter Report โ Detailed validation report with fix instructions Update Shipment Tracker โ Writes status back to Airtable / Google Sheets Create Compliance Ticket โ Opens Jira issue for HOLD or REJECT shipments Return API Response โ Structured JSON result to caller or TMS integration Setup Steps Import workflow into n8n Configure credentials: Anthropic API โ Claude AI for compliance validation Google Drive OAuth โ Document intake and storage Google Sheets OAuth โ Shipment compliance audit log Airtable โ Shipment tracker CRM Slack OAuth โ Logistics team alerts SendGrid / SMTP โ Exporter notification emails Jira API โ Compliance issue tracking Set your Google Drive intake folder ID Configure destination country rules in the AI prompt node Set your Airtable base and shipment table IDs Activate the workflow Sample Webhook Payload { "shipmentId": "SHP-2025-00392", "exporterEmail": "logistics@exportco.com", "originCountry": "CN", "destinationCountry": "AU", "incoterms": "FOB", "declaredValue": 48500, "currency": "USD", "goodsDescription": "Electronic Components", "documents": [ { "name": "Commercial Invoice", "type": "commercial_invoice", "driveFileId": "1aBcD" }, { "name": "Packing List", "type": "packing_list", "driveFileId": "2eFgH" }, { "name": "Bill of Lading", "type": "bill_of_lading", "driveFileId": "3iJkL" }, { "name": "Certificate of Origin", "type": "certificate_of_origin", "driveFileId": "4mNoP" } ] } Documents Supported Commercial Invoice Packing List Bill of Lading (B/L) / Airway Bill (AWB) Certificate of Origin (COO / Form D / EUR.1) Customs Entry / Import Declaration Dangerous Goods Declaration (DGD) Phytosanitary / Health Certificate Insurance Certificate Letter of Credit (L/C) Export Licence / Permit Material Safety Data Sheet (MSDS) Fumigation Certificate AI Validation Checks Field Completeness** โ All mandatory fields present and populated Cross-Document Consistency** โ Values, weights, quantities, HS codes match across docs HS Code Validation** โ Correct classification for declared goods and destination Incoterms Compliance** โ Terms correctly applied across invoice and B/L Valuation Rules** โ Customs value method correct, currency declared Country of Origin** โ COO criteria met, preferential rates applicable Restricted / Prohibited Goods** โ Flags potential dual-use, CITES, or embargoed items Sanction Screening** โ Party names checked against common red flags Date & Validity** โ Document dates consistent, certificates not expired Features Multi-document cross-validation in a single run AI-powered HS code verification and suggestion Destination-countryโspecific compliance rules Automatic HOLD/REJECT routing for high-risk findings Detailed error report with fix instructions per field Full audit trail in Google Sheets Jira ticket creation for escalated compliance issues Explore More Automation: Contact us to design AI-powered lead nurturing, content engagement, and multi-platform reply workflows tailored to your growth strategy.
by CustomJS
This n8n workflow illustrates how to convert PDF files into text with the PDF Toolkit from www.customjs.space. @custom-js/n8n-nodes-pdf-toolkit Notice Community nodes can only be installed on self-hosted instances of n8n. What this workflow does Change** the requested HTML to PDF.. Extract** text from the PDF. Use** a Code node to handle URLs that point to PDF files. Convert** the PDF to text. Requirements Self-hosted** n8n instance. CustomJS API key** for converting PDF to text. HTML** Data to convert PDF files. Code node** for handling URL that indicates PDF file. Workflow Steps: Manual Trigger: Runs with user interaction. HTML to PDF: Request HTML Data Convert HTML to PDF Convert PDF to Text: Convert the generated Text from PDF Usage Get API key from customJS Sign up to customJS platform. Navigate to your profile page Press "Show" button to get API key Set Credentials for CustomJS API on n8n Copy and paste your API key generated from CustomJS here. Design workflow A Manual Trigger for starting workflow. HTTP Request Nodes for downloading PDF files. Code node for handling URL that indicates PDF file. Convert PDF to Text. You can replace logic for triggering and returning results. For example, you can trigger this workflow by calling a webhook and get a result as a response from webhook. Simply replace Manual Trigger and Write to Disk nodes.
by Mauricio Perera
Overview This workflow exposes an HTTP endpoint (webhook) that accepts a JSON definition of an n8n workflow, validates it, andโif everything is correctโdynamically creates that workflow in the n8n instance via its internal API. If any validation fails or the API call encounters an error, an explanatory message with details is returned. Workflow Diagram Webhook โ โผ Validate JSON โโ fails validation โโโบ Validation Error โ โโ passes โโบ Validation Successful? โ โโ true โโบ Create Workflow โโโบ API Successful? โโโบ Success Response โ โ โ โโ false โโบ API Error โโ false โโบ Validation Error Step-by-Step Details 1. Webhook Type**: Webhook (POST) Path**: /webhook/create-workflow Purpose**: Expose a URL to receive a JSON definition of a workflow. Expected Input**: JSON containing the main workflow fields (name, nodes, connections, settings). 2. Validate JSON Type**: Code Node (JavaScript) Validations Performed**: Ensure that payload exists and contains both name and nodes. Verify that nodes is an array with at least one item. Check that each node includes the required fields: id, name, type, position. If missing, initialize connections, settings, parameters, and typeVersion. Output if Error**: { "success": false, "message": "<error description>" } Output if Valid**: { "success": true, "apiWorkflow": { "name": payload.name, "nodes": payload.nodes, "connections": payload.connections, "settings": payload.settings } } 3. Validation Successful? Type**: IF Node Condition**: $json.success === true Branches**: true: proceed to Create Workflow false: route to Validation Error 4. Create Workflow Type**: HTTP Request (POST) URL**: http://127.0.0.1:5678/api/v1/workflows Authentication**: Header Auth with internal credentials Body**: The apiWorkflow object generated earlier Options**: continueOnFail: true (to handle failures in the next IF) 5. API Successful? Type**: IF Node Condition**: $response.statusCode <= 299 Branches**: true: proceed to Success Response false: route to API Error 6. Success Response Type**: SET Node Output**: { "success": "true", "message": "Workflow created successfully", "workflowId": "{{ $json.data[0].id }}", "workflowName": "{{ $json.data[0].name }}", "createdAt": "{{ $json.data[0].createdAt }}", "url": "http://localhost:5678/workflow/{{ $json.data[0].id }}" } 7. API Error Type**: SET Node Output**: { "success": "false", "message": "Error creating workflow", "error": "{{ JSON.stringify($json) }}", "statusCode": "{{ $response.statusCode }}" } 8. Validation Error Type**: SET Node Output**: { "success": false, "message": "{{ $json.message }}" } Example Webhook Request curl --location --request POST 'http://localhost:5678/webhook/create-workflow' \ --header 'Content-Type: application/json' \ --data-raw '{ "name": "My Dynamic Workflow", "nodes": [ { "id": "start-node", "name": "Start", "type": "n8n-nodes-base.manualTrigger", "typeVersion": 1, "position": [100, 100], "parameters": {} }, { "id": "set-node", "name": "Set", "type": "n8n-nodes-base.set", "typeVersion": 1, "position": [300, 100], "parameters": { "values": { "string": [ { "name": "message", "value": "Hello from a webhook-created workflow!" } ] } } } ], "connections": { "Start": { "main": [ [ { "node": "Set", "type": "main", "index": 0 } ] ] } }, "settings": {} }' Expected Success Response { "success": "true", "message": "Workflow created successfully", "workflowId": "abcdef1234567890", "workflowName": "My Dynamic Workflow", "createdAt": "2025-05-31T12:34:56.789Z", "url": "http://localhost:5678/workflow/abcdef1234567890" } Validation Error Response { "success": false, "message": "The 'name' field is required in the workflow" } API Error Response { "success": "false", "message": "Error creating workflow", "error": "{ ...full API response details... }", "statusCode": 401 }
by Nikan Noorafkan
๐งพ Template: Extract Ad Creatives from Googleโs Ads Transparency Center This n8n workflow pulls ad creatives from Google's Ads Transparency Center using SerpApi, filtered by a specific domain and region. It extracts, filters, categorizes, and exports ads into neatly formatted CSV files for easy analysis. ๐ค Whoโs it for? Marketing Analysts** researching competitive PPC strategies Ad Intelligence Teams** monitoring creatives from specific brands Digital Marketers** gathering visual and copy trends Journalists & Watchdogs** reviewing ad activity transparency โ Features Fetch creatives** using SerpApi's google_ads_transparency_center engine Filter results** to include only ads with an exact match to your target domain Categorize** by ad format: text, image, or video Export CSVs**: Generates a downloadable file for each format under the /files/ directory ๐ How to Use Edit the โSet Domain & Regionโ node domain: e.g. example.com region: SerpApi numeric region code โ See codes Add your SerpApi API key In the โGet Ads Page 1โ nodeโs credentials section. Run the workflow Click "Test workflow" to initiate the process. Download your results Navigate to /files/ to find: text_{domain}_ads.csv image_{domain}_ads.csv video_{domain}_ads.csv ๐ Notes Only the first page (up to 50 creatives) is fetched; pagination is not included. Sticky Notes inside the workflow nodes offer helpful internal annotations. CSV files include creative-level details: ad copy, images, video links, etc.
by Solomon
The Stripe API does not provide custom fields in invoice or charge data. So you have to get it from the Checkout Sessions endpoint. But that endpoint is not easy for begginners. It has dictionary parameters and pagination settings. This workflows solves that problem by having a preconfigured GET request that gets all the checkout sessions from the last 7 days. It then transforms the data to make it easier to work with and allows you to filter by the custom_fields you want to get. Want to generate Stripe invoices automatically? Open ๐ this workflow . Check out my other templates https://n8n.io/creators/solomon/
by explorium
Automatically enrich prospect data from HubSpot using Explorium and create leads in Salesforce This n8n workflow streamlines the process of enriching prospect information by automatically pulling data from HubSpot, processing it through Explorium's AI-powered tools, and creating new leads in Salesforce with enhanced prospect details. Credentials Required To use this workflow, set up the following credentials in your n8n environment: HubSpot Type**: App Token (or OAuth2 for broader compatibility) Used for**: triggering on new contacts, fetching contact data Explorium API Type**: Generic Header Auth Header**: Authorization Value**: Bearer YOUR_API_KEY Get explorium api key Salesforce Type**: OAuth2 or Username/Password Used for**: creating new lead records Go to Settings โ Credentials, create these three credentials, and assign them in the respective nodes before running the workflow. Workflow Overview Node 1: HubSpot Trigger This node listens for real-time events from the connected HubSpot account. Once triggered, the node passes metadata about the event to the next step in the flow. Node 2: HubSpot This node fetches contact details from HubSpot after the trigger event. Credential**: Connected using a HubSpot App Token Resource**: Contact Operation**: Get Contact Return All**: Disabled This node retrieves the full contact details needed for further processing and enrichment. Node 3: Match prospect This node sends each contact's data to Explorium's AI-powered prospect matching API in real time. Method**: POST Endpoint**: https://api.explorium.ai/v1/prospects/match Authentication**: Generic Header Auth (using a configured credential) Headers**: Content-Type: application/json The request body is dynamically built from contact data, typically including: full_name, company_name, email, phone_number, linkedin. These fields are matched against Explorium's intelligence graph to return enriched or validated profiles. Response Output: total_matches, matched_prospects, and a prospect_id. Each response is used downstream to enrich, validate, or create lead information. Node 4: Filter This node filters the output from the Match prospect step to ensure that only valid, matched results continue in the flow. Only records that contain at least one matched prospect with a non-null prospect_id are passed forward. Status: Currently deactivated (as shown by the "Deactivate" label) Node 5: Extract Prospect IDs from Matched Results This node extracts all valid prospect_id values from previously matched prospects and compiles them into a flat array. It loops over all matched items, extracts each prospect_id from the matched_prospects array and returns a single object with an array of all prospect_ids. Node 6: Explorium Enrich Contacts Information This node performs bulk enrichment of contacts by querying Explorium with a list of matched prospect_ids. Node Configuration: Method**: POST Endpoint**: https://api.explorium.ai/v1/prospects/contacts_information/bulk_enrich Authentication**: Header Auth (using saved credentials) Headers**: "Content-Type": "application/json", "Accept": "application/json" Returns enriched contact information, such as: emails**: professional/personal email addresses phone_numbers**: mobile and work numbers professions_email, **professional_email_status, mobile_phone Node 7: Explorium Enrich Profiles This additional enrichment node provides supplementary contact data enhancement, running in parallel with the primary enrichment process. Node 8: Merge This node combines multiple data streams from the parallel enrichment processes into a single output, allowing you to consolidate data from different Explorium enrichment endpoints. The "combine" setting indicates it will merge the incoming data streams rather than overwriting them. Node 9: Code - flatten This custom code node processes and transforms the merged enrichment data before creating the Salesforce lead. It can be used to: Flatten nested data structures Format data according to Salesforce field requirements Apply business logic or data validation Map Explorium fields to Salesforce lead properties Handle data type conversions Node 10: Salesforce This final node creates new leads in Salesforce using the enriched data returned by Explorium. Credential**: Salesforce OAuth2 or Username/Password Resource**: Lead Operation**: Create Lead The node creates new lead records with enriched information including contact details, company information, and professional data obtained through the Explorium enrichment process. Workflow Flow Summary Trigger: HubSpot webhook triggers on new/updated contacts Fetch: Retrieve contact details from HubSpot Match: Find prospect matches using Explorium Filter: Keep only successfully matched prospects (currently deactivated) Extract: Compile prospect IDs for bulk enrichment Enrich: Parallel enrichment of contact information through multiple Explorium endpoints Merge: Combine enrichment results Transform: Flatten and prepare data for Salesforce (Code node) Create: Create new lead records in Salesforce This workflow ensures comprehensive data enrichment while maintaining data quality and providing a seamless integration between HubSpot prospect data and Salesforce lead creation. The parallel enrichment structure maximizes data collection efficiency before creating high-quality leads in your CRM system.
by Miquel Colomer
๐ Overview This workflow transforms n8n into a smart real-estate concierge by combining an AI chat interface with Bright Dataโs marketplace datasets. Users interact via chat to specify city, price, bedrooms, and bathroomsโand receive a curated list of three homes for sale, complete with images and briefings. ๐ฅ Workflow in Action Want to see this workflow in action? Play the video ๐ Key Features AI-Powered Chat Trigger:** Instantly start conversations using LangChainโs Chat Trigger node. Contextual Memory:** Retain up to 30 recent messages for coherent back-and-forth. Bright Data Integration:** Dynamically filter โFOR\_SALEโ properties by city, price, bedrooms, and bathrooms (limit = 3). Automated Snapshot Retrieval:** Poll for dataset readiness and fetch full snapshot content. HTML-Formatted Output:** Present results as a ` of ` items, embedding property images. ๐ How It Works (Step-by-Step) Prerequisites: n8n โฅ v1.0 Community nodes: install n8n-nodes-brightdata (the unverified community node) API credentials: OpenAI, Bright Data Webhook endpoint to receive chat messages Node Configuration: Chat Trigger: Listens for incoming chat messages; shows a welcome screen. Memory Buffer: Stores the last 30 messages for context. OpenAI Chat Model: Uses GPT-4o-mini to interpret user intent. Real Estate AI Agent: Orchestrates filtering logic, calls tools, and formats responses. Bright Data โFilter Datasetโ Tool: Applies user-defined filters plus homeStatus = FOR_SALE. Wait & Recover Snapshot: Polls until snapshot is ready, then fetches content. Get Snapshot Content: Converts raw JSON into a structured list. Workflow Logic: User sends search criteria โ Agent validates inputs. Agent invokes โFilter Datasetโ once all filters are present. Upon dataset readiness, the snapshot is retrieved and parsed. Final output rendered as a bullet list with property images. Testing & Optimization: Use the built-in Execute Workflow trigger for rapid dry runs. Inspect node outputs in n8nโs UI; adjust filter defaults or snapshot limits. Tune OpenAI model parameters (e.g., maxIterations) for faster responses. Deployment & Monitoring: Activate the main workflow and expose its webhook URL. Monitor executions in the โExecutionsโ panel; set up alerts for errors. Archive or duplicate workflows as needed; update credentials via credential manager. โ Pre-requisites Bright Data Account:** API key for marketplaceDataset. OpenAI Account:** Access to GPT-4o-mini model. n8n Version:** v1.0 or later with community node support. Permissions:** Webhook access, credential vault read/write. ๐ค Who Is This For? Real-estate agencies and brokers seeking to automate client queries. PropTech startups building conversational search tools. Data analysts who want on-demand property snapshots without manual scraping. ๐ Benefits & Use Cases Time Savings:** Replace manual MLS searches with an AI-driven chat. Scalability:** Serve multiple clients simultaneously via webchat or embedded widget. Consistency:** Always report exactly three properties, ensuring concise results. Engagement:** Visual listings with images boost user satisfaction and conversion. Workflow created and verified by Miquel Colomer https://www.linkedin.com/in/miquelcolomersalas/ and N8nHackers https://n8nhackers.com
by Gavin
This Workflow does a HTTPs request to ConnectWise Manage through their REST API. It will pull all tickets in the "New" status or whichever status you like, and notify your dispatch team/personnel whenever a new ticket comes in using Microsoft Teams. Video Explanation https://youtu.be/yaSVCybSWbM
by Incrementors
This workflow automates the complete blog publishing process. It removes manual work from content creation, image generation, category management, and WordPress publishing by using AI and n8n. It helps agencies, SEO teams, and content creators manage blogs at scale. Key Features Scheduled or manual blog publishing Automated topic research and content writing AI-generated featured and in-content images using Ideogram Dynamic WordPress category detection and creation Automatic media upload with SEO-friendly alt text Internal linking using sitemap data Google Sheets logging for published URLs Error notifications for failed executions What This Workflow Does Input Blog topics or keywords stored in Google Sheets Target WordPress site details Publishing rules and schedule Processing Triggers the workflow on a schedule or manual run Fetches blog posting data from Google Sheets Validates active projects or websites Performs topic and SEO research Writes long-form, SEO-optimized blog content Generates image prompts and creates images using Ideogram Uploads images to WordPress with alt text Detects or creates blog categories dynamically Publishes the blog post to WordPress Output Live published blog post URL Updated Google Sheet with publishing details Notification alerts if any step fails Setup Instructions Prerequisites n8n instance (cloud or self-hosted) WordPress site with REST API access Google Sheets access AI model credentials (Google Gemini, OpenAI, or DeepSeek) Ideogram API access Notification service (Discord or Slack) Step 1: Import the Workflow Download or copy the workflow JSON In n8n, go to Workflows โ Import from file / JSON Import the workflow Step 2: Configure Credentials Set up the required credentials inside n8n's credential manager: Google Sheets OAuth**: For reading posting data and saving URLs WordPress API**: For publishing posts and uploading media AI Model**: Connect Google Gemini, OpenAI, or DeepSeek Ideogram API**: For AI image generation Discord/Slack Webhook**: For error notifications Important: No credentials are hardcoded. All must be connected via n8n's credential manager. Step 3: Configure Google Sheets Prepare a Google Sheet containing: Blog topic or keyword Target website or domain Publishing status fields Domain ID for tracking Update the Sheet ID inside the Get_Post_Data node after import. Step 4: Configure Website Access Update the PBN_Website_Access node with your WordPress site access endpoint or API. This node should return: Complete WordPress URL Basic authentication token Sitemap post URL Step 5: Configure Publishing & Schedule Adjust the Schedule Trigger if auto-publishing is required Modify publishing frequency or time zone Review WordPress post status (draft or publish) Step 6: Test & Activate Add one test row in Google Sheets Run the workflow manually Verify: Content creation Image generation WordPress publishing Sheet updates Activate the workflow Usage Guide Adding New Blog Posts Add a new row in the connected Google Sheet with the required blog topic and website details. The workflow will automatically process and publish the post on the next execution. Understanding the Output After execution, the workflow: Publishes a complete blog post on WordPress Attaches featured and in-content images Assigns the correct category Logs the live URL back to Google Sheets Workflow Node Breakdown Get_Post_Data Fetches blog posting details from Google Sheets based on the current day. It pulls keywords, landing pages, domain IDs, and posting websites. get_client_status Checks the client's project status from the project sheet. It verifies whether the client is active or inactive before proceeding further. This prevents publishing content for paused or stopped clients. PBN_Website_Access Fetches WordPress website access details such as site URL, authentication token, and sitemap URL. These details are required for publishing posts, uploading images, and managing categories. Do the Research on the Topic Performs deep SEO research on the target keyword. It analyzes search intent, content gaps, and audience needs. This ensures the generated content is informative, relevant, and SEO-optimized. sitemap_crawl (internal_linking) Crawls the website sitemap to collect internal URLs. These URLs are later used for internal linking inside the blog content. Internal links help improve SEO and site structure. write_content Uses AI to write an 800-1000 word SEO-optimized blog article based on research data. The content includes proper HTML formatting, internal links, and anchor keyword placement. extract_title_body Separates the H1 title from the blog body content for proper WordPress publishing format. classify_category Automatically determines the most suitable category for the blog post by analyzing the blog title and content context. This keeps the website's category structure clean and relevant. get_category & create_category Checks if the determined category exists in WordPress. If not, it creates a new category automatically. generate_image_prompt Analyzes the blog content and generates AI prompts for creating relevant images including thumbnail and in-content images. Thumbnail Image Generator & Blog Image Generator Generate high-quality images using Ideogram API based on AI-generated prompts. Images are created with proper resolution and rendering settings. Thumbnail Uploading & Blog Image Uploading Upload generated images to WordPress media library and retrieve media IDs for post attachment. Add Alt Text in Images Adds SEO-friendly alt text to uploaded images to improve accessibility and search engine optimization. Blog and Photo Merge Merges the generated images into the blog content at appropriate positions within the article. publish_blog Publishes the complete blog post to WordPress with title, content, category, featured image, and publish status. save_live_url Saves the live published blog URL back into Google Sheets along with keyword, website URL, and timestamp for tracking and reporting. If Error Existed Then Get Notified Sends instant Discord or Slack notifications when any error occurs during workflow execution, ensuring no failure goes unnoticed. Customization Options Change blog length or tone in the content generation node Modify image style or resolution in Ideogram nodes Add multi-site publishing using Switch nodes Replace notification channel (Discord to Slack or Email) Extend workflow to social media posting Troubleshooting Blog not published Check WordPress credentials and REST API permissions. Images not generated Verify Ideogram API credentials and prompt formatting. Sheet not updating Ensure correct Sheet ID and OAuth permissions. Workflow stopped Review execution logs and error notification messages. Use Cases SEO blog automation for agencies Content publishing for niche websites Scalable blog management AI-assisted content operations Hands-free WordPress publishing Final Notes This workflow is designed to be reusable, scalable, and creator-friendly. It follows n8n best practices, avoids hardcoded credentials, and is suitable for public sharing as a workflow template. For any questions or support, please contact: info@incrementors.com or fill out this form: https://www.incrementors.com/contact-us/