by Ranjan Dailata
Who this is for This workflow is built for recruiters, HR professionals, talent acquisition teams, and AI-powered recruitment startups who need to analyze LinkedIn profiles at scale. It’s ideal for anyone looking to turn raw LinkedIn data into structured, ATS-ready candidate profiles and summarized professional insights automatically. What problem this workflow solves Recruiters spend countless hours manually reviewing LinkedIn profiles, extracting details like experience, skills, education, and crafting summaries for ATS or reporting. This process is time-consuming, inconsistent, and error-prone. This workflow automates that process — from profile scraping to data structuring and AI-driven summarization, allowing recruiters to instantly generate standardized talent profiles. What this workflow does The workflow integrates Decodo, Google Gemini, and Google Sheets to perform automated LinkedIn talent profiling. Here’s how it works step-by-step: Input Setup The workflow begins when the user executes it manually or passes a LinkedIn profile URL. The input includes url (LinkedIn profile link) and geo (location, e.g., India). Profile Extraction with Decodo The Decodo node scrapes structured data from the LinkedIn profile (headline, experience, skills, education, etc.). Output: Detailed text-based content of the LinkedIn profile. AI Processing and Enrichment (Google Gemini) Structured Data Extractor Node parses the scraped data into JSON Resume Schema using Gemini AI. The Summarize Content Node uses Gemini AI to produce a short, professional summary of the candidate’s profile. The two Gemini nodes ensure both structured and human-readable data formats are available. JSON Parsing & Merging The Code Node cleans and parses the JSON output from the AI for reliable downstream use. The Merge Node combines both structured profile data and the AI-generated summary. Data Storage in Google Sheets The Google Sheets Node appends or updates the record, storing the structured JSON and summary into a connected spreadsheet. This creates a live repository of candidate profiles with summaries for quick access or ATS integration. End Output A unified, machine-readable profile in JSON + an executive-level summary suitable for HR review or downstream automation. Setup Instructions Prerequisites n8n account** with workflow editor access Decodo API credentials** - You need to register, login and obtain the Basic Authentication Token via Decodo Dashboard Google Gemini (PaLM) API access** Google Sheets OAuth credentials** Setup Steps Import the workflow into your n8n instance. Configure Credentials Add your Decodo API credentials in the Decodo node. Connect your Google Gemini (PaLM) credentials for both AI nodes. Authenticate your Google Sheets account. Edit Input Node In the Set the Input Fields node, replace the default LinkedIn URL with your desired profile or dynamic data source. Run the Workflow Trigger manually or via webhook integration for automation. Verify that structured profile data and summary are written to the linked Google Sheet. How to customize this workflow to your needs Bulk Profile Input** Connect the “Set Input” node to a Google Sheet or CSV input for batch LinkedIn URLs. Alternate Output Format** Instead of Google Sheets, connect to Notion, Airtable, or PostgreSQL for centralized profile databases. Advanced Summaries** Modify the Summarize Content Gemini prompt to generate more specialized summaries — e.g., "Leadership Potential Summary" or "Technical Fit Analysis" Resume Comparison Feature** Add another Gemini node to compare a candidate’s profile against a job description and output a fit score or gap analysis. Notification Integration** Use Slack or Gmail nodes to send alerts when a new candidate summary is generated. Language Localization** Add a language detection step before summarization to support multilingual summaries. Summary The Automated LinkedIn Talent Profiling & Summary via Decodo + Google Gemini workflow streamlines recruitment intelligence by automating every step of LinkedIn profile research: Scraping (via Decodo) Structuring (via Gemini JSON Extraction) Summarizing (via Gemini Summarizer) Storing results (in Google Sheets) This workflow empowers recruiters to analyze hundreds of profiles within minutes, ensuring data consistency, faster candidate evaluation, and smarter hiring decisions — powered by Bright Data scraping intelligence and Google Gemini’s AI reasoning.
by David Olusola
GPT-4o Resume Screener with Error Handling - Google Sheets & Drive Pipeline How it works Enterprise-grade resume screening automation built for production environments. This workflow combines intelligent AI analysis with comprehensive error handling to ensure reliable processing of candidate applications. Every potential failure point is monitored with automatic recovery and notification systems. Core workflow steps: Intelligent Email Processing - Monitors Gmail with attachment validation and file type detection Robust File Handling - Multi-format support with upload verification and extraction validation Quality-Controlled AI Analysis - GPT-4o evaluation with output validation and fallback mechanisms Verified Data Extraction - Contact and qualification extraction with data integrity checks Dual Logging System - Success tracking in main dashboard, error logging in separate audit trail Error Recovery Features: Upload failure detection with retry mechanisms Text extraction validation with quality thresholds AI processing timeout protection and fallback responses Data validation before final logging Comprehensive error notification and tracking system Set up steps Total setup time: 25-35 minutes Core Credentials Setup (8 minutes) Gmail OAuth2 with attachment permissions Google Drive API with folder creation rights Google Sheets API with read/write access OpenAI API key with GPT-4o model access Primary Configuration (12 minutes) Configure monitoring systems - Set up Gmail trigger with error detection Establish file processing pipeline - Create Drive folders for resumes and configure upload validation Deploy dual spreadsheet system - Set up main tracking sheet and error logging sheet Initialize AI processing - Configure GPT-4o with structured output parsing and timeout settings Customize job requirements - Update role specifications and scoring criteria Error Handling Setup (10 minutes) Configure error notifications - Set administrator email for failure alerts Set up error logging spreadsheet - Create audit trail for failed processing attempts Customize timeout settings - Adjust processing limits based on expected file sizes Test error pathways - Validate notification system with sample failures Advanced Customization (5 minutes) Modify validation thresholds for resume quality Adjust AI prompt for industry-specific requirements Configure custom error messages and escalation rules Set up automated retry logic for transient failures Production-Ready Features: Comprehensive logging for compliance and auditing Graceful degradation when services are temporarily unavailable Detailed error context for troubleshooting Scalable architecture for high-volume processing Template Features Enterprise Error Management Multi-layer validation at every processing stage Automatic error categorization and routing Administrative alerts with detailed context Separate error logging for audit compliance Timeout protection preventing workflow hangs Advanced File Processing Upload success verification before processing Text extraction quality validation Resume content quality thresholds Corrupted file detection and handling Format conversion error recovery Robust AI Integration GPT-4o processing with output validation Structured response parsing with error checking AI timeout protection and fallback responses Failed analysis logging with manual review triggers Retry logic for transient API failures Production Monitoring Real-time error notifications via email Comprehensive error logging dashboard Processing success/failure metrics Failed resume tracking for manual review Audit trail for compliance requirements Data Integrity Controls Pre-logging validation of all extracted data Missing information detection and flagging Contact information verification checks Score validation and boundary enforcement Duplicate detection and handling Designed for HR departments and recruiting agencies that need reliable, scalable resume processing with enterprise-level monitoring and error recovery capabilities.
by Davide
This workflow is designed to automatically process AI news emails, extract and summarize articles, categorize them, and store the results in a structured Google Sheet for daily tracking and insights. This automated workflow processes a daily AI newsletter from AlphaSignal, extracting individual articles, summarizing them, categorizing them, and saving the results to a Google Sheet. Key Features 1. ✅ Fully Automated Daily News Pipeline No manual work is required — the workflow runs autonomously every time a new email arrives. This eliminates repetitive human tasks such as opening, reading, and summarizing newsletters. 2. ✅ Cross-AI Model Integration It combines multiple AI systems: Google Gemini* and *OpenAI GPT-5 Mini** for natural language processing and categorization. Scrapegraph AI** for external web scraping and summarization. This multi-model approach enhances accuracy and flexibility. 3. ✅ Accurate Content Structuring The workflow transforms unstructured email text into clean, structured JSON data, ensuring reliability and easy export or reuse. 4. ✅ Multi-Language Support The summaries are generated in Italian, which is ideal for local or internal reporting, while the metadata and logic remain in English — enabling global adaptability. 5. ✅ Scalable and Extensible New newsletters, categories, or destinations (like Notion, Slack, or a database) can be added easily without changing the core logic. 6. ✅ Centralized Knowledge Repository By appending to Google Sheets, the team can: Track daily AI developments at a glance. Filter or visualize trends across categories. Use the dataset for further analysis or content creation. 7. ✅ Error-Resilient and Maintainable The JSON validation and loop-based design ensure that if a single article fails, the rest continue to process smoothly. How it Works Email Trigger & Processing: The workflow is automatically triggered when a new email arrives from news@alphasignal.ai. It retrieves the full email content and converts its HTML body into clean Markdown format for easier parsing. Article Extraction & Scraping: A LangChain Agent, powered by Google Gemini, analyzes the newsletter's Markdown text. Its task is to identify and split the content into individual articles. For each article it finds, it outputs a JSON object containing the title, URL, and an initial summary. Crucially, the agent uses the "Scrape" tool to visit each article's URL and generate a more accurate summary in Italian based on the full page content. Data Preparation & Categorization: The JSON output from the previous step is validated and split into individual data items (one per article). Each article is then processed in a loop: Categorization: An OpenAI model analyzes the article's title and summary, assigning it to the most relevant pre-defined category (e.g., "LLM & Foundation Models," "AI Automation & WF"). URL Shortening: The article's link is sent to the CleanURI API to generate a shortened URL. Data Storage: Finally, for each article, a new row is appended to a specified Google Sheet. The row includes the current date, the article's title, the shortened link, the Italian summary, and its assigned category. Set up Steps To implement this workflow, you need to configure the following credentials and nodes in n8n: Email Credentials: Set up a Gmail OAuth2 credential (named "Gmail account" in the workflow) to allow n8n to access and read emails from the specified inbox. AI Model APIs: Google Gemini: Configure the "Google Gemini(PaLM)" credential with a valid API key to power the initial article extraction and scraping agent. OpenAI: Configure the "OpenAi account (Eure)" credential with a valid API key to power the article categorization step. Scraping Tool: Set up the ScrapegraphAI account credential with its required API key to enable the agent to access and scrape content from the article URLs. Google Sheets Destination: Configure the "Google Sheets account" credential via OAuth2. You must also specify the exact Google Sheet ID and sheet name (tab) where the processed article data will be stored. Activation: Once all credentials are tested and correctly configured, the workflow can be activated. It will then run automatically upon receiving a new newsletter from the specified sender. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Pake.AI
Overview This workflow converts a single topic into a full blog article through a structured multi-step process. Instead of generating everything in one pass, it breaks the task into clear stages to produce cleaner structure, better SEO consistency, and more predictable output quality. How this workflow differs from asking ChatGPT directly It does not produce an article in one step. It separates the process into two focused stages: outline generation and paragraph expansion. This approach gives you more control over tone, SEO, structure, and keyword placement. How it works 1. Generate outline The workflow sends your topic to an AI Agent. It returns a structured outline based on the topic, desired depth, language, and keyword focus. 2. Expand each subtopic The workflow loops through each outline item. Every subtopic is expanded into a detailed, SEO-friendly paragraph. Output is consistent and optimized for readability. 3. Produce final outputs Combines all expanded sections into: A clean JSON object A Markdown version ready for blogs or CMS The JSON includes: Title HTML content Markdown content You can send this directly to REST APIs such as WordPress, Notion, or documentation platforms. Content is validated for readability and typically scores well in tools like Yoast SEO. Uses GPT-4o Mini by default, with average token usage between 2000 and 3000 depending on outline size. Use cases Auto-generate long-form articles for blogs or content marketing. Turn Instagram or short-form scripts into complete SEO articles. Create documentation or educational content using consistent templates. Setup steps 1. Prepare credentials Add your OpenAI API Key inside n8n’s credential manager. 2. Adjust input parameters Topic or main idea Number of outline items Language Primary keyword Tone or writing style (optional) 3. Customize the workflow Switch the model if you want higher quality or lower token usage. Modify the prompt for the outline or paragraph generator to match your writing style. Add additional nodes if you want to auto-upload the final article to WordPress, Notion, or any API. 4. Run the workflow Enter your topic Execute the workflow Retrieve both JSON and Markdown outputs for immediate publishing If you need help expanding this into a full content pipeline or want to integrate it with other automation systems, feel free to customize further.
by FlyCode
⚙️ Automated Stripe Failed Payment Recovery (with Postmark + AI Email Generator) Recover failed Stripe subscription payments with AI-personalized emails sent via Postmark. 📝 Template Description Recover failed subscription payments automatically with Stripe, Postmark, and AI. This workflow listens for Stripe invoice.payment_failed webhooks, checks that the event is related to an auto-charged subscription, and then automatically sends a personalized email (generated with AI) to the customer. The email is polite, branded, but also urgent — encouraging the customer to pay quickly and avoid service cancellation. 🛠️ How it works 📣 Webhook Listens for Stripe webhook events. Make sure to connect it in your Stripe dashboard (see setup below). 🧹 Filter (Guard) Ensures the event is indeed an invoice event and filters out unrelated webhooks. 💡 Code Node Extracts useful fields (firstName, lastName, customer email, amount, currency, invoice number, hosted invoice URL, subscription description, account name). ✅ If Node Verifies that: Event type = invoice.payment_failed Billing reason = subscription_cycle Collection method = charge_automatically 👉 This ensures only recurring subscription invoices with auto-payment are processed. 🤖 AI Agent + OpenAI Generates a ready-to-send email JSON (to, subject, HTML body) using the extracted Stripe data. ✍️ You can customize the prompt here to match your brand’s tone of voice and style. 🧩 Code Parser Parses the AI model’s JSON output into fields (to_email, email_subject, email_body). 📧 HTTP Request (Postmark) Sends the email using Postmark’s API. You’ll need your own Postmark Server Token, From address, and Message Stream. 🚀 Setup Instructions 1. Stripe Webhook Go to Stripe Dashboard → Developers → Webhooks. Click + Add endpoint. Use your n8n Webhook URL (from the Webhook node) as the endpoint. Select event type: invoice.payment_failed. Save and deploy. 👉 Example docs: Stripe: Listen to events with webhooks. 2. Disable Stripe’s Default Failed Payment Emails In Stripe, go to Billing → Settings → Customer emails → Manage failed payments. Turn off “Failed payment” emails under the Revenue Recovery section. This prevents customers from receiving duplicate or conflicting emails. 3. Postmark Setup Create a Postmark account. Add a Server and copy the Server API Token. In n8n, add Postmark credentials with this token. Configure: From = your verified sending email (must be verified in Postmark). MessageStream = typically "outbound" (or any custom stream you set up). Docs: Postmark API overview. 4. OpenAI Setup Add your OpenAI credentials in n8n. Attach them to the OpenAI Chat Model node. You can modify the prompt in the AI Agent node to fit your company’s style. ✨ Customization Tips Update the AI prompt with your brand’s tone of voice (friendly, formal, playful, etc.). Adjust the HTML email design inside the prompt (button colors, footer, etc.). Add extra guard conditions (e.g., only trigger if invoice_amount > 0). Change the sending service: replace Postmark with Gmail, SMTP, or another provider. 💬 Or talk to our Billing Recovery Experts at flycode.com for hands-on help. ✅ Outcome Whenever a customer’s subscription payment fails, this workflow: Detects it instantly via Stripe Generates a polite but urgent recovery email Sends it automatically via Postmark Result: Fewer cancellations, higher recovered revenue, and a smoother customer experience. 💸💌
by n8n Automation Expert | Template Creator | 2+ Years Experience
🔗 Automated Blockchain Transaction Audit System Transform your blockchain compliance workflow with this enterprise-grade automation that monitors transactions across Ethereum and Solana networks, automatically generates professional audit reports, and maintains complete documentation trails. 🚀 What This Workflow Does This comprehensive automation system: 📊 Multi-Chain Monitoring**: Real-time transaction tracking for Ethereum (via Alchemy API) and Solana networks 🤖 AI-Powered Risk Analysis**: Intelligent scoring algorithm that evaluates transaction risk (0-100 scale) 📄 Automated PDF Generation**: Professional audit reports created instantly using APITemplate.io ☁️ Cloud Storage Integration**: Seamless uploads to Google Drive with organized folder structure 📋 Database Management**: Automatic Notion database entries for complete audit trail tracking 📧 Smart Notifications**: Multi-channel alerts to finance teams with detailed transaction summaries 🔒 Compliance Verification**: Built-in KYC/AML checks and regulatory compliance monitoring 💼 Perfect For FinTech Companies** managing blockchain transactions DeFi Protocols** requiring audit documentation Enterprise Finance Teams** handling crypto compliance Blockchain Auditors** automating report generation Compliance Officers** tracking regulatory requirements 🛠 Key Integrations Alchemy API** - Ethereum transaction monitoring Solana RPC** - Native Solana network access APITemplate.io** - Professional PDF report generation Google Drive** - Secure cloud document storage Notion** - Comprehensive audit database Email/SMTP** - Multi-recipient notification system Etherscan/Solscan** - Smart contract verification ⚡ Technical Highlights 10 Optimized Nodes** with parallel processing capabilities Sub-30 Second Processing** for complete audit cycles Enterprise Security** with credential management Error Handling** with automatic retry mechanisms Scalable Architecture** supporting 1000+ transactions/hour Risk Scoring Algorithm** with customizable parameters 📊 Business Impact 80% Cost Reduction** in manual audit processes 95% Error Elimination** through automation 100% Compliance Coverage** with immutable audit trails 70% Time Savings** for finance teams 🔧 Setup Requirements Before using this workflow, ensure you have: Alchemy API key for Ethereum monitoring APITemplate.io account with audit report template Google Drive service account with folder permissions Notion workspace with configured audit database SMTP credentials for email notifications Etherscan API key for contract verification 📈 Use Cases Transaction Compliance Monitoring**: Automatic flagging of high-risk transactions Regulatory Reporting**: Scheduled audit report generation for authorities Internal Auditing**: Complete documentation for financial reviews Risk Management**: Real-time scoring and alert systems Multi-Chain Portfolio Tracking**: Unified reporting across blockchain networks 🎯 Why Choose This Workflow This isn't just another blockchain monitor - it's a complete document management ecosystem that transforms raw blockchain data into professional, compliant documentation while maintaining enterprise-grade security and scalability. Perfect for organizations serious about blockchain compliance and audit trail management! 🚀 🔄 Workflow Process Webhook Trigger receives blockchain event Parallel Monitoring queries Ethereum & Solana networks AI Processing analyzes transaction data and calculates risk Document Generation creates professional PDF audit reports Multi-Channel Distribution uploads to Drive, logs in Notion, sends notifications Verification & Response confirms all processes completed successfully Ready to automate your blockchain compliance? Import this workflow and transform your audit processes today! ✨
by Krishna Sharma
📄 Smart Lead Capture, Scoring & Slack Alerts This workflow captures new leads from Typeform, checks for duplicates in HubSpot CRM, enriches and scores them, assigns priority tiers (Cold, Warm, Hot), and instantly notifies your sales team in Slack. 🔧 How It Works Typeform Trigger → Monitors form submissions and passes lead details into the workflow. HubSpot Deduplication → Searches HubSpot by email before creating a new record. Conditional Routing → If no match → Creates a new contact in HubSpot. If match found → Updates the existing contact with fresh data. Lead Scoring (Function Node) → Custom JavaScript assigns a score based on your rules (e.g. company email, job title, engagement signals, enrichment data). Tier Assignment → Categorizes the lead as ❄️ Cold, 🌡 Warm, or 🔥 Hot based on score thresholds. Slack Notification → Sends formatted lead alerts to a dedicated sales channel with priority indicators. 👤 Who Is This For? Sales teams who need to prioritize hot leads in real-time. Marketing teams running inbound lead capture campaigns with Typeform. RevOps teams that want custom scoring beyond HubSpot defaults. Founders/SMBs looking to tighten lead-to-revenue pipeline with automation. 💡 Use Case / Problem Solved ❌ Duplicate contacts clogging HubSpot CRM. ❌ Manual lead triage slows down response time. ❌ HubSpot’s default scoring is rigid. ✅ Automates lead creation + scoring + notification in one flow. ✅ Sales teams get immediate Slack alerts with context to act fast. ⚙️ What This Workflow Does Captures lead data directly from Typeform. Cleans & deduplicates contacts before pushing to HubSpot CRM. Scores and categorizes leads via custom logic. Sends structured lead alerts to Slack, tagged by priority. Provides a scalable foundation you can extend with data enrichment (e.g., Clearbit, Apollo). 🛠️ Setup Instructions 🔑 Prerequisites Typeform account with API access → Typeform Developer Docs HubSpot CRM account with API key or OAuth → HubSpot API Docs Slack workspace & API access → Slack API Docs (Optional) n8n automation platform to build & run → n8n Hub 📝 Steps to Configure Typeform Node (Trigger) Connect your Typeform account in n8n. Select the form to track submissions. Fields typically include: first name, last name, email, company, phone. HubSpot Node (Search Contact) Configure a search by email. Route outcomes: Not Found → Create Contact Found → Update Contact HubSpot Node (Create/Update Contact) Map Typeform fields into HubSpot (email, name, phone, company). Ensure you capture both standard and custom properties. Function Node (Lead Scoring) Example JavaScript: // Simple lead scoring example const email = $json.email || ""; let score = 0; if (email.endsWith("@company.com")) score += 30; if ($json.company && $json.company.length > 2) score += 20; if ($json.phone) score += 10; let tier = "❄️ Cold"; if (score >= 60) tier = "🔥 Hot"; else if (score >= 30) tier = "🌡 Warm"; return { ...$json, leadScore: score, leadTier: tier }; Customize rules based on your GTM strategy. Reference → n8n Function Node Docs Slack Node (Send Message) Example Slack message template: 🚀 New Lead Alert! 👤 {{ $json.firstname }} {{ $json.lastname }} 📧 {{ $json.email }} | 🏢 {{ $json.company }} 📊 Score: {{ $json.leadScore }} — {{ $json.leadTier }} Send to dedicated #sales-leads channel. Reference → Slack Node in n8n 📌 Notes & Extensions 🔄 Add enrichment with Clearbit or Apollo.io before scoring. 📊 Use HubSpot workflows to trigger nurturing campaigns for ❄️ Cold leads. ⏱ For 🔥 Hot leads, auto-assign to an SDR using HubSpot deal automation. 🧩 Export data to Google Sheets or Airtable for analytics.
by Weiser22
Shopify Multilingual Product Copy with n8n & Gemini 2.5 Flash-Lite Use for free Created by <Weiser22> · Last update 2025-09-02 Categories: E-commerce, Product Content, Translation, Computer Vision Description Generate language-specific Shopify product copy (ES, DE, EN, FR, IT, PT) from each product’s main image and metadata. The workflow performs a vision analysis to extract objective, verifiable details, then produces product names, descriptions, and handles per language, and stores the results in Google Sheets for review or publishing. Good to know Model:** models/gemini-2.5-flash-lite (supports image input). Confirm pricing/limits in your account before scaling. Image requirement:** products should have images[0].src; add a fallback if some products lack a primary image. Sheets mapping:** the sheet node uses Auto-map; ensure your matching column aligns with the field you emit (id vs product_id). Strict output:** the Agent enforces a multilingual JSON contract (es,de,en,fr,it,pt), each with shopify_product_name, shopify_description, handle. How it works Manual Trigger:** start a test run on demand. Get many products (Shopify):** fetch products and their images. Analyze image (Gemini Vision):** send images[0].src with an objective, 3–5 sentence prompt. AI Agent (Gemini Chat):** merge Shopify fields + vision text under anti-hallucination rules and a strict JSON schema. Structured Output Parser:** validates the exact JSON shape. Expand Languages & Sanitize (Code):** split into 6 items and normalize handles/HTML content as needed. Append row in sheet (Google Sheets):** add one row per language to your spreadsheet. Requirements Shopify Access Token with product read permissions. Google AI Studio (Gemini) API key for Vision + Chat Model nodes. Google Sheets credentials (OAuth or Service Account) with access to the target spreadsheet. How to use Connect credentials: Shopify, Gemini (same key for Vision and Chat), and Google Sheets. Configure nodes: Get many products: adjust limit/filters. Analyze image: verify ={{ $json.images[0].src }} resolves to a public image URL. AI Agent & Parser: keep the strict JSON contract as provided. Code (Expand & Sanitize): emits product_id, lang, handle, shopify_product_name, shopify_description, base_handle_es. Google Sheets (Append): set documentId and tab name; confirm the matching column. Run a test: execute the workflow and confirm six rows per product (one per language) appear in the sheet. Data contract (Agent output) { "es": {"shopify_product_name": "", "shopify_description": "", "handle": ""}, "de": {"shopify_product_name": "", "shopify_description": "", "handle": ""}, "en": {"shopify_product_name": "", "shopify_description": "", "handle": ""}, "fr": {"shopify_product_name": "", "shopify_description": "", "handle": ""}, "it": {"shopify_product_name": "", "shopify_description": "", "handle": ""}, "pt": {"shopify_product_name": "", "shopify_description": "", "handle": ""} } Customising this workflow Publish to Shopify:** after review in Sheets, add a product.update step to write finalized copy/handles. Handle policy:** tweak slug rules (diacritics, separators, max length) in the Code node to match store conventions. No-image fallback:** add an IF/Switch to skip vision when images[0].src is missing and generate copy from title + body only. Tone/length:** adjust temperature and token limits on the Chat Model for brand-fit. Troubleshooting No rows in Sheets:** confirm spreadsheet ID, tab name, Auto-map status, and that the matching column matches your emitted field. Vision errors:** ensure images[0].src is reachable. Parser failures:* the Agent must return *bare JSON** with the six root keys and three fields per language—no extra text.
by Ali Amin
🎯 Accounting Alerts Automation Purpose: Automatically track Companies House filing deadlines for UK accounting firms and prevent costly penalties (£150-£1,500 per missed deadline). How it works: Daily automated checks pull live deadline data from Companies House API Color-coded email alerts (Red/Orange/Yellow/Green) prioritize urgent deadlines Interactive "Yes/No" buttons let recipients confirm completion status All data syncs back to Google Sheets for complete audit trail Value: Saves 2-3 hours/week per firm while eliminating manual tracking errors. ⚙️ Daily Deadline Check & Alert System Runs: Every weekday at 5 PM (Mon-Fri) What happens: Read Company Database - Fetches all tracked companies from Google Sheets Get Company Data - Pulls live filing deadlines from Companies House API for each company Update Due Dates - Syncs latest deadline data back to the tracking sheet Build Interactive Email - Creates HTML email with: Color-coded urgency indicators (days remaining) Sortable table by due date Clickable Yes/No confirmation buttons for each company Send via Gmail - Delivers consolidated report to accounting team Why automated: Manual deadline checking across 10-50+ companies is time-consuming and error-prone. This ensures nothing falls through the cracks. ✅ Email Response Handler (Webhook Flow) Triggered when: Recipient clicks "Yes" or "No" button in the alert email What happens: Webhook - Receives confirmation status (company_number, company_name, yes/no) Process Data - Extracts response details from the webhook payload Update Sheet - Records confirmation status in Google Sheets with timestamp Confirmation Page - Displays success message to user Why this matters: Provides instant feedback to the user and creates an audit trail of who confirmed what and when. No separate tracking system needed—everything updates automatically in the same spreadsheet. Result: Accountability without administrative burden. 📋 Setup Requirements Google Sheets Database Structure: Create a sheet with these columns: company_number (manually entered) company_name (manually entered) accounts_due (auto-updated) confirmation_due (auto-updated) confirmation_submitted (updated via email clicks) last_updated (auto-timestamp) Required Credentials: Google Sheets OAuth (for reading/writing data) Companies House API key (free from api.company-information.service.gov.uk) Gmail OAuth (for sending alerts) Webhook Configuration: Update webhook URL in "Build Interactive Email" node to match your n8n instance. Time to Setup: ~15 minutes once credentials are configured.
by Sabrina Ramonov 🍄
Description Fully automated pipeline where you send an email to yourself with a rough idea (subject contains “thread”), n8n’s Gmail trigger picks it up, OpenAI ChatGPT rewrites/apply a viral-thread template, and Blotato posts the long-form thread to X/Twitter, Bluesky, and Meta Threads (optionally schedule or include images/videos). Template is easily extensible to other social platforms. Who Is This For? Digital creators, content marketers, social media managers, agencies, entrepreneurs, and influencers who want fast, automated long-form thread posting. 📄 Documentation Full Step-by-Step Tutorial How It Works 1. Trigger: Gmail Connect your Gmail account. n8n monitors emails sent from you and filters for subjects containing the word “thread”. 2. AI Thread Writer: OpenAI ChatGPT Connect your OpenAI account. Prompt ChatGPT to clean up your draft and format a long-form viral thread. 3. Publish to Social Media via Blotato Connect your Blotato account and choose social accounts (X/Twitter, Threads, Bluesky). Schedule or post immediately. Supports optional image/video URLs via a mediaUrls array (publicly accessible URLs). Example email to trigger the workflow: Email Subject: thread Email Body: I'm obsessed with voice AI apps. Super Whisper is my current favorite because it runs locally and keeps my voice data private. I talk to it instead of typing. Way faster. Setup & Required Accounts Gmail account (used as trigger) n8n Gmail OAuth doc: https://docs.n8n.io/integrations/builtin/credentials/google/oauth-single-service OpenAI Platform account (access to ChatGPT) Blotato account: https://blotato.com Generate Blotato API Key: Settings > API > Generate API Key (paid feature only) Sign in to Blotato and create an API Key (required for posting) n8n: Ensure "Verified Community Nodes" enabled in your n8n Admin Panel Install the "Blotato" community node and create Blotato credentials Optional: Media & Style Tweaks Attach images/videos: insert publicly accessible URLs into the mediaUrls array (advanced). To emulate a specific tone/structure, provide ChatGPT examples of your favorite viral threads or replace the example viral-thread prompt with your preferred example. Voice-to-text tip: record ideas (e.g., Superwhispr) and send the transcript by email — ChatGPT will clean it up. Tips & Tricks During testing, use “Scheduled Time” in Blotato instead of immediate posting to preview before going live. Start with a single social platform while testing. If your script is long or includes media, processing may take longer. Many users prefer speaking their ideas (voice notes) then letting AI edit — faster than typing. Troubleshooting Check your Blotato API Dashboard to inspect each request, response, and error. Confirm API key validity, n8n node credentials, and that emails sent have subject containing “thread”. Need Help? In the Blotato web app, click the orange support button in the bottom right to access Blotato support.
by Cadu | Ei, Doc!
This n8n template demonstrates how to automate blog post creation with AI and WordPress This workflow is designed for creators who want to maintain an active blog without spending hours writing — while still taking advantage of SEO benefits. It connects OpenAI and WordPress to help you schedule AI-generated posts or create content from simple one- or two-word prompts. 🧠 Good to know At the time of writing, each AI-generated post will use your OpenAI API credits according to your model and usage tier. This workflow requires an active WordPress site with API access and your OpenAI API key. Setup is quick — in less than 5 minutes, you can have everything running smoothly! ⚙️ How it works The workflow connects to your WordPress API and your OpenAI account. You can choose between two modes: Scheduled mode: AI automatically creates and publishes posts based on your defined schedule. Prompt mode: Enter a short phrase (one or two words) and let AI generate a complete SEO-optimized post. The generated content is formatted and published directly to your WordPress blog. You can easily customize prompts, post styles, or scheduling frequency to match your brand and goals. 🚀 How to use Start with the Manual Trigger node (as an example) — or replace it with other triggers such as webhooks, cron jobs, or form submissions. Adjust your OpenAI prompts to fine-tune the tone, structure, or SEO focus of your posts. You can also extend this workflow to automatically share posts on social media or send notifications when new articles go live. ✅ Requirements Active OpenAI API key WordPress site** with API access 🧩 Customising this workflow AI-powered content creation can be adapted for many purposes. Try using it for: Automated content calendars Generating product descriptions Creating newsletter drafts Building SEO-focused blogs effortlessly
by Dahiana
Send personalized pet care tips from Google Sheets with AI Automate weekly pet wellness emails with AI-generated, location and age-specific advice. Who's it for Pet care businesses, veterinary clinics, pet subscription services, and animal shelters sending regular wellness content to pet owners. How it works Loads pets data from Google Sheets Filters pets who haven't received email in 7+ days Calculates age from birthdate (formats as "2 years and 3 months") AI generates tip - GPT-4o-mini creates climate-aware, veterinary-aligned advice based on pet type, age, and location Sends email via Gmail or SendGrid Updates timestamp in sheet to prevent duplicates Logs activity to tracking sheet Requirements APIs: Google Sheets, Airtable, Typeform or similar OpenAI (GPT-4o-mini) Gmail OAuth2 OR SendGrid, you can use Brevo, Mailchimp or any other. Google Sheet Structure: Sheet 1: Pets | Email | Owner_Name | Pet_Name | Pet_Type | Date_of_Birth | Country (ISO) | Status | Last_Email_Sent | |-------|------------|----------|----------|---------------|---------------|--------|-----------------| Sheet 2: Email_Log | Timestamp | Parent_Email | Pet_Name | Tip_Category | Status | |-----------|--------------|----------|--------------|--------| How to set up Create Google Sheet with structure above, add 2-3 test pets. Import workflow and add credentials. Update nodes: "Load Pet Info": Set your Sheet ID "Update Last_Email_Sent Date": Set Sheet ID "Log to Email_Log Sheet": Set Sheet ID Test manually with 1 active pet Enable schedule (default: Mondays 9am) How to customize Switch email provider: Enable "Send via SendGrid" node Disable "Send Health Tip using Gmail" node Update template ID Modify AI prompt: Edit "Generate Personalized Tip" node Adjust temperature Add/remove categories Use cases beyond pets Same workflow works for: Plant care** (growth stage tips) Baby milestones** (age-based parenting advice) Fitness coaching** (experience level workouts) Language learning** (study streak motivation) Just update sheet columns and AI prompt. Notes Choose only one mailing service. Country codes use ISO format (US, UK, AU, CA, etc.) AI considers location for seasonal advice.