by Oneclick AI Squad
AI Daily Personal Briefing Engine This workflow runs every morning to fetch your tasks, calendar events, emails, weather, and news — then uses AI to generate a clean, personalized daily briefing delivered via WhatsApp or Email. Who's it for • Busy professionals who want a structured morning overview • Remote workers managing multiple tools (Todoist, Calendar, Gmail) • Anyone who wants an AI-powered daily digest How it works / What it does Triggers every morning at your chosen time Fetches tasks from Todoist/Asana Pulls today's calendar events Reads unread/recent emails Gets current weather forecast Fetches top news headlines Aggregates all data into one context object AI generates a concise, human-friendly briefing Routes to WhatsApp or Email based on your preference Logs briefing summary to Google Sheets How to set up Import this workflow into n8n Set up credentials: Todoist/Asana, Google Calendar, Gmail, OpenAI, SendGrid, Twilio (WhatsApp) Replace placeholder API keys and IDs Set your preferred delivery method in the Set node Activate the workflow Requirements • Todoist or Asana account • Google Calendar + Gmail • OpenWeatherMap API key • NewsAPI key • OpenAI API key • Twilio (WhatsApp) or SendGrid (Email) • Google Sheets (for logging) How to customize • Change cron time in the Schedule Trigger node • Adjust AI tone/length in the AI Briefing node • Add/remove data sources as needed • Toggle delivery method in the Set Delivery Preference node
by PollupAI
Who’s it for This workflow is built for B2B SaaS and CX teams that are drowning in unstructured customer feedback across tools. It’s ideal for Customer Success, Product and Support leaders who want a light “voice of customer engine” without rebuilding their stack: Gmail for interactions, Slack for conversations, Pipedrive for notes and Zendesk for tickets, plus Notion for follow-up tasks. How it works / What it does The workflow runs on a schedule or manual trigger and first sets the CSM’s email address. It then uses an AI “Data agent” to pull recent customer signals from multiple sources: Gmail messages, Slack messages, Pipedrive notes and Zendesk tickets. A “Signals agent” compresses each piece of feedback into a concise, neutral summary, which is then grouped by topic via a “Clustering agent”. Each cluster gets a label, count and examples. Finally, an “Action agent” routes clusters based on their label: Create Zendesk tickets for product/performance issues Post to a dedicated Slack channel for billing / contract topics Create Notion tasks for sales-related feedback Send targeted Gmail messages to the CSM for high-risk or engagement-related items How to set up Import the workflow into n8n. Connect credentials for Gmail, Slack, Pipedrive, Zendesk, Notion and OpenAI. Update the CSM email in the “Set CSM email” node. Adjust date filters, send-to addresses and Slack channel IDs as needed. Enable the schedule trigger for weekly or daily digests. Requirements Active accounts & credentials for: Gmail, Slack, Pipedrive, Zendesk and Notion OpenAI (or compatible) API key for the LLM node At least one Slack channel for posting feedback (e.g. #billing-feedback) How to customize the workflow Change the time window or filters (sender, channel, query) for each data source. Edit the clustering and routing prompts to match your own categories and teams. Add new destinations (e.g. Jira, HubSpot) by connecting more tools to the Action agent. Modify thresholds (e.g. minimum count) before a cluster triggers an action. Localize labels and email copy to your team’s language and tone.
by Veena Pandian
Who is this for? Founders, product managers, content strategists, indie hackers, and anyone who wants to automatically monitor tech industry trends across multiple sources — without manually browsing Hacker News and Product Hunt every day. What this workflow does This workflow scans public RSS feeds from Hacker News and Product Hunt daily, scores every item against configurable keyword groups (AI, SaaS, No-Code, Dev Tools, etc.), clusters the results into ranked themes, and delivers a prioritized intelligence report via Slack and email. All signals and themes are logged to Google Sheets for historical trend analysis. How it works Daily trigger fires on a configurable schedule (default: every 24 hours). Fetches RSS feeds from Hacker News (posts with 50+ points) and Product Hunt in parallel. Parses and normalizes all feed items — extracting titles, descriptions, URLs, and publish dates from RSS/Atom XML. Scores each item against 7 weighted keyword groups. Title matches receive a bonus multiplier. Source weights (Hacker News 1.5x, Product Hunt 1.3x) amplify signals from higher-authority sources. Clusters into themes — groups scored items by primary category, calculates theme strength using source diversity and volume bonuses, and classifies each as VERY_STRONG, STRONG, MODERATE, or WEAK. Builds an intelligence report with theme rankings, top 10 signals, and action items for surging topics. Generated in both plain text (Slack) and HTML (email). Delivers and logs — posts to Slack, sends HTML email, and appends both individual signals and theme summaries to separate Google Sheet tabs. Setup steps Connect Google Sheets OAuth2 credentials and update the Sheet ID in both "Log Signals to Sheet" and "Log Themes to Sheet" nodes. Create a Google Sheet with two tabs: signal — headers: date, title, source, score, category, url themes — headers: date, category, signal_level, theme_strength, item_count, sources, top_keywords Connect Slack OAuth2 credentials and configure your target channel in the "Post Report to Slack" node. Connect Gmail OAuth2 credentials and update YOUR_EMAIL@EXAMPLE.COM in the "Email Daily Report" node. Activate the workflow. Requirements n8n instance (self-hosted or cloud) Google Cloud project with Sheets API enabled Slack workspace with a bot configured Gmail account with OAuth2 credentials (or swap for SMTP) No API keys needed for RSS feeds — they are publicly accessible How to customize Add more RSS feeds** — duplicate a feed node (e.g., TechCrunch, Reddit, Lobsters), connect it to the Merge node as an additional input, and add the parsing logic in the "Parse All RSS Feeds" code node. Edit keyword groups** — modify the keywordGroups object in the "Score and Classify Signals" node. Add your industry-specific keywords, adjust weights, and rename categories. Adjust source weights** — change the weight multipliers in the parser node to reflect which sources you trust most. Theme thresholds** — modify the strength cutoffs (30 = VERY_STRONG, 15 = STRONG, 8 = MODERATE) in the "Aggregate Signals into Themes" node. Schedule** — change from daily to hourly for real-time monitoring, or weekly for a digest format. Add AI analysis** — insert an LLM node after the report builder to generate strategic commentary on detected trends.
by Samyotech
What this workflow does This workflow implements a two-stage news automation system designed for reusable and topic-driven email delivery. News articles are continuously collected from multiple platforms using RSS feeds and stored in a vector database with semantic embeddings and category metadata. Instead of fetching news on demand, the workflow separates daily ingestion from weekly delivery. This allows the same news data to be reused across different topics, audiences, or delivery schedules. On a weekly basis, relevant articles are retrieved from the vector store based on defined areas of interest and item limits. The selected news is then processed by an AI agent, which converts the raw articles into a structured, email-ready format before sending the final content to users. How it works News articles are collected daily from multiple RSS feeds Articles are categorized and stored in a vector database On a weekly trigger, topic preferences are evaluated Relevant articles are retrieved using vector-based search An AI agent formats the content for email delivery The email is sent to the user Setup To use this workflow, complete the following steps: Add and configure your RSS feed sources Connect a vector database and embedding model Configure AI model credentials for content generation Set up email service credentials Define weekly scheduling and topic inputs Test retrieval and email output Customization You can customize this workflow by: Adding or removing RSS feed sources Adjusting news categories or topic filters Changing the number of articles retrieved per topic Modifying the AI agent’s writing tone or structure Reusing the vector store for other content workflows Updating email frequency or delivery format Requirements RSS feed URLs Vector database credentials AI model credentials Email service credentials
by Frederik Duchi
This n8n template demonstrates how to automatically create tasks (or in general, records) in Baserow based on template or blueprint tables. The first blueprint table is the master table that holds the general information about the template. For example: a standard procedure to handle incidents. The second table is the details table that holds multiple records for the template. Each record in that table is a specific task that needs to be assigned to someone with a certain deadline. This makes it easy to streamline task creation for recurring processes. Use cases are many: Project management (generate tasks for employees based on a project template) HR & onboarding (generate tasks for employee onboarding based on a template) Operations (create checklists for maintenance, audits, or recurring procedures) Good to know The Baserow template for handling Standard Operating Procedures works perfect as a base schema to try out this workflow. Authentication is done through a database token. Check the documentation on how to create such a token. Tasks are inserted using the HTTP request node instead of a dedicated Baserow node. This is to support batch import instead of importing records one by one. Requirements Baserow account (cloud or self-hosted) A Baserow database with at least the following tables: Assignee / employee table. This is required to be able to assign someone to a task. Master table with procedure or template information. This is required to be able to select a certain template Details table with all the steps associated with a procedure or template. This is required to convert each step into a specific task. A step must have a field Days to complete with the number of days to complete the step. This field will be used to calculate the deadline. Tasks table that contains the actual tasks with an assignee and deadline. How it works Trigger task creation (webhook)** The automation starts when the webhook is triggered through a POST request. It should contain an assignee, template, date and note in the body of the request. It will send a succes or failure response once all steps are completed. Configure settings and ids** Stores the ids of the involved Baserow database and tables, together with the API credentials and the data from the webhook. Get all template steps** Gets all the steps from the template Details table that are associated with the id of the Master template table. For example: the master template can have a record about handling customer complaints. The details table contains all the steps to handle this procedure. Calculate deadlines for each step** Prepares the input of the tasks by using the same property names as the field of the Tasks table. Adjust this names, add or remove fields if this is required for your database structure. The deadline of each step is calculated by adding the number of days a step can take based on the deadline of the first step. This is done through a field Days to complete in the template Details table. For example. If the schedule_date property in the webhook is set to 2025-10-01 and the Days to complete for the step is 3, then the deadline will be 2025-10-04 Avoid scheduling during the weekend** It might happen that the calculated deadline is on a Saturday or Sunday. This Code node moves those dates to the first Monday to avoid scheduling during the weekend. Aggregate tasks for insert** Aggregates the data from the previous nodes as an array in a property named items. This matches perfect with the Baserow API to insert new records in batch. Generate tasks in batch** Calls the API endpoint /api/database/rows/table/{table_id}/batch/ to insert multiple records at once in the tasks table. Check the Baserow API documentation for further details. Success / Error response** Sends a simple text response to indicate the success or failure of the record creation. This is to offer feedback when triggering the automation from a Baserow application, but can be replaced with a JSON response. How to use Call the Trigger task creation node with the required parameters through a POST request. This can be done from any web application. For example: the application builder in Baserow supports an action to send an HTTP request. The Procedure details page in the Standard Operating Procedures template demonstrates this action. The following information is required in the body of the request. This information is required to create the actual tasks. { "assignee_id": integer refering to the id of the assignee in the database, "template_id": integer refering to the id of the template or procedure in the master table, "schedule_date": the date the tasks need to start scheduling, "note": text with an optional note about the tasks } Set the corresponding ids in the Configure settings and ids node. Check the names of the properties in the Calculate deadlines for each step node. Make sure the names of those properties match the field names of your Tasks table. You can replace the text message in the Success response and Failure response with a more structured format if this is necessary in your application. Customising this workflow Add support for public holidays (e.g., using an external calendar API). Modify the task assignment logic (e.g., pre-assign tasks in the details table). Combine with notifications (email, Slack, etc.) to alert employees when new tasks are generated.
by Oneclick AI Squad
This AI-powered workflow transcribes Zoom/Google Meet recordings, extracts decisions and tasks using AI, then creates tickets in Jira/ClickUp/Linear and assigns them to team members automatically. How it works Trigger - Receives meeting recording URL via webhook or schedule Download Recording - Fetches audio/video file from Zoom/Google Meet Audio Extraction - Converts video to audio if needed using FFmpeg Transcription - Uses Whisper API to transcribe meeting audio Wait & Process - Allows transcription to complete Parse Transcript - Cleans and formats the transcription text AI Analysis - Claude extracts action items, decisions, owners Team Member Matching - Maps names to user IDs in project tools Create Tasks - Generates tickets in Jira/ClickUp/Linear Assign & Notify - Assigns tasks to team members and sends notifications Meeting Summary - Saves full summary to Google Drive/Notion Response - Returns processed action items and task links Setup Steps Import this workflow into your n8n instance Configure credentials: Zoom OAuth - For downloading Zoom recordings Google OAuth - For Google Meet recordings and Drive storage OpenAI API - For Whisper transcription service Anthropic API - For Claude AI analysis Jira/ClickUp/Linear API - For task creation Slack/Teams - For notifications (optional) Set up team member mapping in the config node Configure your project management tool preferences Activate the workflow Sample Trigger Payload { "meetingSource": "zoom", "recordingUrl": "https://zoom.us/rec/share/...", "meetingTitle": "Q1 Planning Meeting", "meetingDate": "2024-01-15", "attendees": ["alice@company.com", "bob@company.com", "charlie@company.com"], "projectKey": "PROJ-123", "taskTool": "jira", "defaultPriority": "medium", "autoAssign": true, "sendNotifications": true, "saveToNotion": false, "saveToDrive": true, "extractDecisions": true, "extractRisks": true, "dueDate": "2024-01-22" } Features Multi-platform support** (Zoom, Google Meet, MS Teams recordings) Accurate transcription** using OpenAI Whisper API AI-powered extraction** of action items, decisions, risks, and next steps Automatic task creation** in Jira, ClickUp, or Linear Smart assignment** - maps attendee names to task assignees Meeting summaries** - saves comprehensive notes to Drive/Notion Slack/Teams notifications** - alerts team members of new tasks Duplicate detection** - prevents creating duplicate tickets Priority detection** - AI assigns urgency levels to tasks
by Rahul Joshi
📊 Description The scoreboard shows you what happened. This workflow tells you why it happened. Every time an IPL match ends this automation detects the completed result, fetches the full scorecard, and sends it to GPT-4o which produces a detailed journalist-style post-match analysis — innings breakdowns, tactical decisions, key turning points, player of the match, and what the result means for both teams. Every Monday it also generates a weekly roundup digest covering all the week's matches in one beautifully designed email. Built for sports media companies, IPL fan platforms, cricket newsletters, and automation agencies who want to produce expert-level match analysis at scale without a dedicated editorial team. What This Workflow Does ⏰ Polls CricAPI every 30 minutes for recently completed IPL matches 📋 Checks the Match Log sheet to avoid analyzing the same match twice 🏏 Detects new completed IPL matches and saves them to the Match Log 🧮 Computes both innings run rates and builds a structured analytical prompt 🤖 Sends full match context to GPT-4o which generates a complete post-match analysis 📧 Assembles the analysis into a branded HTML email and sends it immediately after the match 📝 Logs every analysis to the Analysis Log sheet with match name, winner, and player of match 📊 Every Monday reads all analyses from the past 7 days and generates a weekly roundup 🤖 GPT-4o writes the weekly digest with match recaps, player of the week, and next week preview 📧 Sends the weekly roundup as a branded HTML email every Monday at 9AM Key Benefits ✅ Fully automatic — detects match completion and triggers analysis without manual input ✅ Duplicate prevention — never analyzes the same match twice ✅ GPT-4o writes like a cricket journalist not a data report ✅ Two email formats — immediate post-match deep dive and weekly roundup digest ✅ Complete audit trail across two Google Sheets ✅ Falls back to any completed T20 when IPL is off-season so testing always works ✅ Clean termination on both IF nodes — no dangling branches How It Works SW1 — Match Completion Detector Every 30 minutes the workflow fetches all current and recent matches from CricAPI and reads the Match Log sheet. The Code node filters for completed IPL T20 matches by checking that the match name contains IPL or Indian Premier League, the match type is T20, and both matchStarted and matchEnded are true. It then compares every completed match against the set of already-analyzed match IDs in the Match Log. If a new unanalyzed match is found it gets saved to the Match Log with analyzed set to false and all scorecard data flows forward into the analysis engine. If no new match is found an IF node stops the workflow cleanly. SW2 — Deep Dive Analyzer The match data flows directly from SW1 into the analysis prompt builder. The Code node computes run rates for both innings and assembles a structured prompt containing both innings stats, the match result, and clear instructions for GPT-4o to act as a cricket journalist. GPT-4o returns a headline, 3-4 sentence match summary, separate tactical breakdowns for each innings, three key moments that decided the match, an overall tactical assessment, player of the match with reasoning, and a one-sentence forward-looking note. The response is parsed and assembled into a branded HTML email with a dark blue header, score display, color-coded analysis sections, and a player of the match spotlight. The email is sent immediately and both Google Sheets are updated to record that this match has been analyzed and the email has been sent. SW3 — Weekly Digest Every Monday at 9AM the workflow reads all rows from the Analysis Log and filters for entries from the past 7 days. If matches exist GPT-4o generates a weekly roundup covering the week's headline, individual one-liner recaps for each match, player of the week, the biggest talking point or controversy, and a preview of the upcoming week. The response is assembled into a branded weekly roundup email and sent. If no matches were analyzed in the past 7 days the workflow stops cleanly without sending a blank email. Features 30-minute polling for match completion detection Dynamic IPL match filtering — no hardcoded IDs Duplicate prevention via Match Log sheet lookup Both innings run rate computation GPT-4o post-match analysis with 8 structured output fields Immediate post-match email delivery Weekly Monday digest with recaps, POTW, talking point, and preview Two branded HTML email templates with dark blue cricket theme Two Google Sheets for match tracking and analysis history IF nodes with No Operation fallbacks on both SW1 and SW3 Fallback to any completed T20 for off-season testing Requirements CricAPI account and API key — free tier at cricapi.com OpenAI API key (GPT-4o access) Google Sheets OAuth2 connection Gmail OAuth2 connection Setup Steps Sign up at cricapi.com and get your free API key Create a Google Sheet called "IPL Post Match Analyzer" with 2 sheets — Match Log and Analysis Log Add the correct column headers to both sheets. Paste your Sheet ID into all Google Sheets nodes Connect Google Sheets OAuth2 credentials Add your OpenAI API key to both OpenAI nodes Add your Gmail OAuth2 credentials and set your email in both Gmail nodes Activate the workflow — the system runs itself from here Target Audience 📺 Sports media companies automating post-match editorial content 🏏 IPL cricket newsletters and fan platforms delivering expert analysis 🤖 Automation agencies building cricket intelligence products for media and franchise clients 📱 Fan apps that want to surface match analysis without hiring a commentary team
by Yassin Zehar
Description Automatically triage Product UAT feedback with AI, deduplicate it against your existing Notion backlog, create/update the right Notion item, and close the loop with the tester (Slack or email). This workflow standardizes incoming UAT feedback, runs AI classification (type, severity, summary, suggested title, confidence), searches Notion to prevent duplicates, and upserts the roadmap entry for product review. It then confirms receipt to the tester and returns a structured webhook response. Context Feature requests often arrive unstructured and get lost across channels. Product teams waste time re-triaging the same ideas, creating duplicates, and manually confirming receipt. This workflow ensures: Faster feature request triage Fewer duplicates in your roadmap/backlog Consistent structure for every feedback item Automatic tester acknowledgement Full traceability via webhook response Who is this for? Product Managers running UAT or beta programs Product Ops teams managing a roadmap backlog Teams collecting feature requests via forms, Slack, or internal tools Anyone who wants AI speed with clean backlog hygiene Requirements Webhook trigger (form / Slack / internal tool) OpenAI account (AI triage) Notion account (roadmap/backlog database) Slack and/or Gmail (tester notification) How it works Trigger: feedback received via webhook Normalize & Clean: standardizes fields and cleans message AI Triage: returns structured JSON (type, severity, title, confidence…) Notion Dedupe & Upsert: search by suggested title → update if found, else create Closed Loop: notify tester (Slack or email) + webhook response payload What you get One workflow to capture and structure feature requests Clean Notion backlog without duplicates Automatic tester confirmation Structured output for downstream automation About me : I’m Yassin a Product Manager Scaling tech products with a data-driven mindset. 📬 Feel free to connect with me on Linkedin
by Yuvraj Singh
Purpose This solution enables you to manage all your Notion and Todoist tasks from different workspaces as well as your calendar events in a single place. This is 2 way sync with partial support for recurring How it works The realtime sync consists of two workflows, both triggered by a registered webhook from either Notion or Todoist. To avoid overwrites by lately arriving webhook calls, every time the current task is retrieved from both sides. Redis is used to prevent from endless loops, since an update in one system triggers another webhook call again. Using the ID of the task, the trigger is being locked down for 80 seconds. Depending on the detected changes, the other side is updated accordingly .Generally Notion is treaded as the main source. Using an "Obsolete" Status, it is guaranteed, that tasks never get deleted entirely by accident. The Todoist ID is stored in the Notion task, so they stay linked together An additional full sync workflow daily fixes inconsistencies, if any of them occurred, since webhooks cannot be trusted entirely. Since Todoist requires a more complex setup, a tiny workflow helps with activating the webhook. Another tiny workflow helps generating a global config, which is used by all workflows for mapping purposes. Mapping (Notion >> Todoist) Name: Task Name Priority: Priority (1: do first, 2: urgent, 3: important, 4: unset) Due: Date Status: Section (Done: completed, Obsolete: deleted) <page_link>: Description (read-only) Todoist ID: <task_id> Current limitations Changes on the same task cannot be made simultaneously in both systems within a 15-20 second time frame. Subtasks are not linked automatically to their parent yet. Tasks names do not support URL’s yet. Credentials Follow the video: Setup credentials for Notion (access token), Todoist (access token) and Redis. Todoist Follow this video to get Todoist to obtain API Token. Todoist Credentials.mp4 Notion Follow this video to get Notion Integration Secret. Redis Follow this video to get Redis Setup The setup involves quite a lot of steps, yet many of them can be automated for business internal purposes. Just follow the video or do the following steps: Setup credentials for Notion (access token), Todoist (access token) and Redis - you can also create empty credentials and populate these later during further setup Clone this workflow by clicking the "Use workflow" button and then choosing your n8n instance - otherwise you need to map the credentials of many nodes. Follow the instructions described within the bundle of sticky notes on the top left of the workflow How to use You can apply changes (create, update, delete) to tasks both in Notion and Todoist which then get synced over within a couple of seconds (this is handled by the differential realtime sync) The daily running full sync, resolves possible discrepancies in Todoist. This workflow incorporates ideas and techniques inspired by Mario (https://n8n.io/creators/octionic/) whose expertise with specific nodes helped shape parts of this automation. Significant enhancements and customizations have been made to deliver a unique and improved solution.
by Itunu
Automatically detect unsubscribe replies in your outreach emails and clean your Google Sheets contact list; keeping your domain reputation and deliverability strong. 🎯 Who it’s for This template is designed for freelancers, lead generation specialists, and outreach managers; particularly those running email outreach campaigns for clients or personal lead-gen projects. If you send cold emails, manage multiple lead lists, or handle outreach at scale, this workflow helps you automatically manage unsubscribe requests to maintain healthy email deliverability and protect your domain reputation. ⚙️ How it works Trigger: The workflow starts when a new reply is received in your Gmail inbox. Intent Detection: The email text is analyzed using OpenAI to detect unsubscribe intent (“unsubscribe”, “remove me”, “opt out”, etc.). Normalization & Filtering: A Code node verifies the AI output for accuracy and ensures the result is standardized as either "unsubscribe" or "keep". Check & Update Sheets: If the contact requested removal, the workflow checks your Unsubscribe Sheet to see if they’re already listed. If not, the contact is added to the Unsubscribe Sheet and simultaneously removed from your Main Outreach Sheet. Optional Gmail Label: Adds an “Unsubscribe” tag in Gmail for quick visual tracking (optional customization). 🧩 Requirements To run this workflow, you’ll need: Gmail Credentials** → for reading incoming replies and applying labels. Google Sheets Credentials** → to manage both the “Main” and “Unsubscribe” spreadsheets. OpenAI API Key** → used for detecting unsubscribe intent from message text. All credentials can be added through the n8n Credentials Manager. 🧠 How to Customize Polling Time:** Adjust how often the Gmail Trigger checks for new replies (default: every 5 minutes). Sheets:** Replace the linked Google Sheets IDs with your own. You can change sheet names and columns freely. Intent Rules:** Modify the Code node to include additional unsubscribe phrases or alternate keywords. Optional Gmail Tagging:** Enable or disable tagging for unsubscribed messages. Secondary Validation:** Enable the second “If” check after the OpenAI node to double-confirm unsubscribe intent before moving contacts. 💡 Why this workflow matters By automatically managing unsubscribe requests, you: Respect recipients’ opt-out preferences Reduce spam complaints Protect your domain reputation and increase deliverability Save hours of manual list cleaning This is a must-have automation for anyone running cold email outreach, especially freelancers managing multiple client inboxes. 🪄 Quick Setup Tips Replace all "Gmail account" and "Google Service Account account" credential references with your actual credentials. Ensure your sheet has an EMAIL column for lookup. Test with a few mock replies before activating for production. Optional: Add a time-based trigger to run the sheet cleanup periodically.
by Gilbert Onyebuchi
This workflow leverages n8n to automate LinkedIn content creation from start to finish. Upload an image and quote through a web form, and get a professionally designed post with AI-generated captions, ready to publish in seconds. Features Randomly selects from 6 professional design templates for visual variety Converts HTML designs to high-quality images (90-95% JPEG quality) Generates engaging captions using OpenAI's GPT models Built-in caption editor for customization before posting Direct publishing to LinkedIn profiles or company pages Auto-compresses images for optimal LinkedIn upload Prerequisites N8N Instance: A running n8n instance (cloud or self-hosted) OpenAI API: Active account with API access for caption generation LinkedIn Account: Profile or company page with API access Image Conversion API: HTML CSS to Image account Web Hosting: Platform to host the web form (Netlify, Vercel, or custom server) Setup Instructions 1. Deploy Web Form Download the provided web form template Host on your preferred platform Copy both webhook URLs from your n8n workflow Update form's webhook endpoints with your n8n URLs 2. Configure Image Conversion Sign up at htmlcsstoimage.com Get your API credentials (User ID + API Key) Add to HTTP Request node as Basic Auth credentials 3. Connect OpenAI API Create API key at OpenAI Platform In the ChatGPT HTTP Request node, add Header parameter: Key: Authorization Value: Bearer YOUR_API_KEY Recommended model: gpt-4 or gpt-3.5-turbo 4. Authenticate LinkedIn Create LinkedIn OAuth2 credential in n8n Follow the authentication flow and grant required permissions Select the credential in the "Create a post" LinkedIn node Choose post destination (personal profile or company page) 5. Test the Workflow Submit test data through the web form Monitor n8n execution panel for successful completion Verify image generation, caption quality, and LinkedIn posting Adjust settings as needed based on results Notes Processing time averages 10-20 seconds from upload to preview All 6 design templates are fully responsive and LinkedIn-optimized Caption editor allows full customization before publishing to LinkedIn For questions or issues, please contact me for consulting and support : Linkedin. 🔗 Test with sample data first. Access Web Form Template
by Nabin Bhandari
Who’s it for This template is for businesses, customer support teams, and professionals who want to deliver AI-powered WhatsApp assistance. It helps automate conversations, schedule meetings, answer FAQs, and send follow-up emails — all from WhatsApp. How it works A customer sends a WhatsApp message, which is captured by the Twilio Trigger. The incoming text is formatted and passed to the AI Support Agent. Based on the request, the agent can: Manage Google Calendar events (create, list, delete). Answer questions from your knowledge base (Supabase + embeddings). Draft and send emails via Gmail. Reply directly on WhatsApp with the appropriate response. How to set up Connect your Twilio account with WhatsApp enabled. Add your OpenAI API key. Connect Google Calendar. Connect Gmail. Configure Supabase for knowledge base storage. Requirements Twilio account (with WhatsApp number) OpenAI API key Google Calendar Gmail account Supabase project How to customize Update the Set Fields node with your Twilio number, API keys, and Gmail details. Add custom documents to Supabase for domain-specific FAQs. Adjust AI prompts for different roles (e.g., booking bot, HR assistant, customer support). Extend by adding more tools (CRM, Slack, Notion, etc.) as needed.