by Miha
This n8n template turns raw call transcripts into clean HubSpot call logs and a single, actionable follow-up task—automatically. Paste a transcript and the contact’s email; the workflow finds the contact, summarizes the conversation in 120–160 words, proposes the next best action, and (optionally) updates missing contact fields. Perfect for reps and founders who want accurate CRM hygiene without the manual busywork. How it works A form trigger collects two inputs: Contact email Plain-text call transcript The workflow looks up the HubSpot contact by email to pull known properties. An AI agent reads the transcript (plus known fields) to: Extract participants, role, problem/opportunity, requirements, blockers, timeline, and metrics. Write a 120–160 word recap a teammate can skim. Generate one concrete follow-up task (title + body). Suggest updates for missing contact properties (city, country, job title, job function). The recap is logged to HubSpot as a completed Call engagement. The follow-up is created in HubSpot as a Task with subject and body. (Optional) The contact record is updated using AI-suggested values if the transcript clearly mentions them. How to use Connect HubSpot (OAuth2) on all HubSpot nodes. Connect OpenAI on the AI nodes. Open Form: Capture Transcript, submit the email + transcript. (Optional) In AI: Summarize Call & Draft Task, tweak prompt rules (word count, date normalization). (Optional) In Update Contact from Transcript, review the mapped fields before enabling in production. Activate the workflow and paste transcripts after each call. Requirements HubSpot** (OAuth2) for contact search, call logging, and tasks OpenAI** for summarization and task drafting Notes & customization ideas Swap the form for a Google Drive or S3 watcher to ingest saved transcripts. Add a speech-to-text step if you store audio recordings. Extend Update Contact to include additional fields (timezone, department, seniority). Post the summary to Slack or email the AE for quick handoffs. Gate updates with a confidence check, or route low-confidence changes for manual approval.
by Asfandyar Malik
Short Description: Automatically collect and analyze your competitor’s YouTube performance. This workflow extracts video titles, views, likes, and descriptions from any YouTube channel and saves the data to Google Sheets — helping creators spot viral trends and plan content that performs. Who’s it for For content creators, YouTubers, and marketing teams who want to track what’s working for their competitors — without manually checking their channels every day. How it works This workflow automatically collects data from any YouTube channel you enter. You just write the channel name in the form — n8n fetches the channel ID, gets all recent video IDs, and extracts each video’s title, views, likes, and description. Finally, all the information is saved neatly into a connected Google Sheet for analysis. How to set up Create a Google Sheet with columns for Title, Views, Likes, Description, and URL. Connect your Google account to n8n. Add your YouTube Data API key inside the HTTP Request nodes (use n8n credentials, not hardcoded keys). Update your form submission or trigger node to match your input method. Execute the workflow once to test and verify that data is flowing into your sheet. Requirements YouTube Data API key Google Sheets account n8n cloud or self-hosted instance How to customize You can modify the JavaScript code node to include more metrics (like comments or publish date), filter by keywords, or change the output destination (e.g., Airtable or Notion).
by Rahul Joshi
Description: Transform your Jira project management workflow with this intelligent n8n automation template that continuously tracks, scores, and reports the health of Jira Epics. The automation runs every 6 hours, fetches all active Epics, analyzes linked issues for performance, quality, and stability metrics, and automatically flags at-risk Epics. It updates Jira fields, sends alerts to Slack, logs trends in Google Sheets, and syncs visibility with Monday.com—ensuring teams stay proactive, not reactive. Ideal for agile teams, project managers, and product owners looking to monitor delivery health, detect risks early, and maintain transparent reporting across tools. ✅ What This Template Does (Step-by-Step) ⏱ Trigger Every 6 Hours: Automatically executes every six hours to keep health data updated in near real-time. 📥 Fetch All Epics from Jira: Retrieves all Epics, their keys, and fields via the Jira API to establish a full analysis scope. 🔀 Split Epics for Processing: Converts the batch of Epics into individual items, enabling sequential metric analysis. 🔗 Fetch Linked Issues: Collects all issues linked to each Epic, capturing their types, statuses, cycle times, and labels for deeper health analysis. 📈 Calculate Health Score: Computes a weighted score (0–1 scale) based on: • 40% Average Cycle Time • 30% Bug Ratio • 20% Churn (Reopened issues) • 10% Blocker Ratio Scores above 0.6 indicate at-risk Epics. ⚖️ Decision Gate: At-Risk or Healthy: If the health score exceeds 0.6, the workflow automatically initiates corrective actions. 🔧 Update Jira Epic: Updates Jira with the computed health score and adds an “At Risk” label for visibility in dashboards and filters. 🚨 Send Slack Alerts: Notifies the #project-alerts channel with Epic details, health score, and direct Jira links for immediate attention. 📋 Update Monday.com Pulse: Syncs health metrics and risk status back to your Monday board, maintaining cross-platform transparency. 📊 Log to Google Sheets: Appends health score logs with timestamps and Epic keys for trend analysis, audits, and dashboard creation. 🧠 Key Features ✔️ Automated Jira Epic health scoring (cycle time, churn, bugs, blockers) ✔️ Real-time risk flagging with Slack alerts ✔️ Integrated cross-tool visibility (Jira + Monday + Sheets) ✔️ Continuous trend tracking for performance improvement ✔️ Secure API-based automation 💼 Use Cases 💡 Track project delivery health and spot risks early 📈 Build executive dashboards showing team velocity and quality 🤝 Align product and engineering with shared visibility 🧾 Maintain a compliance audit trail of Epic health trends 📦 Required Integrations • Jira Software Cloud API – for Epic and issue data • Slack API – for real-time team alerts • Monday.com API – for visual board updates • Google Sheets API – for historical tracking and analytics 🎯 Why Use This Template? ✅ Prevents project delays by flagging risks early ✅ Provides automated, data-driven Epic health insights ✅ Connects your reporting ecosystem across platforms ✅ Perfect for Agile and DevOps teams driving continuous improvement
by Trung Tran
AWS IAM Inactive User Automation Alert Workflow > Weekly job that finds IAM users with no activity for > 90 days and notifies a Slack channel. > ⚠️ Important: AWS SigV4 for IAM must be scoped to us-east-1. Create the AWS credential in n8n with region us-east-1 (even if your other services run elsewhere). Who’s it for SRE/DevOps teams that want automated IAM hygiene checks. Security/compliance owners who need regular inactivity reports. MSPs managing multiple AWS accounts who need lightweight alerting. How it works / What it does Weekly scheduler – kicks off the workflow (e.g., every Monday 09:00). Get many users – lists IAM users. Get user – enriches each user with details (password status, MFA, etc.). Filter bad data – drops service-linked users or items without usable dates. IAM user inactive for more than 90 days? – keeps users whose last activity is older than 90 days. Last activity is derived from any of: PasswordLastUsed (console sign-in) AccessKeyLastUsed.LastUsedDate (from GetAccessKeyLastUsed if you add it) Fallback to CreateDate if no usage data exists (optional) Send a message (Slack) – posts an alert for each inactive user. No operation – path for users that don’t match (do nothing). How to set up Credentials AWS (Predefined → AWS) Service: iam Region: us-east-1 ← required for IAM Access/Secret (or Assume Role) with read-only IAM perms (see below). Slack OAuth (bot in your target channel). Requirements n8n (current version). AWS IAM permissions** (minimum): iam:ListUsers, iam:GetUser (Optional for higher fidelity) iam:ListAccessKeys, iam:GetAccessKeyLastUsed Slack bot with permission to post in the target channel. Network egress to iam.amazonaws.com. How to customize the workflow Change window:** set 60/120/180 days by adjusting minus(N, 'days'). Audit log:** append results to Google Sheets/DB with UserName, Arn, LastActivity, CheckedAt. Escalation:** if a user remains inactive for another cycle, mention @security or open a ticket. Auto-remediation (advanced):** on a separate approval path, disable access keys or detach policies. Multi-account / multi-region:** iterate a list of AWS credentials (one per account; IAM stays us-east-1). Exclude list:** add a static list or tag-based filter to skip known service users. Notes & gotchas Many users never sign in; if you don’t pull GetAccessKeyLastUsed, they may look “inactive”. Add that call for accuracy. PasswordLastUsed is null if console login never happened. IAM returns timestamps in ISO or epoch—use toDate/toDateTime before comparisons.
by Yaron Been
CDO Agent with Data Analytics Team Description Complete AI-powered data analytics department with a Chief Data Officer (CDO) agent orchestrating specialized data team members for comprehensive data science, business intelligence, and analytics operations. Overview This n8n workflow creates a comprehensive data analytics department using AI agents. The CDO agent analyzes data requests and delegates tasks to specialized agents for data science, business intelligence, data engineering, machine learning, data visualization, and data governance. Features Strategic CDO agent using OpenAI O3 for complex data strategy and decision-making Six specialized data analytics agents powered by GPT-4.1-mini for efficient execution Complete data analytics lifecycle coverage from collection to insights Automated data pipeline management and ETL processes Advanced machine learning model development and deployment Interactive data visualization and business intelligence reporting Comprehensive data governance and compliance frameworks Team Structure CDO Agent**: Data strategy leadership and team delegation (O3 model) Data Scientist Agent**: Statistical analysis, predictive modeling, machine learning algorithms Business Intelligence Analyst Agent**: Business metrics, KPI tracking, performance dashboards Data Engineer Agent**: Data pipelines, ETL processes, data warehousing, infrastructure Machine Learning Engineer Agent**: ML model deployment, MLOps, model monitoring Data Visualization Specialist Agent**: Interactive dashboards, data storytelling, visual analytics Data Governance Specialist Agent**: Data quality, compliance, privacy, governance policies How to Use Import the workflow into your n8n instance Configure OpenAI API credentials for all chat models Deploy the webhook for chat interactions Send data analytics requests via chat (e.g., "Analyze customer churn patterns and create predictive models") The CDO will analyze and delegate to appropriate specialists Receive comprehensive data insights and deliverables Use Cases Predictive Analytics**: Customer behavior analysis, sales forecasting, risk assessment Business Intelligence**: KPI tracking, performance analysis, strategic business insights Data Engineering**: Pipeline automation, data warehousing, real-time data processing Machine Learning**: Model development, deployment, monitoring, and optimization Data Visualization**: Interactive dashboards, executive reporting, data storytelling Data Governance**: Quality assurance, compliance frameworks, data privacy protection Requirements n8n instance with LangChain nodes OpenAI API access (O3 for CDO, GPT-4.1-mini for specialists) Webhook capability for chat interactions Optional: Integration with data platforms and analytics tools Cost Optimization O3 model used only for strategic CDO decisions and complex data strategy GPT-4.1-mini provides 90% cost reduction for specialist data tasks Parallel processing enables simultaneous agent execution Template libraries reduce redundant analytics development work Integration Options Connect to data platforms (Snowflake, BigQuery, Redshift, Databricks) Integrate with BI tools (Tableau, Power BI, Looker, Grafana) Link to ML platforms (AWS SageMaker, Azure ML, Google AI Platform) Export to business applications and reporting systems Disclaimer: This workflow is provided as a building block for your automation needs. Please review and customize the agents, prompts, and connections according to your specific data analytics requirements and organizational structure. Contact & Resources Website**: nofluff.online YouTube**: @YaronBeen LinkedIn**: Yaron Been Tags #DataAnalytics #DataScience #BusinessIntelligence #MachineLearning #DataEngineering #DataVisualization #DataGovernance #PredictiveAnalytics #BigData #DataDriven #DataStrategy #AnalyticsAutomation #DataPipelines #MLOps #DataQuality #BusinessMetrics #KPITracking #DataInsights #AdvancedAnalytics #n8n #OpenAI #MultiAgentSystem #DataTeam #AnalyticsWorkflow #DataOperations
by Oneclick AI Squad
This automated n8n workflow streamlines invoice creation and payment reminders. It generates invoices on a monthly schedule and sends reminders for overdue payments, updating records in Google Sheets. Good to Know Supports monthly invoice generation and daily overdue checks Integrates with Google Sheets for data management Uses email notifications for invoice delivery and reminders Includes logging for tracking and auditing Features multiple reminder types based on overdue duration How It Works Invoice Creation Flow: Monthly Invoice Trigger** - Initiates workflow on a set monthly schedule Get Clients for Invoicing** - Reads client data from Google Sheet Filter Active Clients** - Filters out inactive clients Generate Invoice Data** - Creates invoice details in required format Save Invoice to Google Sheets** - Appends or updates invoice record in the sheet Send Invoice Email** - Sends the invoice to the client via email Log Invoice Creation** - Logs invoice creation for records/auditing Reminder Flow: Daily Payment Reminder Check** - Triggers workflow daily to check overdue invoices Get Overdue Invoices** - Reads overdue invoices from Google Sheet Filter Overdue Invoices** - Filters invoices still unpaid Calculate Reminder Type** - Calculates how many days overdue Switch Reminder Type** - Decides which type of reminder to send Send Gentle / Follow-up / Urgent / Final Notice** - Sends respective reminder email Update Reminder Log** - Updates reminder status in the sheet How to Use Import workflow into n8n Configure Google Sheets API for data access Set up email service for notifications Define monthly schedule for invoice trigger Test with sample client data and monitor reminders Adjust reminder thresholds as needed Requirements Access to Google Sheets API Email service configuration Scheduled trigger setup in n8n Sheet Columns: Client Name** Invoice ID** Amount** Due Date** Status** Reminder Type** Last Updated** Customizing This Workflow Modify invoice generation schedule Adjust reminder email templates Configure custom Google Sheet columns Set custom overdue thresholds Integrate additional notification methods
by Rahul Joshi
Description This workflow automates the process of retrieving Stripe invoices, validating API responses, generating payment receipts, sending them via email, storing PDFs in Google Drive, and appending details to a Google Sheet ledger. It also includes an error logging system to capture and record workflow issues, ensuring financial operations are both automated and reliable. What This Template Does (Step-by-Step) 📋 Manual Trigger – Start the workflow manually by clicking Execute workflow. 🔗 Fetch Invoices – Authenticates with Stripe and retrieves the 5 most recent invoices (includes customer info, amounts, statuses, and invoice URLs). ✅ Check API Response – Ensures the Stripe API response contains a valid data[] array. If not, errors are logged. 📂 Expand List – Splits Stripe’s bundled invoice list into individual invoice records for independent processing. 💳 IF (Paid?) – Routes invoices based on payment status; only paid invoices move forward. 📧 IF (Already Receipted?) – Skips invoices where a receipt has already been generated (receipt_sent = true). 📑 Download File – Downloads the hosted invoice PDF from Stripe for use in emails and archiving. ✉️ Send Receipt Email – Emails the customer a payment receipt with the PDF attached, using invoice details (number, amount, customer name). ☁️ Upload Invoice PDF – Uploads the invoice PDF to a specific Google Drive folder, named by invoice number. 📊 Append to Ledger – Updates a Google Sheet with invoice metadata (date, invoice number, Drive file ID, link, size). ⚠️ Error Logging – Logs workflow issues (failed API calls, missing data, etc.) into a dedicated error tracking sheet. Prerequisites Stripe API key (with invoice read permissions) Google Drive (destination folder for invoices) Google Sheets with: Receipts Ledger Sheet Error Logging Sheet Gmail OAuth2 account for sending receipts Key Benefits ✅ Automates customer receipt delivery with attached PDFs ✅ Builds a permanent ledger in Google Sheets for finance ✅ Archives invoices in Google Drive for easy retrieval ✅ Prevents duplicates by checking receipt_sent metadata ✅ Includes error logging for smooth monitoring and debugging Perfect For Finance/accounting teams needing automated receipt handling SaaS businesses managing recurring Stripe invoices Operations teams requiring error-proof automation Any business needing audit-ready receipts + logs
by Evoort Solutions
🔗 Automated Semrush Backlink Checker with n8n and Google Sheets 📘 Description This n8n workflow automates backlink data extraction using the Semrush Backlink Checker API available on RapidAPI. By submitting a website via a simple form, the workflow fetches both backlink overview metrics and detailed backlink entries, saving the results directly into a connected Google Sheet. This is an ideal solution for SEO professionals who want fast, automated insights without logging into multiple tools. 🧩 Node-by-Node Explanation On form submission** – Starts the workflow when a user submits a website URL through a web form. HTTP Request* – Sends the URL to the *Semrush Backlink Checker API** using a POST request with headers and form data. Reformat 1** – Extracts high-level backlink overview data like total backlinks and referring domains. Reformat 2** – Extracts individual backlink records such as source URLs, anchors, and metrics. Backlink overview** – Appends overview metrics into the "backlink overflow" tab of a Google Sheet. Backlinks** – Appends detailed backlink data into the main "backlinks" tab of the same Google Sheet. ✅ Benefits of This Workflow No-code integration**: Built entirely within n8n—no scripting required. Time-saving automation**: Eliminates the need to manually log in or export reports from Semrush. Centralized results**: All backlink data is organized in Google Sheets for easy access and sharing. Powered by RapidAPI: Uses the **Semrush Backlink Checker API hosted on RapidAPI for fast, reliable access. Easily extendable**: Can be enhanced with notifications, dashboards, or additional data enrichment. 🛠️ Use Cases 📊 SEO Audit Automation – Auto-generate backlink insights for multiple websites via form submissions. 🧾 Client Reporting – Streamline backlink reporting for SEO agencies or consultants. 📥 Lead Capture Tool – Offer a free backlink analysis tool on your site to capture leads while showcasing value. 🔁 Scheduled Backlink Monitoring – Modify the trigger to run on a schedule for recurring reports. 📈 Campaign Tracking – Monitor backlinks earned during content marketing or digital PR campaigns. 🔐 How to Get Your API Key for the Competitor Keyword Analysis API Go to 👉 Semrush Backlink Checker API - RapidAPI Click "Subscribe to Test" (you may need to sign up or log in). Choose a pricing plan (there’s a free tier for testing). After subscribing, click on the "Endpoints" tab. Your API Key will be visible in the "x-rapidapi-key" header. 🔑 Copy and paste this key into the httpRequest node in your workflow. Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!
by Marth
How It Works: The 5-Node Monitoring Flow This concise workflow efficiently captures, filters, and delivers crucial cybersecurity-related mentions. 1. Monitor: Cybersecurity Keywords (X/Twitter Trigger) This is the entry point of your workflow. It actively searches X (formerly Twitter) for tweets containing the specific keywords you define. Function:** Continuously polls X for tweets that match your specified queries (e.g., your company name, "Log4j," "CVE-2024-XXXX," "ransomware"). Process:** As soon as a matching tweet is found, it triggers the workflow to begin processing that information. 2. Format Notification (Code Node) This node prepares the raw tweet data, transforming it into a clean, actionable message for your alerts. Function:** Extracts key details from the raw tweet and structures them into a clear, concise message. Process:** It pulls out the tweet's text, the user's handle (@screen_name), and the direct URL to the tweet. These pieces are then combined into a user-friendly notificationMessage. You can also include basic filtering logic here if needed. 3. Valid Mention? (If Node) This node acts as a quick filter to help reduce noise and prevent irrelevant alerts from reaching your team. Function:** Serves as a simple conditional check to validate the mention's relevance. Process:** It evaluates the notificationMessage against specific criteria (e.g., ensuring it doesn't contain common spam words like "bot"). If the mention passes this basic validation, the workflow continues. Otherwise, it quietly ends for that particular tweet. 4. Send Notification (Slack Node) This is the delivery mechanism for your alerts, ensuring your team receives instant, visible notifications. Function:** Delivers the formatted alert message directly to your designated communication channel. Process:* The notificationMessage is sent straight to your specified *Slack channel** (e.g., #cyber-alerts or #security-ops). 5. End Workflow (No-Op Node) This node simply marks the successful completion of the workflow's execution path. Function:** Indicates the end of the workflow's process for a given trigger. How to Set Up Implementing this simple cybersecurity monitor in your n8n instance is quick and straightforward. 1. Prepare Your Credentials Before building the workflow, ensure all necessary accounts are set up and their respective credentials are ready for n8n. X (Twitter) API:* You'll need an X (Twitter) developer account to create an application and obtain your Consumer Key/Secret and Access Token/Secret. Use these to set up your *Twitter credential** in n8n. Slack API:* Set up your *Slack credential* in n8n. You'll also need the *Channel ID** of the Slack channel where you want your security alerts to be posted (e.g., #security-alerts or #it-ops). 2. Import the Workflow JSON Get the workflow structure into your n8n instance. Import:** In your n8n instance, go to the "Workflows" section. Click the "New" or "+" icon, then select "Import from JSON." Paste the provided JSON code (from the previous response) into the import dialog and import the workflow. 3. Configure the Nodes Customize the imported workflow to fit your specific monitoring needs. Monitor: Cybersecurity Keywords (X/Twitter):** Click on this node. Select your newly created Twitter Credential. CRITICAL: Modify the "Query" parameter to include your specific brand names, relevant CVEs, or general cybersecurity terms. For example: "YourCompany" OR "CVE-2024-1234" OR "phishing alert". Use OR to combine multiple terms. Send Notification (Slack):** Click on this node. Select your Slack Credential. Replace "YOUR_SLACK_CHANNEL_ID" with the actual Channel ID you noted earlier for your security alerts. (Optional: You can adjust the "Valid Mention?" node's condition if you find specific patterns of false positives in your search results that you want to filter out.) 4. Test and Activate Verify that your workflow is working correctly before setting it live. Manual Test:** Click the "Test Workflow" button (usually in the top right corner of the n8n editor). This will execute the workflow once. Verify Output:** Check your specified Slack channel to confirm that any detected mentions are sent as notifications in the correct format. If no matching tweets are found, you won't see a notification, which is expected. Activate:** Once you're satisfied with the test results, toggle the "Active" switch (usually in the top right corner of the editor) to ON. Your workflow will now automatically monitor X (Twitter) at the specified polling interval.
by Avkash Kakdiya
How it works This workflow automates LinkedIn community engagement by monitoring post comments, filtering new ones, generating AI-powered replies, and posting responses directly on LinkedIn. It also logs all interactions into Google Sheets for tracking and analytics. Step-by-step Trigger & Fetch A Schedule Trigger runs the workflow every 10 minutes. The workflow fetches the latest comments on a specific LinkedIn post using LinkedIn’s API with token-based authentication. Filter for New Comments Retrieves the timestamp of the last processed comment from Google Sheets. Filters out previously handled comments, ensuring only fresh interactions are processed. AI-Powered Reply Generation Sends the new comment to OpenAI GPT-3.5 Turbo with a structured prompt. AI generates a professional, concise, and engaging LinkedIn-appropriate reply (max 2–3 sentences). Post Back to LinkedIn Automatically posts the AI-generated reply under the original comment thread. Maintains consistent formatting and actor identity. Data Logging Appends the original comment, AI response, and metadata into Google Sheets. Enables tracking, review, and future engagement analysis. Benefits Saves time by automating LinkedIn comment replies. Ensures responses are timely, professional, and on-brand. Maintains authentic engagement without manual effort. Prevents duplicate replies by filtering with timestamps. Creates a structured log in Google Sheets for auditing and analytics.
by M Sayed
Automate Egyptian gold and currency price monitoring with beautiful Telegram notifications! 🚀 This workflow scrapes live gold prices and official exchange rates from the Egyptian market every hour and sends professionally formatted updates to your Telegram channel/group. ✨ Features: 🕐 Smart Scheduling: Runs hourly between 10 AM - 10 PM (Cairo timezone) 🥇 Gold Prices: Tracks different gold types with buy/sell rates 💱 Currency Rates: Official exchange rates (USD, EUR, SAR, AED, GBP, etc.) 🎨 Beautiful Formatting: Emoji-rich messages with proper Arabic text formatting ⚡ Reliable: Built-in retry mechanisms and error handling 🇪🇬 Localized: Tailored specifically for the Egyptian market
by Trung Tran
AI-Powered AWS S3 Manager with Audit Logging in n8n (Slack/ChatOps Workflow) > This n8n workflow empowers users to manage AWS S3 buckets and files using natural language via Slack or chat platforms. Equipped with an OpenAI-powered Agent and integrated audit logging to Google Sheets, it supports operations like listing buckets, copying/deleting files, managing folders, and automatically records every action for compliance and traceability. 👥 Who’s it for This workflow is built for: DevOps engineers who want to manage AWS S3 using natural chat commands. Technical support teams interacting with AWS via Slack, Telegram, etc. Automation engineers building ChatOps tools. Organizations that require audit logs for every cloud operation. Users don’t need AWS Console or CLI access — just send a message like “Copy file from dev to prod”. ⚙️ How it works / What it does This workflow turns natural chat input into automated AWS S3 actions using an OpenAI-powered AI Agent in n8n. 🔁 Workflow Overview: Trigger: A user sends a message in Slack, Telegram, etc. AI Agent: Interprets the message Calls one of 6 S3 tools: ListBuckets ListObjects CopyObject DeleteObject ListFolders CreateFolder S3 Action: Performs the requested AWS S3 operation. Audit Log: Logs the tool call to Google Sheets using AddAuditLog: Includes timestamp, tool used, parameters, prompt, reasoning, and user info. 🛠️ How to set up Step-by-step Setup: Webhook Trigger Slack, Telegram, or custom chat platform → connects to n8n. OpenAI Agent Model: gpt-4 or gpt-3.5-turbo Memory: Simple Memory Node Prompt: Instructs agent to always follow tool calls with an AddAuditLog call. AWS S3 Nodes Configure each tool with AWS credentials. Tools: getAll: bucket getAll: file copy: file delete: file getAll: folder create: folder Google Sheets Node Sheet: AWS S3 Audit Logs Operation: Append or Update Row Columns (must match input keys): timestamp, tool, status, chat_prompt, parameters, user_name, tool_call_reasoning Agent Tool Definitions Include AddAuditLog as a 7th tool. Agent calls it immediately after every S3 action (except when logging itself). ✅ Requirements [ ] n8n instance with AI Agent feature [ ] OpenAI API Key [ ] AWS IAM user with S3 access [ ] Google Sheet with required columns [ ] Chat integration (Slack, Telegram, etc.) 🧩 How to customize the workflow | Feature | Customization Tip | |----------------------|--------------------------------------------------------------| | 🌎 Multi-region S3 | Let users include region in the message or agent memory | | 🛡️ Restricted actions| Use memory/user ID to limit delete/copy actions | | 📁 Folder filtering | Extend ListObjects with prefix/suffix filters | | 📤 Upload file | Add PutObject with pre-signed URL support | | 🧾 Extra logging | Add IP, latency, error trace to audit logs | | 📊 Reporting | Link Google Sheet to Looker Studio for audit dashboards | | 🚨 Security alerts | Notify via Slack/Email when DeleteObject is triggered |