by Sk developer
This workflow fetches free Udemy courses hourly via the Udemy Coupons and Courses API on RapidAPI, filters them, and updates a Google Sheet. It sends alerts on errors for smooth monitoring. Node-by-Node Explanation Schedule Trigger: Runs the workflow every hour automatically. Fetch Udemy Coupons: Sends POST request to the Udemy Coupons and Courses API on RapidAPI to get featured courses. Check API Success: Verifies if the API response is successful; routes accordingly. Filter Free Courses: Selects only courses with sale_price of zero (free courses). Send Error Notification: Emails admin if API fetch fails for quick action. Sync Courses to Google Sheet: Appends or updates the filtered free courses into Google Sheets. Google Sheets Columns id name price sale_price image lectures views rating language category subcategory slug store sale_start Google Sheets Setup & Configuration Steps Create Google Sheet: Create or open a Google Sheet where you want to sync courses. Set Headers: Add columns headers matching the fields synced (id, name, price, etc.). Enable Google Sheets API: Go to Google Cloud Console, enable Google Sheets API for your project. Create Service Account: In Google Cloud Console, create a Service Account with editor access. Download Credentials: Download the JSON credentials file from the service account. Share Sheet: Share your Google Sheet with the Service Account email (found in JSON file). Configure n8n Google Sheets Node: Use the service account credentials, set operation to “Append or Update”, provide Sheet URL and sheet name or gid. Match Columns: Map the course fields to your sheet columns and set id as the unique key for updates. How to Obtain RapidAPI Key & Setup API Request Sign up/Login: Visit RapidAPI Udemy Coupons and Courses API and create an account or log in. Subscribe to API: Subscribe to the Udemy Coupons and Courses API plan (free or paid). Get API Key: Navigate to your dashboard and copy your x-rapidapi-key. Configure HTTP Request: In your workflow’s HTTP Request node: Set method to POST. URL: https://udemy-coupons-and-courses.p.rapidapi.com/featured.php Add headers: x-rapidapi-host: udemy-coupons-and-courses.p.rapidapi.com x-rapidapi-key: your copied API key Set content type to multipart/form-data. Add body parameter: page=1 (or as needed). Test API: Run the node to ensure the API responds with data successfully before continuing workflow setup. Use Cases & Benefits Automates daily updates of free Udemy courses in your sheet using the Udemy Coupons and Courses API on RapidAPI. Saves manual effort in tracking coupons and deals. Enables quick error alerts to maintain data accuracy. Ideal for course aggregators, affiliate marketers, or learning platforms needing fresh course data. Who This Workflow Is For Content curators and edtech platforms tracking free courses. Affiliate marketers promoting Udemy deals. Anyone needing real-time access to updated free Udemy coupons.
by Cheng Siong Chin
Introduction Automate peer review assignment and grading with AI-powered evaluation. Designed for educators managing collaborative assessments efficiently. How It Works Webhook receives assignments, distributes them, AI generates review rubrics, emails reviewers, collects responses, calculates scores, stores results, emails reports, updates dashboards, and posts analytics to Slack. Workflow Template Webhook → Store Assignment → Distribute → Generate Review Rubric → Notify Slack → Email Reviewers → Prepare Response → Calculate Score → Store Results → Check Status → Generate Report → Email Report → Update Dashboard → Analytics → Post to Slack → Respond to Webhook Workflow Steps Receive & Store: Webhook captures assignments, stores data. Distribute & Generate: Assigns peer reviewers, AI creates rubrics. Notify & Email: Alerts via Slack, sends review requests. Collect & Score: Gathers responses, calculates peer scores. Report & Update: Generates reports, emails results, updates dashboard. Analyze & Alert: Posts analytics to Slack, confirms completion. Setup Instructions Webhook & Storage: Configure endpoint, set up database. AI Configuration: Add OpenAI key, customize rubric prompts. Communication: Connect Gmail, Slack credentials. Dashboard: Link analytics platform, configure metrics. Prerequisites OpenAI API key Gmail account Slack workspace Database or storage system Dashboard tool Use Cases University peer review assignments Corporate training evaluations Research paper assessments Customization Multi-round review cycles Custom scoring algorithms LMS integration (Canvas, Moodle) Benefits Eliminates manual distribution Ensures consistent evaluation Provides instant feedback and analytics
by Jose Cuartas
Sync Gmail emails to PostgreSQL with S3 attachment storage Automated Gmail Email Processing System Who's it for Businesses and individuals who need to: Archive email communications in a searchable database Backup email attachments to cloud storage Analyze email patterns and communication data Comply with data retention policies Integrate emails with other business systems What it does This workflow automatically captures, processes, and stores Gmail emails in a PostgreSQL database while uploading file attachments to S3/MinIO storage. It handles both individual emails (via Gmail Trigger) and bulk processing (via Schedule Trigger). Key features: Dual processing: real-time individual emails + scheduled bulk retrieval Complete email metadata extraction (sender, recipients, labels, timestamps) HTML to plain text conversion for searchable content Binary attachment processing with metadata extraction Organized S3/MinIO file storage structure UPSERT database operations to prevent duplicates How it works Email Capture: Gmail Trigger detects new emails, Schedule Trigger gets bulk emails from last hour Parallel Processing: Emails with attachments go through binary processing, others go directly to transformation Attachment Handling: Extract metadata, upload to S3/MinIO, create database references Data Transformation: Convert Gmail API format to PostgreSQL structure Storage: UPSERT emails to database with linked attachment information Requirements Credentials needed: Gmail OAuth2 (gmail.readonly scope) PostgreSQL database connection S3/MinIO storage credentials Database setup: Run the provided SQL schema to create the messages table with JSONB fields for flexible data storage. How to set up Gmail OAuth2: Enable Gmail API in Google Cloud Console, create OAuth2 credentials PostgreSQL: Create database and run the SQL schema provided in setup sticky note S3/MinIO: Create bucket "gmail-attachments" with proper upload permissions Configure: Update authenticatedUserEmail in transform scripts to your email Test: Start with single email before enabling bulk processing How to customize Email filters**: Modify Gmail queries (in:sent, in:inbox) to target specific emails Storage structure**: Change S3 file path format in Upload node Processing schedule**: Adjust trigger frequencies based on email volume Database fields**: Extend PostgreSQL schema for additional metadata Attachment types**: Add file type filtering in binary processing logic Note: This workflow processes emails from the last hour to avoid overwhelming the system. Adjust timeframes based on your email volume and processing needs.
by Muhammad Bello
Description This n8n template demonstrates how to turn raw YouTube comments into research-backed content ideas complete with hooks and outlines. Use cases include: Quickly mining a competitor’s audience for video ideas. Generating hooks and outlines for your own channel’s comments. Validating content opportunities with live audience feedback. Good to know Apify is used to scrape YouTube comments (requires an API token). GPT-4.1-mini is used for both filtering and content generation. Tavily provides fresh research to ground the AI’s responses. All outputs are stored in Google Sheets, making it easy to manage and track ideas. How it works Trigger – Paste a YouTube URL into the chat trigger. Scrape Comments – Apify fetches all comments and metadata. Filter – GPT-4.1-mini decides if each comment could inspire a content idea. Store – Comments and “Yes/No” decisions are appended to Google Sheets. Research & Enrich – For “Yes” comments, Tavily provides context, and GPT generates a topic, hook, and outline. Update Sheet – The same row in Google Sheets is updated with enriched fields. Google Sheets Setup Your Google Sheet should include these columns (in this order): id | text | author | likes | isIdea | topic | research | hook | outline id** – unique identifier for each comment text** – the full YouTube comment author** – commenter’s name/handle likes** – number of likes on the comment isIdea** – “Yes” or “No” depending on GPT filter topic** – extracted video topic research** – 300–500 word background from Tavily hook** – engaging opening sentence for a video outline** – structured video outline Setup Steps Connect your Apify, OpenAI, Tavily, and Google Sheets credentials in n8n. Point the Google Sheets nodes to your own document and ensure the above headers exist. Replace sample API keys with your own stored in n8n Credentials. Time to set up: \~15–25 minutes for a first-time n8n user (less if you already have credentials handy). Customizing this workflow Filter logic** – Loosen the GPT filter to allow borderline ideas, or tighten it to only accept the best ones. Research depth** – Change Tavily’s search depth (e.g., depth: basic vs depth: advanced) to control how detailed the background research is. Notification channels* – Send new “Yes” ideas directly to *Slack* (#content-ideas), *Notion* (your content board), or *Email** (notify the content manager instantly). Alternative outputs** – Instead of hooks/outlines, generate: A script draft for YouTube Shorts. Blog post angles based on the same audience comments. A poll question for community engagement.
by Ghufran Barcha
Telegram AI Personal Assistant — Calendar & Email Manager This workflow turns a Telegram bot into a fully functional personal AI assistant capable of handling your schedule and inbox through natural conversation. Send it a text message, record a voice note, or snap a photo — it understands all three and responds intelligently. The assistant is powered by Claude Haiku (via OpenRouter) and comes with a built-in 30-message memory buffer, so it remembers context across a conversation just like a real assistant would. It has full read/write access to Google Calendar and Gmail, meaning it can book meetings, check your availability, send emails, reply to threads, and clean up your inbox — all from a single Telegram chat. What this workflow does Multi-modal input handling Text messages are processed directly Voice notes are downloaded from Telegram and transcribed using OpenAI Whisper Photos are downloaded and analyzed using GPT-4o vision, with any caption included as additional context All three input types are normalized into a unified context object before reaching the agent, so the AI always receives clean, structured input regardless of how the user communicated. Authorization layer Only the allowlisted Telegram User ID can interact with the assistant. Any unauthorized message receives an instant rejection and the workflow stops — no agent calls are made. AI agent with tools The LangChain agent receives the full context and decides autonomously whether to reply conversationally or invoke one of the connected tools. It uses the current date/time from n8n to handle scheduling requests accurately. Google Calendar tools: check availability, create events, list upcoming events, fetch a specific event, update event details, delete events. Gmail tools: send new emails, search the inbox, read a specific email, reply to a thread, delete messages. Persistent memory Each user's conversation is tracked using a sliding window of the last 30 messages, keyed by their Telegram User ID. The assistant remembers what was said earlier in the same session without needing reminders. Example use cases "What do I have on Thursday?" → fetches and summarizes calendar events "Schedule a call with Ahmed tomorrow at 3pm" → creates a calendar event "Any emails from the client today?" → searches Gmail and summarizes results "Reply to John's last email and say I'll confirm by Friday" → reads the thread and sends a reply [sends a photo of a meeting invite] → extracts details from the image and creates a calendar event Setup instructions Telegram API — create a bot via @BotFather and connect the token to the Telegram Trigger node and both send nodes. OpenAI API — required for Whisper voice transcription and GPT-4o image analysis. OpenRouter API — used to run Claude Haiku as the agent's language model. You can swap this for any OpenRouter-compatible model. Google Calendar OAuth2 — authorize your Google account and update the calendar ID (currently set to an example address) in all six calendar tool nodes. Gmail OAuth2 — authorize your Gmail account in all five Gmail tool nodes. User authorization — open the If node and replace the placeholder value with your own Telegram numeric User ID. You can find this by messaging @userinfobot on Telegram. Customization tips To support multiple authorized users, replace the If node with a list-based check or a Code node that checks against an array of allowed IDs. To change the AI model, swap the OpenRouter Chat Model node — the agent prompt and all tools remain fully compatible. To adjust memory length, change the contextWindowLength value in the Simple Memory node (currently 30 messages). To modify the assistant's personality or add new instructions, edit the system prompt inside the MainAgent node. Additional tools (e.g. Notion, Slack, Airtable) can be connected to the MainAgent node as sub-nodes without changing any other part of the workflow.
by Oneclick AI Squad
This n8n workflow monitors and alerts you about new construction projects in specified areas, helping you track competing builders and identify business opportunities. The system automatically searches multiple data sources and sends detailed email reports with upcoming projects. Good to know Email parsing accuracy depends on the consistency of request formats - use the provided template for best results. The workflow includes fallback mock data for demonstration when external APIs are unavailable. Government data sources may have rate limits - the workflow includes proper error handling. Results are filtered to show only upcoming/recent projects (within 3 months). How it works Email Trigger** - Detects new email requests with "Construction Alert Request" in the subject line Check Email Subject** - Validates that the email contains the correct trigger phrase Extract Location Info** - Parses the email body to extract area, city, state, and zip code information Search Government Data** - Queries government databases for public construction projects and permits Search Construction Sites** - Searches construction industry databases for private projects Process Construction Data** - Combines and filters results from both sources, removing duplicates Wait For Data** - Wait for Combines and filters results. Check If Projects Found** - Determines whether to send a results report or no-results notification Generate Email Report** - Creates a professional HTML email with project details and summaries Send Alert Email** - Delivers the construction project report to the requester Send No Results Email** - Notifies when no projects are found in the specified area The workflow also includes a Schedule Trigger that can run automatically on weekdays at 9 AM for regular monitoring. Email Format Examples Input Email Format To: alerts@yourcompany.com Subject: Construction Alert Request Area: Downtown Chicago City: Chicago State: IL Zip: 60601 Additional notes: Looking for commercial projects over $1M Alternative format: To: alerts@yourcompany.com Subject: Construction Alert Request Please search for construction projects in Miami, FL 33101 Focus on residential and mixed-use developments. Output Email Example Subject: 🏗️ Construction Alert: 8 Projects Found in Downtown Chicago 🏗️ Construction Project Alert Report Search Area: Downtown Chicago Report Generated: August 4, 2024, 2:30 PM 📊 Summary Total Projects Found: 8 Search Query: Downtown Chicago IL construction permits 🔍 Upcoming Construction Projects New Commercial Complex - Downtown Chicago 📍 Location: Downtown Chicago | 📅 Start Date: March 2024 | 🏢 Type: Mixed Development Description: Mixed-use commercial and residential development Source: Local Planning Department Office Building Construction - Chicago 📍 Location: Chicago, IL | 📅 Start Date: April 2024 | 🏢 Type: Commercial Description: 5-story office building with retail space Source: Building Permits [Additional projects...] 💡 Next Steps • Review each project for potential competition • Contact project owners for partnership opportunities • Monitor progress and timeline changes • Update your competitive analysis How to use Setup Instructions Import the workflow into your n8n instance Configure Email Credentials: Set up IMAP credentials for receiving emails Set up SMTP credentials for sending alerts Test the workflow with a sample email Set up scheduling (optional) for automated daily checks Sending Alert Requests Send an email to your configured address Use "Construction Alert Request" in the subject line Include location details in the email body Receive detailed project reports within minutes Requirements n8n instance** (cloud or self-hosted) Email account** with IMAP/SMTP access Internet connection** for API calls to construction databases Valid email addresses** for sending and receiving alerts API Integration Code Examples Government Data API Integration // Example API call to USA.gov jobs API const searchGovernmentProjects = async (location) => { const response = await fetch('https://api.usa.gov/jobs/search.json', { method: 'GET', headers: { 'Content-Type': 'application/json', }, params: { keyword: 'construction permit', location_name: location, size: 20 } }); return await response.json(); }; Construction Industry API Integration // Example API call to construction databases const searchConstructionProjects = async (area) => { const response = await fetch('https://www.construction.com/api/search', { method: 'GET', headers: { 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36', 'Accept': 'application/json' }, params: { q: ${area} construction projects, type: 'projects', limit: 15 } }); return await response.json(); }; Email Processing Function // Extract location from email content const extractLocationInfo = (emailBody) => { const lines = emailBody.split('\n'); let area = '', city = '', state = '', zipcode = ''; for (const line of lines) { if (line.toLowerCase().includes('area:')) { area = line.split(':')[1]?.trim(); } if (line.toLowerCase().includes('city:')) { city = line.split(':')[1]?.trim(); } if (line.toLowerCase().includes('state:')) { state = line.split(':')[1]?.trim(); } if (line.toLowerCase().includes('zip:')) { zipcode = line.split(':')[1]?.trim(); } } return { area, city, state, zipcode }; }; Customizing this workflow Adding New Data Sources Add HTTP Request nodes for additional APIs Update the Process Construction Data node to handle new data formats Modify the search parameters based on API requirements Enhanced Email Parsing // Custom email parsing for different formats const parseEmailContent = (emailBody) => { // Add regex patterns for different email formats const patterns = { address: /(\d+\s+[\w\s]+,\s[\w\s]+,\s[A-Z]{2}\s*\d{5})/, coordinates: /(\d+\.\d+),\s*(-?\d+\.\d+)/, zipcode: /\b\d{5}(-\d{4})?\b/ }; // Extract using multiple patterns // Implementation details... }; Custom Alert Conditions Modify the Check If Projects Found node to filter by: Project value/budget Project type (residential, commercial, etc.) Distance from your location Timeline criteria Advanced Scheduling // Set up multiple schedule triggers for different areas const scheduleConfigs = [ { area: "Downtown", cron: "0 9 * * 1-5" }, // Weekdays 9 AM { area: "Suburbs", cron: "0 14 * * 1,3,5" }, // Mon, Wed, Fri 2 PM { area: "Industrial", cron: "0 8 * * 1" } // Monday 8 AM ]; Integration with CRM Systems Add HTTP Request nodes to automatically create leads in your CRM when high-value projects are found: // Example CRM integration const createCRMLead = async (project) => { await fetch('https://your-crm.com/api/leads', { method: 'POST', headers: { 'Authorization': 'Bearer YOUR_TOKEN', 'Content-Type': 'application/json' }, body: JSON.stringify({ name: project.title, location: project.location, value: project.estimatedValue, source: 'Construction Alert System' }) }); }; Troubleshooting No emails received**: Check IMAP credentials and email filters Empty results**: Verify API endpoints and add fallback data sources Failed email delivery**: Confirm SMTP settings and recipient addresses API rate limits**: Implement delays between requests and error handling
by Ninmegne Paul
🔧 How it works Scheduled Trigger The workflow is triggered automatically every day at 12:00 PM using a Cron node. RSS Feed Collection It fetches the latest content from multiple RSS feeds related to Technology, Manga, and Movies. Content Processing & Formatting The collected data is filtered and organized based on your interests. A dynamic HTML email template is generated to present the content in a clean and readable layout. Email Delivery The final newsletter is sent directly to your inbox using the Send Email node. ⚙️ Set up steps Configure RSS Sources Update the RSS feed URLs inside the Set nodes to match your preferred sources. Set Email Recipient Replace the email address in the Send Email node with your own. Adjust Schedule Modify the execution time in the Cron Trigger node if you want the newsletter to be sent at a different time. Activate the Workflow Enable the workflow to start receiving your personalized daily newsletter automatically.
by Carl Fung
✨ Intro This workflow shows how to go beyond a “plain” AI chatbot by: 🧠 Adding a Personality Layer — Link an extra LLM to inject a custom tone and style. Here, it’s Nova, a sassy, high-fashion assistant. You can swap in any personality without changing the main logic. 🎨 Custom Styling with CSS — Easily restyle the chatbot to match your brand or project theme. Together, these make your bot smart, stylish, and uniquely yours. ⚙️ How it Works 📥 Route Input Chat trigger sends messages to a Switch. If a Telegram video note exists → runs the audio path. Otherwise → runs the text path. 🎤 Audio Path Telegram Get a File → OpenAI Speech-to-Text → pass transcript to the agent. 💬 Text Path Chat text is normalized and sent to the agent. 🛠 Agent Brain Uses tools like Gmail 📧, Google Calendar 📅, Google Drive 📂, Airtable 📋, SerpAPI 🌐, Wikipedia 📚, Hacker News 📰, and Calculator ➗. 🧾 Memory Keeps the last 20 messages for context-aware replies. 💅 Optional Personality Polish An LLM Chain adds witty or cheeky tone on top of the agent’s response. 🛠 Setup Steps ⏱ Time Required ~10–15 minutes (+5 minutes for each Google/Airtable connection). 🔑 Connect Credentials OpenAI (and/or Anthropic) Telegram Bot Gmail, Google Calendar, Google Drive Airtable SerpAPI 📌 Configure IDs Set Airtable base/table. Set Calendar email. Adjust Drive search query defaults if needed. 🎙 Voice Optional Disable Telegram + Transcribe nodes if you only want text chat. 🎭 Choose Tone Edit Chat Trigger’s welcome text/CSS for custom look. Or disable persona chain for neutral voice. 🚀 Publish Activate workflow and share the chat URL. 💡 Detailed behavior notes are available as sticky notes inside the workflow.
by Davide
This workflow automates the process of generating advertising (ADV) images from multiple reference images and publishing them directly to social media (Instagram and Facebook with Upload-Post) with Seedream v4 AI. This workflow automates the process of generating an AI image based on a user's text prompt and up to 6 reference images. The process is triggered by a user submitting a web form. Key Advantages ✅ Automated Image Creation – Generates high-quality, consistent visuals from multiple references without manual editing. ✅ Seamless Social Media Publishing – Automatically posts to Instagram and Facebook with minimal effort. ✅ SEO-Optimized Titles – Ensures your posts get better reach with AI-generated, keyword-friendly titles. ✅ Scalable Workflow – Can be triggered manually, on schedule, or via form submissions. ✅ Time-Saving – Reduces manual steps from design to publishing, enabling faster content production. ✅ Multi-Platform Support – Easily extendable to other platforms (TikTok, LinkedIn, etc.) with Upload-Post API. How It Works Form Trigger: A user fills out a form with a "Prompt" (text description) and a list of "Reference images" (comma-separated URLs). Data Processing: The workflow converts the submitted image URL string into a proper array for the AI API. AI Image Generation: The workflow sends the prompt and image URLs to the fal.ai API (specifically, the ByteDance seedream model) to generate a new, consistent image. Status Polling: It periodically checks the status of the AI job until the image generation is COMPLETED. Result Retrieval: Once complete, it fetches the URL of the generated image and downloads the image file itself. SEO Title Generation: The original user prompt is sent to OpenAI's GPT-4o-mini model to generate an optimized, engaging social media title. Cloud Backup: The generated image is uploaded to a specified Google Drive folder for storage. Social Media Posting: Finally, the workflow posts the downloaded image file to both Instagram and Facebook via the Upload-Post.com API, using the AI-generated title. Set Up Steps To make this workflow functional, you need to configure several third-party services and their corresponding credentials within n8n. Obtain fal.ai API Key: Create an account at fal.ai. Locate your API key in your account settings. In the "Create Video" and "Get status" nodes, edit the HTTP Header Auth credentials. Set the Header Name to Authorization and the Value to Key YOUR_FAL_AI_API_KEY. Configure Upload-Post.com API: Create an account at Upload-Post.com and get your API key. Create a profile within the Upload-Post app (e.g., test1); this profile manages your social account connections. In both the "Post to Instagram" and "Post to Facebook" nodes, edit the HTTP Header Auth credentials. Set the Header Name to Authorization and the Value to Apikey YOUR_UPLOAD_POST_API_KEY. Crucially, in the same nodes, find the user parameter in the body and replace the placeholder YOUR_USERNAME with the profile name you created (e.g., test1). Configure OpenAI/OpenRouter (Optional for Title Generation): The "Generate title" node uses an OpenAI-compatible API. The provided example uses OpenRouter. Ensure you have valid credentials (e.g., for OpenRouter or directly for OpenAI) configured in n8n and selected in this node. Configure Google Drive (Optional for Backup): The "Upload Image" node requires Google OAuth credentials. Set up a Google Cloud project, enable the Drive API, and create OAuth 2.0 credentials in the n8n settings. Authenticate and select the desired destination folder in the node's parameters. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by n8n Automation Expert | Template Creator | 2+ Years Experience
🔗 Automated Blockchain Transaction Audit System Transform your blockchain compliance workflow with this enterprise-grade automation that monitors transactions across Ethereum and Solana networks, automatically generates professional audit reports, and maintains complete documentation trails. 🚀 What This Workflow Does This comprehensive automation system: 📊 Multi-Chain Monitoring**: Real-time transaction tracking for Ethereum (via Alchemy API) and Solana networks 🤖 AI-Powered Risk Analysis**: Intelligent scoring algorithm that evaluates transaction risk (0-100 scale) 📄 Automated PDF Generation**: Professional audit reports created instantly using APITemplate.io ☁️ Cloud Storage Integration**: Seamless uploads to Google Drive with organized folder structure 📋 Database Management**: Automatic Notion database entries for complete audit trail tracking 📧 Smart Notifications**: Multi-channel alerts to finance teams with detailed transaction summaries 🔒 Compliance Verification**: Built-in KYC/AML checks and regulatory compliance monitoring 💼 Perfect For FinTech Companies** managing blockchain transactions DeFi Protocols** requiring audit documentation Enterprise Finance Teams** handling crypto compliance Blockchain Auditors** automating report generation Compliance Officers** tracking regulatory requirements 🛠 Key Integrations Alchemy API** - Ethereum transaction monitoring Solana RPC** - Native Solana network access APITemplate.io** - Professional PDF report generation Google Drive** - Secure cloud document storage Notion** - Comprehensive audit database Email/SMTP** - Multi-recipient notification system Etherscan/Solscan** - Smart contract verification ⚡ Technical Highlights 10 Optimized Nodes** with parallel processing capabilities Sub-30 Second Processing** for complete audit cycles Enterprise Security** with credential management Error Handling** with automatic retry mechanisms Scalable Architecture** supporting 1000+ transactions/hour Risk Scoring Algorithm** with customizable parameters 📊 Business Impact 80% Cost Reduction** in manual audit processes 95% Error Elimination** through automation 100% Compliance Coverage** with immutable audit trails 70% Time Savings** for finance teams 🔧 Setup Requirements Before using this workflow, ensure you have: Alchemy API key for Ethereum monitoring APITemplate.io account with audit report template Google Drive service account with folder permissions Notion workspace with configured audit database SMTP credentials for email notifications Etherscan API key for contract verification 📈 Use Cases Transaction Compliance Monitoring**: Automatic flagging of high-risk transactions Regulatory Reporting**: Scheduled audit report generation for authorities Internal Auditing**: Complete documentation for financial reviews Risk Management**: Real-time scoring and alert systems Multi-Chain Portfolio Tracking**: Unified reporting across blockchain networks 🎯 Why Choose This Workflow This isn't just another blockchain monitor - it's a complete document management ecosystem that transforms raw blockchain data into professional, compliant documentation while maintaining enterprise-grade security and scalability. Perfect for organizations serious about blockchain compliance and audit trail management! 🚀 🔄 Workflow Process Webhook Trigger receives blockchain event Parallel Monitoring queries Ethereum & Solana networks AI Processing analyzes transaction data and calculates risk Document Generation creates professional PDF audit reports Multi-Channel Distribution uploads to Drive, logs in Notion, sends notifications Verification & Response confirms all processes completed successfully Ready to automate your blockchain compliance? Import this workflow and transform your audit processes today! ✨
by Fahmi Fahreza
Match Resumes to Jobs Automatically with Gemini AI and Decodo Scraping Sign up for Decodo HERE for Discount This automation intelligently connects candidate profiles to job opportunities. It takes an intake form with a short summary, resume link, and optional LinkedIn profile, then enriches the data using Decodo and Gemini. The workflow analyzes skills, experience, and role relevance, ranks top matches, and emails a polished HTML report directly to your inbox—saving hours of manual review and matching effort. Who’s it for? This template is designed for recruiters, hiring managers, and talent operations teams who handle large candidate volumes and want faster, more accurate shortlisting. It’s also helpful for job seekers or career coaches who wish to identify high-fit openings automatically using structured AI analysis. How it works Receive an intake form containing a candidate’s resume, summary, and LinkedIn URL. Parse and summarize the resume with Gemini for core skills and experience. Enrich the data using Decodo scraping to gather extra profile details. Merge insights and rank job matches from Decodo’s job data. Generate an HTML shortlist and email it automatically through Gmail. How to set up Connect credentials for Gmail, Google Gemini, and Decodo. Update the Webhook path and test your form connection. Customize variables such as location or role preferences. Enable Send as HTML in the Gmail node for clean reports. Publish as self-hosted if community nodes are included.
by Cheng Siong Chin
How It Works This workflow automates AI decision governance by tracing, assessing, and auditing automated decisions for risk and compliance. Designed for AI governance officers, compliance teams, and regulated industries, it addresses the critical need for explainability and accountability in AI-driven decisions. A schedule trigger initiates a simulated decision request, which is processed by a Decision Trace Agent to extract metadata. A Governance Agent then delegates to Risk Assessment and Compliance Checker sub-agents. Decisions are routed by risk level—high-risk cases trigger Slack alerts and are stored separately—while all outcomes are merged into a governance report sent via email, with full audit trail and explainability report stored for regulatory review. Setup Steps Set schedule trigger interval to match governance audit frequency. Add OpenAI API credentials to all OpenAI Model nodes. Configure Slack credentials and set high-risk alert channel. Add Gmail/SMTP credentials to Send Governance Report node. Replace simulated decision request with live AI system webhook. Prerequisites Slack workspace with bot token Gmail or SMTP credentials Google Sheets or database for audit storage Use Cases Regulatory compliance auditing for AI-driven loan or insurance decisions Automated fairness and bias detection in HR or admissions systems Customization Swap simulated input with live AI system API or decision log feed Add sub-agents for fairness, bias, or sector-specific compliance checks Benefits Automates end-to-end AI decision auditing on a schedule Ensures high-risk decisions are flagged and stored instantly