by David Olusola
💰 Track Daily Fiat & Crypto Exchange Rates Report with ExchangeRate-API & CoinGecko A simple, reliable workflow that emails you a beautiful HTML currency report every morning at 8:00 AM (your n8n server’s timezone). It pulls USD→EUR and USD→NGN fiat rates and BTC/ETH prices (+ 24h % change), then formats a clean HTML email. 📌 What It Does ⏰ Schedule: Runs daily at 8:00 AM 🌍 Fiat Rates: USD→EUR and USD→NGN (via ExchangeRate-API, no key needed) ₿ Crypto: BTC & ETH prices + 24h change (via CoinGecko, no key needed) ✉️ Email: Sends a mobile-friendly HTML + plain text fallback 🗺️ Node Map (At a Glance) | # | Node Name | Type | Purpose | |---|-----------|------|---------| | 1 | Daily 8AM Trigger | Schedule Trigger | Fires every day at 08:00 | | 2 | Get Fiat Exchange Rates | HTTP Request | https://api.exchangerate-api.com/v4/latest/USD | | 3 | Get Crypto Prices | HTTP Request | CoinGecko simple price endpoint | | 4 | Merge | Merge | Combines fiat + crypto responses | | 5 | Format Email Content | Code (v2) | Builds HTML + text, sets subject & summary | | 6 | Send Daily Currency Email | Email Send | Sends the HTML email via SMTP | > 📝 Sticky Notes in the canvas explain each section. They’re optional and safe to delete. ⚙️ Required Setup 1) Schedule Time Open Daily 8AM Trigger → set cron to 08:00 daily. Suggested cron: 0 8 * * (server local time; if you’re in Lagos, ensure server timezone matches *Africa/Lagos** or adjust accordingly). 2) SMTP Credentials Open Send Daily Currency Email → set: From Email: your sender (e.g. your-email@gmail.com) To Email: recipient address Credentials: select your SMTP account Gmail tip:* use *App Passwords** (with 2FA enabled). Server: smtp.gmail.com Port: 587 (STARTTLS) or 465 (SSL) Auth: your full Gmail address + app password 3) API Access Both endpoints are free & no API key: Fiat (USD base): https://api.exchangerate-api.com/v4/latest/USD Crypto (BTC/ETH): https://api.coingecko.com/api/v3/simple/price?ids=bitcoin,ethereum&vs_currencies=usd&include_24hr_change=true 🧩 Input Order The Format Email Content node is written to auto-detect which input is fiat vs crypto, so the Merge order doesn’t matter. A clean pattern is: Get Crypto Prices* → *Merge** (Input 1) Get Fiat Exchange Rates* → *Merge** (Input 2) Merge* → *Format Email Content* → *Send Daily Currency Email** 🚀 Test It Quickly Run Get Fiat Exchange Rates → verify rates.EUR and rates.NGN exist. Run Get Crypto Prices → verify BTC/ETH usd and usd_24h_change. Run Format Email Content → check it outputs subject, html, text. Run Send Daily Currency Email → confirm the styled report arrives. 🎛 Customize Currencies:** Add more fiat codes from rates (e.g., GBP, ZAR) and extend the HTML template. Coins:** Add ids= in CoinGecko (e.g., bitcoin,ethereum,solana) and render extra cards. Send time:** Adjust the cron (e.g., 30 7 * * * for 7:30 AM). Branding:** Edit colors, fonts, and header gradient in the HTML string. Timezone stamp:** Change the display timezone inside the Code node if needed. 🧩 Common Pitfalls & Fixes Email not styled:* Ensure the Email node is set to *HTML** format. Gmail auth fails:* Use an *App Password* and port *587** with STARTTLS. Empty values:** Run the two HTTP nodes once and confirm the responses contain data. Rate limits:** If you increase frequency, consider adding a short Wait node or caching.
by Muh Resky Adiansyah
Salesforce Leads & Opportunities to PostgreSQL (Backfill & Incremental Sync ETL) This workflow extracts Lead and Opportunity data from Salesforce, transforms and normalizes the data, and loads it into PostgreSQL as a structured data bank for reporting and analytics. It is designed for scalable data ingestion and supports both historical backfill and incremental sync in a single workflow. Use Case This workflow is suitable when you need to: Centralize Salesforce data into a database for reporting Build a data warehouse for BI tools (Looker Studio, Metabase, etc.) Track lead-to-opportunity lifecycle Merge multiple Salesforce objects into a unified dataset Maintain a clean and normalized CRM data layer Two Input Modes 1. Historical Backfill (Manual Trigger) Run once to populate historical data. Set start_date and end_date in the "Set Historical Date Range" node Data is split into 7-day batches Each batch is processed sequentially to reduce API load 2. Incremental Sync (Schedule Trigger) Runs automatically (e.g. daily). Date range is generated dynamically using ISO datetime Typically pulls data from yesterday until today No manual input required Batch Processing Date ranges are processed in weekly batches. This helps: Prevent large API requests Reduce timeout risk Improve stability during backfill Keep memory usage efficient Core Workflow Logic 1. Data Extraction Fetch Lead records from Salesforce Fetch Opportunity records from Salesforce Filter using CreatedDate (since_datetime and until_datetime) 2. Phone-Based Routing Records are split into two paths: Records without phone: Skip normalization Still included in final dataset Records with phone: Processed for normalization Used for merging This ensures no data is lost even if phone is missing. 3. Phone Normalization (+62) Phone numbers are standardized into: +62XXXXXXXXXX Steps: Remove spaces and symbols Remove all non-digit characters Convert 0xxxx → 62xxxx Ensure no duplicated prefix (e.g. 6262) Add "+" prefix This uses Indonesia's International Direct Dialing (IDD) code: +62 4. Opportunity De-duplication Duplicate opportunities are removed Based on normalized phone key This ensures clean merging and avoids duplicate enrichment. 5. Lead–Opportunity Merge Merge is done using normalized phone fields: body.nomorlead body.nomoroppty Behavior: Lead is the primary dataset Opportunity enriches lead Records without phone: Still preserved Not removed 6. Data Standardization All records are transformed into a unified schema: Source_Object SF_Id CreatedDate CreatedById Name Phone Clean_Phone Email LeadSource Status StageName OwnerId AccountId Amount 7. Upsert to PostgreSQL Uses UPSERT (insert or update) Matching key: sf_id Behavior: New data → insert Existing data → update Ensures: No duplicate records Idempotent execution Data Flow Summary Salesforce (Lead + Opportunity) → Date Filtering → Batch Processing (weekly) → Phone Routing → Phone Normalization (+62) → Opportunity Deduplication → Lead–Opportunity Merge → Data Standardization → PostgreSQL (Upsert) Setup Requirements Before using this workflow, prepare the following: 1. Salesforce Salesforce OAuth2 credential Access to: Lead object Opportunity object Ensure API access is enabled 2. PostgreSQL Active PostgreSQL database Credentials configured in n8n Table created (see schema below) 3. n8n Environment n8n instance (cloud or self-hosted) Salesforce node configured PostgreSQL node configured 4. Date Configuration (Backfill) Set start_date and end_date manually in: "Set Historical Date Range" node 5. Schedule Configuration (Incremental) Configure Schedule Trigger Recommended: Daily execution Off-peak hours Minimal PostgreSQL Table Schema CREATE TABLE n8n_salesforce_data ( sf_id TEXT PRIMARY KEY, Source_Object TEXT, CreatedDate TIMESTAMP, CreatedById TEXT, Name TEXT, Phone TEXT, Clean_Phone TEXT, Email TEXT, LeadSource TEXT, Status TEXT, StageName TEXT, OwnerId TEXT, AccountId TEXT, Amount NUMERIC, synced_at TIMESTAMP DEFAULT NOW() ); Important Notes sf_id is used as the unique key for upsert Clean_Phone is recommended for indexing if used in analytics Data consistency depends on phone normalization quality Schema must be updated manually if additional fields are added Known Limitations Phone-based matching may fail if: Phone numbers are inconsistent Phone is missing in both Lead and Opportunity No deduplication for Leads (only Opportunities handled) No retry logic for API failures (can be added) Recommended Improvements Add index on Clean_Phone for faster queries Add logging table for monitoring ETL runs Add retry and error handling nodes Extend support for: Contact Account Campaign data Summary This workflow provides a reliable and scalable way to: Extract Salesforce data Normalize and merge datasets Store structured data in PostgreSQL Enable analytics and reporting pipelines It is best suited for teams building a lightweight data warehouse layer on top of Salesforce.
by Roshan Ramani
Who's it for Business owners, marketers, and web developers who want to instantly respond to website contact form submissions and maintain organized lead records without manual monitoring. What it does This workflow automatically processes contact form submissions from your website, sending immediate WhatsApp notifications with formatted lead details while simultaneously logging all data to Google Sheets for organized lead management and follow-up tracking. How it works When someone submits your website contact form, the webhook instantly receives the data, formats it into a professional WhatsApp message with emojis and structure, sends the notification to your phone, and logs all details (name, email, phone, service, message, timestamp) to a Google Sheets database for permanent storage and analysis. Requirements WhatsApp Business API credentials Google Sheets API access with a spreadsheet containing these columns: date (for timestamp) name (contact's full name) email (contact's email address) phone (contact's phone number) service (requested service/interest) message (contact's message/inquiry) Website contact form that can POST to webhook URL with fields: name, email, phone, service, message n8n instance (self-hosted or cloud) Google Sheets Setup Create a new Google Sheet with the following column headers in row 1: Column A: date Column B: name Column C: email Column D: phone Column E: service Column F: message The workflow will automatically populate these columns with each form submission and use the date column for duplicate checking. How to set up Credentials Setup: Configure WhatsApp Business API credentials in the WhatsApp node Set up Google Sheets API connection and grant necessary permissions Configuration: Update the recipient phone number in the WhatsApp node (format: +1234567890) Replace the Google Sheets document ID with your spreadsheet ID Ensure your Google Sheet has the required column structure mentioned above Integration: Copy the webhook URL from the Contact Form Trigger node Configure your website form to POST data to this endpoint with field names: name, email, phone, service, message Testing: Test the workflow by submitting a sample form entry Verify WhatsApp notification is received and data appears in Google Sheets How to customize the workflow Message Format:** Modify the WhatsApp message template in the Format Lead Data node Additional Fields:** Add more form fields by updating both the Code node and Google Sheets mapping Email Notifications:** Include email alerts by adding an Email node after the Format Lead Data node Conditional Logic:** Set up different notifications for high-priority services or VIP customers Data Validation:** Add filtering rules in the Code node to handle spam or invalid submissions Multiple Recipients:** Configure the WhatsApp node to send alerts to multiple team members
by Cong Nguyen
📄 What this workflow does This workflow automates the process of finding LinkedIn leads and writing personalized outreach messages. It takes user input (keywords + purpose), generates a Boolean LinkedIn search query with Gemini, fetches up to 20 results via Google Custom Search API, logs them into Google Sheets, and then drafts custom outreach messages for each lead. Finally, the workflow updates the sheet and optionally sends you an email notification with the results. 👤 Who is this for Sales and business development teams who want to automate LinkedIn prospecting. Recruiters searching for candidates and generating outreach at scale. Marketers or founders looking for potential partners, clients, or collaborators. Anyone tired of manual LinkedIn searches and copy-pasting outreach messages. ✅ Requirements Google Sheets account (with a sheet for storing LinkedIn leads + messages). Google Custom Search Engine (CSE) enabled with "Search the entire web" and valid cx. Gemini API access (for Boolean query generation + outreach message drafting). SMTP credentials for optional email notifications. ⚙️ How to set up Connect your Google Sheets account and select the sheet to store results. Configure Gemini API credentials in n8n for both search query + outreach message generation. Create a Google Custom Search Engine and note down the key and cx. Update the HTTP Request node with your credentials (key, cx, hl, gl). Set up SMTP credentials if you want email notifications. Publish the Form trigger and test with sample keywords + purposes. 🔁 How it works Form Submit → Collects user input: keywords + purpose of contact. Gemini (Boolean Generator) → Creates a LinkedIn-specific search query (site:linkedin.com). Google Custom Search API → Fetches up to 20 matching profiles or company pages. Append to Google Sheets → Saves name, LinkedIn URL, description. Split & Loop → Processes each LinkedIn entry one by one. Gemini (Message Writer) → Generates personalized outreach messages using Purpose + company info. Update Google Sheets → Adds outreach message to the matching LinkedIn row. Optional Email Notification → Sends you a link to the updated sheet. 💡 About Margin AI Margin AI is an AI-services agency that acts as your AI Service Companion. We design intelligent, human-centric automation solutions—turning your team’s best practices into scalable, automated workflows and tools. Industries like marketing, sales, and operations benefit from our tailored AI consulting, automation tools, chatbot development, and more.
by Cristian Tala Sánchez
Who is this workflow for This workflow is designed for busy professionals, productivity enthusiasts, and teams drowning in email overload. Whether you're a startup founder, operations manager, executive assistant, or team lead, this solution helps you bring structure and clarity to your inbox. If you struggle to identify which emails deserve immediate attention versus which can be scheduled, delegated, or ignored, this workflow is for you. What it does / How it works This n8n automation integrates Gmail and OpenAI to apply the Eisenhower Matrix—a classic productivity framework—to incoming emails. It reads each new unread email from your inbox and automatically classifies it into one of four categories based on urgency and importance: Urgent + Important**: Critical messages requiring immediate action (e.g., legal, financial, investor or user-blocking issues). Not Urgent + Important**: High-value strategic emails you should schedule time for (e.g., partnership opportunities, key coordination). Urgent + Not Important**: Time-sensitive but delegable tasks (e.g., routine operations or technical updates). Not Urgent + Not Important**: Low-value noise like spam or promotions that should be archived or batch-reviewed later. The classification is powered by a GPT model with custom prompts tailored to understand email context and assign the right category with high accuracy. After classification, the workflow adds the appropriate label in Gmail and automatically archives emails marked as low-priority. The result: your inbox becomes a dynamic priority system, helping you make faster, smarter decisions without the mental load of constant triage. How to set it up Create Gmail Labels: Manually create these four labels in your Gmail account: Urgent + Important Not Urgent + Important Urgent + Not Important Not Urgent + Not Important Connect Accounts in n8n: Set up Gmail OAuth2 credentials in your n8n instance to allow reading and labeling emails. Add your OpenAI API key to enable the AI classification. Update Label IDs: In the Gmail nodes of the workflow, replace the label IDs (e.g., Label_4335697733647997523) with the IDs from your own Gmail account. You can find these by creating a dummy workflow with a “Gmail → Get All Labels” node. Test and Deploy: You can run the workflow manually using the “Execute Workflow” trigger or set it to run automatically with the Gmail Trigger that polls every minute. Review your Gmail inbox to see how labels are applied and ensure the archive function is working as expected. Requirements A free or paid n8n instance (self-hosted or cloud) Gmail account with OAuth2 access configured in n8n OpenAI API key (GPT-4.1-mini recommended for accuracy) Four predefined Gmail labels that match the Eisenhower Matrix Basic familiarity with editing Gmail node parameters in n8n How to customize the workflow Tailor the AI prompts**: Update the Text Classifier node with your own examples or definitions of what counts as urgent or important based on your business context. Refine inputs**: Add filters to process only emails from specific senders or domains (e.g., VIP clients, your team). Extend outcomes**: Trigger Slack alerts for urgent messages, auto-reply to certain senders, or sync scheduled items to your calendar. Localization**: Adjust labels and prompts to match your preferred language or naming conventions. Archive rules**: Modify the "Remove Labels" node to exclude specific categories from archiving or add additional cleanup actions. Why this improves productivity This workflow removes decision fatigue from your email routine. By automatically labeling and sorting emails according to urgency and importance, you: Spend less time sorting emails manually Focus energy on what truly matters Schedule strategic tasks thoughtfully Delegate or ignore distractions confidently Instead of reacting to your inbox, you take control of it—turning email chaos into a structured priority system that aligns with your goals.
by Razvan Bara
How it works This workflow for trip weather forecasting is event-driven, starting when a calendar event is created or updated, and provides timely weather alerts and forecasts tailored to your travel dates and locations. Overall, this workflow efficiently integrates calendar travel plans with real-time and updated weather intelligence for ultimate travel preparedness and peace of mind. From the creator If you’re jetting off frequently, bouncing between time zones, juggling meetings, and squeezing every drop of life out of travel, you need this flow. This ain’t your grandma’s weather app. It’s a bulletproof system that scans your calendar, mines your trips, and delivers laser-targeted weather intel and urgent alerts, right when you need it. No more surprises**. No more scrambling**. Just real-time weather mastery that saves your schedule. You’re not just traveling: you’re dominating. This flow makes sure the only thing you worry about is your next move, not whether the weather’s gonna ruin it. Time to upgrade from a tourist to a boss. Step-by-step 📅 Google Calendar Triggers (Event Created/Updated): The workflow starts immediately upon creation or update of any calendar event, enabling real-time detection of new or changed travel plans. ✈ Identify Trips: Filters these calendar events to detect travel-related trips by matching keywords such as "trip," "flight," or "vacation" in titles or descriptions. 📍Extract Locations: Parses each trip event’s details to extract start and end dates and the trip destination from the summary/description/location fields. 🌐 Build interrogation URL: Constructs a Visual Crossing API request URL dynamically based on the extracted trip location and dates, including daily forecasts and alerts. Fetches the detailed weather forecast and alert data for the trip location and duration right after detecting the event. Formats the raw weather data into a readable summary 🌤️🌪🌀 including temperatures, precipitation probabilities, conditions, and eventual severe weather alerts. 📲 📧 Send Forecast: Sends the forecast summary with alerts via Telegram to keep the user informed instantly. ⌛One day before the trip: Pauses the workflow until exactly one day before the trip start date, ensuring a timely second fetch when more accurate or updated weather data is available and the updated forecast is sent. Optional You can replace the Telegram node with email, WhatsApp, Slack, SMS notifications, or add multiple notification nodes to receive them across all desired channels.
by ScoutNow
Automated Daily Competitor Tweet Summarizer with X API, GPT-5-Nano, and Gmail Stay on top of your competition with this powerful n8n workflow that automatically fetches and summarizes your competitors’ latest tweets every day. Using the official X (formerly Twitter) API and OpenAI's GPT-5-Nano model, this template extracts insights from public tweets and sends concise summaries directly to your Gmail inbox. Ideal for marketing teams, product managers, PR professionals, and competitive intelligence analysts, this solution turns noisy social feeds into clear, actionable summaries—automated and customized to your needs. Features Daily automation: Fetches competitor tweets every 24 hours using X API AI summarization: Uses GPT-5-Nano to highlight key insights and themes Smart filtering: Cleans and filters tweets for relevance before summarizing Email delivery: Sends summaries to Gmail (or your team’s inbox) Fully customizable: Modify schedules, accounts, and integrations as needed Setup Instructions Get API Keys: X API (Bearer Token) – from developer.x.com OpenAI API Key – from platform.openai.com Gmail OAuth2 credentials (via Google Cloud Console) Configure in n8n: Import the workflow Add credentials under the "Credentials" tab Set target X usernames and schedule Customize Delivery (Optional): Set email subject, recipients Add additional integrations (e.g., Slack, Notion, Sheets) How It Works Trigger: A daily cron node initiates the workflow. Fetch User ID: The workflow uses the X API to retrieve the user ID based on the provided username. This step is necessary because the tweet retrieval endpoint requires a user ID, not a username. Fetch Tweets: Using the extracted user ID, the workflow queries the X API for recent tweets from the selected account. Clean Data: Filters out replies, retweets, and any irrelevant content to ensure only meaningful tweets are summarized. Summarize: GPT-4 processes the cleaned tweet content and generates a concise, insightful summary. Send Email: The Gmail node sends the final summary to your inbox or chosen recipient. Use Cases Track competitor announcements and marketing messages Automate daily social media briefs for leadership Monitor trends in your industry effortlessly Keep your team aligned with market developments Requirements Valid X API credentials (Bearer token) OpenAI API key Gmail OAuth2 credentials Access to n8n (cloud or self-hosted) Delivery Options While Gmail is the default, you can easily extend the workflow to integrate with: Slack Notion Google Sheets Webhooks Any supported n8n integration Automate your competitive intelligence process and stay informed—without lifting a finger.
by ermanatalay
Create a powerful brand/company monitoring system that fetches news headlines, performs AI-powered sentiment analysis, and delivers witty, easy-to-read reports via email. This workflow turns brand mentions into a lively “personality analysis” — making your reports not only insightful but also fun to read. Perfect for teams that want to stay informed and entertained. How it works ++Data Collection++: A Google Sheets table captures brand name and recipient email which triggers the workflow. ++News Aggregation++: The RSS Read node fetches recent news headlines from Google News based on the specified brand or company keyword. ++Content Processing++: News headlines are aggregated and formatted for AI analysis. ++AI Analysis++: Gemini 2.5 Flash model plays the role of a brand analyst, writing reports as if the brand were a character in a story. It highlights strengths, quirks, and challenges in a witty, narrative-driven style — while still providing sentiment scores and action points. ++Report Generation++: JavaScript code structures the AI response into well-formatted HTML paragraphs for a smooth email reading experience. ++Automated Delivery++: Gmail integration sends the analysis report directly to the specified email address. How to use First, create a google sheets document with sheet name="page1", A1 cell name="keyword" and B1 cell name="email". The system will read the keyword & email data when a new row data is entered. Paste the url of your google sheets document into the first trigger node. Select trigger on "row added" in the node. Enter your credentials to connect Gemini PaLM API account in the "message a model" node of Google. Enter your credentials to connect Gmail account in the "send a message" node. The workflow automatically runs when new row is detected. Recipients receive comprehensive sentiment analysis reports within minutes! Requirements -Google Sheets URL -Google Gemini API credentials for AI analysis -Gmail API credentials for email delivery
by Dahiana
This template demonstrates how to build an AI-powered name generator that creates realistic names perfect for UX/UI designers, developers, and content creators. Use cases: User persona creation, mockup development, prototype testing, customer testimonials, team member listings, app interface examples, website content, accessibility testing, and any scenario requiring realistic placeholder names. How it works AI-Powered Generation:** Uses any LLM to generate names based on your specifications Customizable Parameters:** Accepts gender preferences, name count, and optional reference names for style matching UX/UI Optimized:** Names are specifically chosen to work well in design mockups and prototypes Smart Formatting:** Returns clean JSON arrays ready for integration with design tools and applications Reference Matching:** Can generate names similar in style to a provided reference name How to set up Replace "Dummy API" credentials with your preferred language model API key Update webhook path and authentication as needed for your application Test with different parameters: gender (masculine/feminine/neutral), count (1-20), reference_name (optional) Integrate webhook URL with your design tools, Bubble apps, or other platforms Requirements LLM API access (OpenAI, Claude, or other language model) n8n instance (cloud or self-hosted) Platform capable of making HTTP POST requests API Usage POST to webhook with JSON body: { "gender": "masculine", "count": 5, "reference_name": "Alex Chen" // optional } Response: { "success": true, "names": ["Marcus Johnson", "David Kim", "Sofia Rodriguez", "Chen Wei", "James Wilson"], "count": 5 } How to customize Modify AI prompt for specific naming styles or regions Add additional parameters (age, profession, cultural background) Connect to databases for persistent name storage Integrate with design tools APIs (Figma, Sketch, Adobe XD) Create batch processing for large mockup projects
by Rahul Joshi
📘 Description: This workflow automates the incident response lifecycle — from creation to communication and archival. It instantly creates Jira tickets for new incidents, alerts the on-call Slack team, generates timeline reports, logs the status in Google Sheets, and archives documentation to Google Drive — all automatically. It helps engineering and DevOps teams respond faster, maintain audit trails, and ensure no incident details are lost, even after Slack or Jira history expires. ⚙️ What This Workflow Does (Step-by-Step) 🟢 Manual Trigger – Start the incident creation and alerting process manually on demand. 🏷️ Define Incident Metadata – Sets up standardized incident data (Service, Severity, Description) used across Jira, Slack, and Sheets for consistent processing. 🎫 Create Jira Incident Ticket – Automatically creates a Jira task with service, severity, and description fields. Returns a unique Jira key and link for tracking. ✅ Validate Jira Ticket Creation Success – Confirms the Jira ticket was successfully created before continuing. True Path: Proceeds to Slack alerts and documentation flow. False Path: Logs the failure details to Google Sheets for debugging. 🚨 Log Jira Creation Failures to Error Sheet – Records any Jira API errors, permission issues, or timeouts to an error log sheet for reliability monitoring. 🔗 Combine Incident & Jira Data – Merges incident context with Jira ticket data to ensure all details are unified for downstream notifications. 💬 Format Incident Alert for Slack – Generates a rich Slack message containing Jira key, service, severity, and description with clickable Jira links. 📢 Alert On-Call Team in Slack – Posts the formatted message directly to the #oncall Slack channel to instantly notify engineers. 📋 Generate Incident Timeline Report – Parses Slack message content to create a detailed incident timeline including timestamps, service, severity, and placeholders for postmortem tracking. 📄 Convert Timeline to Text File – Converts the generated timeline into a structured .txt file for archival and compliance. ☁️ Archive Incident Timeline to Drive – Uploads the finalized incident report to Google Drive (“Incident Reports” folder) with timestamped filenames for traceability. 📊 Log Incident to Status Tracking Sheet – Appends Jira key, service, severity, and timestamp to the “status update” Google Sheet to build a live incident dashboard and enable SLA tracking. 🧩 Prerequisites Jira account with API access Google Sheets for “status update” and “error log” tracking Slack workspace connected via API credentials Google Drive access for archival 💡 Key Benefits ✅ Instant Slack alerts for new incidents ✅ Centralized Jira ticketing and tracking ✅ Automated timeline documentation for audits ✅ Seamless Google Drive archival and status logging ✅ Reduced MTTR through faster communication 👥 Perfect For DevOps and SRE teams managing production incidents Engineering managers overseeing uptime and reliability Organizations needing automated post-incident documentation Teams focused on SLA adherence and compliance reporting
by Javier Rieiro
Short description Automates collection, technical extraction, and automatic generation of Nuclei templates from public CVE PoCs. Converts verified PoCs into reproducible detection templates ready for testing and distribution. Purpose Provide a reliable pipeline that turns public proof-of-concept data into usable detection artifacts. Reduce manual work involved in finding PoCs, extracting exploit details, validating sources, and building Nuclei templates. How it works (technical summary) Runs a scheduled SSH job that executes vulnx with filters for recent, high-severity PoCs. Parses the raw vulnx output and splits it into individual CVE entries. Extracts structured fields: CVE ID, severity, title, summary, risk, remediation, affected products, POCs, and references. Extracts URLs from PoC sections using regex. Validates each URL with HTTP requests. Invalid or unreachable links are logged and skipped. Uses an AI agent (OpenAI via LangChain) to extract technical artifacts: exploit steps, payloads, endpoints, raw HTTP requests/responses, parameters, and reproduction notes. The prompt forces technical-only output. Sends the extracted technical content to ProjectDiscovery Cloud API to generate Nuclei templates. Validates AI and API responses. Accepted templates are saved to a configured Google Drive folder. Produces JSON records and logs for each processed CVE and URL. Output Nuclei templates in ProjectDiscovery format (YAML) stored in Google Drive. Structured JSON per CVE with metadata and extracted technical details. Validation logs for URL checks, AI extraction, and template generation. Intended audience Bug bounty hunters. Security researchers and threat intel teams. Automation engineers who need reproducible detection templates. Setup & requirements n8n instance with workflow imported. SSH access to a host with vulnx installed. OpenAI API key for technical extraction. ProjectDiscovery API key for template generation. Google Drive OAuth2 credentials for storing templates. Configure schedule trigger and target Google Drive folder ID. Security and usage notes Performs static extraction and validation only. No active exploitation. Processes only PoCs that meet configured filters (e.g., CVSS > 6). Use responsibly. Do not target systems you do not own or have explicit permission to test.
by Avkash Kakdiya
How it works This workflow starts whenever a new lead submits a Typeform. It captures the lead’s details, checks their budget, and routes them based on priority and source. High-budget leads are pushed into HubSpot with a follow-up task for sales. Facebook leads are logged in Google Sheets for marketing, while SurveyMonkey leads are stored in Airtable for campaign tracking. Finally, every lead receives an automated Gmail acknowledgment to confirm receipt and set expectations. Step-by-step Capture Leads The workflow listens for new form responses from Typeform. Each lead’s details — name, email, phone, budget, and message — are captured for processing. Prioritize High-Budget Leads The budget field is checked. If the budget is greater than $5,000 → the lead is flagged as high priority. These leads are added or updated in HubSpot CRM. A priority follow-up task is created in HubSpot for the sales team. Route by Lead Source If the source is Facebook → the lead is logged in a Google Sheet for marketing analysis. If the source is SurveyMonkey → the lead is stored in Airtable for structured campaign tracking. Send Auto-Response After storage, every lead receives an automated Gmail reply. The email thanks them for their interest and assures them that the sales team will follow up within 24 hours. Why use this? Captures and organizes leads from multiple channels in one workflow. Flags and escalates high-budget leads instantly to sales. Routes leads to the right system (HubSpot, Google Sheets, Airtable) based on their source. Automates acknowledgment emails, improving response time and customer experience. Saves manual effort by centralizing lead capture, qualification, and routing in one place.