by Yaron Been
Create Viral LinkedIn Content with O3 & GPT-4.1-mini Multi-Agent Team This n8n workflow is a multi-agent LinkedIn content factory. At its heart is the Content Director Agent (O3 model), who acts as the project manager. It listens for LinkedIn chat messages, analyzes them, and coordinates a team of AI specialists (all powered by GPT-4.1-mini) to produce viral, engaging, and optimized LinkedIn content. 🟢 Section 1 – Workflow Entry & Strategy Layer Nodes: 🔔 When chat message received → Captures LinkedIn requests (your idea, draft, or prompt). 🧠 Content Director Agent (O3) → Acts as the leader, deciding how the content should be structured and which specialists to call. 💡 Think Node → Helps the Director brainstorm and evaluate possible approaches before delegating. 🤖 OpenAI Chat Model Director (O3) → The Director’s brain, providing strategic-level thinking. ✅ Beginner-friendly benefit: This section is like the “command center.” Any LinkedIn content request starts here and gets transformed into a clear, strategic plan before moving to specialists. ✍️ Section 2 – Content Creation Specialists Nodes: ✍️ LinkedIn Copywriter → Creates viral hooks, compelling posts, and platform-friendly messaging. 🎓 Domain Expert → Ensures technical accuracy and industry authority in the post. 📝 Proofreader & Editor → Polishes content for grammar, tone, and style. Each agent connects to its own GPT-4.1-mini model for cost-efficient, specialized output. ✅ Beginner-friendly benefit: This section is like your content writing team—from drafting, to adding expertise, to polishing for professional LinkedIn standards. 🚀 Section 3 – Engagement & Optimization Specialists Nodes: 🚀 Engagement Strategist → Crafts hashtags, posting times, and audience growth strategies. 🎨 Visual Content Strategist → Designs carousels, infographics, and visual ideas. 📊 Content Performance Analyst → Tracks analytics, measures performance, and suggests improvements. Each of these also relies on GPT-4.1-mini, keeping cost low while delivering specialized insights. ✅ Beginner-friendly benefit: This section is like your growth & marketing team—they ensure your content doesn’t just look good but also performs well and reaches the right audience. 📊 Summary Table | Section | Key Nodes | Role | Beginner Benefit | | ---------------------------- | -------------------------------------- | -------------------- | --------------------------------------- | | 🟢 Entry & Strategy | Trigger, Director, Think, O3 Model | Strategy & planning | Turns your idea into a clear strategy | | ✍️ Content Creation | Copywriter, Domain Expert, Proofreader | Writing & refinement | Produces expert-level, polished content | | 🚀 Engagement & Optimization | Engagement, Visuals, Analytics | Growth & performance | Maximizes reach, visuals, and results | 🌟 Why This Workflow Rocks All-in-one content team** → Strategy + Writing + Optimization Low cost** → O3 only for strategy, GPT-4.1-mini for specialists Parallel agents** → Work simultaneously for faster results Scalable** → Reusable for any LinkedIn content need 👉 Even a beginner can use this workflow: just send a LinkedIn content idea (e.g., “Write a post on AI in finance”), and your AI team handles the rest—writing, polishing, visuals, and engagement tactics.
by Intuz
This n8n template from Intuz provides a complete and automated solution for deep-dive lead research and hyper-personalized email generation. It transforms a basic list of LinkedIn profiles into a campaign-ready database by first enriching contacts with detailed career data and then using AI to craft unique, context-aware emails based on each individual's professional journey. Who's this workflow for? Sales Development Representatives (SDRs) Account Executives (AEs) B2B Marketers & Growth Hackers Recruiters & Talent Acquisition Specialists Startup Founders How it works 1. Scheduled Data Fetch: The workflow runs automatically on a schedule, fetching a list of leads (containing LinkedIn URLs) from a Google Sheet. 2. Enrich Profiles with Apify: For each lead that hasn't been processed, it uses an Apify actor to scrape their LinkedIn profile, extracting key information like their "About" section and detailed work experience. 3. Update Central Database: The scraped career history is saved back into a "Profile Data" column in the original Google Sheet, creating a rich, centralized lead profile. 4. AI Email Personalization: The workflow sends the complete, enriched profile data to a Google Gemini AI model via LangChain, using a sophisticated prompt that instructs it to act as an expert B2B copywriter. 5. Craft a Unique Hook: The AI analyzes the lead's entire career journey to find unique "nuggets"—like long tenure, specific achievements, or unusual career paths—and uses them to write a compelling opening line. 6. Save the Final Email Draft: The AI-generated subject line and personalized email body are saved back into the Google Sheet, leaving you with a ready-to-send, hyper-personalized outreach campaign. Setup Instructions 1. Google Sheets Setup: Connect your Google Sheets account to n8n. In all three Google Sheets nodes, select your credentials and update the Document ID and Sheet Name to match your lead list. Populate your sheet with initial lead data, including at least their LinkedIn URL. 2. Apify Connection: Connect your Apify account in the Runs Profile Extraction Actor node. 3. Google Gemini AI Connection: Connect your Google Gemini (PaLM) API account in the Google Gemini Chat Model node. 4. Configure the Schedule: In the Schedule Trigger node, set the interval for how often you want the workflow to run and process new leads. 5. Activate Workflow: Save the workflow and toggle the "Active" switch to ON. Your automated research and personalization engine is now active. Key Requirements to Use This Template n8n Instance: An active n8n account (Cloud or self-hosted). Google Account & Sheet: A pre-made Google Sheet with columns for First Name, Last Name, LinkedIn, Profile Data, Subject, and Email Body. Apify Account: An active Apify account with a plan that supports the LinkedIn Profile Scraper actor. Google Gemini AI Account: A Google Cloud account with the Vertex AI API (for Gemini models) enabled and an associated API Key. Connect with us Website: https://www.intuz.com/services Email: getstarted@intuz.com LinkedIn: https://www.linkedin.com/company/intuz Get Started: https://n8n.partnerlinks.io/intuz For Custom Worflow Automation Click here- Get Started
by Salvador
Keep your Gmail inbox organized and stress-free with this AI-powered workflow. Ideal for freelancers, small business owners, and productivity enthusiasts who receive a high volume of mixed emails. This template automatically sorts messages into predefined labels and drafts a professional reply when follow-up is needed. How it works Gmail Trigger starts the workflow whenever a new email arrives. AI Classifier (Gemini or compatible model) analyzes the message, checks your previous conversations and sent emails, and assigns the correct Gmail label. Smart rules ensure irrelevant messages are deleted or archived, while actionable ones are prepared for a response. AI Draft Node creates a concise, friendly, and professional draft reply — stored safely in your Gmail drafts folder (never auto-sent). Optional tools like CheckCalendar can suggest time slots for meetings automatically. Together, these steps make your inbox work for you: sorting, prioritizing, and drafting responses. Set up steps Connect your Gmail account and ensure your label categories already exist. Connect your Gemini (or other AI) credentials. (Optional) Enable CheckCalendar for scheduling suggestions. Adjust the labeling rules and prompt text to match your personal or business workflow. Requirements Gemini account for LLM Google OAuth2 credentials
by Adnan Tariq
What this template does Batch-evaluates compliance controls from Google Sheets using the CyberPulse Compliance API. Each control is scored, mapped to selected frameworks, enriched with crosswalk mappings, and summarized with AI-generated findings and recommendations. How it works Read from Sheets → Build control text (response_text + implementation_notes) → CyberPulse Compliance (scoring, mapping, AI summary) → Normalize → Append results to Sheets. Setup (5–10 min) Add Google Sheets + CyberPulse HTTP Header Auth credentials. Replace YOUR_SHEET_ID and sheet names. Provide your Crosswalk JSON URL (raw GitHub or API endpoint) or use this url: https://www.cyberpulsesolutions.com/xw.json Select frameworks to evaluate against. Run a small test, then full batch. CyberPulse API (required for production) Use hosted scoring/mapping (no local ML code). Create a CyberPulse HTTP Header Auth credential with API Key. In the node: paste Crosswalk URL, select frameworks, set credential. For large sheets, add a short Wait or reduce batch size. Input columns control_id control_description response_text implementation_notes evidence_url_1 … evidence_url_4 Output columns status evaluation score confidence rationale categories evidence_count mapped_count mapping_flat frameworks_selected engine_version ai_summary ai_findings (3 per control) ai_recommendations (3 per control) Troubleshooting No rows → check sheet ID and range. Empty mappings → verify Crosswalk URL. Write errors → confirm results sheet + permissions. Learn more about CyberPulse Compliance Agent: https://www.cyberpulsesolutions.com/solutions/compliance-agent Start free: https://www.cyberpulsesolutions.com/pricing Email: info@cyberpulsesolutions.com
by Fahmi Fahreza
Sync QuickBooks Chart of Accounts to Google BigQuery Keep a historical, structured copy of your QuickBooks Chart of Accounts in BigQuery. This n8n workflow runs weekly, syncing new or updated accounts for better reporting and long-term tracking. Who Is This For? Data Analysts & BI Developers** Build a robust financial model and analyze changes over time. Financial Analysts & Accountants** Track structural changes in your Chart of Accounts historically. Business Owners** Maintain a permanent archive of your financial structure for future reference. What the Workflow Does Extract** Every Monday, fetch accounts created or updated in the past 7 days from QuickBooks. Transform** Clean the API response, manage currencies, create stable IDs, and format the data. Format** Convert cleaned data into an SQL insert-ready structure. Load** Insert or update account records into BigQuery. Setup Steps 1. Prepare BigQuery Create a table (e.g., quickbooks.accounts) with columns matching the final SQL insert step. 2. Add Credentials Connect QuickBooks Online and BigQuery credentials in n8n. 3. Configure the HTTP Node Open 1. Get Updated Accounts from QuickBooks. Replace the Company ID {COMPANY_ID} with your real Company ID. Press Ctrl + Alt + ? in QuickBooks to find it. 4. Configure the BigQuery Node Open 4. Load Accounts to BigQuery. Select the correct project. Make sure your dataset and table name are correctly referenced in the SQL. 5. Activate Save and activate the workflow. It will now run every week. Requirements QuickBooks Online account QuickBooks Company ID Google Cloud project with BigQuery and a matching table Customization Options Change Sync Frequency** Adjust the schedule node to run daily, weekly, etc. Initial Backfill** Temporarily update the API query to select * from Account for a full pull. Add Fields** Modify 2. Structure Account Data to include or transform fields as needed.
by Guillaume Duvernay
Create a Telegram bot that answers questions using Retrieval-Augmented Generation (RAG) powered by Lookio and an LLM agent (GPT-4.1). This template handles both text and voice messages (voice transcribed via a Mistral model by default), routes queries through an agent that can call a Lookio tool to fetch knowledge from your uploaded documents, and returns concise, Telegram-friendly replies. A security switch lets you restrict use to a single Telegram username for private testing, or remove the filter to make the bot public. Who is this for? Internal teams & knowledge workers**: Turn your internal docs into an interactive Telegram assistant for quick knowledge lookups. Support & ops**: Provide on-demand answers from your internal knowledge base without exposing full documentation. Developers & automation engineers**: Use this as a reference for integrating agents, transcription, and RAG inside n8n. No-code builders**: Quickly deploy a chat interface that uses Lookio for accurate, source-backed answers. What it does / What problem does this solve? Provides accurate, source-backed answers: Routes queries to **Lookio so replies are grounded in your documents instead of generic web knowledge. Handles voice & text transparently: Accepts Telegram voice messages, transcribes them (via the **Mistral API node by default), and treats transcripts the same as typed text. Simple agent + tool architecture: Uses a **LangChain AI Agent with a Query knowledge base tool to separate reasoning from retrieval. Privacy control: Includes a **Myself? filter to restrict access to a specific Telegram username for safe testing. How it works Trigger: Telegram Trigger receives incoming messages (text or voice). Route: Message Router detects voice vs text. Voice files are fetched with Get Audio File. Transcribe: Mistral transcribe receives the audio file and returns a transcript; the transcript or text is normalized into preset\_user\_message and consolidated in Consolidate user message. Agent: AI Agent (GPT-4.1-mini configured) runs with a system prompt that instructs it to call the Query knowledge base tool when domain knowledge is required. Respond: The agent output is sent back to the user via Telegram answer. How to set up Create a Lookio assistant: Sign up at https://www.lookio.app/, upload documents, and create an assistant. Add credentials in n8n: Configure Telegram API, OpenAI (or your LLM provider), and Mistral Cloud credentials in n8n. Configure Lookio tool: In the Query knowledge base node, replace <your-lookio-api-key> and <your-assistant-id> placeholders with your Lookio API Key and Assistant ID. Set Telegram privacy (optional): Edit the Myself? If node and replace <Replace with your Telegram username> with your username to restrict access. Remove the node to allow public use. Adjust transcription (optional): Swap the Mistral transcribe HTTP node for another provider (OpenAI, Whisper, etc.) and update its prompt to include your jargon list. Connect LLM: In OpenAI Chat Model node, add your OpenAI API key (or configure another LLM node) and ensure the AI Agent node references this model. Activate workflow: Activate the workflow and test by messaging your bot in Telegram. Requirements An n8n instance (cloud or self-hosted) A Telegram Bot token added in n8n credentials A Lookio account, API Key, and Assistant ID An LLM provider account (OpenAI or equivalent) for the OpenAI Chat Model node A Mistral API key (or other transcription provider) for voice transcription How to take it further Add provenance & sources**: Parse Lookio responses and include short citations or source links in the agent replies. Rich replies**: Use Telegram media (images, files) or inline keyboards to create follow-up actions (open docs, request feedback, escalate to humans). Multi-user access control**: Replace the single-username filter with a list or role-based access system (Airtable or Google Sheets lookup) to allow multiple trusted users. Logging & analytics: Save queries and agent responses to **Airtable or Google Sheets for monitoring, quality checks, and prompt improvement.
by Thomas Heal
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Overview This workflow is designed for inspiring teams or individuals who need to quickly and efficiently serve files, content, or documents via the web. It offers a straightforward approach while still being flexible and adaptable for different branding needs. The HTML branding can be easily updated through an external LLM, making it possible to fully customize the look and feel without complex coding. You simply prompt the AI (using the JSON as a guide) to output your desired design. This template makes use of a powerful community node, which brings in the benefits of shared knowledge and collective improvement. Setup Instructions Copy the JSON Preset into your AI model or use your own, along with your custom branding requirements. Ask the model for an HTML response, then paste the output into the HTML Preset. Next, connect the JSON inputs into the relevant locations from the structured output parser. Once complete, the static HTML can be served via AWS or another web server using HTTPS, ensuring secure and reliable delivery. Workflow Explanation This AI Agent takes a simple user input and transforms it into dynamic HTML. The structured JSON output forces consistent formatting, while giving you the creative flexibility to adjust visuals on demand. Since the output is relatively consistent the workflow can produce repitive business documents with consistency and accuracy. Requirements LLM account access AWS Account (S3) or HTTPs equivalent Basic HTML/JSON knowledge PDF.co Account
by Rahul Joshi
Description Automatically assigns new tasks from an Excel/Google Sheets source to the best-fit employee based on expertise, then creates issues in Jira. Gain fast, consistent routing that reduces manual triage and speeds delivery. 🧠📊➡️🗂️ What This Template Does Fetches new task rows and related areas from Google Sheets/Excel. Analyzes each item with an AI Agent using Azure OpenAI. Selects the best-fit employee by matching the area to expertise stored in the sheet. Returns structured outputs (task, assignee, expertise, ID, bug/task) and creates the Jira issue. Applies rule-based handling for bugs vs tasks via a Switch node. Key Benefits ⏱ Save time by automating task assignment from new entries. 🎯 Improve accuracy with expertise-based matching. 📋 Keep clean, structured outputs for downstream systems. 🔁 Seamless handoff from Sheets to Jira with no manual steps. Features Google Sheets Trigger: Reads new task name and related area from the sheet. AI Agent (Azure OpenAI): Evaluates expertise fit and decides the best assignee. Structured Output Parser: Returns exactly five fields: task_name, assignee_name, expertise, employee_id, item_type (bug/task). Jira Create Issue: Creates issues in Jira using selected assignee and item type. Switch (Rules): Routes logic for bugs vs tasks for consistent categorization. Requirements n8n instance: Cloud or self-hosted. Google Sheets access: Sheet containing employee roster with columns for Name, Expertise, and ID; connect credentials in n8n. Azure OpenAI (GPT-4o-mini): Configure the Azure OpenAI Chat Model credentials for the AI Agent. Jira credentials: Authorized account with permissions to create issues. Output Parser setup: Structured Output Parser configured to the five-field schema: task_name, assignee_name, expertise, employee_id, item_type. Target Audience 🧩 IT Support and Ops teams routing incoming work. 🧭 Project managers orchestrating assignments at scale. 🛠 Engineering managers seeking consistent triage. 📈 Business operations teams automating intake to delivery. Step-by-Step Setup Instructions Connect Google Sheets credentials and map the task and area fields; ensure roster columns (Name, Expertise, ID) are present. Add Jira credentials and set the Create Issue node to your target project and issue type. Configure Azure OpenAI (GPT-4o-mini) for the AI Agent and provide credentials. Import the workflow, assign all credentials, and align the Structured Output Parser to the five-field schema. Run a test with sample rows; confirm assignee selection and Jira issue creation; then enable scheduling. Security Best Practices Use least-privilege API tokens for Google Sheets and Jira. Restrict sheet access to only required users and service accounts. Validate and sanitize incoming task data before issue creation. Store credentials securely in n8n and rotate them regularly. Log only necessary fields; avoid sensitive data in workflow logs.
by Naveen Choudhary
Description This workflow automatically monitors and tracks SEC Form D filings (private placement offerings) by fetching data from the SEC EDGAR database every 10 minutes during business hours and saving new filings to Google Sheets for analysis and tracking. Who's it for Venture capitalists** tracking private funding rounds and market activity Investment analysts** researching private placement trends and opportunities Financial researchers** collecting data on private securities offerings Business development teams** identifying potential partnership or acquisition targets Compliance professionals** monitoring regulatory filings in their industry How it works The workflow connects to the SEC EDGAR RSS feed to fetch the latest Form D filings, parses the XML data, extracts key information including CIK numbers and filing links, filters out duplicates from previous runs, and automatically saves new filings to a Google Sheets document for easy analysis and tracking. What it does Automated scheduling - Runs every 10 minutes during business hours (6 AM - 9 PM, Monday-Friday) Fetches SEC data - Retrieves the 40 most recent Form D filings from SEC EDGAR RSS feed Parses filing data - Converts XML to structured data and extracts CIK numbers, titles, and links Filters duplicates - Only processes new filings that haven't been seen in previous executions Saves to sheets - Appends new filing data to Google Sheets with proper formatting Requirements Google Sheets API access** with OAuth2 credentials configured Google Sheets document** - Make a copy of this template sheet n8n instance** running continuously for scheduled execution How to set up Copy the template Google Sheet from the link above to your Google Drive Configure Google Sheets OAuth2 authentication in n8n credentials Update the Google Sheets document ID in the "Save to SEC Data Sheet" node to point to your copied sheet Customize the User-Agent header in the HTTP Request node with your contact information (required by SEC) Activate the workflow - The schedule trigger will start monitoring automatically Test manually by replacing the Schedule Trigger with a Manual Trigger for initial testing How to customize the workflow Schedule frequency**: Modify the cron expression in the Schedule Trigger (default: every 10 minutes) Business hours**: Adjust the time range (default: 6 AM - 9 PM EST) Working days**: Change from Monday-Friday to include weekends if needed Filing count**: Modify the SEC URL to fetch more than 40 filings (change count=40 parameter) Form types**: Update the URL to track different SEC forms (change type=D to other form types) Output format**: Customize the Google Sheets column mapping to include additional fields Notifications**: Add Slack, email, or webhook nodes to get alerts for new filings Output data includes CIK Number** - Central Index Key for the filing company Company Title** - Name of the company making the filing Form Type** - Type of SEC form (Form D for private placements) HTML Filing Link** - Link to view the filing in SEC EDGAR system TXT Filing Link** - Direct link to the raw text version of the filing Updated Date** - When the filing was submitted to SEC Key features Duplicate prevention** - Built-in deduplication ensures no filing is processed twice Business hours scheduling** - Respects SEC server load by running only during business hours SEC compliance** - Includes proper User-Agent header as required by SEC guidelines Automatic link generation** - Creates both HTML and TXT links for easy access to filings CIK extraction** - Automatically extracts company CIK numbers from filing titles Note: This workflow is designed for monitoring public SEC filings and complies with SEC EDGAR access guidelines. The User-Agent header must be updated with your contact information before use.
by Garri
Description This n8n workflow automates the process of retrieving images from a specific Google Drive folder, resizing them, and inserting them into a Google Docs document. It ensures images are processed in numeric order, automatically resized to fit the document, and uploaded in batches to prevent timeouts. This template is designed for content creators, documentation teams, and businesses who need to automatically insert images (e.g., product photos, reports, or scanned documents) into Google Docs with minimal manual effort. How it works Retrieves image files from a Google Drive folder. Filters and sorts files based on numeric order in the filename. Generates direct image URIs and resizes them automatically (width & height). Inserts the resized images into the target Google Docs document via API. Uses a batch loop to avoid timeouts and ensure all images are uploaded successfully. Requirements / Pre-conditions An n8n instance (self-hosted or cloud). Connected Google Drive credential in n8n. Connected Google Docs credential in n8n. A target Google Drive folder containing supported image files. A Google Docs document ready to receive the images. Supported formats: PNG, JPG, JPEG, GIF, WEBP. Error handling: If a file is not an image or exceeds Google Docs API limits, the workflow will skip it and continue processing the rest. Setup Steps Google Drive Credential Connect your Google Drive account in n8n to grant access to the folder containing the images. Google Docs Credential Connect your Google Docs account to allow image insertion into the document. Folder & File Filter In the Search File node, replace the placeholder {{YOUR_FOLDER_ID}} with your Google Drive folder ID. Google Docs Document ID In the Insert Image (HTTP Request) node, replace {{YOUR_DOCUMENT_ID}} with your target Google Docs document ID. (Make sure you rename this node to something descriptive, e.g., Insert Image to Google Doc.) Batch Loop The workflow includes a batch processing loop to prevent timeout errors when dealing with large sets of images. You can adjust the batch size if needed. Run the workflow Execute the workflow, and images will be automatically retrieved, resized, and inserted into the document. Customization Resize Dimensions: Adjust the width/height in the Image Resize node to fit your document’s style. Ordering Logic: Modify the sorting step if you want alphabetical or upload-date order instead of numeric order. Error Notifications: Add an email or Slack node to notify you when an image fails to insert. Image Placement: By default, images are appended. You can adjust the insert logic (e.g., after specific headings).
by Jeffrey W.
GitHub Bounty Issue Tracker & Alert System (Google Sheets + Email/WhatsApp) Overview Looking for a way to track GitHub bounty issues automatically and get notified in real time? This GitHub Bounty Tracker workflow monitors repositories for issues labeled 💎 Bounty, logs them in Google Sheets, and sends instant alerts via Email (HTML-styled) or WhatsApp. Perfect for developers, freelancers, and open-source contributors who want to discover and claim paid opportunities faster. What This Workflow Does 🔎 Automated Bounty Discovery Searches GitHub hourly for all open issues labeled with “💎 Bounty.” Filters duplicates to avoid re-tracking the same issue. 📢 Smart Notifications Sends styled HTML email alerts with GitHub-themed design. WhatsApp Business API integration (optional, disabled by default). Alerts only for bounties created within the last 5 days. 📊 Google Sheets Tracking Sheet1**: Complete bounty list (all tracked issues). Sheet2**: Recent notification log (for quick reference). 🔄 Status Updates Checks every 6 hours for issue changes (open/closed state, new comments). Includes bounty amount, issue details, and direct GitHub links. Use Cases 👩💻 Freelance developers hunting for paid open-source work. 🛠 Development teams tracking bounty opportunities for their stack. 🌍 Community managers monitoring open-source bounty program engagement. 🤝 Open-source contributors looking for compensated tasks. Requirements GitHub Personal Access Token (with repo access). Google Sheets (2 sheets required: Sheet1 = bounties, Sheet2 = notifications). Gmail account (OAuth2 for sending email alerts). WhatsApp Business API credentials (optional). Configuration Notes This workflow supports pagination for large result sets and includes filters to prevent duplicate notifications. You can customize the GitHub search query in the HTTP Request node to target: Specific repositories Custom labels Team/organization projects Frequently Asked Questions (FAQ) Q: How often will I get notifications? A: By default, new bounties trigger alerts once an hour. Updates on existing issues (status/comments) are checked every 6 hours. Q: Do I need WhatsApp integration? A: No, it’s optional. Email alerts work out-of-the-box. Q: Can I filter by bounty amount? A: Yes, the workflow extracts bounty details and you can add filters in your Google Sheets or notification logic. Why Use This GitHub Bounty Tracker? Unlike manual searches, this workflow ensures you never miss a paid GitHub issue. Whether you’re a freelancer looking for income, a team seeking funded tasks, or a contributor wanting recognition and rewards, this system keeps you updated automatically.
by InfyOm Technologies
✅ What problem does this workflow solve? Managing inventory manually requires constant monitoring, manual purchase order creation, and back-and-forth communication with suppliers. This workflow automates the entire inventory replenishment cycle — from detecting low-stock items to generating purchase orders and emailing suppliers automatically. It ensures accurate stock levels, reduces manual work, and prevents stockouts. 💡 Main Use Cases 🔍 Identify low-stock items automatically based on thresholds 📊 Perform scheduled daily inventory checks 🧾 Auto-generate purchase orders for items that need replenishment ✉️ Email purchase orders directly to suppliers 📄 Update Google Sheets with order and inventory tracking information 🧠 How It Works – Step-by-Step 1. ⏰ Scheduled Trigger The workflow runs automatically every day (or any chosen interval) to begin inventory checks without manual involvement. 2. 📉 Get Low-Stock Items Reads your Google Sheets inventory file to identify items where current stock < minimum stock threshold. 3. 🧮 Process Each Low-Stock Item For every item below the # Header 1threshold: Calculates the required order quantity Generates purchase order details, including SKU / Item Name Quantity Needed Supplier Email Stock Levels 4. 🔀 Conditional Flow For each low-stock item: Purchase Order Actions Creates a purchase order email using the generated details Sends the PO automatically to the supplier via Gmail Logs the PO entry in Google Sheets with: Item Details Order Quantity Supplier Timestamp Status (“PO Sent”) 5. 📢 Notifications Sends purchase order emails directly to suppliers. (Optional) Internal notifications (Slack/email) can be added for procurement visibility. 📊 Logging & Reporting All actions — PO creation, stock levels, supplier emails — are written back to Google Sheets for complete auditability and reporting. 👤 Who can use this? Perfect for: Retail & eCommerce businesses Warehouse teams Procurement & purchasing departments Manufacturing operations Any business managing physical inventory 🚀 Benefits ⏱ Automated stock monitoring 📦 Prevents stockouts ✉️ Eliminates manual PO creation 📚 Creates a complete audit trail 🧠 Smart, rule-based reorder logic