by victor de coster
The template allows to make Dropcontact batch requests up to 250 requests every 10 minutes (1500/hour). Valuable if high volume email enrichment is expected. Dropcontact will look for email & basic email qualification if first_name, last_name, company_name is provided. +++++++++++++++++++++++++++++++++++++++++ Step 1: Node "Profiles Query" Connect your own source (Airtable, Google Sheets, Supabase,...) the template is using Postgres by default. Note I: Be careful your source is only returning a maximum of 250 items. Note II: The next node uses the next variables, make sure you can map these from your source file: first_name last_name website (company_name would work too) full_name (see note) Note III: This template is using the Dropcontact Batch API, which works in a POST & GET setup. Not a GET request only to retrieve data, as Dropcontact needs to process the batch data load properly. +++++++++++++++++++++++++++++++++++++++++ Step 2: Node "Data Transformation" Will transform the input variables in the proper json format. This json format is expected from the Dropcontact API to make a batch request. "full_name" is being used as a custom identifier to update the returned email to the proper contact in your source database. To make things easy, use a unique identiefer in the full_name variable. +++++++++++++++++++++++++++++++++++++++++ Step3: Node: "Bulk Dropcontact Requests". Enter your Dropcontact credentials in the node: Bulk Dropcontact Requests. +++++++++++++++++++++++++++++++++++++++++ Step4: Connect your output source by mapping the data you like to use. +++++++++++++++++++++++++++++++++++++++++ Step5: Node: "Slack" (OPTIONAL) Connect your slack account, if an error occur, you will be notified. TIP: Try to run the workflow with a batch of 10 (not 250) as it might need to run initially before you will be able to map the data to your final destination. Once the data fields are properly mapped, adjust back to 250.
by Yaron Been
Description This workflow automatically collects and organizes research papers from academic databases and journals into Google Sheets. It helps researchers and students save time by eliminating manual searches across multiple academic sources and centralizing research materials. Overview This workflow automatically scrapes research papers from academic databases and journals, then organizes them in Google Sheets. It uses Bright Data to access academic sources and extracts key information like titles, authors, abstracts, and citations. Tools Used n8n:** The automation platform that orchestrates the workflow. Bright Data:** For scraping academic websites and research databases without getting blocked. Google Sheets:** For organizing and storing research paper data. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Configure Bright Data: Add your Bright Data credentials to the Bright Data node. Connect Google Sheets: Authenticate your Google account. Customize: Specify research topics, journals, or authors to track. Use Cases Academic Researchers:** Stay updated on new papers in your field. Students:** Collect research for literature reviews and dissertations. Research Teams:** Collaborate on literature databases. Connect with Me Website:** https://www.nofluff.online YouTube:** https://www.youtube.com/@YaronBeen/videos LinkedIn:** https://www.linkedin.com/in/yaronbeen/ Get Bright Data:** https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #research #academicpapers #brightdata #googlesheets #researchpapers #academicresearch #literaturesearch #scholarlyarticles #n8nworkflow #workflow #nocode #researchautomation #academicscraping #researchtools #papertracking #academicjournals #researchdatabase #literaturereview #academicwriting #datascraping #researchorganization #scholarlyresearch #citationmanagement #academicproductivity
by Rahul Joshi
Description Turn incoming Gmail messages into structured Zendesk tickets, enriched by Azure OpenAI, and log key details to Google Sheets for tracking. Ideal for IT Support teams needing fast, consistent intake and documentation. ⚡ What This Template Does Fetches new emails via Gmail Trigger. ✉️ Normalizes Gmail data and formats it for downstream steps. Enriches and structures content with Azure OpenAI Chat Model and Output Parsers. Creates Zendesk tickets from the processed data. 🎫 Appends or updates logs in Google Sheets for auditing and reporting. 📊 Key Benefits Saves time by automating ticket creation and logging. ⏱️ Improves ticket quality with AI-driven normalization and structure. Ensures consistent records in Google Sheets for easy reporting. Reduces manual errors in IT Support intake. ✅ Features Gmail-triggered intake flow for new messages. AI enrichment using Azure OpenAI Chat Model with parsing and memory tooling. Zendesk ticket creation (create: ticket) with structured fields. Google Sheets logging (appendOrUpdate: sheet). Modular design with Execute Workflow nodes for reuse and scaling. Requirements n8n instance (Cloud or self-hosted). Gmail credentials configured in n8n for the Gmail Trigger. Zendesk credentials with permission to create tickets. Google Sheets credentials with access to the target spreadsheet (append/update enabled). Azure OpenAI credentials configured for the Azure OpenAI Chat Model and associated parsing. Target Audience IT Support and Helpdesk teams handling email-based requests. 🛠️ Operations teams standardizing inbound email workflows. Agencies and MSPs offering managed support intake. Internal automation teams centralizing ticket capture and logging. Step-by-Step Setup Instructions Connect Gmail credentials in n8n and select the inbox/label for the Gmail Trigger. Add Zendesk credentials and confirm ticket creation permissions. Configure Google Sheets credentials and select the target sheet for logs. Add Azure OpenAI credentials to the Azure OpenAI Chat Model node and verify parsing steps. Import the workflow, assign credentials to each node, update any placeholders, and run a test. Rename the final email/logging nodes descriptively (e.g., “Log to Support Sheet”) and schedule if needed.
by Rahul Joshi
Description Process new resumes from Google Drive, extract structured candidate data with AI, save to Google Sheets, and auto-create a ClickUp hiring task. Gain a centralized, searchable candidate database and instant task kickoff—no manual data entry. 🚀 What This Template Does Watches a Google Drive folder for new resume PDFs and triggers the workflow. 📂 Downloads the file and converts the PDF to clean, readable text. 📄 Analyzes resume text with an AI Resume Analyzer to extract structured candidate info (name, email, phone, experience, skills, education). 🤖 Cleans and validates the AI JSON output for reliability. 🧹 Appends or updates a candidate row in Google Sheets and creates a ClickUp hiring task. ✅ Key Benefits Save hours with end-to-end, hands-off resume processing. ⏱️ Never miss a candidate—every upload triggers automatically. 🔔 Keep a single source of truth in Sheets, always up-to-date. 📊 Kickstart hiring instantly with auto-created ClickUp tasks. 🗂 Works with varied resume formats using AI extraction. 🧠 Features Google Drive “Watch for New Resumes” trigger (every minute). ⏲ PDF-to-text extraction optimized for text-based PDFs. 📘 AI-powered resume parsing into standardized JSON fields. 🧩 JSON cleanup and validation for safe storage. 🧰 Google Sheets append-or-update for a central candidate database. 📑 ClickUp task creation with candidate-specific titles and assignment. 🎯 Requirements n8n instance (cloud or self-hosted); recommended n8n version 1.106.3 or higher. 🔧 Google Drive access to a dedicated resumes folder (PDF resumes recommended). 📂 Google Sheets credential with edit access to the candidate database sheet. 📈 ClickUp workspace/project access to create tasks for hiring. 📌 AI service credentials for the Resume Analyzer step (add in n8n Credentials). 🤖 Target Audience HR and Talent Acquisition teams needing faster screening. 👥 Recruiters and staffing agencies handling high volumes. 🏢 Startups and ops teams standardizing candidate intake. 🚀 No-code/low-code builders automating hiring workflows. 🧩 Step-by-Step Setup Instructions Connect Google Drive, Google Sheets, ClickUp, and your AI service in n8n Credentials. 🔐 Set the Google Drive “watched” folder (e.g., Resume_store). 📁 Import the workflow, assign credentials to all nodes, and map your Sheets columns. 🗂️ Adjust the ClickUp task details (title pattern, assignee, list). 📝 Run once with a sample PDF to test, then enable scheduling (every 1 minute). ▶️ Optionally rename the email/task nodes for clarity (e.g., “Create Hiring Task in ClickUp”). ✍️
by AI Incarnation
This n8n template empowers IT support teams by automating document ingestion and instant query resolution through a conversational AI. It integrates Google Drive, Pinecone, and a Chat AI agent (using Google Gemini/OpenRouter) to transform static support documents into an interactive, searchable knowledge base. With two interlinked workflows—one for processing support documents and one for handling chat queries—employees receive fast, context-aware answers directly from your support documentation. Overview Document Ingestion Workflow Google Drive Trigger:** Monitors a specified folder for new file uploads (e.g., updated support documents). File Download & Extraction:** Automatically downloads new files and extracts text content. Data Cleaning & Text Splitting:** Utilizes a Code node to remove line breaks, trim extra spaces, and strip special characters, while a text splitter segments the content into manageable chunks. Embedding & Storage:** Generates text embeddings using Google Gemini and stores them in a Pinecone vector store for rapid similarity search. Chat Query Workflow Chat Trigger:** Initiates when an employee sends a support query. Vector Search & Context Retrieval:** Retrieves the top relevant document segments from Pinecone based on similarity scores. Prompt Construction:** A Code node combines the retrieved document snippets with the user’s query into a detailed prompt. AI Agent Response:** The constructed prompt is sent to an AI agent (using OpenRouter Chat Model) to generate a clear, step-by-step solution. Key Benefits & Use Case Imagine a large organization where every IT support document—from troubleshooting guides to system configurations—is stored in a single Google Drive folder. When an employee encounters an issue (e.g., “How do I reset my VPN credentials?”), they simply type the query into a chat interface. Instantly, the workflow retrieves the most relevant context from the ingested documents and provides a detailed, actionable answer. This process reduces resolution times, enhances support consistency, and significantly lightens the load on IT staff. Prerequisites A valid Google Drive account with access to the designated folder. A Pinecone account for storing and retrieving text embeddings. Google Gemini* (or *OpenRouter**) credentials to power the Chat AI agent. An operational n8n instance configured with the necessary nodes and credentials. Workflow Details 1 Document Ingestion Workflow Google Drive Trigger Node:** Listens for file creation events in the specified folder. Google Drive Download Node:** Downloads the newly added file. Extract from File Node:** Extracts text content from the downloaded file. Code Node (Data Cleaning):** Cleans the extracted text by removing line breaks, trimming spaces, and eliminating special characters. Recursive Text Splitter Node:** Segments the cleaned text into manageable chunks. Pinecone Vector Store Node:** Generates embeddings (via Google Gemini) and uploads the chunks to Pinecone. 2 Chat Query Workflow Chat Trigger Node:** Receives incoming user queries. Pinecone Vector Store Node (Query):** Searches for relevant document chunks based on the query. Code Node (Context Builder):** Sorts the retrieved documents by relevance and constructs a prompt merging the context with the query. AI Agent Node:** Sends the prompt to the Chat AI agent, which returns a detailed answer. How to Use Import the Template: Import the template into your n8n instance. Configure the Google Drive Trigger: Set the folder ID (e.g., 1RQvAHIw8cQbtwI9ZvdVV0k0x6TM6H12P) and connect your Google Drive credentials. Set Up Pinecone Nodes: Enter your Pinecone index details and credentials. Configure the Chat AI Agent: Provide your Google Gemini (or OpenRouter) API credentials. Test the Workflows: Validate the document ingestion workflow by uploading a sample support document. Validate the chat query workflow by sending a test query and verifying the returned support information. Additional Notes Ensure all credentials (Google Drive, Pinecone, and Chat AI) are correctly set up and tested before deploying the workflows in production. The template is fully customizable. Adjust the text cleaning, splitting parameters, or the number of document chunks retrieved based on your support documentation's size and structure. This template not only enhances IT support efficiency but also offers a scalable solution for managing and leveraging growing volumes of support content.
by Lucas Walter
Overview & Setup This n8n template demonstrates how to automatically generate authentic User-Generated Content (UGC) style marketing videos for eCommerce products using AI. Simply upload a product image, and the workflow creates multiple realistic influencer-style video ads complete with scripts, personas, and video generation. Use cases Generate multiple UGC video variations for A/B testing Create authentic-looking product demonstrations for social media Produce influencer-style content without hiring creators Quickly test different marketing angles for new products Scale video content creation for eCommerce catalogs Good to know Sora 2 video generation takes approximately 2-3 minutes per 12-second video Each video generation costs approximately $0.50-1.00 USD (check OpenAI pricing for current rates) The workflow generates multiple video variations from a single product image Videos are automatically uploaded to Google Drive upon completion Generated videos are in 720x1280 (9:16) format optimized for social media How it works Product Analysis: OpenAI's vision API analyzes the uploaded product image to understand its features, benefits, and target audience Persona Creation: The system generates a detailed profile of the ideal influencer/creator who would authentically promote this product Script Generation: Gemini 2.5 Pro creates multiple authentic UGC video scripts (12 seconds each) with frame-by-frame breakdowns, natural dialogue, and camera movements Frame Generation: For each script, Gemini generates a custom first frame that adapts the product image to match UGC aesthetic and aspect ratio Video Production: Sora 2 API generates the actual video using the script and custom first frame as reference Status Monitoring: The workflow polls the video generation status every 15 seconds until completion Upload & Storage: Completed videos are automatically uploaded to Google Drive with organized naming How to use Click the form trigger URL to access the submission form Upload your product image (works best with clean product shots on white/neutral backgrounds) Enter the product name Submit the form and wait for the workflow to complete Find your generated UGC videos in the specified Google Drive folder Each run produces multiple video variations you can test Requirements OpenAI API** account with Sora 2 access for video generation and GPT-4 Vision Google Gemini API** account for script generation and image adaptation Google Drive** account for video storage Sufficient API credits for video generation (budget accordingly) Customizing this workflow Adjust the video duration in the generate_video node (currently set to 12 seconds) Modify the persona prompt in analyze_product node to target different audience demographics Change the script style in set_build_video_prompts node for different UGC aesthetics (excited discovery, casual recommendation, etc.) Update the Google Drive folder in upload_video node to organize videos by campaign Add additional processing nodes for video editing, subtitle generation, or thumbnail creation Modify the aspect ratio in resize_image node for different platforms (1:1 for Instagram feed, 16:9 for YouTube, etc.)
by usamaahmed
🚀 HR Resume Screening Workflow — Smart Hiring on Autopilot 🤖 🎯 Overview: "This workflow builds an AI-powered resume screening system inside n8n. It begins with Gmail and Form triggers that capture incoming resumes, then uploads each file to Google Drive for storage. The resume is downloaded and converted into plain text, where two branches run in parallel: one extracts structured contact details, and the other uses an AI agent to summarize education, job history, and skills while assigning a suitability score. A cleanup step normalizes the data before merging both outputs, and the final candidate record is saved into Google Sheets and Airtable, giving recruiters a centralized dashboard to identify top talent quickly and consistently.” 🔑 Prerequisites: To run this workflow successfully, you’ll need: Gmail OAuth** → to read incoming resumes. Google Drive OAuth** → to upload and download resume files. Google Sheets OAuth** → to save structured candidate records. Airtable Personal Access Token** → for dashboards and record-keeping. OpenAI / OpenRouter API Key** → to run the AI summarizer and evaluator. ⚙️ Setup Instructions: Import the Workflow Clone or import the workflow into your n8n instance. Add Credentials Go to n8n → Credentials and connect Gmail, Google Drive, Google Sheets, Airtable, and OpenRouter/OpenAI. Configure Key Nodes Gmail Trigger → Update filters.q with the job title you are hiring for (e.g., "Senior Software Engineer"). Google Drive Upload → Set the folderId where resumes will be stored. Google Sheets Node → Link to your HR spreadsheet (e.g., “Candidates 2025”). Airtable Node → Select the correct base & table schema for candidate records. Test the Workflow Send a test resume (via email or form). Check Google Sheets & Airtable for structured candidate data. Go Live Enable the workflow. It will now run continuously and process new resumes as they arrive. 📊 End-to-End Workflow Walkthrough: 🟢 Section 1 – Entry & Intake Nodes: 📧 Gmail Trigger → Polls inbox every minute, captures job application emails, and downloads resume attachments (CV0, CV1, …). 🌐 Form Trigger → Alternate entry for resumes submitted via a careers page or job portal. ✅ Quick Understanding: Think of this section as the front desk of recruitment - resumes received either by email or online form, and the system immediately grabs them for processing. 📂 Section 2 – File Management Nodes: ☁️ Upload File (Google Drive) → Saves the incoming resume into a structured Google Drive folder, naming it after the applicant. ⬇️ Download File (Google Drive) → Retrieves the stored resume file for further processing. 🔎 Extract from File → Converts the resume (PDF/DOC) into plain text so the AI and extractors can work with it. ✅ Quick Understanding: This is your digital filing room. Every resume is safely stored, then converted into a readable format for the hiring system. 🤖 Section 3 – AI Processing (Parallel Analysis) Nodes: 🧾 Information Extractor → Pulls structured contact information (candidate name, candidate email and candidate phone number) using regex validation and schema rules. 🤖 AI Agent (LangChain + OpenRouter) → Reads the full CV and outputs: 🎓 Educational Qualifications 💼 Job History 🛠 Skills Set 📊 Candidate Evaluation Score (1–10) 📝 Justification for the score ✅ Quick Understanding: Imagine having two assistants working in parallel, one quickly extracts basic contact info, while the other deeply reviews the CV and gives an evaluation. 🛠️ Section 4 – Data Cleanup & Merging Nodes: ✏️ Edit Fields → Standardizes the AI Agent’s output into a consistent field (output). 🛠 Code (JS Parsing & Cleanup) → Converts the AI’s free-text summary into clean JSON fields (education, jobHistory, skills, score, justification). 🔗 Merge → Combines the structured contact info with the AI’s evaluation into a single candidate record. ✅ Quick Understanding: This is like the data cleaning and reporting team, making sure all details are neat, structured, and merged into one complete candidate profile. 📊 Section 5 – Persistence & Dashboards Nodes: 📑 Google Sheets (Append Row) → Saves candidate details into a Google Sheet for quick team access. 🗄 Airtable (Create Record) → Stores the same structured data into Airtable, enabling dashboards, analytics, and ATS-like tracking. ✅ Quick Understanding: Think of this as your HR dashboard and database. Every candidate record is logged in both Google Sheets and Airtable, ready for filtering, reporting, or further action. 📊 Workflow Overview Table: | Section | Key Roles / Nodes | Model / Service | Purpose | Benefit | | --- | --- | --- | --- | --- | | 📥 Entry & Intake | Gmail Trigger, Form Trigger | Gmail API / Webhook | Capture resumes from email or forms | Resumes collected instantly from multiple sources | | 📂 File Management | Google Drive Upload, Google Drive Download, Extract from File | Google Drive + n8n Extract | Store resumes & convert to plain text | Centralized storage + text extraction for processing | | 🤖 AI Processing | Information Extractor, AI Agent (LangChain + OpenRouter) | Regex + OpenRouter AI {gpt-oss-20b (free)} | Extract contact info + AI CV analysis | Candidate details + score + justification generated automatically | | 🛠 Data Cleanup & Merge | Edit Fields, Code (JS Parsing & Cleanup), Merge | n8n native + Regex Parsing | Standardize and merge outputs | Clean, structured JSON record with all candidate info | | 📊 Persistence Layer | Google Sheets Append Row, Airtable Create Record | Google Sheets + Airtable APIs | Store structured candidate data | HR dashboards & ATS-ready records for easy review and analytics | | 🔄 Execution Flow | All connected | Gmail + Drive + Sheets + Airtable + AI | End-to-end automation | Automated resume → structured record → recruiter dashboards | 📂 Workflow Output Overview: Each candidate’s data is standardized into the following fields: Candidate Name Candidate Email Contact Number Educational Qualifications Job History Skills Set AI Score (1–10) Justification 📌 Example (Google Sheet row): 📊 Benefits of This Workflow at a Glance: ⏱️ Lightning-Fast Screening** → Processes hundreds of resumes in minutes instead of hours. 🤖 AI-Powered Evaluation** → Automatically summarizes candidate education, work history, skills, and gives a suitability score (1–10) with justification. 📂 Centralized Storage** → Every resume is securely saved in Google Drive for easy access and record-keeping. 📊 Data-Ready Outputs** → Structured candidate profiles go straight into Google Sheets and Airtable, ready for dashboards and analytics. ✅ Consistency & Fairness** → Standardized AI scoring ensures every candidate is evaluated on the same criteria, reducing human bias. 🛠️ Flexible Intake** → Works with both Gmail (email applications) and Form submissions (job portals or career pages). 🚀 Recruiter Productivity Boost** → Frees HR teams from manual extraction and data entry, allowing them to focus on interviewing and hiring the best talent. 👉 Practical HR Use Case: “Screen resumes for a Senior Software Engineer role and shortlist top candidates.” Gmail Trigger → Captures incoming job applications with CVs attached. Google Drive → Stores resumes for record-keeping. Extract from File → Converts CVs into plain text. Information Extractor → Pulls candidate name, email, and phone number. AI Agent → Summarizes education, job history, skills, and assigns a suitability score (1–10). Code & Merge → Cleans and combines outputs into a structured candidate profile. Google Sheets → Logs candidate data for quick HR review. Airtable → Builds dashboards to filter and identify top-scoring candidates. ✅ Result: HR instantly sees structured candidate records, filters by score, and focuses interviews on the best talent.
by Oneclick AI Squad
This workflow transforms traditional REST APIs into structured, AI-accessible MCP (Model Context Protocol) tools. It provides a unified gateway that allows Claude AI to safely, granularly, and auditibly interact with any business system — CRM, ERP, databases, SaaS — through a single MCP-compliant interface. How it works Receive MCP Tool Request - Webhook ingests tool call from AI agent or MCP client Validate & Authenticate - Verifies API key, checks JWT token, validates MCP schema Tool Registry Lookup - Resolves requested tool name to backend API config and permission scope Claude AI Intent Verification - Confirms tool call parameters are safe, well-formed, and within policy Rate Limit & Quota Check - Enforces per-client tool call limits before execution Execute Backend API Call - Routes to the correct business system API with mapped parameters Normalize & Enrich Response - Standardizes API response into MCP tool result schema Audit & Log - Writes immutable access log for compliance and observability Return MCP Tool Result - Delivers structured response back to the AI agent Setup Steps Import workflow into n8n Configure credentials: Anthropic API - Claude AI for intent verification and parameter validation Google Sheets - Tool registry, rate limit tracking, and audit log SMTP - Alert notifications for policy violations Populate the Tool Registry sheet with your API endpoints Set your MCP gateway API key in the validation node Activate the workflow and point your MCP client to the webhook URL Sample MCP Tool Call Payload { "mcpVersion": "1.0", "clientId": "agent-crm-001", "apiKey": "mcp-key-xxxx", "toolName": "crm.get_customer", "parameters": { "customerId": "CUST-10042", "fields": ["name", "email", "tier"] }, "requestId": "req-abc-123", "callerContext": "User asked: show me customer details" } Supported Tool Categories CRM Tools** — get_customer, update_contact, list_deals ERP Tools** — get_inventory, create_order, update_stock Database Tools** — query_records, insert_record, update_record Communication Tools** — send_email, post_slack, create_ticket Analytics Tools** — run_report, fetch_metrics, export_data Features MCP-compliant schema** — works with any MCP-compatible AI agent Granular permission scopes** — read/write/admin per tool per client Claude AI intent guard** — blocks malformed or policy-violating calls Rate limiting** — per-client quota enforcement Full audit trail** — every tool call logged for SOC2 / ISO 27001 Explore More Automation: Contact us to design AI-powered lead nurturing, content engagement, and multi-platform reply workflows tailored to your growth strategy.
by Lakindu Siriwardana
🔧 Automated Video Generator (n8n Workflow) 🚀 Features End-to-End Video Creation from user idea or transcript AI-Powered Scriptwriting using LLMs (e.g., DeepSeek via OpenRouter) Voiceover Generation with customizable TTS voices Image Scene Generation using generative models like together.ai Clip Creation & Concatenation into a full video Dynamic Caption Generation with styling options Google Drive & Sheets Integration for asset storage and progress tracking ⚙️ How It Works User Submits Form with: Main topic or transcript Desired duration TTS voice Visual style (e.g., Pixar, Lego, Cyberpunk) Image generation provider AI generates a script: A catchy title, description, hook, full script, and CTA using a language model. Text-to-Speech (TTS): The script is turned into audio using the selected voice, with timestamped captions generated. Scene Segmentation: The script is split into 5–6 second segments for visual storyboarding. Image Prompt Creation: Each scene is converted into an image prompt in the selected style (e.g., "anime close-up of a racing car"). Image Generation: Prompts are sent to together.ai or fal.ai to generate scenes. Clip Creation: Each image is turned into a short video clip (Ken Burns-style zoom) based on script timing. Video Assembly: All clips are concatenated into a single video. Captions are overlaid using the earlier timestamps. Final Output is uploaded to Google Drive, Telegram and links are saved in Google Sheets. 🛠 Inital Setup 🗣️ 1. Set Up TTS Voice (Text-to-Speech) Run your TTS server locally using Docker. 🧰 2. Set Up NCA-Toolkit The nca-toolkit appears to be a custom video/image processing backend used via HTTP APIs: http://host.docker.internal:9090/v1/image/transform/video http://host.docker.internal:9090/v1/video/concatenate http://host.docker.internal:9090/v1/ffmpeg/compose 🔧 Steps: Clone or build the nca-toolkit container (if it's a private tool): Ensure it exposes port 9090. It should support endpoints for: Image to video (zoom effect) Video concatenation Audio + video merging Caption overlay via FFmpeg Run it locally with Docker: docker run -d -p 9090:80 your-nca-toolkit-image 🧠 3. Set Up together.ai (Image Generation) (Optional You can use ChatGPT API Instead) This handles image generation using models like FLUX.1-schnell. 🔧 Steps: Create an account at: https://www.together.ai Generate your API key
by Abdulrahman Alhalabi
NGO TPM Request Management System Benefits For Beneficiaries: 24/7 Accessibility** - Submit requests anytime via familiar Telegram interface Language Flexibility** - Communicate in Arabic through text or voice messages Instant Acknowledgment** - Receive immediate confirmation that requests are logged No Technical Barriers** - Works on basic smartphones without special apps For TPM Teams: Centralized Tracking** - All requests automatically logged with timestamps and user details Smart Prioritization** - AI categorizes issues by urgency and type for efficient response Action Guidance** - Specific recommended actions generated for each request type Performance Analytics** - Track response patterns and common issues over time For NGO Operations: Cost Reduction** - Automated intake reduces manual processing overhead Data Quality** - Standardized categorization ensures consistent reporting Audit Trail** - Complete record of all beneficiary interactions for compliance Scalability** - Handle high volumes without proportional staff increases How it Works Multi-Input Reception - Accepts both text messages and voice recordings via Telegram Voice Transcription - Uses OpenAI Whisper to convert Arabic voice messages to text AI Categorization - GPT-4 analyzes requests and categorizes issues (aid distribution, logistics, etc.) Action Planning - AI generates specific recommended actions for TPM team in Arabic Data Logging - Records all requests, categories, and actions in Google Sheets with user details Confirmation Feedback - Sends acknowledgment message back to users via Telegram Set up Steps Setup Time: ~20 minutes Create Telegram Bot - Get bot token from @BotFather and configure webhook Configure APIs - Set up OpenAI (transcription + chat) and Google Sheets credentials Customize AI Prompts - Adjust system messages for your NGO's specific operations Set Up Spreadsheet - Link Google Sheets for request tracking and reporting Test Workflows - Verify both text and voice message processing paths Detailed Arabic language configuration and TPM-specific categorization examples are included as sticky notes within the workflow. What You'll Need: Telegram Bot Token (free from @BotFather) OpenAI API key (Whisper + GPT-4) Google Sheets API credentials Google Spreadsheet for logging requests Sample Arabic text/voice messages for testing Key Features: Dual input support (text + voice messages) Arabic language processing and responses Structured data extraction (category + recommended action) Complete audit trail with user information Real-time confirmation messaging TPM team-specific workflow optimization
by Cheng Siong Chin
How It Works Scheduled runs collect data from oil markets, global shipping movements, news sources, and official reports. The system performs statistical checks to detect anomalies and volatility shifts. An AI-driven geopolitical model evaluates emerging risks and assigns a crisis score. Based on severity thresholds, results are routed to the appropriate alert channels for rapid response. Setup Steps Data Sources: Connect the oil price API, OPEC report feeds, shipping databases, and news sources. AI Model: Configure the OpenRouter ChatGPT model for geopolitical and risk analysis. Alerts: Define severity rules and route alerts to Email, Slack, or Dashboard APIs. Storage: Configure a database for historical records, audit logging, and trend tracking. Prerequisites Oil market API credentials; news feed access; OPEC data source; OpenRouter API key; Slack/email/dashboard integrations Use Cases Supply chain risk monitoring; energy market crisis detection; geopolitical threat assessment; trader decision support; operational risk management Customization Adjust risk thresholds; add market data sources; modify alert routing rules Benefits Reduces crisis detection lag 90%; consolidates fragmented data; enables proactive response
by khaled
What Problem Does It Solve? Business owners, managers, and accountants waste valuable time manually entering daily expenses, supplier payments, and employee advances into Odoo. Getting quick balance reports usually requires logging into the ERP, navigating multiple menus, and generating complex reports. Managing post-dated checks often relies on manual tracking, leading to missed due dates. This workflow solves these by: -- Allowing users to record financial transactions simply by sending a natural language message (e.g., via Telegram or Botpress). -- Automatically fetching real-time account balances and supplier statements, returning them instantly in the chat. -- Setting up automated calendar reminders for post-dated check due dates. -- Handling the entire double-entry accounting process in the background without human intervention. How to Configure It Chat Platform Setup** -- Add the webhook URL from this workflow to your Telegram Bot, Botpress, or preferred chat interface. Odoo Setup** -- Connect your Odoo credentials in n8n. -- Open the "Build [X] Entry" code nodes and replace the placeholder journal_id and currency_id with your actual Odoo system IDs. AI Setup** -- Add your OpenAI API key (or swap the node for Google Gemini/Anthropic). -- Open the "AI Financial Agent" node and update the # ACCOUNT MAPPING section with your specific Odoo Chart of Accounts codes. Calendar Setup (Optional)** -- Connect your Google Calendar credentials if you want the workflow to automatically schedule reminders for check due dates. How It Works Webhook catches the new text message from your chat platform. An AI Agent analyzes the Arabic natural language and extracts the intent, amount, date, check details, and specific account categories. Routing: -- For Expenses, Payments, or Advances → The workflow searches Odoo for the correct IDs, builds a balanced double-entry journal record, creates it, and posts it. -- For Post-Dated Checks → Extracts the due date and creates a Google Calendar event before posting the entry to Odoo. -- For Balance Inquiries → Fetches the relevant ledger lines, calculates total debits/credits, and formats a clean Arabic text summary. A success confirmation or the requested financial report is instantly sent back to the user in the chat. Customization Ideas Expand the AI prompt and routing switch to handle Customer Invoices or internal Petty Cash transfers. Add an approval step (e.g., sending a Slack/Email button) before the workflow officially "posts" large transactions in Odoo. Change the AI prompt to support multiple languages or different regional dialects. Log a backup of all financial chat requests into Google Sheets or a Notion database for auditing. For more info Contact Me