by Artem Boiko
Upload a construction photo via web form โ get a detailed cost estimate with work breakdown, resource costs, and professional HTML report. Powered by GPT-4 Vision and the open-source DDC CWICR database (55,000+ work items). Who's it for Site managers** who need quick estimates from mobile photos Renovation contractors** evaluating project scope from initial site visit Real estate inspectors** estimating repair costs Construction consultants** providing rapid ballpark figures DIY enthusiasts** planning home improvement budgets What it does Collects photo + region/language via n8n Form Analyzes photo with GPT-4 Vision (room type, elements, dimensions) Decomposes visible elements into construction work items Searches DDC CWICR vector database for matching rates Generates professional HTML report with cost breakdown Supports 9 regions: ๐ฉ๐ช Berlin ยท ๐ฌ๐ง Toronto ยท ๐ท๐บ St. Petersburg ยท ๐ช๐ธ Barcelona ยท ๐ซ๐ท Paris ยท ๐ง๐ท Sรฃo Paulo ยท ๐จ๐ณ Shanghai ยท ๐ฆ๐ช Dubai ยท ๐ฎ๐ณ Mumbai How it works โโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโ โ Web Form โ โ โ STAGE 1 โ โ โ STAGE 4 โ โ โ Loop Works โ โ Photo+Lang โ โ GPT-4 Vision โ โ Decompose โ โ per item โ โโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโ โ โ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ Identify room, elements, fixtures, dimensions โ โ โ Break down into 15-40 construction work items โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ โโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโ โ HTML Report โ โ โ STAGE 7.5 โ โ โ STAGE 5 โ โ โ Qdrant โ โ Response โ โ Aggregate โ โ Parse+Score โ โ Vector DB โ โโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโโ Pipeline stages: | Stage | Node | Description | |-------|------|-------------| | 1 | GPT-4 Vision | Analyzes photo: room type, elements, materials, dimensions | | 4 | GPT-4 Decompose | Breaks elements into work items with quantities | | 5 | Vector Search + Score | Finds matching rates in DDC CWICR, quality scoring | | 7.5 | Aggregate & Validate | Sums costs, groups by phase, validates results | | 9 | HTML Report | Generates professional estimate document | Prerequisites | Component | Requirement | |-----------|-------------| | n8n | v1.30+ with Form Trigger support | | OpenAI API | GPT-4 Vision + Embeddings access | | Qdrant | Vector DB with DDC CWICR collections | | DDC CWICR Data | github.com/datadrivenconstruction/DDC-CWICR | Setup 1. n8n Credentials (Settings โ Credentials) OpenAI API** โ required (GPT-4 Vision + text-embedding-3-large) Qdrant API** โ your Qdrant instance connection 2. Qdrant Collections Load DDC CWICR embeddings for your target regions: DE_BERLIN_workitems_costs_resources_EMBEDDINGS_3072_DDC_CWICR ENG_TORONTO_workitems_costs_resources_EMBEDDINGS_3072_DDC_CWICR RU_STPETERSBURG_workitems_costs_resources_EMBEDDINGS_3072_DDC_CWICR ES_BARCELONA_workitems_costs_resources_EMBEDDINGS_3072_DDC_CWICR FR_PARIS_workitems_costs_resources_EMBEDDINGS_3072_DDC_CWICR PT_SAOPAULO_workitems_costs_resources_EMBEDDINGS_3072_DDC_CWICR ZH_SHANGHAI_workitems_costs_resources_EMBEDDINGS_3072_DDC_CWICR AR_DUBAI_workitems_costs_resources_EMBEDDINGS_3072_DDC_CWICR HI_MUMBAI_workitems_costs_resources_EMBEDDINGS_3072_DDC_CWICR 3. Activate Workflow Import JSON into n8n Link OpenAI + Qdrant credentials to respective nodes Activate workflow Access form at: https://your-n8n/form/photo-estimate-pro-v3 Features | Feature | Description | |---------|-------------| | ๐ธ Photo Analysis | GPT-4 Vision identifies room type, elements, fixtures | | ๐ Dimension Estimation | Uses reference objects (doors, tiles) for sizing | | ๐ง Work Decomposition | Breaks down to 15-40 specific work items | | ๐ฏ Quality Scoring | Rates match quality (high/medium/low/not_found) | | ๐ Phase Grouping | PREPARATION โ MAIN โ FINISHING โ MEP | | ๐ฐ Cost Breakdown | Labor, materials, machines per item | | โ Validation | Warns if <50% rates found or missing demolition | | ๐ 9 Languages | Full localization + regional pricing | Form Fields | Field | Type | Options | |-------|------|---------| | ๐ท Upload Photo | File | .jpg, .png, .webp | | ๐ Region & Language | Dropdown | 9 regions with currencies | | ๐๏ธ Work Type | Dropdown | New / Renovation / Repair / Auto | | ๐ Description | Textarea | Optional context | Example Output Input: Bathroom photo (renovation) Region: ๐ฉ๐ช German - Berlin (EUR โฌ) Generated Work Items: PREPARATION (3 items) โโโ Demolition of wall tiles โ 12 mยฒ โ โฌ180 โโโ Demolition of floor tiles โ 4.5 mยฒ โ โฌ95 โโโ Disposal of construction waste โ 0.8 mยณ โ โฌ120 MAIN (8 items) โโโ Floor waterproofing โ 4.5 mยฒ โ โฌ225 โโโ Wall waterproofing wet zone โ 8 mยฒ โ โฌ280 โโโ Floor screed โ 4.5 mยฒ โ โฌ135 โโโ Wall tiling โ 22 mยฒ โ โฌ880 โโโ Floor tiling โ 4.5 mยฒ โ โฌ225 โโโ Toilet installation โ 1 pcs โ โฌ320 โโโ Sink installation โ 1 pcs โ โฌ185 โโโ Shower cabin installation โ 1 pcs โ โฌ450 FINISHING (3 items) โโโ Ceiling painting โ 4.5 mยฒ โ โฌ68 โโโ Grouting โ 26.5 mยฒ โ โฌ133 โโโ Silicone sealing โ 8 m โ โฌ48 MEP (4 items) โโโ Socket installation โ 2 pcs โ โฌ90 โโโ Light point installation โ 2 pcs โ โฌ120 โโโ Mixer/faucet installation โ 2 pcs โ โฌ160 โโโ Ventilation installation โ 1 pcs โ โฌ85 โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ TOTAL: โฌ3,799.00 Labor: โฌ1,520 ยท Materials: โฌ1,900 ยท Machines: โฌ379 Quality: 78% high match ยท 18 work items Quality Scoring System | Score | Level | Meaning | |-------|-------|---------| | 60-100 | ๐ข High | Exact match with resources | | 40-59 | ๐ก Medium | Good match, minor differences | | 20-39 | ๐ Low | Partial match, review needed | | 0-19 | ๐ด Not Found | No suitable rate found | Scoring factors: Has price in database (+30) Has resources breakdown (+25) Unit matches expected (+20) Material keywords match (+15) Work type keywords match (+10) Vector similarity >0.5 (+10) Notes & Tips Best photo angles:** Capture full room, include reference objects (doors, sockets) Renovation mode:** AI automatically adds demolition works Validation warnings:** Check if <50% rates found โ may need manual additions Rate accuracy:** Depends on DDC CWICR coverage for your region Extend:** Chain with PDF generation, email delivery, or CRM integration Categories AI ยท Data Extraction ยท Document Ops ยท Files & Storage Tags photo-analysis, gpt-4-vision, construction, cost-estimation, qdrant, vector-search, form-trigger, html-report, multilingual Author DataDrivenConstruction.io https://DataDrivenConstruction.io info@datadrivenconstruction.io Consulting & Training We help construction, engineering, and technology firms implement: AI-powered visual estimation systems CAD/BIM data processing pipelines Vector database integration for construction data Multilingual cost database solutions Contact us to test with your data or adapt to your project requirements. Resources DDC CWICR Database:** GitHub Qdrant Documentation:** qdrant.tech/documentation n8n Form Trigger:** docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-base.formtrigger โญ Star us on GitHub! github.com/datadrivenconstruction/DDC-CWICR
by Vonn
This n8n template demonstrates how to fully automate the creation of UGC-style product videos using AI, starting from a simple Google Sheet. It transforms product data into AI-generated images, cinematic video scripts, and final videos, then uploads everything to Google Drive and updates your sheet automatically. ๐ก Use cases Generate UGC ads at scale for e-commerce products Create TikTok / Reels content automatically Build content pipelines for agencies or creators Rapidly test different product angles, audiences, and messaging Automate creative production from structured data (Google Sheets) Good to know This workflow uses multiple AI services (image + video), so cost depends on usage: Image generation (DALLยทE) Video generation (Sora) Video generation is asynchronous and may take several minutes per item Some AI models (like Sora) may be region-restricted or limited access Generated image URLs may expire, so storing them (as done here) is important How it works Reads product data from Google Sheets and selects rows marked "Pending". Creates a prompt and generates a product image. Analyzes the image and turns it into a video script. Sends the script to Sora and waits until the video is ready. Uploads the video to Google Drive and updates the sheet. Logs errors and marks the row as "Error". How to use Add products to Google Sheets with name, description, audience, and set status to "Pending". Run the workflow or let the schedule trigger process items automatically. The system generates image โ script โ video, uploads them, and updates your sheet. Requirements Requires OpenAI, Google Sheets, and Google Drive accounts. Requires an n8n instance with credentials configured. Customizing this workflow Replace the schedule trigger with a webhook or form for real-time use. Generate multiple videos per product. Send outputs to platforms like TikTok, Meta Ads, or CMS tools. Add voiceovers, captions, or permanent asset storage.
by Guillaume Duvernay
This template introduces a revolutionary approach to automated web research. Instead of a rigid workflow that can only find one type of information, this system uses a "thinker" and "doer" AI architecture. It dynamically interprets your plain-English research request, designs a custom spreadsheet (CSV) with the perfect columns for your goal, and then deploys a web-scraping AI to fill it out. It's like having an expert research assistant who not only finds the data you need but also builds the perfect container for it on the fly. Whether you're looking for sales leads, competitor data, or market trends, this workflow adapts to your request and delivers a perfectly structured, ready-to-use dataset every time. Who is this for? Sales & marketing teams:** Generate targeted lead lists, compile competitor analysis, or gather market intelligence with a simple text prompt. Researchers & analysts:** Quickly gather and structure data from the web for any topic without needing to write custom scrapers. Entrepreneurs & business owners:** Perform rapid market research to validate ideas, find suppliers, or identify opportunities. Anyone who needs structured data:** Transform unstructured, natural language requests into clean, organized spreadsheets. What problem does this solve? Eliminates rigid, single-purpose workflows:** This workflow isn't hardcoded to find just one thing. It dynamically adapts its entire research plan and data structure based on your request. Automates the entire research process:** It handles everything from understanding the goal and planning the research to executing the web search and structuring the final data. Bridges the gap between questions and data:** It translates your high-level goal (e.g., "I need sales leads") into a concrete, structured spreadsheet with all the necessary columns (Company Name, Website, Key Contacts, etc.). Optimizes for cost and efficiency:* It intelligently uses a combination of deep-dive and standard web searches from *Linkup.so** to gather high-quality initial results and then enrich them cost-effectively. How it works (The "Thinker & Doer" Method) The process is cleverly split into two main phases: The "Thinker" (AI Planner): You submit a research request via the built-in form (e.g., "Find 50 US-based fashion companies for a sales outreach campaign"). The first AI node acts as the "thinker." It analyzes your request and determines the optimal structure for your final spreadsheet. It dynamically generates a plan, which includes a discoveryQuery to find the initial list, an enrichmentQuery to get details for each item, and the JSON schemas that define the exact columns for your CSV. The "Doer" (AI Researcher): The rest of the workflow is the "doer," which executes the plan. Discovery: It uses a powerful "deep search" with Linkup.so to execute the discoveryQuery and find the initial list of items (e.g., the 50 fashion companies). Enrichment: It then loops through each item in the list. For each one, it performs a fast and cost-effective "standard search" with Linkup to execute the enrichmentQuery, filling in all the detailed columns defined by the "thinker." Final Output: The workflow consolidates all the enriched data and converts it into a final CSV file, ready for download or further processing. Setup Connect your AI provider: In the OpenAI Chat Model node, add your AI provider's credentials. Connect your Linkup account: In the two Linkup (HTTP Request) nodes, add your Linkup API key (free account at linkup.so). We recommend creating a "Generic Credential" of type "Bearer Token" for this. Linkup offers โฌ5 of free credits monthly, which is enough for 1k standard searches or 100 deep queries. Activate the workflow: Toggle the workflow to "Active." You can now use the form to submit your first research request! Taking it further Add a custom dashboard:** Replace the form trigger and final CSV output with a more polished user experience. For example, build a simple web app where users can submit requests and download their completed research files. Make it company-aware:** Modify the "thinker" AI's prompt to include context about your company. This will allow it to generate research plans that are automatically tailored to finding leads or data relevant to your specific products and services. Add an AI summary layer:** After the CSV is generated, add a final AI node to read the entire file and produce a high-level summary, such as "Here are the top 5 leads to contact first and why," turning the raw data into an instant, actionable report.
by Olaf Titel
Setup & Instructions โ fluidX: Create Session, Analyze & Notify Goal: This workflow demonstrates the full fluidX THE EYE integration โ starting a live session, inviting both the customer (via SMS) and the service agent (via email), and then accessing the media (photos and videos) created during the session. Captured images are automatically analyzed with AI, uploaded to an external storage (such as Google Drive), and a media summary for the session is generated at the end. The agent receives an email with a link to join the live session. The customer receives an SMS with a link to start sharing their camera. Once both are connected, the agent can view the live feed, and the system automatically stores uploaded images and videos in Google Drive. When the session ends, the workflow collects all media and creates a complete AI-powered session summary (stored and updated in Google Drive). Below is an example screenshot from the customerโs phone: Prerequisites Developer account:* https://live.fluidx.digital (activate the *TEST plan**, โฌ0) API docs (Swagger):** fluidX.digital API ๐ Required Credentials 1๏ธโฃ fluidX API key (HTTP Header Auth) โข Credential name in n8n: fluidx API key โข Header name: x-api-key โข Header value: YOUR_API_KEY 2๏ธโฃ SMTP account (for outbound email) โข Credential name in n8n: SMTP account โข Configure host, port, username, and password according to your provider โข Enable TLS/SSL as required 3๏ธโฃ Google Drive account โข Used to store photos, videos, and automatically update the session summary files. 4๏ธโฃ OpenAI API (for AI analysis & summary) โขUsed in the Analyze Images (AI) and Generate Summary parts of the workflow. โข Credential type: OpenAI โข Credential name (suggested): OpenAI account โข API Key: your OpenAI API key โข Model: e.g. gpt-4.1, gpt-4o, or similar (choose in the OpenAI node settings) โ๏ธ Configuration (in the โSet Configโ node) BASE_URL: https://live.fluidx.digital company / project / billingcode / sku: adjust as needed emailAgent: set before running (empty in template) phoneNumberUser: set before running (empty in template) Flow Overview Form Trigger โ Create Session โ Set Session Vars โ Send SMS (User) โ Send Email (Agent) โ Monitor Media โ Analyze Images (AI) โ Upload Files to Google Drive โ Generate Summary โ Update Summary File The workflow starts automatically when a Form submission is received. Users enter the customerโs phone number and agentโs email, and the system creates a new fluidX THE EYE session. As media is uploaded during the session, the workflow automatically retrieves, stores, analyzes, and summarizes it โ providing a complete end-to-end automation example for remote inspection, support, or field-service use cases. Notes Do not store real personal data inside the template. Manage API keys and secrets via n8n Credentials or environment variables. Log out of https://live.fluidx.digital in the agentโs browser before testing, to ensure a clean invite flow and session creation.
by Fabrice
This n8n template shows you how to automate document summarization while keeping full digital sovereignty. By combining Nextcloud for file storage and the IONOS AI Model Hub, your sensitive documents stay on European infrastructureโfully outside US CLOUD Act jurisdiction. Use cases Daily briefings: Automatically summarize lengthy reports as soon as your team uploads them. Meeting notes: Turn uploaded transcripts into clear, actionable bullet pointsโhands-free. Research management: Get instant summaries of academic papers or PDFs the moment they land in your research folder. How it works A Schedule Trigger kicks off the workflow at regular intervals. It calls the Nextcloud List a Folder node to retrieve all files in your chosen directory. A Code node then acts as a filter, checking each file's last-modified timestamp and letting through only files changed within the past 24 hours. Filtered files are pulled into the workflow via the Nextcloud Download a File node. From there, the Summarization Chain takes over: a Default Data Loader and Token Splitter prepare the content, and the IONOS AI Model Hub Chat Model generates a concise, structured summary. The final output is saved back to your Nextcloud instance as a note for your team. Good to know Nextcloud setup: Make sure your Nextcloud credentials include both read and write permissionsโone to download source files, one to upload the generated summary. Model selection: The IONOS AI Model Hub currently offers several LLMs, including Llama 3.1 8B, Mistral Nemo 12B, Mistral Small 24B, Llama 3.3 70B, GPT-OSS 120B, and Llama 3.1 405B. For document summarization, Llama 3.3 70B strikes the best balance between output quality and speed. If you're processing high volumes or need faster turnaround, Mistral Nemo 12B is a leaner alternative. How to set it up Set your folder path in the List a Folder node to the directory you want to monitor. In the New Files Only code node, adjust the 24 value to change how far back the workflow looks. Open the Summarization Chain and define your output formatโfor example: "Summarize this document in 3 bullet points." Requirements Nextcloud account for file hosting and retrieval IONOS Cloud account to access the AI Model Hub
by vinci-king-01
Meeting Notes Distributor โ Mailchimp and MongoDB This workflow automatically converts raw meeting recordings or written notes into concise summaries, stores them in MongoDB for future reference, and distributes the summaries to all meeting participants through Mailchimp. It is ideal for teams that want to keep everyone aligned without manual copy-and-paste or email chains. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or cloud) Audio transcription service or written notes available via HTTP endpoint MongoDB database (cloud or self-hosted) Mailchimp account with an existing Audience list Required Credentials MongoDB** โ Connection string with insert permission Mailchimp API Key** โ To send campaigns (Optional) HTTP Service Auth** โ If your transcription/notes endpoint is secured Specific Setup Requirements | Component | Example Value | Notes | |------------------|--------------------------------------------|-----------------------------------------------------| | MongoDB Database | meeting_notes | Database in which summaries will be stored | | Collection Name | summaries | Collection automatically created if it doesnโt exist| | Mailchimp List | Meeting Participants | Audience list containing participant email addresses| | Notes Endpoint | https://example.com/api/meetings/{id} | Returns raw transcript or note text (JSON) | How it works This workflow automatically converts raw meeting recordings or written notes into concise summaries, stores them in MongoDB for future reference, and distributes the summaries to all meeting participants through Mailchimp. It is ideal for teams that want to keep everyone aligned without manual copy-and-paste or email chains. Key Steps: Schedule Trigger**: Fires daily (or on-demand) to check for new meeting notes. HTTP Request**: Downloads raw notes or transcript from your endpoint. Code Node**: Uses an AI or custom function to generate a concise summary. If Node**: Skips processing if the summary already exists in MongoDB. MongoDB**: Inserts the new summary document. Split in Batches**: Splits participants into Mailchimp-friendly batch sizes. Mailchimp**: Sends personalized summary emails to each participant. Wait**: Ensures rate limits are respected between Mailchimp calls. Merge**: Consolidates success/failure results for logging or alerting. Set up steps Setup Time: 15-25 minutes Clone the workflow: Import or copy the JSON into your n8n instance. Configure Schedule Trigger: Set the cron expression (e.g., every weekday at 18:00). Set HTTP Request URL: Replace placeholder with your transcription/notes endpoint. Add auth headers if needed. Add MongoDB Credentials: Enter your connection string in the MongoDB node. Customize Summary Logic: Open the Code node to tweak summarization length, language, or model. Mailchimp Credentials: Supply your API key and select the correct Audience list. Map Email Fields: Ensure participant emails are supplied from transcription metadata or external source. Test Run: Execute once manually to verify MongoDB insert and email delivery. Activate Workflow: Enable the workflow so it runs on its defined schedule. Node Descriptions Core Workflow Nodes: Schedule Trigger** โ Initiates the workflow at predefined intervals. HTTP Request** โ Retrieves the latest meeting data (transcript or notes). Code** โ Generates a summarized version of the meeting content. If** โ Checks MongoDB for duplicates to avoid re-sending. MongoDB** โ Stores finalized summaries for archival and audit. SplitInBatches** โ Breaks participant list into manageable chunks. Mailchimp** โ Sends summary emails via campaigns or transactional messages. Wait** โ Pauses between batches to honor Mailchimp rate limits. Merge** โ Aggregates success/failure responses for logging. Data Flow: Schedule Trigger โ HTTP Request โ Code โ If If summary is new: MongoDB โ SplitInBatches โ Mailchimp โ Wait Merge collates all results Customization Examples 1. Change Summary Length // Inside the Code Node const rawText = items[0].json.text; const maxSentences = 5; // adjust to 3, 7, etc. items[0].json.summary = summarize(rawText, maxSentences); return items; 2. Personalize Mailchimp Subject // In the Set node before Mailchimp items[0].json.subject = Recap: ${items[0].json.meetingTitle} โ ${new Date().toLocaleDateString()}; return items; Data Output Format The workflow outputs structured JSON data: { "meetingId": "abc123", "meetingTitle": "Quarterly Planning", "summary": "Key decisions on roadmap, budget approvals...", "participants": [ "alice@example.com", "bob@example.com" ], "mongoInsertId": "65d9278fa01e3f94b1234567", "mailchimpBatchIds": ["2024-01-01T12:00:00Z#1", "2024-01-01T12:01:00Z#2"] } Troubleshooting Common Issues Mailchimp rate-limit errors โ Increase Wait node delay or reduce batch size. Duplicate summaries โ Ensure the If node correctly queries MongoDB using meeting ID as a unique key. Performance Tips Keep batch sizes under 500 to stay well within Mailchimp limits. Offload AI summarization to external services if Code node execution time is high. Pro Tips: Store full transcripts in MongoDB GridFS for future reference. Use environment variables in n8n for all API keys to simplify workflow export/import. Add a notifier (e.g., Slack node) after Merge to alert admins on failures. This is a community template provided โas-isโ without warranty. Always validate the workflow in a test environment before using it in production.
by JJ Tham
Generate AI Voiceovers from Scripts and Upload to Google Drive This is the final piece of the AI content factory. This workflow takes your text-based video scripts and automatically generates high-quality audio voiceovers for each one, turning your text into ready-to-use audio assets for your video ads. Go from a spreadsheet of text to a folder of audio files, completely on autopilot. โ ๏ธ CRITICAL REQUIREMENTS (Read First!) This is an advanced, self-hosted workflow that requires specific local setup: Self-Hosted n8n Only:** This workflow uses the Execute Command and Read/Write Files nodes, which requires you to run your own instance of n8n. It will not work on n8n Cloud. FFmpeg Installation:** You must have FFmpeg installed on the same machine where your n8n instance is running. This is used to convert the audio files to a standard format. What it does This is Part 3 of the AI marketing series. It connects to the Google Sheet where you generated your video scripts (in Part 2). For each script that hasn't been processed, it: Uses the Google Gemini Text-to-Speech (TTS) API to generate a voiceover. Saves the audio file to your local computer. Uses FFmpeg to convert the raw audio into a standard .wav file. Uploads the final .wav file to your Google Drive. Updates the original Google Sheet with a link to the audio file in Drive and marks the script as complete. How to set up IMPORTANT: This workflow is Part 3 of a series and requires the output from Part 2 ("Generate AI Video Ad Scripts"). If you need Part 1 or Part 2 of this workflow series, you can find them for free on my n8n Creator Profile. Connect to Your Scripts Sheet: In the "Getting Video Scripts" node, connect your Google Sheets account and provide the URL to the sheet containing your generated video scripts from Part 2. Configure AI Voice Generation (HTTP Request): In the "HTTP Request To Generate Voice" node, go to the Query Parameters and replace INSERT YOUR API KEY HERE with your Google Gemini API key. In the JSON Body, you can customize the voice prompt (e.g., change <INSERT YOUR DESIRED ACCENT HERE>). Set Your Local File Path: In the first "Read/Write Files from Disk" node, update the File Name field to a valid directory on your local machine where n8n has permission to write files. Replace /Users/INSERT_YOUR_LOCAL_STORAGE_HERE/. Connect Google Drive: In the "Uploading Wav File" node, connect your Google Drive account and choose the folder where your audio files will be saved. Update Your Tracking Sheet: In the final "Uploading Google Drive Link..." node, ensure it's connected to the same Google Sheet from Step 1. This node will update your sheet with the results. Name and Description for Submission Form Here are the name and description, updated with the new information, ready for you to copy and paste. Name: Generate AI Voiceovers from Scripts and Upload to Google Drive Description: Welcome to the final piece of the AI content factory! ๐ This advanced workflow takes the video ad scripts you've generated and automatically creates high-quality audio voiceovers for each one, completing your journey from strategy to ready-to-use media assets. โ ๏ธ This is an advanced workflow for self-hosted n8n instances only and requires FFmpeg to be installed locally. โ๏ธ How it works This workflow is Part 3 of a series. It reads your video scripts from a Google Sheet, then for each script it: Generates a voiceover using the Google Gemini TTS API. Saves the audio file to your local machine. Converts the file to a standard .wav format using FFmpeg. Uploads the final audio file to Google Drive. Updates your Google Sheet with a link to the new audio file. ๐ฅ Whoโs it for? Video Creators & Marketers: Mass-produce voiceovers for video ads, tutorials, or social media content without hiring voice actors. Automation Power Users: A powerful example of how n8n can bridge cloud APIs with local machine commands. Agencies: Drastically speed up the production of audio assets for client campaigns. ๐ ๏ธ How to set up This workflow requires specific local setup due to its advanced nature. IMPORTANT: This is Part 3 of a series. To find Part 1 ("Generate a Strategic Plan") and Part 2 ("Generate Video Scripts"), please visit my n8n Creator Profile where they are available for free. Setup involves connecting to your scripts sheet, configuring the AI voice API, setting a local file path for n8n to write to, and connecting your Google Drive.
by Amit Kumar
Overview This n8n template automates the entire process of generating short-form AI videos and publishing them across multiple social media platforms. It combines Google Gemini for structured prompt creation, KIE AI for video generation, and Blotato for centralized publishing. The result is a fully automated content pipeline ideal for creators, marketers, agencies, or anyone who wants consistent, hands-free content generation. This workflow is especially useful for short-video creators, meme pages, educational creators, UGC teams, auto-posting accounts, and brands who want to maintain high-frequency posting without manual effort. Good to Know API costs:** KIE AI generates videos using paid tokens/credits. Prices vary based on model, duration, and resolution (check KIE AI pricing). Google Gemini model restrictions:** Certain Gemini models are geo-limited. If you receive โmodel not found,โ the model may not be available in your region. Blotato publishing:** Blotato supports many platforms: YouTube, Instagram, Facebook, LinkedIn, TikTok, X, Bluesky, and more. Platform availability depends on your Blotato setup. Runtime considerations:** Video generation can take time (10โ60 seconds+, depending on the complexity). Self-hosted requirement:** This workflow uses a community node (Blotato). Community nodes do not run on n8n Cloud. A self-hosted instance is required. How It Works Scheduler Trigger Defines how frequently new videos should be created (e.g., every 12 hours). Random Template Selector A JavaScript node generates a random number to choose from multiple creative prompt templates. AI Agent (Google Gemini) Gemini generates a JSON object containing: A short title A human-readable video description A detailed text-to-video prompt The Structured Output Parser ensures strict JSON shape. Video Generation with KIE AI The prompt is sent to KIE AIโs video generation API. KIE AI creates a synthetic AI video based on the description and your chosen parameters (aspect ratio, frames, watermark removal, etc.). Polling & Retrieval The workflow waits until the video is fully rendered, then fetches the final video URL. Media Upload to Blotato The generated video is uploaded into Blotatoโs media storage for publishing. Automatic Posting to Social Platforms Blotato distributes the video to all connected platforms. Examples include: YouTube Instagram Facebook LinkedIn Bluesky TikTok X Any platform supported by your Blotato account This results in a fully automated โidea โ video โ upload โ publishโ pipeline. How to Use Start by testing the workflow manually to verify video generation and posting. Adjust the Scheduler Trigger to fit your posting frequency. Add your API credentials for: Google Gemini KIE AI Blotato Ensure your Blotato account has social channels connected. Edit or expand the prompt templates for your content niche: Comedy clips Educational videos Product demos Storytelling Pet videos Motivational content The more template prompts you add, the more diverse your automated videos will be. Requirements Google Gemini** API Key Used for generating structured titles, descriptions, and video prompts. KIE AI API key** Required for creating the actual AI-generated video. Blotato account** Required for uploading media and automatically posting to platforms. Self-hosted n8n instance** Needed because Blotato uses a community node, which n8n Cloud does not support. Limitations KIE AI models may output inconsistent results if prompts are vague. High-frequency scheduling may consume API credits quickly. Some platforms (e.g., TikTok or Facebook Pages) may require additional permissions or account linking steps in Blotato. Video rendering time varies depending on prompt complexity. Customization Ideas Add more prompt templates to increase variety. Swap Gemini for an LLM of your choice (OpenAI, Claude, etc.). Add a Telegram, Discord, or Slack notification once posting is complete. Store all generated titles, descriptions, and video URLs in: Google Sheets Notion Airtable Supabase Add multi-language support using a translation node. Add an approval step where videos go to your team before publishing. Add analytics logging (impressions, views, etc.) using Blotato or another service. Troubleshooting Video not generating?** Check if your KIE AI model accepts your chosen parameters. Model not found?** Switch to a supported Gemini model for your region. Publishing fails?** Ensure Blotato platform accounts are authenticated. Workflow stops early?** Increase the wait timeout before polling KIE AI. This template is designed for easy setup and high flexibility. All technical details, configuration steps, and workflow logic are already included in sticky notes inside the workflow. Once configured, this pipeline becomes a hands-free AI-powered content engine capable of generating and publishing content at scale.
by vinci-king-01
Error Alert Aggregator โ Email and Jira This workflow aggregates error logs arriving from multiple sources, deduplicates identical events within a configurable time-window, and sends a single consolidated notification via Email and Jira. It prevents alert fatigue by batching similar errors and guarantees that responsible teams are informed through both channels. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted โฅ v1.0 or n8n.cloud account) Basic understanding of your log sourceโs payload structure SMTP server or n8n Email credentials configured Jira Cloud or Jira Server account with API access Required Credentials Email (SMTP/IMAP or n8n Email node credential)** โ to dispatch alert emails Jira** โ Create issues automatically in the chosen project HTTP Request Auth (optional)** โ If your log endpoint requires authentication Specific Setup Requirements | Setting | Recommended Value | Notes | |-----------------------------|----------------------------------------|-----------------------------------------------------------| | Batch window (Wait node) | 10 minutes | Time allowed to collect & deduplicate errors | | Deduplication key (Code) | error_id or message field | Choose a unique attribute representing the same incident | | Email recipients | Security & DevOps distribution list | Use semicolons for multiple addresses | | Jira project key | SEC | Project where alert tickets should be filed | How it works This workflow aggregates error logs arriving from multiple sources, deduplicates identical events within a configurable time-window, and sends a single consolidated notification via Email and Jira. It prevents alert fatigue by batching similar errors and guarantees that responsible teams are informed through both channels. Key Steps: Schedule Trigger**: Runs every X minutes to poll/collect new log items. HTTP Request**: Pulls error events from your monitoring or log system. IF Node**: Quickly filters out non-error or resolved events. Code Node (Deduplicator)**: Hashes & stores unique error signatures, skipping already-seen items. Wait Node**: Holds processing for the batching period (e.g., 10 min). Merge Node**: Combines all unique errors gathered during the window. Set Node**: Formats the consolidated message for Email & Jira. Email Send**: Dispatches the summary email. Jira Node**: Creates (or updates) an issue with the same summary. Sticky Notes**: Provide inline documentation right inside the workflow for easier maintenance. Set up steps Setup Time: 15-20 minutes Import template: Download the JSON template and drag & drop it into your n8n editor. Configure Schedule Trigger: Set polling interval (e.g., every 5 minutes). HTTP Request Node: Enter the URL of your log endpoint. Add authentication if required. Adjust IF filter: Modify the condition to match your logโs error severity field (status === "error"). Customize Code Node: Replace error_id with the field that uniquely identifies an error. Optionally tweak deduplication TTL. Wait Node: Set the batch time (e.g., 600 seconds). Set Node: Edit the email subject/body and Jira issue summary/description placeholders. Credentials: Add or select your Email credential in Email Send. Add or select your Jira credential in Jira node. Test run the workflow to verify that: Duplicate events are collapsed. Email and Jira tickets show combined information. Activate the workflow to start production monitoring. Node Descriptions Core Workflow Nodes: Schedule Trigger** โ Initiates workflow on a fixed interval. HTTP Request** โ Retrieves fresh error logs from an external API. IF** โ Only lets true error events proceed. Code (Deduplicator)** โ Uses JavaScript to remove already-known errors via n8n static data. Wait** โ Creates a batching window for aggregation. Merge (Queue mode)** โ Joins events accumulated during the wait. Set** โ Crafts a human-readable report for Email & Jira. Email Send** โ Dispatches the consolidated message to stakeholders. Jira** โ Opens/updates an issue containing the same error digest. Sticky Note** โ Provides inline explanations for future maintainers. Data Flow: Schedule Trigger โ HTTP Request โ IF โ Code Code โ Wait โ Merge โ Set Set โ Email Send & Jira Customization Examples Change Deduplication Strategy // Code Node snippet // Use error 'stacktrace' + 'service' for uniqueness const signature = ${item.json.stacktrace}_${item.json.service}; if ($workflow.staticData.signatureCache?.includes(signature)) { // duplicate, skip return []; } $workflow.staticData.signatureCache = [ ...( $workflow.staticData.signatureCache || [] ), signature ]; return item; Update Existing Jira Issue Instead of Creating New // Jira Node settings // Search for an open ticket with the same summary // If found, add a comment instead of creating { "operation": "comment", "issueKey": "={{$node['Set'].json['jiraIssueKey']}}", "comment": "New occurrences: {{$json.errorCount}}" } Data Output Format The workflow outputs structured JSON data: { "errors": [ { "id": "ERR123", "message": "Database timeout", "count": 5, "firstSeen": "2024-03-14T10:12:00Z", "lastSeen": "2024-03-14T10:22:00Z" } ], "emailStatus": "success", "jiraStatus": "issue_created" } Troubleshooting Common Issues No data returned from HTTP Request โ Verify endpoint URL, authentication headers, and that your monitoring tool actually has recent error events. Duplicate alerts still coming through โ Increase the Wait nodeโs batching window or refine the deduplication key in the Code node. Performance Tips Cache HTTP responses if the log API supports it to reduce bandwidth. Use selective fields in the HTTP Requestโs query parameters to limit payload size. Pro Tips: Store a rolling hash list in external Redis or DB for large-scale deduplication. Add a second IF branch to auto-resolve Jira tickets when an error disappears for X hours. Use Slack or Microsoft Teams nodes in parallel to broaden alert coverage. This is a community-contributed n8n workflow template provided โas-is.โ Thoroughly test in a non-production environment before deploying to production.
by vinci-king-01
How it works This workflow automatically analyzes website visitors in real-time, enriches their data with company intelligence, and provides lead scoring and sales alerts. Key Steps Webhook Trigger - Receives visitor data from your website tracking system. AI-Powered Company Intelligence - Uses ScrapeGraphAI to extract comprehensive company information from visitor domains. Visitor Enrichment - Combines visitor behavior data with company intelligence to create detailed visitor profiles. Lead Scoring - Automatically scores leads based on company size, industry, engagement, and intent signals. CRM Integration - Updates your CRM with enriched visitor data and lead scores. Sales Alerts - Sends real-time notifications to your sales team for high-priority leads. Set up steps Setup time: 10-15 minutes Configure ScrapeGraphAI credentials - Add your ScrapeGraphAI API key for company intelligence gathering. Set up HubSpot connection - Connect your HubSpot CRM to automatically update contact records. Configure Slack integration - Set up your Slack workspace and specify the sales alert channel. Customize lead scoring criteria - Adjust the scoring algorithm to match your target customer profile. Set up website tracking - Configure your website to send visitor data to the webhook endpoint. Test the workflow - Verify all integrations are working correctly with a test visitor. Key Features Real-time visitor analysis** with company intelligence enrichment Automated lead scoring** based on multiple factors (company size, industry, engagement) Intent signal detection** (pricing interest, demo requests, contact intent) Priority-based sales alerts** with recommended actions CRM integration** for seamless lead management Deal size estimation** based on company characteristics
by Kevin Meneses
What this workflow does This workflow automatically monitors eBay Deals and sends Telegram alerts when relevant, high-quality deals are detected. It combines: Web scraping with Decodo** JavaScript pre-processing (no raw HTML sent to the LLM)** AI-based product classification and deal scoring** Rule-based filtering using price and score** Only valuable deals reach the final notification. How it works (overview) The workflow runs manually or on a schedule. The eBay Deals page is scraped using Decodo, which handles proxies and anti-bot protections. Decodo โ Web Scraper for n8n JavaScript extracts only key product data (ID, title, price, URL, image). An AI Agent classifies each product and assigns a deal quality score (0โ10). Price and score rules are applied. Matching deals are sent to Telegram. How to configure it 1. Decodo Add your Decodo API credentials to the Decodo node. Optionally change the target eBay URL. 2. AI Agent Add your LLM credentials (e.g. Google Gemini). No HTML is sent to the model โ only compact, structured data. 3. Telegram Add your Telegram Bot Token. Set your chat_id in the Telegram node. Customize the alert message if needed. 4. Filtering rules Adjust price limits and minimum deal score in the IF node
by vinci-king-01
Product Price Monitor with Mailchimp and Baserow โ ๏ธ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow scrapes multiple e-commerce sites for product pricing data, stores the historical prices in Baserow, analyzes weekly trends, and emails a neatly formatted seasonal report to your Mailchimp audience. It is designed for retailers who need to stay on top of seasonal pricing patterns to make informed inventory and pricing decisions. Pre-conditions/Requirements Prerequisites Running n8n instance (self-hosted or n8n cloud) ScrapeGraphAI community node installed Mailchimp account with at least one audience list Baserow workspace with edit rights Product URLs or SKU list from target e-commerce platforms Required Credentials | Credential | Used By | Scope | |------------|---------|-------| | ScrapeGraphAI API Key | ScrapeGraphAI node | Web scraping | | Mailchimp API Key & Server Prefix | Mailchimp node | Sending emails | | Baserow API Token | Baserow node | Reading & writing records | Baserow Table Setup Create a table named price_tracker with the following fields: | Field Name | Type | Example | |------------|------|---------| | product_name | Text | โWinter Jacketโ | | product_url | URL | https://example.com/winter-jacket | | current_price | Number | 59.99 | | scrape_date | DateTime | 2023-11-15T08:21:00Z | How it works This workflow scrapes multiple e-commerce sites for product pricing data, stores the historical prices in Baserow, analyzes weekly trends, and emails a neatly formatted seasonal report to your Mailchimp audience. It is designed for retailers who need to stay on top of seasonal pricing patterns to make informed inventory and pricing decisions. Key Steps: Schedule Trigger**: Fires every week (or custom CRON) to start the monitoring cycle. Code (Prepare URLs)**: Loads or constructs the list of product URLs to monitor. SplitInBatches**: Processes product URLs in manageable batches to avoid rate-limit issues. ScrapeGraphAI**: Scrapes each product page and extracts the current price and name. If (Price Found?)**: Continues only if scraping returns a valid price. Baserow**: Upserts the scraped data into the price_tracker table. Code (Trend Analysis)**: Aggregates weekly data to detect price increases, decreases, or stable trends. Set (Mail Content)**: Formats the trend summary into an HTML email body. Mailchimp**: Sends the seasonal price-trend report to the selected audience segment. Sticky Note**: Documentation node explaining business logic in-workflow. Set up steps Setup Time: 10-15 minutes Clone the template: Import the workflow JSON into your n8n instance. Install ScrapeGraphAI: n8n-nodes-scrapegraphai via the Community Nodes panel. Add credentials: a. ScrapeGraphAI API Key b. Mailchimp API Key & Server Prefix c. Baserow API Token Configure Baserow node: Point it to your price_tracker table. Edit product list: In the โPrepare URLsโ Code node, replace the sample URLs with your own. Adjust schedule: Modify the Schedule Trigger CRON expression if weekly isnโt suitable. Test run: Execute the workflow manually once to verify credentials and data flow. Activate: Turn on the workflow for automatic weekly monitoring. Node Descriptions Core Workflow Nodes: Schedule Trigger** โ Initiates the workflow on a weekly CRON schedule. Code (Prepare URLs)** โ Generates an array of product URLs/SKUs to scrape. SplitInBatches** โ Splits the array into chunks of 5 URLs to stay within request limits. ScrapeGraphAI** โ Scrapes each URL, using XPath/CSS selectors to pull price & title. If (Price Found?)** โ Filters out failed or empty scrape results. Baserow** โ Inserts or updates the price record in the database. Code (Trend Analysis)** โ Calculates week-over-week price changes and flags anomalies. Set (Mail Content)** โ Creates an HTML table with product, current price, and trend arrow. Mailchimp** โ Sends or schedules the email campaign. Sticky Note** โ Provides inline documentation and edit hints. Data Flow: Schedule Trigger โ Code (Prepare URLs) โ SplitInBatches SplitInBatches โ ScrapeGraphAI โ If (Price Found?) โ Baserow Baserow โ Code (Trend Analysis) โ Set (Mail Content) โ Mailchimp Customization Examples Change scraping frequency // Schedule Trigger CRON for daily at 07:00 UTC 0 7 * * * Add competitor comparison column // Code (Trend Analysis) item.competitor_price_diff = item.current_price - item.competitor_price; return item; Data Output Format The workflow outputs structured JSON data: { "product_name": "Winter Jacket", "product_url": "https://example.com/winter-jacket", "current_price": 59.99, "scrape_date": "2023-11-15T08:21:00Z", "weekly_trend": "decrease" } Troubleshooting Common Issues Invalid ScrapeGraphAI key โ Verify the API key and ensure your subscription is active. Mailchimp โInvalid Audienceโ error โ Double-check the audience ID and that the API key has correct permissions. Baserow โField mismatchโ โ Confirm your table fields match the names/types in the workflow. Performance Tips Limit each SplitInBatches run to โค10 URLs to reduce scraping timeouts. Enable caching in ScrapeGraphAI to avoid repeated requests to the same URL within short intervals. Pro Tips: Use environment variables for all API keys to avoid hard-coding secrets. Add an extra If node to alert you if a productโs price drops below a target threshold. Combine with n8nโs Slack node for real-time alerts in addition to Mailchimp summaries.