by Warren Gates
What it does This provides a web form for use with my personal property inventory workflow, allowing you to upload image(s) and an optional description with a simple web interface. How it works Displays web form allowing you upload image(s) and an optional description Resizes images and converts to webp format Posts image(s) and description to webhook of the personal property inventory workflow. Requirements A running personal property inventory workflow. How to use Update the HTTP Request node's URL to point to your personal property inventory workflow. Set the HTTP Request node's authentication to match that of the webhook of the personal property inventory workflow.
by Madame AI
Generate SEO articles from search queries to WordPress with BrowserAct This workflow automates a programmatic SEO pipeline by turning a list of search queries into fully researched, authoritative blog posts. It scrapes search results (focusing on community insights like Reddit) for real-world data, uses AI to draft comprehensive guides, and publishes them directly to your WordPress site. Target Audience SEO specialists, content marketers, niche site builders, and editorial teams looking to scale content production with high-quality, researched articles. How it works Define Topics: The workflow begins by defining a list of target keywords or questions in a Set node (e.g., "Best automation tools"). Research: It iterates through each query using a Loop node. For each item, BrowserAct scrapes search engine results to gather raw insights, discussions, and market consensus. Draft Content: An AI Agent (acting as a "Senior Technical Editor") analyzes the raw data. It synthesizes the information into a structured, HTML-formatted article with tables, headers, and actionable advice. Publish: The generated content is sent to WordPress to create a new post. Notify: Once the entire batch is processed, a Slack message is sent to notify the team. How to set up Configure Credentials: Connect your BrowserAct, OpenRouter, WordPress, and Slack accounts in n8n. Prepare BrowserAct: Ensure the Programmatic SEO Data Pipeline template is saved in your BrowserAct account. Set Keywords: Open the Set queries node and update the Queries array with the list of topics you want to write about. Configure WordPress: Open the Create a post node and ensure it is connected to your WordPress site. Configure Notification: Open the Send completion notification node and select the Slack channel where you want to receive alerts. Requirements BrowserAct* account with the *Programmatic SEO Data Pipeline** template. OpenRouter** account (or credentials for a specific LLM like GPT-4o or GPT-5). WordPress** account. Slack** account. How to customize the workflow Adjust the Persona: Modify the system prompt in the AI Agent node to change the writing style (e.g., from "Technical Editor" to "Casual Blogger" or "Sales Copywriter"). Add Visuals: Insert an image generation node (like DALL-E or Stable Diffusion) before the WordPress node to create a unique featured image based on the article title. Review Loop: Instead of publishing directly, change the final step to add the draft to Google Docs or Notion for human approval. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video Automated Content Factory: From Reddit Data to SEO Blog Posts with n8n
by Shohei Sawada
This template gives your HR or operations team an AI-powered Slack bot that answers employee questions about internal policies — directly in DM, available to everyone in the workspace, with no per-user setup required. Employees simply send a direct message to the bot. It searches your Google Drive HR documents using RAG (Retrieval-Augmented Generation) via Pinecone, and replies in the same thread using GPT-4.1 — based strictly on your documents, not general knowledge. Who this is for HR, operations, and IT teams who want to reduce repetitive policy questions (leave, expenses, remote work, etc.) without building a custom chatbot from scratch. What's included This template contains two workflows on a single canvas: HR Document Indexer — Runs daily at 3:00 AM. Indexes all documents in a specified Google Drive folder into Pinecone automatically. HR QA Bot — Listens for Slack DMs workspace-wide. Filters out bot messages to prevent infinite loops, retrieves relevant document chunks, generates an answer, and replies in-thread. A built-in RAG quality evaluator automatically scores every response (faithfulness & answer relevancy) and logs results to Google Sheets for continuous quality monitoring. Key features Works for the entire Slack workspace with a single setup — no per-user configuration Strictly document-grounded answers — the bot will not hallucinate outside your documents Infinite loop prevention via bot_id filtering Automatic daily re-indexing with duplicate prevention (namespace clear before upsert) Built-in answer quality logging to Google Sheets Prerequisites Google Drive folder with HR documents (txt, pdf, docx) Pinecone account (index: dimension 1536, metric cosine) OpenAI API key Slack App with DM permissions (setup guide included in the workflow) Google Sheets (copy link provided in the workflow)
by GiovanniSegar
Super simple workflow to convert image URLs to an uploaded attachment in Airtable. You'll need to adjust the field names to match your specific data, including in the filter formula where it says "Cover image URL". Just replace that with the field name where you are storing the image URL.
by mike
This is an example of how you can make Merge by Key work. The “Data 1” and “Data 2” nodes simply provide mock data. You can replace them with your own data sources. Then the “Convert Data” nodes are important. They make sure that the different array items are actually different items in n8n. After that, you have then the merge with the merged data.
by Jitesh Dugar
Consolidate and compress project archives for cost-optimized cloud storage 🎯 Description Optimize your cloud storage costs by using this automation to intelligently compress and migrate aging project documentation. This workflow allows you to achieve a professional data lifecycle policy by identifying "stale" files in active storage, applying high-ratio PDF compression, and migrating them to cold storage while maintaining a searchable audit trail. A critical technical feature of this template is the Luxon-based lifecycle logic. By utilizing {{ $now.minus({ months: 6 }).toISODate() }}, the workflow dynamically filters for files that haven't been modified in over half a year. It then generates a unique archive path using {{ $now.toFormat('yyyy/MM_MMM') }}, ensuring your cold storage bucket remains perfectly indexed by year and month without any manual folder creation or renaming. ✨ How to achieve automated storage optimization You can achieve an enterprise-grade archiving system by using the available tools to: Monitor and age-gate — Use the Google Drive node to list project files and a Code node to compare file metadata against a 6-month "hot storage" threshold. Compress and verify — Pass identified files through the HTML to PDF compression engine to reduce file size by up to 80% while maintaining document readability. Migrate to cold storage — Stream the compressed binary directly to AWS S3 (or a dedicated archive folder), using dynamic naming conventions for organized retrieval. Log and notify — Automatically alert the IT team via Slack upon batch completion, providing a report on the specific files migrated and the storage path used. 💡 Key features Intelligent cost reduction** — Automatically targets large, old files for compression, significantly reducing long-term "Cold Storage" billing. Dynamic indexing* — Uses *Luxon** to build a chronological folder structure in the cloud, making multi-year archives easy to navigate. Integrity assurance** — The workflow ensures files meet specific age and type criteria before moving them, preventing accidental archival of active documents. 📦 What you will need Google Drive — Your "Hot" storage where active project files are kept. HTML to PDF Node — Used here for the PDF compression and optimization engine. AWS S3 — Your destination "Cold" storage for long-term archiving. Slack — For automated reporting on storage optimization status. Ready to optimize your cloud storage? Import this template, connect your credentials, and start saving on long-term data costs today.
by Jitesh Dugar
📦 Automated Instagram Product Drop via uploadtourl Streamline your e-commerce marketing with this end-to-end Instagram publishing pipeline. This workflow automates the transition from a new product entry to a live social media announcement, ensuring your brand stays active and consistent across platforms. 🎯 What This Workflow Does This template handles the complex multi-step process of social media publishing through a single trigger: 🔔 Smart Data Intake The Webhook Trigger accepts data from Shopify, Airtable, or manual inputs. A normalization node then sanitizes the raw data, mapping nested fields to a consistent schema and applying fallback defaults for missing captions or hashtags. ☁️ Image Processing & Hosting The system fetches your product image from any public source and passes the binary to the uploadtourl node. This converts your raw image into a public CDN URL, which is required for the Instagram Graph API to access the file. 📸 Instagram Publishing Flow A dedicated Code Node assembles a brand-ready caption with emojis, prices, and CTAs, automatically truncating text to 2,200 characters to meet API limits. The workflow then executes Instagram's two-step flow: creating a media container and publishing it after a safe 5-second buffer. 📊 Audit & Notification Once the post is live, the workflow records the Live Post ID, product name, and timestamp in Airtable for future analytics. Finally, it sends a formatted alert to your team via Slack with a direct link to the new post. ✨ Key Features Multi-Source Support: Works natively with Shopify products/create webhooks and Airtable automations. Automated CDN Hosting: Bypasses manual file uploads by using uploadtourl to generate API-compatible links. Safe Buffer Logic: Built-in wait timers ensure Instagram finishes asynchronous image processing before the final publish call. Brand Guardrails: Centralized caption builder ensures every post follows your brand's emoji and hashtag style. 💼 Perfect For SMM Managers: Automating repetitive product announcement tasks. E-commerce Owners: Linking Shopify new arrivals directly to Instagram. Creative Agencies: Managing high-volume product drops for multiple clients. Content Operations: Creating an automated audit log of all social media activity. 🔧 What You'll Need Required Integrations Instagram Graph API: A Business or Creator account access token. uploadtourl Account: Credentials configured in n8n for media hosting. Airtable & Slack: (Optional) For logging and team notifications. Configuration Steps API Keys: Connect your Instagram and uploadtourl credentials. Environment Variables: Set your IG_ACCOUNT_ID, AIRTABLE_BASE_ID, and SLACK_CHANNEL_ID in n8n.
by Dhruv Dalsaniya
This workflow is designed for e-commerce, marketing teams, or creators who want to automate the production of high-quality, AI-generated product visuals and ad creatives. Here is what the workflow does: It accepts a product description and other creative inputs through a web form. It uses AI to transform your text input into a detailed, creative prompt. This prompt is then used to generate a product image. The workflow analyzes the generated image and creates a new prompt to generate a second image that includes a model, adding a human element to the visual. A final prompt is created from the model image to generate a short, cinematic video. All generated assets (images and video) are automatically uploaded to your specified hosting platform, providing you with direct URLs for immediate use. This template is an efficient solution for scaling your content creation efforts, reducing time spent on manual design, and producing a consistent stream of visually engaging content for your online store, social media, and advertising campaigns. Prerequisites: OpenRouter Account:** Required for AI agents to generate image and video prompts. GOAPI Account:** Used for the final video generation process. Media Hosting Platform:** A self-hosted service like MediaUpload, or any alternative like Google Drive or a similar service that can provide a direct URL for uploaded images and videos. This is essential for passing the visuals between different steps of the workflow.
by Jitesh Dugar
Branded Social Proof Automation via Bannerbear and uploadtourl Convert your customer satisfaction into high-converting social media content with this fully automated social proof pipeline. This workflow scans your database for top-tier reviews, generates a branded quote card, and publishes it directly to Instagram, ensuring a consistent stream of credibility for your brand. 🎯 What This Workflow Does This template manages the entire lifecycle of a testimonial post, from data retrieval to final notification: 🔄 Review Dispatch Automation Schedule Trigger:** Automatically fires daily at 10:00 AM; cadence can be adjusted via cron expression. Airtable — Fetch Review:** Retrieves the oldest 5-star, unposted record using a specific filter formula to prevent duplicates. IF — Has Valid Review?:** Validates the data; the workflow exits gracefully if no new reviews are found and only proceeds when a 5-star review is ready. ✍️🎨 Dynamic Asset Generation Code — Prepare Payload:** Formats review data into a JSON body, mapping fields like name and truncated text to Bannerbear layers while generating the final Instagram caption. HTTP — Create Image Job:** Submits the request to the Bannerbear API and retrieves a unique job uid for asynchronous processing. 🔁 Status Verification & Media Hosting HTTP — Poll Status:** Regularly checks the job status via the Bannerbear API to see if the rendering is finished. IF — Image Ready?:** Confirms completion; if still processing, it triggers a "Wait 3s + re-poll" loop for up to 5 retries before passing the image_url forward. uploadtourl Bridge:** Mandatory CDN step that uploads the rendered image binary and returns a stable public URL, which is required for Instagram's API to access the file. 📸 Instagram Publishing & Tracking IG — Create & Publish:** Executes the two-step Instagram Graph API flow to create a media container and publish it to your feed after a safe 6-second buffer. Airtable — Mark as Posted:** Updates the original record with the Post ID and timestamp to prevent duplicate posting. Slack Notification:** Sends a final team alert with a preview of the card and the live link. ✨ Key Features Adaptive Polling:** Instead of a static wait time, the workflow intelligently polls Bannerbear until the image is confirmed ready. Automated CDN Bridge:** Uses uploadtourl to bypass Instagram's rejection of base64/binary payloads by providing a direct public URL. Intelligent Truncation:** Automatically shortens long reviews to 180 characters to ensure perfect readability on your branded quote card. Full Audit Trail:** Every post is logged back to Airtable with its live Instagram ID and CDN URL for easy reporting. 💼 Perfect For SaaS Companies:** Showcasing user feedback and "Love letters" from customers. E-commerce Brands:** Sharing 5-star product reviews to build buyer confidence. Service Providers:** Highlighting client testimonials on a regular schedule. Digital Marketers:** Automating the "Social Proof" pillar of a social media strategy. 🔧 What You'll Need Required Integrations Bannerbear:** API key and a Template ID with layers named reviewer_name, review_text, and star_label. Instagram Graph API:** A Business or Creator account access token. uploadtourl:** Credentials configured in n8n for mandatory media hosting. Airtable:** A base with a Reviews table containing fields for the name, text, and rating.
by Jitesh Dugar
Convert your customer satisfaction into high-converting social media content with this fully automated social proof pipeline. This workflow scans your database for top-tier reviews, generates a branded quote card, and publishes it directly to Instagram, ensuring a consistent stream of credibility for your brand. 🎯 What This Workflow Does This template manages the entire lifecycle of a testimonial post, from data retrieval to final notification: 🔄 Review Dispatch Automation Schedule Trigger:** Automatically fires daily at 10:00 AM; cadence can be adjusted via cron expression. Airtable — Fetch Review:** Retrieves the oldest 5-star, unposted record using a specific filter formula to prevent duplicates. IF — Has Valid Review?:** Validates the data; the workflow exits gracefully if no new reviews are found and only proceeds when a 5-star review is ready. ✍️🎨 Dynamic Asset Generation Code — Prepare Payload:** Formats review data into a JSON body, mapping fields like name and truncated text to Bannerbear layers while generating the final Instagram caption. HTTP — Create Image Job:** Submits the request to the Bannerbear API and retrieves a unique job uid for asynchronous processing. 🔁 Status Verification & Media Hosting HTTP — Poll Status:** Regularly checks the job status via the Bannerbear API to see if the rendering is finished. IF — Image Ready?:** Confirms completion; if still processing, it triggers a "Wait 3s + re-poll" loop for up to 5 retries before passing the image_url forward. uploadtourl Bridge:** Mandatory CDN step that uploads the rendered image binary and returns a stable public URL, which is required for Instagram's API to access the file. 📸 Instagram Publishing & Tracking IG — Create & Publish:** Executes the two-step Instagram Graph API flow to create a media container and publish it to your feed after a safe 6-second buffer. Airtable — Mark as Posted:** Updates the original record with the Post ID and timestamp to prevent duplicate posting. Slack Notification:** Sends a final team alert with a preview of the card and the live link. ✨ Key Features Adaptive Polling:** Instead of a static wait time, the workflow intelligently polls Bannerbear until the image is confirmed ready. Automated CDN Bridge:** Uses uploadtourl to bypass Instagram's rejection of base64/binary payloads by providing a direct public URL. Intelligent Truncation:** Automatically shortens long reviews to 180 characters to ensure perfect readability on your branded quote card. Full Audit Trail:** Every post is logged back to Airtable with its live Instagram ID and CDN URL for easy reporting. 💼 Perfect For SaaS Companies:** Showcasing user feedback and "Love letters" from customers. E-commerce Brands:** Sharing 5-star product reviews to build buyer confidence. Service Providers:** Highlighting client testimonials on a regular schedule. Digital Marketers:** Automating the "Social Proof" pillar of a social media strategy. 🔧 What You'll Need Required Integrations Bannerbear:** API key and a Template ID with layers named reviewer_name, review_text, and star_label. Instagram Graph API:** A Business or Creator account access token. uploadtourl:** Credentials configured in n8n for mandatory media hosting. Airtable:** A base with a Reviews table containing fields for the name, text, and rating.
by Tomoki
Video Processing Pipeline with Thumbnail Generation and CDN Distribution Summary Automated video processing system that monitors S3 for new uploads, generates thumbnails and preview clips, extracts metadata, transcodes to multiple formats, and distributes to CDN with webhook notifications. Detailed Description A comprehensive video processing workflow that receives S3 events or manual triggers, validates video files, extracts metadata via FFprobe, generates thumbnails at key frames, creates animated GIF previews, transcodes to multiple resolutions, invalidates CDN cache, and sends completion notifications. Key Features S3 Event Monitoring**: Automatic detection of new video uploads Thumbnail Generation**: Multiple sizes at key frame intervals Video Metadata**: FFprobe extraction of duration, resolution, codec info Preview GIF**: Animated preview clips for video galleries Multi-Format Transcoding**: Convert to 1080p, 720p, 480p CDN Distribution**: Cloudflare cache invalidation and signed URLs Webhook Callbacks**: Notify origin system on completion Use Cases Video hosting platforms Media asset management systems Content delivery networks Video streaming services Social media platforms E-learning video processing User-generated content platforms Required Credentials AWS S3 Credentials (for video storage) FFmpeg API credentials (via HTTP) Cloudflare API Token (for CDN) Slack Bot Token (for notifications) Google Sheets OAuth (for logging) Node Count: 24 (19 functional + 5 sticky notes) Unique Aspects Uses Webhook for S3 event notifications Uses Code nodes for S3 info extraction and URL generation Uses If node for video format validation Uses HTTP Request nodes for FFprobe, FFmpeg, and CDN APIs Uses Aggregate node for collecting parallel processing results Uses Merge nodes for multiple workflow path consolidation Implements parallel processing for thumbnails, GIF, and transcoding Workflow Architecture [S3 Event Webhook] [Manual Webhook] | | +--------+----------+ | v [Merge Triggers] | v [Extract S3 Info] (Code) | v [Check Is Video] (If) / \ Yes No | | v v [Get Video Metadata] [Invalid Response] (FFprobe) | | | v | [Parse Video Metadata] | (Code) | /|\ | / | \ | v v v | Thumbs[Transcode] | \ | / | \ | / | v v | [Aggregate Results] | | | v | [Invalidate CDN Cache] | | | v | [Generate Signed URLs] | / \ | / \ | v v | [Log Sheet] [Slack] | \ / | \ / | v | [Merge Output Paths] | | | +---------+-------+ | v [Merge All Paths] | v [Respond to Webhook] Configuration Guide S3 Event: Configure S3 bucket notification to send events to webhook FFmpeg API: Use a hosted FFmpeg service (e.g., api.ffmpeg-service.com) Cloudflare: Set zone ID and API token for cache invalidation Slack Channel: Set #video-processing for notifications Google Sheets: Connect for processing metrics logging Supported Video Formats | Extension | MIME Type | |-----------|----------| | .mp4 | video/mp4 | | .mov | video/quicktime | | .avi | video/x-msvideo | | .mkv | video/x-matroska | | .webm | video/webm | | .m4v | video/x-m4v | Thumbnail Generation | Size | Dimensions | Suffix | |------|------------|--------| | Large | 1280x720 | _large | | Medium | 640x360 | _medium | | Small | 320x180 | _small | Thumbnails generated at: 10%, 30%, 50%, 70%, 90% of video duration Transcoding Presets | Preset | Resolution | Bitrate | Codec | |--------|------------|---------|-------| | 1080p | 1920x1080 | 5000k | H.264 | | 720p | 1280x720 | 2500k | H.264 | | 480p | 854x480 | 1000k | H.264 | Output Structure { "job_id": "job_1705312000_abc123", "status": "completed", "original": { "filename": "video.mp4", "resolution": "1920x1080", "duration": "00:05:30" }, "thumbnails": { "large": "https://cdn/thumbnails/job_id/thumb_0_large.jpg", "medium": "https://cdn/thumbnails/job_id/thumb_0_medium.jpg", "small": "https://cdn/thumbnails/job_id/thumb_0_small.jpg" }, "preview_gif": "https://cdn/previews/job_id/preview.gif", "transcoded": { "1080p": "https://cdn/transcoded/job_id/video_1080p.mp4", "720p": "https://cdn/transcoded/job_id/video_720p.mp4", "480p": "https://cdn/transcoded/job_id/video_480p.mp4" } } `
by simonscrapes
Overview This n8n automation is a complete LinkedIn Content Engine that turns simple topic ideas into fully written, visual, and scheduled posts. It features a "Human-in-the-Loop" design, meaning AI handles the heavy lifting of writing and image creation, but nothing goes live until you manually approve it in Google Sheets. How It Works The system runs two separate workflows in parallel: 1. The "Creator" Workflow Input: Detects when you add a new topic to your "Content Calendar" Google Sheet. Brand Alignment: Pulls your specific "Brand Voice" guidelines from a separate tab to ensure the AI sounds like you. Creation: Uses Gemini Flash 1.5 to write the post and DALL-E 3 to generate a matching professional image. Drafting: Uploads the image to ImgBB and saves the full draft back to your sheet with a status of "Draft." 2. The "Publisher" Workflow Daily Scan: Wakes up every morning to check your Content Calendar. Verification: Looks for posts that match two criteria: Date Scheduled matches today's date. Status is marked as "Approved" (by you). Publishing: If both match, it automatically uploads the text and image to LinkedIn and updates the sheet status to "Posted." Tools Used: n8n, Google Sheets, OpenRouter (Gemini / OpenAI), ImgBB. Connect & Learn More YouTube Channel: Simon Scrapes – More tutorials on AI & Automation. Community: Skool Community – Master AI & Automation with us. Full Video Tutorial: Watch the step-by-step build here