by Camille Roux
Create a reusable “photos to post” queue from your Lightroom Cloud album—ideal for Lightroom-to-Instagram automation with n8n. It discovers new photos, stores clean metadata in a Data Table, and generates AI alt text to power on-brand captions and accessibility. Use it together with “Lightroom Image Webhook (Direct JPEG for Instagram)” and “Instagram Auto-Publisher for Lightroom Photos (AI Captions).” What it’s for Automate Lightroom to Instagram; centralize photo data for scheduled IG posting; prep AI-ready alt text and metadata for consistent, hands-free publishing. Parameters to set Lightroom Cloud credentials (client/app + API key) Album/collection ID to monitor in Lightroom Cloud Data Table name for the posting queue (e.g., Photos) AI settings: language/tone for alt text (concise, brand-aware) Image analysis URL: public endpoint of Workflow 2 (Lightroom Image Webhook) Works best with Workflow 2: Lightroom Image Webhook (Direct JPEG for Instagram) Workflow 3: Instagram Auto-Publisher for Lightroom Photos (AI Captions) Learn more & stay in the loop Want the full story (decisions, trade-offs, and tips) behind this Lightroom Cloud → Instagram automation? 👉 Read the write-up on my blog: camilleroux.com If you enjoy street & urban photography or you’re curious how I use these n8n workflows day-to-day: 👉 Follow my photo account on Instagram: @camillerouxphoto 👉 Follow me on other networks: links available on my site (X, Bluesky, Mastodon, Threads)
by SpaGreen Creative
Shopify Auto Send WhatsApp Thank-You Messages & Loyalty Coupon Using Rapiwa API Who is this for? This workflow is for Shopify store owners, marketers, and support teams who want to automatically message their high-value customers on WhatsApp when new discount codes are created. What this workflow does Fetches customer data from Shopify Filters customers where total_spent > 5000 Cleans phone numbers (removes non-digit characters) and normalizes them to an international format Verifies numbers via the Rapiwa API (verify-whatsapp endpoint) Sends coupon or thank-you messages to verified numbers via the Rapiwa send-message endpoint Logs each send attempt to Google Sheets with status and validity Uses batching (SplitInBatches) and Wait nodes to avoid rate limits Key features Automated trigger: Shopify webhook (discounts/create) or manual trigger Targeted sending to high-value customers Pre-send verification to reduce failed sends Google Sheets logging and status updates Rate-limit protection using Wait node #How to use? Step-by-step setup 1) Prepare a Google Sheet Columns: name, number, status, validity, check (optional) Example row: Abdul Mannan | 8801322827799 | not sent | unverified | check 2) Configure n8n credentials Shopify: store access token (X-Shopify-Access-Token) Rapiwa: Bearer token (HTTP Bearer credential) Google Sheets: OAuth2 credentials and sheet access 3) Configure the nodes Webhook/Trigger: Shopify discounts/create or Manual Trigger HTTP Request (Shopify): /admin/api/<version>/customers.json Code node: filter customers total_spent > 5000 and map fields SplitInBatches: batching/looping Code (clean number): waNoStr.replace(/\D/g, "") HTTP Request (Rapiwa verify): POST https://app.rapiwa.com/api/verify-whatsapp body { number } IF node: check data.exists to decide branch HTTP Request (Rapiwa send-message): POST https://app.rapiwa.com/api/send-message body { number, message_type, message } Google Sheets Append/Update: write status and validity Wait: add 2–5 seconds delay between sends 4) Test with a small batch Run manually with 2–5 records first and verify results Google Sheet column structure A Google Sheet formatted like this ➤ Sample | Name | Number | Status | Validity | | -------------- | ------------- | -------- | ---------- | | Abdul Mannan | 8801322827798 | not sent | unverified | | Abdul Mannan | 8801322827799 | sent | verified | Requirements Shopify Admin API access (store access token) Rapiwa account and Bearer token Google account and Google Sheet (OAuth2 setup) n8n instance (nodes used: HTTP Request, Code, SplitInBatches, IF, Google Sheets, Wait) Customization ideas Adjust the filter (e.g., order count, customer tags) Use message templates to insert name and coupon code per customer Add an SMS or email fallback for unverified numbers Send a run summary to admin (Slack / email) Store logs in a database for deeper analysis Important notes data.exists may be a boolean or a string — normalize it in a Code node before using in an IF node Ensure Google Sheets column names match exactly Store Rapiwa and Shopify tokens securely in n8n credentials Start with small batches for testing and scale gradually Useful Links Dashboard:** https://app.rapiwa.com Official Website:** https://rapiwa.com Documentation:** https://docs.rapiwa.com Support & Help WhatsApp**: Chat on WhatsApp Discord**: SpaGreen Community Facebook Group**: SpaGreen Support Website**: https://spagreen.net Developer Portfolio**: Codecanyon SpaGreen
by Nima Salimi
Overview This n8n workflow automatically retrieves Brevo contact reports and inserts summarized engagement data into NocoDB. It groups campaign activity by email, creating a clean, unified record that includes sent, delivered, opened, clicked, and blacklisted events. This setup keeps your CRM or marketing database synchronized with the latest Brevo email performance data. ✅ Tasks ⏰ Runs automatically on schedule or manually 🌐 Fetches contact activity data from Brevo API 🧩 Groups all campaign activity per email 💾 Inserts summarized data into NocoDB ⚙️ Keeps engagement metrics synced between Brevo and NocoDB 🛠 How to Use 🧱 Prepare your NocoDB table Create a table with fields for: email, messagesSent, delivered, opened, clicked, done, and blacklisted. 🔑 Connect your Brevo credentials Add your Brevo API Key in the HTTP Request node to fetch contact data securely. 🧮 Review the Code Nodes These nodes group contact activity by email and prepare a single dataset for insertion. 🚀 Run or schedule the workflow Execute it manually or use a Schedule Trigger to automate the data sync process. 📌 Notes 🗂 Make sure the field names in NocoDB match those used in the workflow. 🔐 Keep your Brevo API Key secure and private. ⚙️ The workflow can be expanded to include additional fields or filters. 📊 Use the data for engagement analytics, segmentation, or campaign performance tracking.
by Marth
Workflow Description: Automated YouTube Short Viral History (Blotato + GPT-4.1) This workflow is a powerful, self-sustaining end-to-end content automation pipeline designed to feed your YouTube Shorts channel with consistent, high-quality, and highly engaging videos focused on "What if history..." scenarios. This solution completely eliminates manual intervention across the creative, production, and publishing stages. It expertly links the creative power of a GPT-4o AI Agent with the video rendering capabilities of the Blotato API, all orchestrated by n8n. How It Works The automation runs through a five-step, scheduled process: Trigger and Idea Generation: The Schedule Trigger starts the workflow (default is 10:00 AM daily). The AI Agent (GPT-4o) acts as a copywriter/researcher, automatically brainstorming a random "What if history..." topic, researching relevant facts, and formulating a viral, hook-driven 60-second video script, along with a title and caption. Visual Production Request: The formatted script is sent to the Blotato API via the Create Video node. Blotato begins rendering the text-to-video short based on the pre-set style parameters (cinematic style, specific voice ID, and AI models). Status Check and Wait: The Wait node pauses the workflow, and the Get Video node continually checks the Blotato system until the video rendering status is confirmed as done. Media Upload: The completed video file is uploaded to the Blotato media library using an HTTP Request node, preparing it for publishing. Automated Publishing: The final YT Post node (another HTTP Request to the Blotato API) automatically publishes the video to your linked YouTube channel, using the video URL and the AI-generated title and short caption. Set Up Steps To activate and personalize this powerful content pipeline in n8n, follow these steps: OpenAI Credential: Ensure your OpenAI API key credential is created and connected to the Brainstorm Idea node (Language Model). The workflow uses GPT-4o by default. Blotato API Key: Obtain your Blotato API Key. Open the Prepare Video node and manually insert your Blotato API Key into the blotato_api_key field. YouTube Account ID: Find the Account ID (or Channel ID) for the YouTube channel you want to post to. Open the Prepare for Publish node and manually insert your YouTube Account ID into the youtube_id field. Customize Video Style (Optional): If desired, adjust the visual aesthetic by modifying parameters in the Prepare Video node, such as: voiceId: To change the video narrator. style: To change the visual theme (e.g., from cinematic to documentary). text_to_image_model and image_to_video_model: To change the underlying AI generation models. Activate Workflow: Save the workflow and toggle the main switch to Active. The first video will be created and published on the next scheduled run.
by Anne Uy Gothong
This free n8n automation helps anyone—from busy parents to entrepreneurs—get a daily SMS summary of their calendar events. It’s a personal assistant that gives you a heads up of what your day will look like. Great for: starting your day cognizant of the day's events, or for the neurodivergent. Example use case: Parent receives a text summarizing the Family Calendar events at 5AM. Its a reminder of a child's doctor appointment at 1PM and soccer practice at 430PM. The AI then ends the message on an uplifting note. Good to Know Requires user to buy a Twilio phone number to send SMS from. Each message, at the time of writing, is $0.083 CAD. Requires basic knowledge of Google Cloud Console for the activation of Google Calendar API How it Works Every morning at 7AM, your workflow checks Google Calendar for the day’s events. It formats your schedule into a friendly, easy-to-read summary using your favorite AI model (any LLM works—Anthropic, OpenAI, Gemini, etc). That summary is texted directly to your phone via Twilio as a personal daily reminder. How to Use Copy this n8n workflow into your own instance. Hook up your Google Calendar and Twilio accounts. You can choose the specific calendar in the Calendar node. Choose or swap in any AI model you prefer to personalize your summaries. I find Claude sounds the most natural. Enjoy your daily, cheerful calendar digest by SMS at 7AM! Requirements A Google Cloud account (to activate the Calendar API and get credentials—see Google documentation). Twilio account (for sending SMS—get started with Twilio’s easy setup). Any LLM API account (optional, but recommended for polite/friendly summaries). Customize this flow Change SMS times to fit your morning routine. Adjust message formatting for your style or brand. Swap LLM services, tweak prompts, or combine multiple calendars—whatever works for you. Reach out anytime at ralleyreminders.com if you have questions or want to share ideas!
by Calistus Christian
What this workflow does Provides the tools layer for the Parent agent to manage Google Calendar: Get (list events), Create, and Delete. Accepts text + sessionid from the Parent and uses an LLM with short-term memory to choose and run the correct tool. Pipeline: Execute Workflow Trigger → Sub-Agent → (Get / Create / Delete) → Google Calendar Category: Productivity / Calendar / Agentic\ Time to set up: ~10 minutes\ Difficulty: Intermediate\ Cost: Mostly free (n8n CE; OpenAI + Google Calendar usage as configured) * What you'll need OpenAI credentials. Google Calendar OAuth2 credentials. A calendar ID (use a placeholder like your.calendar@example.com in the node and select your actual calendar at runtime). * Set up steps Import this Sub-Agent workflow. Open the Google Calendar tool nodes (Get, Create, Delete) and select your OAuth2 credential and calendar. Ensure the Execute Workflow Trigger exposes two inputs: text and sessionid. Connect the Parent's toolWorkflow node to this workflow. * Testing (direct call example) From the Parent, send: "Schedule 'Team Sync' tomorrow 10:00--11:00" → Sub-Agent should call Create. "List events next week" → Get with timeMin/timeMax. "Delete event 'Team Sync'" → Delete with eventId once matched.
by Yar Malik (Asfandyar)
Who’s it for This workflow is designed for researchers, content creators, and AI agents who need to quickly scrape structured web data and capture full-page screenshots for further use. It’s especially useful for automating competitive research, news monitoring, and content curation. How it works The workflow uses the Firecrawl API integrated with n8n to perform web searches and return results in structured formats (Markdown and screenshots). It includes: A search agent that transforms natural language queries into Firecrawl-compatible search strings. HTTP requests to retrieve results from specific sites (e.g., YouTube, news outlets) or across the web. Automatic capture of full-page screenshots alongside structured text. Integration with the OpenAI Chat Model for enhanced query handling. How to set up Import this workflow into your n8n instance. Add and configure your Firecrawl API credentials. Add your OpenAI credentials for natural language query parsing. Trigger the workflow via the included chat input or modify it to run on schedule. Requirements A Firecrawl account with an active API key. n8n self-hosted or cloud instance. OpenAI account if you want to enhance search queries. How to customize the workflow Update the search queries to focus on your preferred sites or keywords. Adjust the number of results with the limit parameter. Extend the workflow to store screenshots in Google Drive, Notion, or your database. Replace the chat trigger with any other event trigger (webhook, schedule, etc.).
by Barbora Svobodova
Create LinkedIn Post from Telegram Voice or Text Message with AI Image Who's it for This workflow is perfect for busy professionals, content creators, and marketers who want to publish polished LinkedIn posts without spending time on formatting or design. Send a quick text or voice message via Telegram, and get a fully formatted LinkedIn post with a relevant AI-generated image, post it immediately on LinkedIn. Example use cases: Entrepreneurs sharing business insights on the go without opening LinkedIn Marketers creating consistent content during commutes or between meetings Thought leaders turning quick voice notes into professional posts with visuals How it works / What it does Receive text or voice messages through a Telegram bot. Transcribe voice messages using OpenAI's audio transcription. Transform raw input into a professional LinkedIn post using AI formatting (proper structure, tone, and character limits). Generate a relevant image prompt based on post content. Create an AI image that matches the post topic. Automatically publish the complete post (text + image) to LinkedIn. How to set up Create a Telegram bot via @BotFather and obtain your API token. For self-hosted n8n users: Create a LinkedIn app at developer.linkedin.com to get OAuth credentials (Client ID and Client Secret). Add the OpenAI API key, LinkedIn OAuth credentials, and Telegram API to n8n. Assign your credentials to the Telegram, OpenAI, and LinkedIn nodes. Deploy and activate the workflow. Send a text or voice message to your Telegram bot and watch it create and post to LinkedIn! Requirements Telegram Bot Token OpenAI API Key LinkedIn OAuth credentials n8n instance (cloud or self-hosted) How to customize the workflow Modify the LinkedIn Post Text prompt to match your personal writing style or brand voice. Adjust image generation settings (model, size, style) in the Create Image node. Add approval steps by routing posts to Google Sheets, Airtable, or Notion before publishing. Create a second workflow to schedule approved posts for specific times. Limitations and Usage Tips Input Clarity**: Voice messages should be clear and well-articulated for accurate transcription. LinkedIn Character Limits**: The AI formatter optimizes posts for 1,242-2,500 characters. API Costs**: Each post generation uses OpenAI API calls for transcription (if voice), text formatting, image prompt creation, and image generation. Monitor your usage to manage costs. LinkedIn Rate Limits**: LinkedIn API has posting frequency limits. Avoid bulk posting in short time periods to prevent rate limiting.
by Msaid Mohamed el hadi
🔍 AI-Powered Website Prompt Executor (Apify + OpenRouter) This workflow combines the power of Apify and OpenRouter to scrape website content and execute any custom prompt using AI. You define what you want — whether it’s extracting contact details, summarizing content, collecting job offers, or anything else — and the system intelligently processes the site to give you results. 🚀 Overview This workflow allows you to: Input a URL and define a prompt. Scrape the specified number of pages from the website. Process each page’s metadata and Markdown content. Use AI to interpret and respond to the prompt on each page. Aggregate and return structured output. 🧠 How It Works Input Example { "enqueue": true, "maxPages": 5, "url": "https://apify.com", "method": "GET", "prompt": "collect all contact informations available on this website" } Workflow Steps | Step | Action | | ---- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | | 1 | Triggered by another workflow with JSON input. | | 2 | Calls the Apify actor firescraper-ai-website-content-markdown-scraper to scrape content. | | 3 | Loops through the scraped pages. | | 4 | AI analyzes each page based on the input prompt. | | 5 | Aggregates AI outputs across all pages. | | 6 | Final AI processing step to return a clean structured result. | 🛠 Technologies Used Apify** – Scrapes structured content and Markdown from websites. OpenRouter** – Provides access to advanced AI models like Gemini. LangChain** – Handles AI agent orchestration and prompt interpretation. 🔧 Customization Customize the workflow via the following input fields: url: Starting point for scraping maxPages: Limit the number of pages to crawl prompt: Define any instruction (e.g., “summarize this website,” “extract product data,” “list all emails,” etc.) This allows dynamic, flexible use across various use cases. 📦 Output The workflow returns a JSON result that includes: Processed prompt responses from each page Aggregated AI insights Structured and machine-readable format 🧪 Example Use Cases 🔍 Extracting contact information from websites 📄 Summarizing articles or company profiles 🛍️ Collecting product information 📋 Extracting job listings or news 📬 Generating outreach lists from public data 🤖 Used as a tool within other AI agents for real-time web analysis 🧩 Integrated as an external tool in MCP (Multi-Component Prompt) servers to enhance AI capabilities 🔐 API Credentials Required You will need: Apify API token** – For running the scraper actor OpenRouter API key** – For AI-powered prompt processing Set these credentials in your environment or n8n credential manager before running.
by Chris Jadama
Voice-to-Ideas: Auto-Transcribe Telegram Voice Notes to Google Sheets Who it's for Creators, entrepreneurs, writers, and anyone who wants to capture ideas quickly without typing. This workflow is ideal for storing thoughts, content ideas, brainstorms, reminders, or voice memos on the go. What it does This workflow listens for Telegram voice messages, sends the audio to OpenAI Whisper for transcription, and saves the raw text directly into a Google Sheet. No formatting or additional processing is applied. The exact transcription from the audio is stored as-is. How it works A Telegram Trigger detects when you send a voice message to your bot. The Telegram node downloads the audio file. OpenAI Whisper transcribes the voice note into text. The raw transcription is appended to Google Sheets along with the current date. Requirements Telegram bot token (created via BotFather) OpenAI API key with Whisper transcription enabled Google Sheets credentials connected in n8n A Google Sheet with two columns: Notes (stores the transcription text) Date (timestamp of the voice note) Setup steps Create a Telegram bot with BotFather and connect Telegram credentials in n8n. Add your OpenAI API key to the OpenAI node. Connect Google Sheets credentials in n8n. Create a Google Sheet with two columns: Notes and Date. Send a voice message to your Telegram bot to test the workflow.
by Rahul Joshi
Description: Guarantee that only fully compliant stories and tasks make it into your release with this n8n automation template. The workflow monitors Jira for issue updates and link changes, validates whether each story meets the Definition of Done (DoD), and automatically flags non-compliant items. It also creates a tracking record in Monday.com for unresolved blockers and sends Slack alerts summarizing readiness status for every version. Perfect for release managers, QA leads, and engineering teams who need an automated guardrail for production readiness. ✅ What This Template Does (Step-by-Step) 🎯 Jira Webhook Trigger: Activates automatically when an issue is updated or linked in Jira — ideal for continuous readiness validation. 📋 Fetch Full Issue Details: Retrieves the complete issue payload, including custom fields, status, and Definition of Done flags. 🔄 Batch Processing (1-by-1): Ensures each issue is validated individually, allowing precise error handling and clean audit trails. ✅ Check Definition of Done (DoD): Evaluates whether the customfield_DoD field is marked as true — a key signal of readiness for release. ⚠️ Flag Non-Compliant Issues: If DoD isn’t met, marks the issue as “Non-Compliant” with the reason “Definition of Done not met.” 📊 Create Tracking Record in Monday.com: Logs non-compliant issues to a dedicated Release Issues board for visibility and coordination with cross-functional teams. 📢 Send Slack Notifications: Posts to the #release-updates channel summarizing compliant vs non-compliant items per version, helping the team take timely action. 🧠 Key Features 🚦 Real-time Jira readiness validation ✅ Automated DoD enforcement before release 📊 Monday.com tracker for all non-compliant issues 📢 Slack summary notifications for release teams ⚙️ Batch-wise validation for scalable QA 💼 Use Cases 🚀 Enforce Definition of Done across linked Jira stories 📦 Automate pre-release checks for every version increment 🧩 Provide visibility into blockers via Monday.com dashboard 📢 Keep engineering and QA teams aligned on release status 📦 Required Integrations Jira Software Cloud API – to monitor issue updates and retrieve details Monday.com API – to log and track non-compliant items Slack API – for real-time release alerts 🎯 Why Use This Template? ✅ Eliminates manual pre-release validation ✅ Reduces release delays due to missed criteria ✅ Keeps all stakeholders aligned on readiness status ✅ Creates a transparent audit trail of compliance
by Vadim
This workflow automates the process of generating stylized product photos for e-commerce by combining real product shots with creative templates. It enables the creation of a complete set of images for an SKU from a single product photo and a set of reusable templates. The workflow uses Google Gemini (Nano Banana) for image editing and Airtable as the data source. Example use case. An apparel brand can use this workflow to turn plain product photos (e.g., socks on a white background) into lifestyle images that match their brand aesthetic. By combining each product photo with predefined templates and reference images, the workflow generates a variety of stylized results automatically - ready for marketing or online stores. How it works This workflow expects the following Airtable table setup: "Product Images"** - contains original product photos, one per record. "Reference Images"** - contains reference images for templates, one per record. "Templates"** - contains reusable generation templates. Each template includes a text prompt and up to three reference images. "Jobs"** - contains batch generation jobs. Each job references multiple product images and multiple templates. "Results"** - contains the generated outputs. Each result includes a generated image, references to the job, product image, and template, and a status field (pending, approved, rejected). The workflow is triggered by a webhook that receives a job ID from Airtable. It then: Fetches the job record. Retrieves the associated product images and templates (each with its text prompt and reference images). Downloads all required product and reference images. For each product-template combination, sends these images and the prompt to Google Gemini to generate new AI-edited images. Saves the generated images back into Airtable. NOTE: A separate workflow should handle the human-in-the-loop approval process and any regeneration of rejected results. Requirements Airtable Personal Access Token Google Gemini API key Setup Ensure all required Airtable tables exist. Configure parameters in the parameters node: Set Airtable Base ID Set ID of the attachment field in the "Results" table (where the generated images will be uploaded) Configure credentials for all Airtable nodes. Set Google Gemini API key for the "Generate..." nodes.