by Jay Emp0
Prompt-to-Image Generator & WordPress Uploader (n8n Workflow) This workflow generates high-quality AI images from text prompts using Leonardo AI, then automatically uploads the result to your WordPress media library and returns the final image URL. It functions as a Modular Content Production (MCP) tool - ideal for AI agents or workflows that need to dynamically generate and store visual assets on-demand. βοΈ Features π§ AI-Powered Generation Uses Leonardo AI to create 1472x832px images from any text prompt, with enhanced contrast and style UUID preset. βοΈ WordPress Media Upload Uploads the image as an attachment to your connected WordPress site via REST API. βοΈ Twitter Media Upload Uploads the image to twitter so that you can post the image later on to X.com using the media_id π Returns Final URL Outputs the publicly accessible image URL for immediate use in websites, blogs, or social media posts. π Workflow-Callable (MCP Compatible) Can be executed standalone or triggered by another workflow. Acts as an image-generation microservice for larger automation pipelines. π§ Use Cases For AI Agents (MCP) Plug this into multi-agent systems as the "image generation module" Generate blog thumbnails, product mockups, or illustrations Return a clean image_url for content embedding or post-publishing For Marketers / Bloggers Automate visual content creation for articles Scale image generation for SEO blogs or landing pages Supports media upload for twitter For Developers / Creators Integrate with other n8n workflows Pass prompt and slug as inputs from any external trigger (e.g., webhook, Discord, Airtable, etc.) π₯ Inputs | Field | Type | Description | |--------|--------|----------------------------------------| | prompt | string | Text prompt for image generation | | slug | string | Filename identifier (e.g. hero-image) | Example: { "prompt": "A futuristic city skyline at night", "slug": "futuristic-city" } π€ Output { "public_image_url" : "https://your.wordpress.com/img-id", "wordpress":{...obj}, "twitter":{...obj} } π Workflow Summary Receive Prompt & Slug Via manual trigger or parent workflow execution Generate Image POST to Leonardo AI's API with the prompt and config Wait & Poll Delays 1 minute, then fetches final image metadata Download Image GET request to retrieve generated image Upload to WordPress Uses WordPress REST API with proper headers Upload to Twitter Uses Twitter Media Upload API to get the media id incase you want to post the image to twitter Return Result Outputs a clean public_image_url JSON object along with wordpress and twitter media objects π Requirements Leonardo AI account and API Key WordPress site with API credentials (media write permission) Twitter / x.com Oauth API (optional) n8n instance (self-hosted or cloud) This credential setup: httpHeaderAuth for Leonardo headers httpBearerAuth for Leonardo bearer token wordpressApi for upload π§© Node Stack Execute Workflow Trigger / Manual Trigger Code (Input Parser) HTTP Request β Leonardo image generation Wait β 1 min delay HTTP Request β Poll generation result HTTP Request β Download image HTTP Request β Upload to WordPress Code β Return final image URL πΌ Example Prompt { "prompt": "Batman typing on a laptop", "slug": "batman-typing-on-a-laptop" } Will return: { "public_image_url": "https://articles.emp0.com/wp-content/uploads/2025/07/img-batman-typing-on-a-laptop.jpg" } π§ Integrate with AI Agents This workflow is MCP-compliantβplug it into: Research-to-post pipelines Blog generators Carousel builders Email visual asset creators Trigger it from any parent AI agent that needs to generate an image based on a given idea, post, or instruction.
by Marth
How It Works: The 5-Node Monitoring Flow This concise workflow efficiently captures, filters, and delivers crucial cybersecurity-related mentions. 1. Monitor: Cybersecurity Keywords (X/Twitter Trigger) This is the entry point of your workflow. It actively searches X (formerly Twitter) for tweets containing the specific keywords you define. Function:** Continuously polls X for tweets that match your specified queries (e.g., your company name, "Log4j," "CVE-2024-XXXX," "ransomware"). Process:** As soon as a matching tweet is found, it triggers the workflow to begin processing that information. 2. Format Notification (Code Node) This node prepares the raw tweet data, transforming it into a clean, actionable message for your alerts. Function:** Extracts key details from the raw tweet and structures them into a clear, concise message. Process:** It pulls out the tweet's text, the user's handle (@screen_name), and the direct URL to the tweet. These pieces are then combined into a user-friendly notificationMessage. You can also include basic filtering logic here if needed. 3. Valid Mention? (If Node) This node acts as a quick filter to help reduce noise and prevent irrelevant alerts from reaching your team. Function:** Serves as a simple conditional check to validate the mention's relevance. Process:** It evaluates the notificationMessage against specific criteria (e.g., ensuring it doesn't contain common spam words like "bot"). If the mention passes this basic validation, the workflow continues. Otherwise, it quietly ends for that particular tweet. 4. Send Notification (Slack Node) This is the delivery mechanism for your alerts, ensuring your team receives instant, visible notifications. Function:** Delivers the formatted alert message directly to your designated communication channel. Process:* The notificationMessage is sent straight to your specified *Slack channel** (e.g., #cyber-alerts or #security-ops). 5. End Workflow (No-Op Node) This node simply marks the successful completion of the workflow's execution path. Function:** Indicates the end of the workflow's process for a given trigger. How to Set Up Implementing this simple cybersecurity monitor in your n8n instance is quick and straightforward. 1. Prepare Your Credentials Before building the workflow, ensure all necessary accounts are set up and their respective credentials are ready for n8n. X (Twitter) API:* You'll need an X (Twitter) developer account to create an application and obtain your Consumer Key/Secret and Access Token/Secret. Use these to set up your *Twitter credential** in n8n. Slack API:* Set up your *Slack credential* in n8n. You'll also need the *Channel ID** of the Slack channel where you want your security alerts to be posted (e.g., #security-alerts or #it-ops). 2. Import the Workflow JSON Get the workflow structure into your n8n instance. Import:** In your n8n instance, go to the "Workflows" section. Click the "New" or "+" icon, then select "Import from JSON." Paste the provided JSON code (from the previous response) into the import dialog and import the workflow. 3. Configure the Nodes Customize the imported workflow to fit your specific monitoring needs. Monitor: Cybersecurity Keywords (X/Twitter):** Click on this node. Select your newly created Twitter Credential. CRITICAL: Modify the "Query" parameter to include your specific brand names, relevant CVEs, or general cybersecurity terms. For example: "YourCompany" OR "CVE-2024-1234" OR "phishing alert". Use OR to combine multiple terms. Send Notification (Slack):** Click on this node. Select your Slack Credential. Replace "YOUR_SLACK_CHANNEL_ID" with the actual Channel ID you noted earlier for your security alerts. (Optional: You can adjust the "Valid Mention?" node's condition if you find specific patterns of false positives in your search results that you want to filter out.) 4. Test and Activate Verify that your workflow is working correctly before setting it live. Manual Test:** Click the "Test Workflow" button (usually in the top right corner of the n8n editor). This will execute the workflow once. Verify Output:** Check your specified Slack channel to confirm that any detected mentions are sent as notifications in the correct format. If no matching tweets are found, you won't see a notification, which is expected. Activate:** Once you're satisfied with the test results, toggle the "Active" switch (usually in the top right corner of the n8n editor) to ON. Your workflow will then automatically monitor X (Twitter) at the specified polling interval.
by Muhammad Abrar
This n8n template demonstrates how to automate the scraping of posts, comments, and sub-comments from a Facebook Group and store the data in a Supabase database. Use cases are many: Gather user engagement data for analysis, archive posts and comments for research, or monitor community sentiment by collecting feedback across discussions! Good to know At the time of writing, this workflow requires the apify api for scraping and Supabase credentials for database storage. How it works The Facebook Group posts are retrieved using an Apify scraper node. For each post, comments and sub-comments are collected recursively to capture all levels of engagement. The data is then structured and stored in Supabase, creating records for posts, comments, and sub-comments. This workflow includes the option to adjust how often it scrapes and which group to target, making it easy to automate collection on a schedule. How to use The workflow is triggered manually in the example, but you can replace this with other triggers like webhooks or scheduled workflows, depending on your needs. This workflow is capturing data points, such as user interactions or media attached to posts. Requirements Apify account API Supabase account for data storage Customizing this workflow This template is ideal for gathering and analyzing community feedback, tracking discussions over time, or archiving group content for future use.
by Atik
Simplify expense tracking with AI-powered automation that extracts receipt data and organizes it instantly. What this workflow does Watches Google Drive for new receipt uploads (images/PDFs) Automatically downloads and prepares files for processing Parses key details using the trusted VLM Run node (merchant, customer, amount, currency, date) Stores structured records in Airtable for real-time tracking Setup Prerequisites: Google Drive & Airtable accounts, VLM Run API credentials, n8n instance. Install the verified VLM Run node by searching for VLM Run in the node list, then click Install. Once installed, you can start using it in your workflows. Quick Setup: Configure Google Drive OAuth2 and create a receipt upload folder Add VLM Run API credentials Create an Airtable base with fields: Customer, Merchant, Amount, Currency, Date Update folder/base IDs in workflow nodes Test and activate How to customize this workflow to your needs Extend functionality by: Adding categories, budgets, or approval steps Syncing with accounting tools (QuickBooks, Xero) Sending Slack or email alerts for processed receipts Enabling error handling and duplicate checks This workflow eliminates manual data entry and creates a seamless, automated system that saves time and improves accuracy.
by Abhiman G S
Short description Transcribe Telegram voice/audio messages to text using Groqβs OpenAI-compatible Whisper endpoint. Replies are delivered either as a Telegram message or as a downloadable .txt file, plug-and-play for n8n with minimal setup. Whoβs it for / Uses Educators, podcasters, interviewers, and support teams who need quick voice β text conversions. Automating meeting notes, voice feedback, voicemail transcription, or speech logging. Useful when you want transcripts pushed to chat or saved as files for archiving. How it works (overview) Telegram Trigger β workflow starts on incoming message. Switch (Audio/Voice) β detects voice or audio. If neither, replies βWrong file typeβ and stops. Telegram Download β downloads the audio using the file_id, outputs file path/binary. Set Node (credentials + options) β stores Groq_API and Telegram_access_token (required) and transcript_output_format (message or file). HTTP Request β Groq (Whisper) β uploads audio (multipart/form-data) to Groqβs transcription endpoint and receives text. Reply Switch β routes to either: Message branch: send transcribed text as a Telegram message. File branch: convert transcript to .txt and send as a document. Requirements n8n instance (cloud or self-hosted) with internet access Telegram bot token (create via BotFather) Groq API key (create at https://console.groq.com/keys) Basic n8n nodes: Telegram Trigger, Switch, Telegram Download, Set, HTTP Request, Convert to File, Telegram Send Message/Document Important setup & notes Mandatory:* Add Groq_API and Telegram_access_token in the *Set** node (or use n8n Credentials). The workflow will fail without them. Do not hardcode** keys in HTTP node fields that will be exported/shared. Use Set fields or n8n Credentials. Include sticky notes explaining each node (yellow note with full description recommended). Sticky notes should show setup steps and required fields. Before publishing: remove personal IDs and secrets, test with sample voice messages, and verify Groq response schema to map the transcript field correctly. Security & best practices Use n8n Credentials or environment variables in production. Rotate API keys if they become exposed. Keep the Set node private when sharing templates; instruct users to replace keys with their own.
by Nima Salimi
π§ Automated SEO Keyword and SERP Analysis with DataForSEO for High-Converting Content | n8n workflow template Overview π This is a complete SEO automation workflow built for professionals who want to manage all their DataForSEO operations inside n8n β no coding required βοΈ You can easily choose your operator (action), such as: π SERP Analysis β Get ranking data for specific keywords π Keyword Data β Retrieve search volume, CPC, and trends π§ Competitor Research β Analyze which domains dominate target queries Once the workflow runs, it automatically creates a new Google Sheet π (if it doesnβt exist) and appends the results β including metrics like keyword, rank, domain, and date β to keep a growing historical record of your SEO data π π‘ Ideal for SEO specialists, agencies, and growth teams who want a single automation to handle all keyword and ranking data pipelines using DataForSEO + Google Sheets + n8n. Examples related keyword sheet Each operator (SERP, Keywords Data, Competitors) automatically creates a separate Google Sheet π π€ Whoβs it for? π§© SEO Specialists who need accurate keyword & SERP insights daily βοΈ Content Marketers planning new blog posts or landing pages π Digital Marketing Teams tracking top-performing keywords and competitors πΌ Agencies managing multiple websites or niches with automated reports π§ AI-Driven SEOs building GPT-powered content strategies using live ranking data βοΈ How It Works Trigger & Input Setup Start the workflow manually or schedule it to run daily / weekly π Import a keyword list from Google Sheets π, NocoDB, or an internal database Keyword Data Retrieval (DataForSEO Keyword API) Sends requests to the keywords_data endpoint of DataForSEO Gathers search volume, CPC, competition level, and trend data Identifies the most promising keywords for conversion-focused content SERP Analysis (DataForSEO SERP API) Fetches the top organic results for each keyword Extracts domains, titles, snippets, and ranking positions Highlights which competitors dominate the search landscape Data Enrichment & Filtering Uses Code nodes to clean and normalize the DataForSEO JSON output Filters out low-intent or irrelevant keywords automatically Optionally integrates OpenAI or GPT nodes for insight generation β¨ Store & Visualize Saves results into Google Sheets, Airtable, or NocoDB for tracking Each run adds fresh data, building a performance history over time π Optional AI Layer (Advanced) Use OpenAI Chat Model to summarize SERP insights: > βTop 3 competitors for cloud storage pricing focus on cost transparency β recommend including pricing tables.β Automatically generate content briefs or keyword clusters π§© Workflow Highlights β‘ Multiple DataForSEO Endpoints Supported (keywords_data, serp, competitors) π Automated Scheduling for daily / weekly updates π§ Data Normalization for clean, structured SEO metrics π Easy Export to Google Sheets or NocoDB π§© Expandable Design β integrate GPT, Google Search Console, or Analytics π Multi-Language & Multi-Location Support via language_code and location_code π Example Output (Google Sheets) | keyword | rank | domain | volume | cpc | competition | date | |----------|------|----------------|---------|---------|---------------|------------| | cloud hosting | 1 | cloud.google.com | 18,100 | $2.40 | 0.62 | 2025-10-25 | | cloud server | 3 | aws.amazon.com | 12,900 | $3.10 | 0.75 | 2025-10-25 | | hybrid cloud | 5 | vmware.com | 9,800 | $2.90 | 0.58 | 2025-10-25 | Each run appends new keyword metrics for trend and performance tracking. π‘ Pro Tips π Combine this workflow with Google Search Console for even richer insights βοΈ Adjust the location_code and language_code for local SEO targeting π¬ Add a Slack or Gmail alert to receive weekly keyword opportunity reports π€ Extend with OpenAI to automatically create content briefs or topic clusters π Integrations Used π§ DataForSEO API β Keyword & SERP data source π Google Sheets / Airtable / NocoDB β Storage and visualization π€ OpenAI Chat Model (optional) β Insight generation and summarization βοΈ Code Nodes β JSON parsing and custom data processing β Features π Choose from 100+ Locations Select your target country, region, or city using the location_code parameter. Perfect for local SEO tracking or multi-market analysis. π£οΈ Choose from 50+ Languages Define the language_code to get accurate, language-specific keyword and SERP data. Supports English (en), Spanish (es), French (fr), German (de), and more. π Auto-Creates Google Sheets for You No need to manually set up a spreadsheet β the workflow automatically creates a new Google Sheet (if it doesnβt exist) and structures it with the right columns (query, rank, domain, date, etc.). π Append New Data Automatically Every run adds fresh SEO metrics to your sheet, building a continuous daily or weekly ranking history. βοΈ Flexible Operator Selection Choose which DataForSEO operator (action) you want to run: keywords_data, serp, or competitors. Each operator retrieves a different type of SEO insight. π§ Fully Expandable Add Slack alerts, Airtable sync, or AI summaries using OpenAI β all within the same workflow. βοΈ How to Set Up π Add DataForSEO Credentials Get your API login from dataforseo.com Add it under HTTP Request β Basic Auth in n8n π Connect Google Sheets Authorize your Google account The workflow will auto-create the sheet if it doesnβt exist π Choose Operator (Action) Pick one: serp, keywords_data, or competitors Each operator runs a different SEO analysis π Set Location & Language Example: location_code: 2840 (US), language_code: en π Run or Schedule Trigger manually or set a daily schedule New results will append to your Google Sheet automatically πΊ Check Out My Channel π¬ Learn more about SEO Automation, n8n, and AI-powered content workflows π Connect with me on LinkedIn: Nima Salimi Follow for more templates, AI workflows, and SEO automation tutorials π₯
by Yaron Been
This workflow automates the entire text-to-video generation process using OpenAI's SORA-2 model via Replicate API. Simply provide a seed image, text prompt describing your desired scene, and video duration (4, 8, or 12 seconds), and the workflow handles the video creation, status monitoring, and delivery of your AI-generated content - typically ready in 2-5 minutes. Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Powers the OpenAI SORA-2 AI model for text-to-video generation OpenAI API**: Required for SORA-2 model authentication Status Monitoring**: Automated checking system with intelligent retry logic Error Handling**: Built-in resilience with comprehensive error management Batch Processing**: Optional bulk generation for multiple videos How to Install Import the Workflow: Download the .json file and import it into your n8n instance Get API Keys: Sign up at replicate.com and openai.com for your API tokens Configure API Tokens: Replace placeholders in the "Set API Token" node with your keys Add Seed Image: Update the image URL in the "Add Seed Image, Prompt and amount of seconds" node (1280x720 recommended) Write Your Prompt: Describe your desired video scene and set duration, then run the workflow Use Cases Ad Agencies**: Generate multiple video variations for A/B testing campaigns E-commerce Brands**: Create product demonstration videos from a single photo Content Creators**: Produce unique video content without filming equipment Marketing Teams**: Rapidly iterate on video ad concepts for social media Startups**: Create professional video content without expensive production costs Social Media Managers**: Generate platform-specific video content at scale Connect with Me Email**: Yaron@nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ ROASPIG**: Check out ROASPIG.com for scalable media generation and automation solutions #n8n #automation #ai #sora2 #texttovideo #openai #replicate #contentcreation #videomarketing #adcreatives
by KlickTipp
Community Node Disclaimer: This workflow uses KlickTipp community nodes. Introduction This workflow automates the end-to-end integration between Zoom and KlickTipp. It listens to Zoom webinar events (specifically meeting.ended), validates incoming webhooks, retrieves participant data from Zoom, and applies segmentation in KlickTipp by subscribing and tagging participants based on their attendance duration. This enables precise, automated campaign targeting without manual effort. How It Works Zoom Webhook Listener Captures meeting.ended events from Zoom. Validates initial webhook registration via HMAC before processing. Webhook Response Handling Distinguishes between Zoomβs URL validation requests and actual event data. Sends appropriate responses (plainToken + encryptedToken for validation, or simple status: ok for regular events). Data Retrieval Waits briefly (1 second) to ensure meeting data is available. Pulls the participant list from Zoomβs past_meetings/{uuid}/participants endpoint. Participant Processing Splits the list into individual participant items. Filters out internal users (like the host). Routes participants based on the meeting topic (e.g., AnfΓ€nger vs. Experten webinar). Attendance Segmentation Subscribes each participant to KlickTipp with mapped fields (first name, last name, email). Uses conditions to check attendance thresholds: β₯ 90% of total meeting duration β Full attendance Otherwise β General attendance Applies corresponding KlickTipp tags per meeting type. Key Features β Webhook Validation & Security with HMAC (SHA256). β Automated Attendance Calculation using participant duration vs. meeting duration. β Dynamic Routing by meeting topic for multiple webinars. β KlickTipp Integration with: Subscriber creation or update. Tagging for full vs. general attendance. β Scalable Structure for adding more webinars by extending the Switch and tagging branches. Setup Instructions Zoom Setup Enable Zoom API access and OAuth2 app credentials. Configure webhook event meeting.ended. Grant scopes: meeting:read:meeting meeting:read:list_past_participants KlickTipp Setup Prepare custom fields: Zoom | meeting selection (Text) Zoom | meeting start (Date & Time) Zoom | Join URL (URL) Zoom | Registration ID (Text) Zoom | Duration meeting (Text) Create tags for each meeting variation: attended, attended fully, not attended per meeting name. n8n Setup Add Zoom webhook node (Listen to ending Zoom meetings). Configure validation nodes (Crypto, Build Validation Body). Set up HTTP Request node with Zoom OAuth2 credentials. Connect KlickTipp nodes with your KlickTipp API credentials. Testing & Deployment End a test Zoom meeting connected to this workflow. Verify that: The webhook triggers correctly. Participant list is fetched. Internal users are excluded. Participants are subscribed and tagged in KlickTipp. Check contact records in KlickTipp for tag and field updates. π‘ Pro Tip: Use test emails and manipulate duration values to confirm segmentation logic. Customization Ideas Adjust attendance thresholds (e.g., 80% instead of 90%). Add additional meeting topics via the Switch node. Trigger email campaigns in KlickTipp based on attendance tags. Expand segmentation with more granular ranges (e.g., 0β30%, 30β60%, 60β90%). Add error handling for missing Zoom data or API failures. Resources: Use KlickTipp Community Node in n8n Automate Workflows: KlickTipp Integration in n8n
by Milan Vasarhelyi - SmoothWork
Video Introduction Want to automate your inbox or need a custom workflow? π Book a Call | π¬ DM me on Linkedin Overview This workflow automatically exports customer balance data from QuickBooks to Google Sheets on a monthly basis. It eliminates manual data entry and creates a historical record of customer balances that updates automatically, making it easy to track payment trends, identify outstanding balances, and monitor customer financial health over time. Key Features Automated Monthly Reporting**: Runs on the first day of each month to capture a snapshot of all customer balances Clean Data Structure**: Extracts only the essential fields (Customer ID, Balance, Email, and Period) for easy analysis Historical Tracking**: Each monthly run appends new data to your Google Sheet, building a timeline of customer balances No Manual Work**: Once configured, the workflow runs completely hands-free Common Use Cases Track customer payment patterns and identify accounts with growing balances Create monthly reports for management or finance teams Build dashboards and visualizations from historical QuickBooks data Monitor customer account health without logging into QuickBooks Setup Requirements QuickBooks Developer Account: Register at developer.intuit.com and create a new app in the App Dashboard. Select the 'Accounting' scope for permissions. You'll receive a Client ID and Client Secret to configure your n8n credentials. Credentials: Set up QuickBooks OAuth2 credentials in n8n using your app's Client ID and Client Secret. Use 'Sandbox' environment for testing or 'Production' for live data (requires Intuit app approval). Also connect your Google Sheets account. Google Sheet: Create a spreadsheet with column headers matching the workflow output: Period, Id, Balance, and Email. Configuration Schedule**: The workflow runs monthly on the first day at 8 AM. Modify the Schedule Trigger to change timing or frequency Spreadsheet URL**: Update the 'Export to Google Sheets' node with your destination spreadsheet URL Data Fields**: Customize the 'Prepare Customer Data' node to extract different customer fields if needed
by Alexandru Burca
AI-powered automation that rewrites, enhances, and publishes Telegram RSS content directly to your WordPress news site, including image/video upload and category mapping. Whoβs it for This workflow is designed for content creators, news publishers, and social media managers who share updates on Telegram and want to automatically republish them as formatted articles on WordPress. Itβs ideal for news portals, agencies, or blogs that manage content across multiple channels. How it works / What it does Fetches new posts from a Telegram channel feed (via RSS). Uses OpenAI to rewrite the text into a polished news-style article. Detects whether the content includes images or videos and downloads them. Uploads the media to WordPress and links it to the post. Automatically publishes the formatted article to WordPress with the correct category and excerpt. Set up steps Setup takes around 10β15 minutes. Youβll need API keys for OpenAI and WordPress Application Passwords. Add your Telegram RSS feed URL and WordPress site URL in the relevant nodes. (Optional) Adjust tone or rewrite style in the OpenAI node and category mapping in the Switch node. All configuration details are included in sticky notes inside the workflow. Requirements WordPress site with REST API access and Application Password OpenAI API key Telegram channels RSS URL How to customize the workflow You can easily adjust the writing style in the OpenAI node, change categories in the Switch node, or schedule how often the workflow checks Telegram for new posts.
by Grace Gbadamosi
How it works This workflow automatically monitors your Google Business Profile for new reviews and uses AI to generate personalized response suggestions. When a review is detected, the system formats the review data, generates an appropriate AI response based on the rating and content, sends differentiated Slack notifications (urgent alerts for negative reviews, celebration messages for positive ones), and logs everything to Google Sheets for tracking and analysis. Who is this for Local business owners, restaurant managers, retail store operators, service providers, and reputation management teams who want to stay on top of customer feedback and respond promptly with thoughtful, AI-generated responses. Perfect for businesses that receive regular reviews and want to maintain consistent, professional customer engagement without manually monitoring multiple platforms. Requirements Google Business Profile**: Active business profile with review monitoring enabled Google API Credentials**: Service account with access to Business Profile API and Sheets API Slack Webhook**: Incoming webhook URL for team notifications Google Sheets**: Spreadsheet with "Reviews" sheet for logging review data Environment Variables**: Setup for secure credential storage Basic n8n Knowledge**: Understanding of triggers, expressions, and credential management How to set up Configure Google Business Profile API - Create Google Cloud project, enable Business Profile API, set up service account credentials, and add your Business Account ID and Location ID to environment variables Prepare Google Sheets Integration - Create Google Sheet with "Reviews" sheet, add required headers, set GOOGLE_SHEET_ID environment variable, and ensure service account has edit access Setup Slack Notifications - Create Slack webhook in your workspace and set SLACK_WEBHOOK_URL environment variable Customize Business Settings - Update Business Configuration node with your business name and adjust AI response tone preferences How to customize the workflow Modify the Business Configuration node to change your business name, adjust the AI response tone (professional, friendly, casual), customize Slack notification messages in the HTTP Request nodes, or add additional review sources by duplicating the trigger structure.
by inderjeet Bhambra
This workflow automates AI-powered image and video generation using MagicHour.ai's API, enhanced by GPT-4.1 for intelligent prompt optimization. It processes webhook requests, refines prompts using AI, generates media content, and returns the final output. Who's it for Content creators, marketers, social media managers, and developers who need automated AI media generation at scale. Perfect for teams building applications that require on-demand image or video creation without manual intervention. How it works The workflow receives a webhook POST request containing generation parameters (type, orientation, style, duration). GPT-4.1 analyzes and optimizes the user's prompt based on the request type (image or video), then sends it to MagicHour.ai's API. The workflow monitors the generation status through polling loops, downloads the completed media, and returns it via webhook response. Error handling ensures failed requests are captured and reported. Requirements n8n instance** (self-hosted or cloud) MagicHour.ai account** with API access (Bearer token) OpenAI API account** for GPT-4.1 access Basic understanding of webhooks and JSON How to set up Configure credentials: Add MagicHour.ai Bearer token in HTTP Request nodes (ai-image-generator, text-to-video, Get Image Details, Get Video Details) Add OpenAI API credentials in both Generate Image Prompt and Generate video Prompt nodes Activate the workflow: Enable the workflow to activate the webhook endpoint Copy the webhook URL from the Webhook trigger node Test the workflow: Download the n8n-magichour HTML tester Click here to download For image generation, send a POST request with this structure: { "action": "generate", "type": "image", "parameters": { "name": "My Image", "image_count": 1, "orientation": "landscape", "style": { "prompt": "A serene mountain landscape at sunset", "tool": "realistic" } } } For video generation, use: { "action": "generate", "type": "video", "parameters": { "name": "My Video", "end_seconds": 5, "orientation": "landscape", "resolution": "1080p", "style": { "prompt": "A dog running through a field" } } } How to customize the workflow Adjust AI prompt optimization: Modify the system prompts in Generate Image Prompt or Generate video Prompt nodes to change how GPT-4.1 refines user inputs. Current prompts enforce strict character limits and avoid unauthorized content. Change polling intervals: Modify the Wait nodes to adjust how frequently the workflow checks generation status (useful for longer video renders). Modify response format: Update the Respond to Webhook node to customize the output structure sent back to the caller. Add multiple output formats: Extend Download Image/Video nodes to save files to cloud storage (Google Drive, S3) instead of just returning via webhook. Implement queue management: Add a database node before MagicHour.ai calls to queue requests and prevent API rate limiting.