by Lakshit Ukani
Automated Instagram posting with Facebook Graph API and content routing Who is this for? This workflow is perfect for social media managers, content creators, digital marketing agencies, and small business owners who need to automate their Instagram posting process. Whether you're managing multiple client accounts or maintaining consistent personal branding, this template streamlines your social media operations. What problem is this workflow solving? Manual Instagram posting is time-inconsistent and prone to inconsistency. Content creators struggle with: Remembering to post at optimal times Managing different content types (images, videos, reels, stories, carousels) Maintaining posting schedules across multiple accounts Ensuring content is properly formatted for each post type This workflow eliminates manual posting, reduces human error, and ensures consistent content delivery across all Instagram format types. What this workflow does The workflow automatically publishes content to Instagram using Facebook's Graph API with intelligent routing based on content type. It handles image posts, video stories, Instagram reels, carousel posts, and story content. The system creates media containers, monitors processing status, and publishes content when ready. It supports both HTTP requests and Facebook SDK methods for maximum reliability and includes automatic retry mechanisms for failed uploads. Setup Connect Instagram Business Account to a Facebook Page Configure Facebook Graph API credentials with instagram_basic permissions Update the "Configure Post Settings" node with your Instagram Business Account ID Set media URLs and captions in the configuration section Choose post type (http_image, fb_reel, http_carousel, etc.) Test workflow with sample content before going live How to customize this workflow to your needs Modify the post_type variable to control content routing: Use http_* prefixes for direct API calls Use fb_* prefixes for Facebook SDK calls Use both HTTP and Facebook SDK nodes as fallback mechanisms** - if one method fails, automatically try the other for maximum success rate Add scheduling by connecting a Cron node trigger Integrate with Google Sheets or Airtable for content management Connect webhook triggers for automated posting from external systems Customize wait times based on your content file sizes Set up error handling** to switch between HTTP and Facebook SDK methods when API limits are reached
by Alex Kim
Weather via Slack 🌦️💬 Overview This workflow provides real-time weather updates via Slack using a custom Slack command: /weather [cityname] Users can type this command in Slack (e.g., /weather New York), and the workflow will fetch and post the latest forecast, including temperature, wind conditions, and a short weather summary. While this workflow is designed for Slack, users can modify it to send weather updates via email, Discord, Microsoft Teams, or any other communication platform. How It Works Webhook Trigger – The workflow is triggered when a user runs /weather [cityname] in Slack. Geocoding with OpenStreetMap – The city name is converted into latitude and longitude coordinates. Weather Data from NOAA – The coordinates are used to retrieve detailed weather data from the National Weather Service (NWS) API. Formatted Weather Report – The workflow extracts relevant weather details, such as: Temperature (°F/°C) Wind speed and direction Short forecast summary Slack Notification – The weather forecast is posted back to the Slack channel in a structured format. Requirements A custom Slack app with: The ability to create a Slash Command (/weather) OAuth permissions to post messages in Slack An n8n instance to host and execute the workflow Customization Replace Slack messaging with email, Discord, Microsoft Teams, Telegram, or another service. Modify the weather data format for different output preferences. Set up scheduled weather updates for specific locations. Use Cases Instantly check the weather for any location directly in Slack. Automate weather reports for team members or projects. Useful for remote teams, outdoor event planning, or general weather tracking. Setup Instructions Create a custom Slack app: Go to api.slack.com/apps and create a new app. Add a Slash Command (/weather) with the webhook URL from n8n. Enable OAuth scopes for sending messages. Deploy the webhook – Ensure it can receive and process Slack commands. Run the workflow – Type /weather [cityname] in Slack and receive instant weather updates.
by Niranjan G
How it works This workflow acts like your own personal AI assistant, automatically fetching and summarizing the most relevant Security, Privacy, and Compliance news from curated RSS feeds. It processes only the latest articles (past 24 hours), organizes them by category, summarizes key insights using AI, and delivers a clean HTML digest straight to your inbox—saving you time every day. Key Highlights Handles three independent tracks: Security, Privacy, and Compliance Processes content from customizable RSS sources (add/remove easily) Filters fresh articles, removes duplicates, and sorts by recency Uses AI to summarize and format insights in a digestible format Sends polished HTML digests via Gmail—one per category Fully modular and extensible—adapt it to your needs Personalization You can easily tailor the workflow: 🎯 Customize feeds: Add or remove sources in the following Code nodes: Fetch Security RSS, Fetch Privacy Feeds, and Fetch Compliance Feeds 🔧 Modify logic: Adjust filters, sorting, formatting, or even AI prompts as needed 🧠 Bring your own LLM: Works with Gemini, but easily swappable for other LLM APIs Setup Instructions Requires Gmail and LLM (e.g., Gemini) credentials Prebuilt with placeholders for RSS feeds and email output Designed to be readable, maintainable, and fully adaptable
by Lucas Peyrin
How it works This workflow automates the process of checking for and applying updates to a self-hosted n8n instance running on Docker. It runs on a schedule, checks for new versions, summarizes the release notes with AI, and asks for your approval via Telegram before updating. Scheduled Check: The workflow runs hourly, triggered by a Schedule node. Version Discovery: It first confirms it's running in a Docker environment. It uses SSH to connect to the host machine and inspects the running n8n container to find its current version tag (e.g., latest or next). It then queries the Docker Hub API to compare the image digest (a unique ID for an image version) of the running version against the latest available version for that tag. Update Detection: If the digests do not match, it means a new image has been pushed for your version tag (e.g., a new latest image is available), and an update is needed. AI-Powered Release Notes: It fetches the official release notes for the new version from the GitHub API. An AI model (LLM) summarizes these technical notes into a concise, human-readable overview of the key features and fixes. Manual Approval: It sends a message to a Telegram chat with the AI-generated summary and two buttons: "✅ Update" and "❌ Ignore". The workflow then pauses and waits for your response. Execute Update: If you approve the update, the workflow uses SSH to run a docker compose command on your server, which pulls the new image, stops the old containers, and starts the new ones. Set up steps Setup time: ~5-10 minutes SSH Credentials: Go to Credentials and create a new SSH credential with the username, host, and password/private key for the server where your n8n Docker instance is running. Select this credential in the Get n8n Current Version and Update Docker nodes. Telegram Bot Credentials: Create a Telegram Bot and get its API token. Go to Credentials and create a new Telegram credential with your bot's token. Select this credential in the Send a text message node. AI Model Credentials: Ensure you have credentials for an AI provider (like Google AI, OpenAI, etc.) set up. Select your desired credential in the Google Gemini Chat Model node (or replace it with your preferred LLM node). Configure Paths and Commands: Open the Docker Path node. Set the docker_path to the absolute path of your docker-compose.yml file on the server (e.g., /root/n8n). If you use workers, adjust the worker_command to include the correct --scale argument for your setup. If not, you can leave it blank. Set Your Chat ID: Open the Approve Update Telegram node and enter your personal Telegram Chat ID in the Chat ID field. This ensures the approval message is sent to you. Activate the workflow. It will now check for updates every hour. To enable fully automatic updates (without manual approval): Delete the nodes from Get n8n Releases to Approved ? and connect the Needs Update ? node directly to the Update Docker node.
by Falk
How it works Collects articles from your preferred RSS feeds. Rates and tags each article using an AI model (e.g., QWEN 14B-s4), filtering for relevance and quality. Summarizes high-rated articles with a language model (e.g., Gemma3 4B) for quick, digestible reading. Checks for duplicates to avoid sending the same article twice. Formats and sends the top articles as an HTML newsletter via Gmail, using OAuth2 authentication. Stores records in a Postgres database, tracking which articles have been sent and their ratings. Requirements Postgres Account AI Models (if you work localy use Ollama) In the cloud you have to change Ollama node to your prefered Model Node RSS Feed of your desire Google Auth2, if you want to use Gmail Recommendations Use n8n local version for this workflow Here are some more informations: https://github.com/falks-ai-workbench/n8n_newsletter
by Jimleuk
This n8n template shows how anyone can build a simple newsletter-like subscription service where users can enrol themselves to receive messages/content on a regular basis. It uses n8n forms for data capture, Airtable for database, AI for content generation and Gmail for email sending. How it works An n8n form is setup up to allow users to subscribe with a desired topic and interval of which to recieve messages via n8n forms which is then added to the Airtable. A scheduled trigger is executed every morning and searches for subscribers to send messages for based on their desired intervals. Once found, Subscribers are sent to a subworkflow which performs the text content generation via an AI agent and also uses a vision model to generate an image. Both are attached to an email which is sent to the subscriber. This email also includes an unsubscribe link. The unsubscribe flow works similarly via n8n form interface which when submitted disables further scheduled emails to the user. How to use Make a copy of sample Airtable here: https://airtable.com/appL3dptT6ZTSzY9v/shrLukHafy5bwDRfD Make sure the workflow is "activated" and the forms are available and reachable by your audience. Requirements Airtable for Database OpenAI for LLM (but compatible with others) Gmail for Email (but can be replaced with others) Customising this workflow This simple use can be extended to deliver any types of content such as your company newsletter, promotions, social media posts etc. Doesn't have to be limited to just email - try social messaging, Whatsapp, Telegram and others.
by Fabrizio Terzi
AI-Driven Handbook Generator with Multi-Agent Orchestration (Pyragogy AI Village) This n8n workflow is a modular, multi-agent AI orchestration system designed for the collaborative generation of Markdown-based handbooks. Inspired by peer learning and open publishing workflows, it simulates a content pipeline where specialized AI agents act in defined roles, enabling true AI–human co-creation and iterative refinement. This project is a core component of Pyragogy, an open framework dedicated to ethical cognitive co-creation, peer AI–human learning, and human-in-the-loop automation for open knowledge systems. It implements the master orchestration architecture for the Pyragogy AI Village, managing a complex sequence of AI agents to process input, perform review, synthesis, and archiving, with a crucial human oversight step for final approval. How It Works: A Deep Dive into the Workflow's Architecture The workflow orchestrates a sophisticated content generation and review process, ideal for creating AI-driven knowledge bases or handbooks with human oversight. Webhook Trigger & Input:* The process begins when the workflow receives a JSON input via a *Webhook** (specifically at /webhook/pyragogy/process). This input typically includes details like the handbook's title, initial text, and relevant tags. Database Verification:* It first verifies the connection to a *PostgreSQL database** to ensure data persistence. Meta-Orchestrator:* A powerful *Meta-Orchestrator** (powered by gpt-4o from OpenAI) analyzes the initial request. Its role is to dynamically determine and activate the optimal sequence of specialized AI agents required to fulfill the input, ensuring tasks are dynamically routed and assigned based on each agent’s responsibility. Agent Execution & Iteration:** Each activated agent executes its step using OpenAI or custom endpoints. This involves: Content Generation: Agents like the Summarizer and the Synthesizer generate new content or refine existing text. Peer Review Board: A crucial aspect is the Peer Review Board, comprised of AI agents like the Peer Reviewer, the Sensemaking Agent, and the Prompt Engineer. This board evaluates the output for quality, coherence, and accuracy. Reprocessing & Redrafting: If the review agents flag a major_issue, they trigger redrafting loops by generating specific feedback for the Synthesizer. This mechanism ensures iterative refinement until the content meets the required standards. Human-in-the-Loop (HITL) Review:* For final approval, particularly for the Archivist agent's output, a *human review process* is initiated. An email is sent to a human reviewer, prompting them to approve, reject, or comment via a "Wait for Webhook" node. This ensures *human oversight** and quality control. Content Persistence & Versioning:** If the content is approved by the human reviewer: It's saved to a PostgreSQL database (specifically to the handbook_entries and agent_contributions tables). Optionally, the content can be committed to a GitHub repository for version control, provided the necessary environment variables are configured. Notifications:* The final output and the sequence of executed agents can be sent as a notification to *Slack**, if configured. Observe the dynamic loop: orchestrate → assign → generate → review (AI/human) → store Included AI Agents This workflow leverages a suite of specialized AI agents, each with a distinct role in the content pipeline: Meta-Orchestrator:** Determines the optimal sequence of agents to execute based on the input. Summarizer Agent:** Summarizes text into key points (e.g., 3 key points). Synthesizer Agent:** Synthesizes new text and effectively incorporates reprocessing feedback from review agents. Peer Reviewer Agent:** Reviews generated text, highlighting strengths, weaknesses, and suggestions, and indicates major_issue flags. Sensemaking Agent:** Analyzes input within existing context, identifying patterns, gaps, and areas for improvement. Prompt Engineer Agent:** Refines or generates prompts for subsequent agents, optimizing their output. Onboarding/Explainer Agent:** Provides explanations of the process or offers guidance to users. Archivist Agent:** Prepares content for the handbook, manages the human review process, and handles archiving to the database and GitHub. Setup Steps & Prerequisites To get this powerful workflow up and running, follow these steps: Import the Workflow: Import the pyragogy_master_workflow.json (or generate-collaborative-handbooks-with-gpt4o-multi-agent-orchestration-human-review.json) into your n8n instance. Connect Credentials: Postgres: Set up a Postgres Pyragogy DB credential (ID: pyragogy-postgres). OpenAI: Configure an OpenAI Pyragogy credential (ID: pyragogy-openai) for all OpenAI agents. GPT-4o is highly suggested for optimal performance. Email Send: Set up a configured email credential (e.g., for sending human review requests). Define Environment Variables: Define essential environment variables (an .env.template is included in the repository). These include: API base for OpenAI. Database connection details. (Optional) GitHub: For content persistence and versioning, configure GITHUB_ACCESS_TOKEN, GITHUB_REPOSITORY_OWNER, and GITHUB_REPOSITORY_NAME. (Optional) Slack: For notifications, configure SLACK_WEBHOOK_URL. Send a sample payload to your webhook URL (/webhook/pyragogy/process): { "title": "History of Peer Learning", "text": "Peer learning is an educational approach where students learn from and with each other...", "tags": ["education", "pedagogy"], "requireHitl": true } Ideal For This workflow is perfectly suited for: Educators and researchers exploring AI-assisted publishing and co-authoring with AI. Knowledge teams looking to automate content pipelines for internal or external documentation. Anyone building collaborative Markdown-driven tools or AI-powered knowledge bases. Documentation & Contributions: An Open Source and Collaborative Project This workflow is an open-source project and community-driven. Its development is transparent and open to everyone. We warmly invite you to: Review it:** Contribute your analysis, identify potential improvements, or report issues. Remix it:** Adapt it to your specific needs, integrate new features, or modify it for a different use case. Improve it:** Propose and implement changes that enhance its efficiency, robustness, or capabilities. Share it back:** Return your contributions to the community, either through pull requests or by sharing your implementations. Every contribution is welcome and valued! All relevant information for verification, improvement, and collaboration can be found in the official repository: 🔗 GitHub – pyragogy-handbook-n8n-workflow
by Romain Jouhannet
This workflow imports Productboard data into Snowflake, automating data extraction, mapping, and updates for features, companies, and notes. It supports scheduled weekly updates, data cleansing, and Slack notifications summarizing the latest insights. Features Fetches data from Productboard (features, companies, notes). Maps and processes data for Snowflake tables. Automates table creation, truncation, and updates. Summarizes new and unprocessed notes. Sends weekly Slack notifications with key insights. Setup Configure Productboard and Snowflake credentials in n8n. Update Snowflake table schemas to match your setup. Replace Slack channel ID and dashboard URL in the notification node. Activate the workflow and set the desired schedule.
by Francis Njenga
Workflow Documentation: HR Job Posting and Evaluation with AI Detailed Description The HR Job Posting and Evaluation with AI workflow is designed to streamline and enhance recruitment for technical roles, such as Automation Specialists. By automating key stages in the hiring process, this workflow ensures a seamless experience for both candidates and HR teams. From collecting applications to evaluating candidates using AI and scheduling interviews, this workflow provides an end-to-end solution for recruitment challenges. Who is this for? This workflow is ideal for: HR Professionals**: Managing multiple job postings and candidates efficiently. Recruitment Teams**: Handling large volumes of applications for technical positions. Hiring Managers**: Ensuring structured and objective candidate evaluations. What problem does this workflow solve? Time-Consuming Processes**: Automates repetitive tasks like data entry, CV management, and scheduling. Fair Candidate Evaluation**: Leverages AI to provide objective insights based on resumes and job descriptions. Streamlined Communication**: Ensures timely and personalized candidate interactions, improving their experience. What this workflow does This workflow automates the following steps: Form Submission: Collects candidate information via a structured application form. Data Storage: Stores applicant details in Airtable for centralized tracking. CV Management: Automatically uploads resumes to Google Drive for easy access and organization. AI-Powered Candidate Evaluation: Scores candidates based on their resumes and job descriptions using OpenAI, providing actionable insights. Interview Scheduling: Automates scheduling based on candidate and interviewer availability. Communication: Sends customized emails to candidates for interview invitations and feedback. Setup Prerequisites To use this workflow, you’ll need: n8n Account**: To create and run the workflow. Airtable Account**: For managing applicant data. Google Drive Account**: For storing candidate CVs. OpenAI API Key**: For AI-powered candidate scoring. SMTP Email Account**: For sending candidate communications. Setup Process Airtable Configuration: Create a base in Airtable with tables for Applicants and Job Positions. Google Drive Setup: Create a folder for CV storage and ensure you have write permissions. Integrate Airtable in n8n: Use the Airtable API key to connect Airtable to n8n. Integrate Google Drive in n8n: Authorize Google Drive to enable CV storage automation. OpenAI Integration: Add your OpenAI API key to n8n for candidate scoring. Email Configuration: Set up your SMTP email account in n8n for sending notifications and invitations. How to customize this workflow Tailor the workflow to fit your unique recruitment needs: Edit Job Descriptions: Adjust the form parameters to match the specific role and qualifications. Refine AI Evaluation Criteria: Modify OpenAI prompts to reflect the skills and competencies for the desired position. Personalize Email Templates: Update email content to match your organization’s tone and branding. Add New Features: Incorporate additional steps like feedback collection or integration with other HR tools. Conclusion The HR Job Posting and Evaluation with AI workflow simplifies and automates the recruitment process, enabling HR teams to focus on engaging with candidates rather than handling administrative tasks. With its powerful integrations and customization options, this workflow helps organizations hire efficiently while improving the candidate experience.
by Geekaz / Kazen
Who is this for? This template is designed for social media managers, content creators, data analysts, and anyone who wants to automatically save and analyze their Meta Threads posts in Notion. It’s particularly useful for: Building a personal archive of your Threads content. Training AI models using your social media data. Tracking your online presence and engagement. What this workflow does This workflow uses the Meta Threads API to automatically retrieve your posts and import them into a Notion database. It retrieves the post content, date, and time, and stores them in designated properties within your Notion database. Setup Get Threads Access Token and ID: Obtain a long-lived access token and your Threads ID from the Meta Threads developer platform. This token auto-refreshes, ensuring uninterrupted workflow operation. Configure Credentials and Date Range: In the “Set Credentials” node (using edit fields), enter your token and ID. Set the since and until parameters in the “Set Date Range” node to specify the post import period. Connect to Notion and Create a Database: Connect to your Notion workspace and create a database with these properties (customize with the “Create Properties” node): a. Title: Threads post URL (Notion entry title). b. Threads ID: Unique post ID (prevents duplicate imports). c. Username: Post author (for future multi-account/source management). d. Post Date: Original post date. e. Source (Multi-Select): “Threads” tag (for future multi-platform import and filtering). f. Created: Import date and time. g. Import Check (Optional): For use with a separate post-categorization workflow.
by Matthieu
LinkedIn Profile Tracker Automation Who is this for? This template is ideal for sales teams, recruiters, business development professionals, and relationship managers who need to monitor changes in their network's LinkedIn profiles. Perfect for agencies tracking client personnel changes, HR teams monitoring talent movements, sales professionals staying updated on prospect job changes, and content teams tracking influencer activity. What problem does this workflow solve? Manually checking LinkedIn profiles for updates like job changes, status modifications, profile edits, or latest posts is extremely time-consuming and easy to miss. This automation eliminates the need for constant manual monitoring while ensuring you never miss important changes that could signal new business opportunities, relationship updates, or content engagement opportunities. What this workflow does This workflow automatically monitors a list of LinkedIn profiles on a weekly schedule, detects any changes in: Personal information** (name, headline, summary) Job status** (hiring/open to work flags) Latest work experience** (new positions, company changes) Recent posts** (latest content activity) When changes are detected, it immediately sends Slack notifications with before/after comparisons and updates your tracking database to maintain historical records of all profile evolution. Setup Create a Ghost Genius API account and get your API key for LinkedIn profile scraping Configure HTTP Request nodes with Header Auth credentials using your Ghost Genius API key Set up your Google Sheets database with columns: Firstname, Lastname, LinkedIn URL, ID Tagline, Summary, Latest experience Open to work?, Hiring?, Latest post Configure Slack webhook integration for real-time notifications Set up credentials for Google Sheets and Slack following n8n documentation Add LinkedIn profile URLs to your Google Sheet to start monitoring How to customize this workflow Modify the schedule trigger** to check profiles daily, bi-weekly, or monthly based on your monitoring needs Customize Slack notification messages** to include additional context, mentions, or custom formatting Add email notifications** alongside Slack alerts for critical changes like job transitions Set up filtered notifications** to only alert on specific types of changes (e.g., job changes only, posts from key influencers) Add post content analysis** to detect mentions of your company or competitors Integrate with CRM systems** to automatically update lead records when profile changes occur
by Mohan Gopal
🎥 AI Tour Video Generator with GPT-4o, RunwayML & ElevenLabs for Social Media' This n8n workflow generates 20-second faceless videos for social media by combining AI-generated images, audio, and video clips for a given tour destination. The output is a ready-to-publish video file, which can be pushed to social platforms and logged in a tracking sheet. ⚙️ Workflow Overview This system is divided into 4 main sections: 🧠 Generate Image Prompts 🎨 Generate Media (Images, Videos, Audio) 🛠️ Render & Upload 📈 Future Enhancements 🔌 Integration Setup Table | Integration | Service Used | Setup Instruction | |--------------------|----------------------------|------------------------------------------------------------------------------------| | OpenAI | GPT-4o (Prompt Generation) | Get API Key and configure in n8n | | Google Sheet | Idea I/O tracking | Connect Google account with OAuth/Credentials in n8n | | Piapia API | AI Image Generation | Sign up at piapia.ai and get API key | | Runway API | AI Video Generation | Register at runwayml.com for access | | ElevenLabs | AI Voice Generation | Sign up at elevenlabs.io for API key | | CreateMate API | Render Final Video | Visit createmate.ai to access API | | Google Drive | Upload/Share Final Video | Use n8n Google Drive node to configure credentials | ✅ Required Services & Tools Ensure you have active accounts with the following tools and services: ✅ OpenAI (GPT-4o + Embeddings) ✅ Google Sheets (for destination ideas and tracking) ✅ Piapia API (Image generation) ✅ RunwayML API (Video generation) ✅ ElevenLabs API (Voiceover TTS) ✅ Google Drive (Storage & Sharing) ✅ CreateMate (Video Rendering) ✅ Social Media Scheduler (Optional - Zapier, Buffer, Make.com) 🧠 1. Generate Image Prompts > Purpose: Prepares the content idea and generates visual prompts. | Step | Node Name | Function | |--------------|------------------------|-----------------------------------------------| | 🔁 Trigger | Schedule or Manual | Starts the workflow | | 📥 Grab Idea | Read Sheet | Pulls destination idea from Google Sheet | | ✍️ Set Content | Manual Input | Adds structure/narrative to the idea | | 🔀 Split | Split Out | Breaks input into chunks | | 🤖 GPT Agent | Image Prompt Agent | Uses GPT-4o to generate creative image prompts| | 🧹 Clean | Remove \n | Cleans up formatting | | 📌 Save | Set Prompts | Finalizes prompts for next stage | 🖼️ 2. Generate Media 🎨 Generate Images | Step | Function | |----------------|-----------------------------------------------------------| | Generate Image | Calls Piapia API with AI-generated prompts | | Wait | Adds delay for rendering (90 sec) | | Get Images | Retrieves final images for video | 🎥 Generate Videos | Step | Function | |----------------|-----------------------------------------------------------| | Generate Video | Calls RunwayML to generate video clips from the prompts | | Wait | 2-minute delay to allow video generation | | Get Videos | Fetches completed video clips | 🔊 Generate Audio | Step | Function | |------------------|----------------------------------------------------------| | Update Status | Logs progress in Google Sheet | | Sound Agent | Gemini or GPT generates narration text | | Set Audio | Formats narration for voice synthesis | | Generate Audio | Uses ElevenLabs for realistic voiceover | | Upload to Drive | Saves final audio to Google Drive | | Share File | Creates sharable URL for audio file | 🛠️ 3. Render & Upload > Purpose: Combines all elements (image, video, audio) into a single output and prepares for social media. | Step | Function | |-----------------|----------------------------------------------------------------| | Merge | Combines images, videos, and audio | | Split Out Parts | Breaks content for rendering | | Render Video | Uses CreateMate to render the final 20-second video | | Wait | Short delay to complete rendering | | Download Video | Saves output video locally or on Drive | | Update Sheet | Logs final video URL/status in Google Sheet | | Social Upload | (Coming Soon) Post to Instagram, YouTube Shorts, TikTok, etc. | 🧩 Pre-Conditions Before running the workflow: ✅ Google Sheet should be created with destination ideas ✅ All API keys must be configured in n8n ✅ Google Drive folder must exist for output videos ✅ Sufficient credit/quota must be available on AI platforms ✅ Internet access must be stable for external API calls 🚀 Outcome A polished 20-second travel destination video Combines AI visuals, short clips, and AI narration Ready for instant social media upload Fully automated** from idea to video file 🧠 Tech Stack Summary | Component | Tools Used | |-----------------|-------------------------------| | Language Model | GPT-4o (OpenAI), Gemini (Google) | | Image Generator | Piapia API | | Video Generator | RunwayML | | Audio Generator | ElevenLabs | | Storage | Google Drive | | Video Composer | CreateMate API | | Orchestration | n8n | 📈 Future Enhancements ✅ Smart Enhancements Dynamic hashtags & captions via AI Auto-post to TikTok, Instagram, YouTube via Buffer/Zapier Scene detection + matching B-roll Multilingual narration (e.g., Arabic, French, Malay) A/B testing of video versions to analyze performance 🧪 Testing Add-ons Add preview screen before upload Error tracking & retry flow Manual override before publishing 🧰 Customization Guide | Element | How to Customize | |----------------------|-------------------------------------------------------------------| | ✏️ Prompt Format | Change structure inside Set Content or Prompt Agent | | 🌍 Destination Ideas | Modify Google Sheet for different destinations/categories | | 🎨 Image Style | Customize prompt to Piapia (e.g., “in Pixar style”, “3D render”) | | 🎙️ Voiceover Script | Adjust tone/structure in the Sound Agent | | 📆 Posting Schedule | Use Zapier/Buffer for timed posting | | 🎯 Target Duration | Adjust number of clips or frame duration | 🙌 Community Value This workflow is ideal for: 📸 Travel content creators 🌍 Destination marketers 🏛️ Tourism boards 🧳 Travel SMEs looking for automation Feel free to fork, remix, or request a JSON export in the comments below!