by David Ashby
Complete MCP server exposing 9 NPR Listening Service API operations to AI agents. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Credentials Add NPR Listening Service credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the NPR Listening Service API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://listening.api.npr.org • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (9 total) 🔧 V2 (9 endpoints) • GET /v2/aggregation/{aggId}/recommendations: Get a set of recommendations for an aggregation independent of the user's lis... • GET /v2/channels: List Available Channels • GET /v2/history: Get User Ratings History • GET /v2/organizations/{orgId}/categories/{category}/recommendations: Get a list of recommendations from a category of content from an organization • GET /v2/organizations/{orgId}/recommendations: Get a variety of details about an organization including various lists of rec... • GET /v2/promo/recommendations: Get Recent Promo Audio • POST /v2/ratings: Submit Media Ratings • GET /v2/recommendations: Get User Recommendations • GET /v2/search/recommendations: Get Search Recommendations 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native NPR Listening Service API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Pixril
Who is this for? This workflow is designed for social media managers, marketing agencies, and business owners who want to automate their Facebook and Instagram posting without losing quality control. It is perfect if you manage content in Google Sheets and want AI to write your captions, but you still need a human to review the final post before it goes live to your audience. What this workflow does This workflow acts as a complete social media auto-poster. It reads your product inventory from Google Sheets, finds items that haven't been promoted recently, and uses OpenAI to write a professional, engaging caption. Before publishing, the workflow pauses and sends you a simple web form. Once you click "Approve," it automatically publishes the post to Facebook and Instagram, then logs the successful post back in your Google Sheet. Key Features Smart Google Sheets Tracking:** Uses your spreadsheet as an inventory tracker to ensure you never over-promote the same product twice. AI Copywriting:** Automatically drafts platform-optimized, emoji-free captions based on your product data. Approval Gate:** Pauses the automation to let you manually review and approve the AI-generated content via an n8n web form. Multi-Platform Auto-Posting:** Connects directly to the Facebook Graph API to publish to your Facebook Page and Instagram Business Account. How it works Trigger: Open the n8n form and choose to promote a specific product or a general agency pitch. Selection: The workflow scans your Google Sheet and picks the item with the lowest "Times Posted" count. Generation: OpenAI writes a tailored caption for the selected item. Review: You receive an interactive prompt to review the text and image URL. Publishing: Upon approval, it posts instantly to Meta (Facebook/Instagram) and increments the post count in your Google Sheet. Set up steps Estimated time: 10 minutes Google Sheets: Connect your Google credential to the Sheets nodes and add your Spreadsheet ID. OpenAI Key: Add your API key to the "AI Copywriter" node. Meta API: Connect your Facebook Graph API credentials to the FB/IG nodes and ensure your Page ID is correct. Run: Click "Test URL" on the Pixril Dispatcher form to trigger your first post! About the Creator Built by Pixril. We specialize in building advanced, production-ready AI workflows and automation templates for n8n. Find more professional workflows in our shop: https://pixril.etsy.com
by Matthew
Automated Cold Email Personalization This workflow automates the creation of highly personalized cold outreach emails by extracting lead data, scraping company websites, and leveraging AI to craft unique email components. This is ideal for sales teams, marketers, and business development professionals looking to scale their outreach efforts while maintaining a high degree of personalization. How It Works Generate Batches: The workflow starts by generating a sequence of numbers, defining how many leads to process in batches. Scrape Lead Data: It uses an external API (Apify) to pull comprehensive lead information, including contact details, company data, and social media links. Fetch Client Data: The workflow then retrieves relevant client details from your Google Sheet based on the scraped data. Scrape Company Website: The lead's company website is automatically scraped to gather content for personalization. Summarize Prospect Data: An OpenAI model analyzes both the scraped website content and the individual's profile data to create concise summaries and identify unique angles for outreach. Craft Personalized Email: A more advanced OpenAI model uses these summaries and specific instructions to generate the "icebreaker," "intro," and "value proposition" components of a personalized cold email. Update Google Sheet: Finally, these generated email components are saved back into your Google Sheet, enriching your lead records for future outreach. Google Sheet Structure Your Google Sheet must have the following exact column headers to ensure proper data flow: Email** (unique identifier for each lead) Full Name** Headline** LinkdIn** cityName** stateName** company/cityName** Country** Company Name** Website** company/businessIndustry** Keywords** icebreaker** (will be populated by the workflow) intro** (will be populated by the workflow) value\_prop** (will be populated by the workflow) Setup Instructions Add Credentials: In n8n, add your OpenAI API key via the Credentials menu. Connect your Google account via the Credentials menu for Google Sheets access. You will also need an Apify API key for the Scraper node. Configure Google Sheets Nodes: Select the Client data and Add email data to sheet nodes. For each, choose your Google Sheets credential, select your spreadsheet, and the specific sheet name. Ensure all column mappings are correct according to the "Google Sheet Structure" section above. Configure Apify Scraper Node: Select the Scraper node. Update the Authorization header with your Apify API token (Bearer KEY). In the JSON Body, set the searchUrl to your Apollo link (or equivalent source URL for lead data). Configure OpenAI Nodes: Select both Summarising prospect data and Creating detailed email nodes. Choose your OpenAI credential from the dropdown. In the Creating detailed email node's prompt, replace PUT YOUR COMPANY INFO HERE with your company's context and verify the target sector for the email generation. Verify Update Node: On the final Add email data to sheet node, ensure the Operation is set to Append Or Update and the Matching Columns field is set to Email. Customization Options 💡 Trigger: Change the When clicking 'Execute workflow' node to an automatic trigger, such as a **Cron node for daily runs, or a Google Sheets trigger when new rows are added. Lead Generation: Modify the **Code node to change the number of leads processed per run (currently set to 50). Scraping Logic**: Adjust the Scraper node's parameters (e.g., count) or replace the Apify integration with another data source if needed. AI Prompting: Experiment with the prompts in the **Summarising prospect data and Creating detailed email OpenAI nodes to refine the tone, style, length, or content focus of the generated summaries and emails. AI Models**: Test different OpenAI models (e.g., gpt-3.5-turbo, gpt-4o) in the OpenAI nodes to find the optimal balance between cost, speed, and output quality. Data Source/CRM**: Replace the Google Sheets nodes with integrations for your preferred CRM (e.g., HubSpot, Salesforce) or a database (e.g., PostgreSQL, Airtable) to manage your leads.
by Madame AI
Automate product creation from links to WordPress & WooCommerce using Telegram & BrowserAct This workflow is a powerful e-commerce assistant that takes a raw product link (from any online store) and automatically creates optimized listings for your own platforms. It uses AI to write persuasive sales copy and blog articles, generates SEO-friendly image metadata, and syncs everything directly to WooCommerce and WordPress. Target Audience Dropshippers, affiliate marketers, and e-commerce store owners who want to scale their product catalog and content marketing effortlessly. How it works Analyze Intent: The workflow receives a message via Telegram. An AI Agent classifies it to see if it's a product link or a casual chat. Scrape Details: If a link is detected, BrowserAct executes a background task to scrape the product's title, price, description, images, and reviews. AI Strategy: A "Senior Copywriter" AI (using OpenAI/Gemini) processes the raw data. It writes a high-converting WooCommerce description (with HTML formatting), drafts an engaging blog post review, and generates SEO filenames/alt-text for all images. Sync to Store: WooCommerce: Creates the product, sets the price, and updates the description. WordPress: Publishes the AI-written blog article reviewing the product. Optimize Images: The workflow loops through every product image, updating them in WooCommerce with the new SEO-friendly metadata. How to set up Configure Credentials: Connect your Telegram, WooCommerce, WordPress, BrowserAct, Google Gemini, and OpenRouter accounts in n8n. Prepare BrowserAct: Ensure the WordPress & WooCommerce Product Management template is saved in your BrowserAct account. Configure Telegram: Create a bot via BotFather and add the API token to your Telegram credentials. Activate: Turn on the workflow. Test: Send a product link (e.g., from AliExpress or Amazon) to your bot to see it magically appear in your store and blog. Requirements BrowserAct* account with the *WordPress & WooCommerce Product Management** template. WooCommerce** account (Consumer Key/Secret). WordPress** account (Application Password). Telegram** account (Bot Token). Google Gemini* & *OpenRouter** accounts. How to customize the workflow Change Blog Tone: Modify the system prompt in the Generate response agent to change the writing style (e.g., "Professional Reviewer" vs. "Hypebeast"). Add Social Media: Add a Twitter or LinkedIn node at the end to automatically tweet the new blog post link. Price Markup: Add a Code node before the WooCommerce step to automatically increase the scraped price by a percentage (margin). Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video Automate WooCommerce: Auto-Import Products & Write Blog Posts with n8n 🚀
by Jitesh Dugar
Turn your blog into a self-driving social media machine. This workflow monitors your RSS feed, extracts new content, and uses AI to craft platform-perfect posts for LinkedIn and Twitter/X, complete with hosted images. 🎯 What This Workflow Does This workflow automates the transition from "published on blog" to "live on social" in three primary stages: 🔁 Step 1 — RSS Trigger & Filter RSS Feed Trigger:** Polls your blog every 15 minutes to detect new articles Validation:** Ensures each item has a title and cover image before proceeding ☁️ Step 2 — Media Hosting Bridge Fetch Binary:** Downloads the blog’s cover image UploadToURL:** Uploads the image to a public CDN and returns a stable URL for social platforms 🤖 Step 3 — AI Multi-Platform Posting OpenAI Captions:** Generates: LinkedIn → professional, long-form post Twitter/X → short, punchy tweet with hashtags Parallel Publishing:** Posts simultaneously to both platforms Success Logging:** Tracks hosted image URLs and caption details ✨ Key Features Built-in Deduplication:** RSS ensures no duplicate posts Clean Data Processing:** Strips HTML for better AI output Fallback Logic:** Uses default caption if AI fails Reliable Media Hosting:** UploadToURL ensures public image access 🔧 Setup Requirements Required Credentials OpenAI:** API key LinkedIn:** OAuth2 credentials Twitter/X:** OAuth1 credentials UploadToURL:** API key Configuration Add your blog’s RSS Feed URL Adjust polling interval (default: 15 minutes) Ready to boost your blog’s reach? Import this template and automate your social presence instantly!
by Khairul Muhtadin
Turn unstructured pitch decks and investment memos into polished Due Diligence PDF reports automatically. This n8n workflow handles everything from document ingestion to final delivery, combining internal document analysis with live web research to produce analyst-grade output in minutes. The Problem It Solves Reviewing a single deal manually reading the deck, cross-checking claims online, formatting the summary easily takes half a day. Multiply that by 10–20 inbound deals per week, and your team is buried in low-leverage work before any real analysis begins. This workflow compresses that cycle into a single automated pipeline. How It Works Upload: Send a PDF, DOCX, or PPTX to the webhook endpoint. Parse: LlamaParse extracts clean Markdown from complex layouts, preserving tables and financial data. Enrich: The workflow identifies the target company, then pulls supplementary data from the open web (corporate pages, risk signals) using Decodo's search and scraping APIs to verify and contextualize claims made in the source documents. Analyze: An AI Agent runs six targeted retrieval queries against the combined dataset: revenue history, key risks, business model, competitive landscape, management profile, and deal terms. Deliver: Results render into a branded HTML template, convert to PDF via Puppeteer, upload to Cloudflare R2, and return a download link. Each deal gets a unique namespace in Pinecone, so documents are isolated and repeat uploads skip redundant parsing. What You Need | Service | Role | | --- | --- | | n8n | Workflow orchestration | | LlamaIndex Cloud | Document parsing (LlamaParse) | | Pinecone | Vector storage & retrieval | | OpenAI API | Embeddings (text-embedding-3-small) & LLM analysis (GPT-5.4) | | Decodo API | Web search & page scraping | | Cloudflare R2 | Report file storage (S3-compatible) | Quick Start Import the workflow JSON into your n8n instance. Add credentials for OpenAI, Pinecone, LlamaIndex (Header Auth), Decodo, and Cloudflare R2 (S3-compatible). Update the R2 base URL in the "Build Public Report URL" node. Fire a test POST with a sample deck to the webhook. Customization Ideas Swap the HTML template to match your firm's branding and report structure. Extend the AI Agent prompt to cover additional dimensions like ESG scoring or technical debt. Route the finished PDF to Slack, email, or your CRM instead of (or alongside) R2. Troubleshooting | Symptom | Likely Fix | | --- | --- | | Parsing times out | Increase the Wait node duration; check file size against LlamaParse limits | | Thin or generic analysis | Verify the source PDF is text-based, not a scanned image, enable OCR if needed | | Broken PDF layout | Simplify CSS in the HTML render node; older Puppeteer builds handle basic layouts better | Created by: Khmuhtadin Category: Business Intelligence | Tags: AI, RAG, Due Diligence, Decodo Portfolio • Store • LinkedIn • Medium • Threads
by Firecrawl
What this does Uses Firecrawl to scrape any company website and extract structured business signals from it. The enriched profile is automatically saved to Supabase. A self-hosted, free alternative to paid enrichment APIs like Apollo or Clay, powered by Firecrawl. How it works Webhook receives a POST request with a url field (bare domain or full URL) Verify URL node validates and normalizes the domain Firecrawl scrapes the target website and searches for additional company data AI Agent (OpenRouter) extracts structured business signals from the scraped content Structured Output Parser formats the result into a clean JSON profile Supabase checks for duplicates before inserting, then saves the enriched profile Respond to Webhook returns the enriched result (or a 422 error if the URL was invalid) Business signals extracted Company name, industry, pricing model, free trial availability, employee size signal, funding stage, tech stack and integrations detected, target customer profile, trust signals (certifications, reviews, customer count), hiring status and open roles count. Requirements Firecrawl API key OpenRouter API key (or swap for any OpenAI-compatible model) Supabase project (setup SQL provided below) Setup Create a Supabase project and run the following SQL in the SQL editor: CREATE TABLE lead_enrichment ( id UUID PRIMARY KEY DEFAULT gen_random_uuid(), created_at TIMESTAMPTZ DEFAULT now(), updated_at TIMESTAMPTZ DEFAULT now(), domain TEXT NOT NULL UNIQUE, company_name TEXT, industry TEXT, pricing_model TEXT, has_free_trial BOOLEAN, employee_signal TEXT, funding_stage TEXT, tech_stack TEXT[], integrations TEXT[], target_customer TEXT, trust_signals TEXT[], hiring BOOLEAN, open_roles_count INT, raw_scraped_text TEXT, enrichment_source TEXT DEFAULT 'firecrawl' ); CREATE OR REPLACE FUNCTION update_updated_at() RETURNS TRIGGER AS $$ BEGIN NEW.updated_at = now(); RETURN NEW; END; $$ LANGUAGE plpgsql; CREATE TRIGGER set_updated_at BEFORE UPDATE ON lead_enrichment FOR EACH ROW EXECUTE FUNCTION update_updated_at(); Add your Firecrawl API key as a credential in n8n Add your OpenRouter API key as a credential (or swap for any OpenAI-compatible provider) Add your Supabase credentials (project URL + service role key) Activate the workflow How to use Send a POST request to the webhook URL: curl -X POST https://your-n8n-instance/webhook/your-id \ -H "Content-Type: application/json" \ -d '{"url": "firecrawl.dev"}' `
by Incrementors
Financial Insight Automation: Market Cap to Telegram via Bright Data 📊 Description An automated n8n workflow that scrapes financial data from Yahoo Finance using Bright Data, processes market cap information, generates visual charts, and sends comprehensive financial insights directly to Telegram for instant notifications. 🚀 How It Works This workflow operates through a simple three-zone process: 1. Data Input & Trigger User submits a keyword (e.g., "AI", "Crypto", "MSFT") through a form trigger that initiates the financial data collection process. 2. Data Scraping & Processing Bright Data API discovers and scrapes comprehensive financial data from Yahoo Finance, including market cap, stock prices, company profiles, and financial metrics. 3. Visualization & Delivery The system generates interactive market cap charts, saves data to Google Sheets for record-keeping, and sends visual insights to Telegram as PNG images. ⚡ Setup Steps > ⏱️ Estimated Setup Time: 15-20 minutes Prerequisites Active n8n instance (self-hosted or cloud) Bright Data account with Yahoo Finance dataset access Google account for Sheets integration Telegram bot token and chat ID Step 1: Import the Workflow Copy the provided JSON workflow code In n8n: Go to Workflows → + Add workflow → Import from JSON Paste the JSON content and click Import Step 2: Configure Bright Data Integration Set up Bright Data Credentials: In n8n: Navigate to Credentials → + Add credential → HTTP Header Auth Add Authorization header with value: Bearer BRIGHT_DATA_API_KEY Replace BRIGHT_DATA_API_KEY with your actual API key Test the connection to ensure it works properly > Note: The workflow uses dataset ID gd_lmrpz3vxmz972ghd7 for Yahoo Finance data. Ensure you have access to this dataset in your Bright Data dashboard. Step 3: Set up Google Sheets Integration Create a Google Sheet: Go to Google Sheets and create a new spreadsheet Name it "Financial Data Tracker" or similar Copy the Sheet ID from the URL Configure Google Sheets credentials: In n8n: Credentials → + Add credential → Google Sheets OAuth2 API Complete OAuth setup and test connection Update the workflow: Open the "📊 Filtered Output & Save to Sheet" node Replace YOUR_SHEET_ID with your actual Sheet ID Select your Google Sheets credential Step 4: Configure Telegram Bot Set up Telegram Integration: Create a Telegram bot using @BotFather Get your bot token and chat ID In n8n: Credentials → + Add credential → Telegram API Enter your bot token Update the "📤 Send Chart on Telegram" node with your chat ID Replace YOUR_TELEGRAM_CHAT_ID with your actual chat ID Step 5: Test and Activate Test the workflow: Use the form trigger with a test keyword (e.g., "AAPL") Monitor the execution in n8n Verify data appears in Google Sheets Check for chart delivery on Telegram Activate the workflow: Turn on the workflow using the toggle switch The form trigger will be accessible via the provided webhook URL 📋 Key Features 🔍 Keyword-Based Discovery: Search companies by keyword, ticker, or industry 💰 Comprehensive Financial Data: Market cap, stock prices, earnings, and company profiles 📊 Visual Charts: Automatic generation of market cap comparison charts 📱 Telegram Integration: Instant delivery of insights to your mobile device 💾 Data Storage: Automatic backup to Google Sheets for historical tracking ⚡ Real-time Processing: Fast data retrieval and processing with Bright Data 📊 Output Data Points | Field | Description | Example | |-------|-------------|---------| | Company Name | Full company name | "Apple Inc." | | Stock Ticker | Trading symbol | "AAPL" | | Market Cap | Total market capitalization | "$2.89T" | | Current Price | Latest stock price | "$189.25" | | Exchange | Stock exchange | "NASDAQ" | | Sector | Business sector | "Technology" | | PE Ratio | Price to earnings ratio | "28.45" | | 52 Week Range | Annual high and low prices | "$164.08 - $199.62" | 🔧 Troubleshooting Common Issues Bright Data Connection Failed: Verify your API key is correct and active Check dataset permissions in Bright Data dashboard Ensure you have sufficient credits Google Sheets Permission Denied: Re-authenticate Google Sheets OAuth Verify sheet sharing settings Check if the Sheet ID is correct Telegram Not Receiving Messages: Verify bot token and chat ID Check if bot is added to the chat Test Telegram credentials manually Performance Tips Use specific keywords for better data accuracy Monitor Bright Data usage to control costs Set up error handling for failed requests Consider rate limiting for high-volume usage 🎯 Use Cases Investment Research:** Quick financial analysis of companies and sectors Market Monitoring:** Track market cap changes and stock performance Competitive Analysis:** Compare financial metrics across companies Portfolio Management:** Monitor holdings and potential investments Financial Reporting:** Generate automated financial insights for teams 🔗 Additional Resources n8n Documentation Bright Data Datasets Google Sheets API Telegram Bot API For any questions or support, please contact: info@incrementors.com or fill out this form: https://www.incrementors.com/contact-us/
by Lucas Peyrin
How it works This workflow is an interactive, hands-on tutorial designed to teach you the absolute basics of JSON (JavaScript Object Notation) and, more importantly, how to use it within n8n. It's perfect for beginners who are new to automation and data structures. The tutorial is structured as a series of simple steps. Each node introduces a new, fundamental concept of JSON: Key/Value Pairs: The basic building block of all JSON. Data Types: It then walks you through the most common data types one by one: String (text) Number (integers and decimals) Boolean (true or false) Null (representing "nothing") Array (an ordered list of items) Object (a collection of key/value pairs) Using JSON with Expressions: The most important step! It shows you how to dynamically pull data from a previous node into a new one using n8n's expressions ({{ }}). Final Exam: A final node puts everything together, building a complete JSON object by referencing data from all the previous steps. Each node has a detailed sticky note explaining the concept in simple terms. Set up steps Setup time: 0 minutes! This is a tutorial workflow, so there is no setup required. Simply click the "Execute Workflow" button to run it. Follow the instructions in the main sticky note: click on each node in order, from top to bottom. For each node, observe the output in the right-hand panel and read the sticky note next to it to understand what you're seeing. By the end, you'll have a solid understanding of what JSON is and how to work with it in your own n8n workflows.
by DuyTran
Description: Overview This workflow generates automated revenue and expense comparison reports from a structured Google Sheet. It enables users to compare financial data across the current period, last month, and last year, then uses an AI agent to analyze and summarize the results for business reporting. Prerequisites A connected Google Sheets OAuth2 credential. A valid DeepSeek AI API (or replaceable with another Chat Model). A sub-workflow (child workflow) that handles processing logic. Properly structured Google Sheets data (see below). Required Google Sheet Structure Column headers must include at least: Date, Amount, Type. Setup Steps Import the workflow into your n8n instance. Connect your Google Sheets and DeepSeek API credentials. Update: Sheet ID and Tab Name (already embedded in node: Get revenual from google sheet). Custom sub-workflow ID (in the Call n8n Workflow Tool node). Optionally configure chatbot webhook in the When chat message received node. What the Workflow Does Accepts date inputs via AI chat interface (ChatTrigger + AI Agent). Fetches raw transaction data from Google Sheets. Segments and pivots revenue by classification for: Current period Last month Last year Aggregates totals and applies custom titles for comparison. Merges all summaries into a final unified JSON report. Customization Options Replace DeepSeek with OpenAI or other LLMs. Change the date fields or cycle comparisons (e.g., quarterly, weekly). Add more AI analysis steps such as sentiment scoring or forecasting. Modify the pivot logic to suit specific KPI tags or labels. Troubleshooting Tips If Google Sheets fetch fails: ensure the document is shared with your n8n Google credential. If parsing errors: verify that all dates follow the expected format. Sub-workflow must be active and configured to accept the correct inputs (6 dates). SEO Keywords (ẩn hoặc mô tả ngầm): Google Sheets report, AI financial report, compare revenue by month, expense analysis automation, chatbot n8n report generator, n8n Google Sheet integration
by Cheng Siong Chin
How It Works This workflow automates cross-platform content distribution from Instagram to YouTube with intelligent AI enhancement. Designed for content creators, social media managers, and digital marketers who need to maximize their content reach across platforms efficiently. The template solves the challenge of manual video repurposing by automating the entire process from content retrieval to optimized publishing. It retrieves Instagram videos on schedule, generates engaging metadata using dual AI models (Anthropic Claude for creative titles/descriptions), uploads to YouTube, logs performance metrics to Google Sheets, and sends WhatsApp notifications upon completion. The workflow intelligently routes tasks between AI providers: Claude's language capabilities create compelling and platform-optimized content. This dual-model approach delivers superior results compared to single-AI solutions, combining creativity with precision for maximum engagement. Setup Steps Configure Instagram credentials Add Anthropic API key for Claude model in AI nodes Connect YouTube account and configure upload settings Link Google Sheets with target spreadsheet ID for logging Add WhatsApp Business API credentials Prerequisites Instagram Business/Creator account with API access Use Cases Social media agencies managing multiple client accounts Customization Modify AI prompts for brand-specific tone, adjust scheduling frequency Benefits Saves 2-3 hours daily on manual uploads, ensures consistent posting schedules
by Davide
This workflow automates the creation and publishing of AI-generated motion videos for TikTok. The process starts with an image and a reference motion video. Using the Kling v2.6 Motion Control AI model, the workflow generates a new animated video where the character from the image replicates the movements from the reference video. Once the AI-generated video is produced, the workflow automatically retrieves the result, uploads it to Postiz, and publishes it directly to TikTok with a predefined caption. Start: Watch the starting video Result: Watch the final video Key Advantages 1. ✅ Full Automation The workflow automates the entire pipeline from AI video generation to social media publishing, eliminating manual steps. 2. ✅ AI-Powered Content Creation By leveraging Kling Motion Control, the system creates dynamic animated content from a static image and motion reference video. 3. ✅ Scalable Content Production This setup enables rapid production of multiple AI-generated videos, making it ideal for automated social media content strategies. 4. ✅ Efficient Asynchronous Processing The workflow uses webhooks and wait nodes to handle long-running AI jobs efficiently without blocking the workflow. 5. ✅ Seamless Social Media Integration Direct integration with Postiz and TikTok allows automatic publishing, streamlining the content distribution process. 6. ✅ Modular and Customizable Each step (AI generation, parsing, upload, publishing) is modular, allowing easy modification for: different AI models other social platforms different prompts or media inputs 7. ✅ Reduced Manual Work Content creators can generate and publish AI-based videos with a single workflow execution. How it works Trigger & Input: The workflow is started manually. The initial "Set params" node defines the key inputs: an image_url, a video_url, and a tiktok_desc (caption). AI Video Generation: The "Run Kling v2.6 Motion Control" node sends a request to the Kie.ai API. It instructs the AI to make the character in the static image follow the movements from the reference video. Crucially, it includes a callBackUrl (the n8n webhook URL from the Wait node) so the API can notify the workflow when the video is ready. The workflow then pauses at the "Wait" node, holding its execution until it receives the callback from Kie.ai. Retrieve Result: Once the AI finishes processing, it sends a request to the "Wait" node's webhook, which resumes the workflow. The "Result" node then fetches the details of the completed job, including a link to the newly generated video (resultUrl). Process for Posting: The "Parsing" node extracts the resultUrl from the API's JSON response. A "Get ResulUrl" Code node formats this data to be passed to the next step. The "Get File Video" node uses the resultUrl to download the actual video file from the temporary URL. Upload & Schedule: The "Upload Video to Postiz" node takes the downloaded video file and uploads it to the Postiz platform using a multipart/form-data request. The final "TikTok" (Postiz) node creates a new post. It uses the video ID returned from the upload and the tiktok_desc from the initial parameters to schedule the post to the specified TikTok integration. Setup steps To make this workflow work for you, you need to configure the following: Set Input Parameters: In the "Set params" node, replace the example image_url, video_url, and tiktok_desc with your own values. image_url: Direct URL to the static character image. video_url: Direct URL to the reference movement video. tiktok_desc: The caption you want for the final TikTok post. Kie.ai API Credentials: Locate the "Run Kling v2.6 Motion Control" and "Result" nodes. You will need to provide credentials for httpBearerAuth. Replace the existing credential ID with your own Kie.ai API credentials. Ensure the credential is configured with a valid API Bearer Token. Postiz API Credentials: Locate the "Upload Video to Postiz" node. Provide credentials for httpHeaderAuth. Replace the existing credential ID with your own Postiz API key. This key must be set as a header for authentication. Postiz Integration ID: In the final "TikTok" (Postiz) node, look for the field integrationId inside the posts.post.value object. Replace the placeholder "XXX" with the actual Integration ID for your TikTok account from Postiz. 👉 Subscribe to my new YouTube channel. Here I’ll share videos and Shorts with practical tutorials and FREE templates for n8n. Need help customizing? Contact me for consulting and support or add me on Linkedin.