by Harshil Agrawal
This is an example that gets the logo, icon, and information of a company and stores it in Airtbale. You can set the values that you want to store in the Set node. If you want to store the data in a different database (Google Sheet, Postgres, MongoDB, etc.) replace the Airtable node with that node. You can refer to the documentation to learn how to build this workflow from scratch.
by Eduard
Create, iterate, and share! Transform a single image through multiple scenes while maintaining consistency. ✨ What this workflow does This template showcases FLUX.1 Kontext - Black Forest Labs' in-context image generation model that excels at maintaining character features across multiple transformations. Combined with the Upload Post community node for effortless multi-platform social media posting, you can create and share compelling visual stories instantly. The workflow demonstrates FLUX Kontext's core strength: character consistency across multiple image generations. Starting with a single input image, it: 🖼️ Loads an initial character image (example: a cute animal mascot) 📝 Defines multiple scene transformation prompts 🔄 Iteratively generates new scenes while preserving exact character features 🎯 Maintains visual consistency by reusing binary data from previous generations 📱 Auto-posts the complete transformation series to multiple social platforms simultaneously 🚀 Key Features: The Consistency Advantage Character Preservation**: FLUX Kontext's signature feature - maintains character features and style across transformations (requires specific prompting techniques) Iterative Context Building**: Each generation uses the previous image as context, creating visual continuity Binary Data Reuse**: Smart workflow design that feeds output from one generation as input to the next Multi-Scene Storytelling**: Transform your character across different environments while keeping them recognizable One-Click Multi-Platform Posting*: Upload Post eliminates the tedious process of posting to each platform individually 📱 Why use Upload Post? Posting the same content to TikTok, Instagram, LinkedIn, YouTube, Facebook, X (Twitter), and Threads individually is time-consuming and error-prone. The Upload Post service* simplifies this process: ✅ Connect once, post everywhere: Link all your social media accounts to Upload Post ✅ Single API call: Post to multiple platforms with one simple node ✅ No more platform juggling: Skip the endless switching between apps and dashboards ✅ Consistent timing: All platforms get your content simultaneously ✅ Trusted by 3,751+ users: Proven solution for content creators and marketers Instead of spending 30+ minutes manually posting to each platform, Upload Post does it all in seconds with a single n8n node! 🛠️ Prerequisites Required Accounts: Black Forest Labs API: Create account at dashboard.bfl.ai Get your API key for FLUX Kontext Pro access Upload Post Account: Sign up at upload-post.com* Connect your social media profiles (TikTok, Instagram, LinkedIn, YouTube, Facebook, X/Twitter, Threads) Get API credentials for automated posting Free tier available: 10 uploads/month 💡 Perfect For: Character Designers** maintaining brand character integrity across scenes Social Media Managers** creating engaging visual story series without manual posting Brand Marketers** ensuring character consistency across campaigns Storytellers** building visual narratives with consistent protagonists Agencies** managing multiple client accounts efficiently 🔧 Customization Options: Modify transformation prompts** to create your own character journey Adjust iteration steps** Change initial character image** Configure social platform targeting** (choose which platforms to post to) Customize post content** and formatting Experiment with different consistency scenarios** \ Affiliate link*
by Harshil Agrawal
This example workflow allows you to create, update, and get a document in Google Cloud Firestore. The workflow uses the Set node to set the data, however, you might receive data from a different source. Add the node that receives the data before the Set node and set the values you want to insert in a document, in the Set node. Also, update the Columns/ attributes fields in the Google Cloud Firestore node.
by Harshil Agrawal
Based on your use case, you might want to trigger a workflow if new data gets added to your database. This workflow allows you to send a message to Mattermost when new data gets added in Google Sheets. The Interval node triggers the workflow every 45 minutes. You can modify the timing based on your use case. You can even use the Cron node to trigger the workflow. If you wish to fetch new Tweets from Twitter, replace the Google Sheet node with the respective node. Update the Function node accordingly.
by Harshil Agrawal
This workflow allows you to get analytics of a website and store it Airtable. In this workflow, we get the analytics for the sessions grouped by the country. Based on your use-case, you can select different Dimensions and set different Metrics. You can use the Cron node or the Interval node to trigger the workflow on a particular interval and fetch the analytics data regularly. Based on your use-case, you might want to store the data returned by Google Analytics to a database or a Google Sheet. Replace the Airtable node with the appropriate node.
by Harshil Agrawal
This workflow allows you to receive updates about the positiong of the ISS and add it to a table in TimescaleDB. Cron node: The Cron node triggers the workflow every minute. You can configure the time based on your use-case. HTTP Request node: This node makes an HTTP Request to an API that returns the position of the ISS. Based on your use-case you may want to fetch data from a different URL. Enter the URL in the URL field. Set node: In the Set node we set the information that we need in the workflow. Since we only need the timestamp, latitude, and longitude we set this in the node. If you need other information, you can set them in this node. TimescaleDB node: This node stores the information in a table named iss. You can use a different table as well.
by Shashikanth
Source code, I maintain this worflow here. Usage Guide This workflow backs up all workflows as JSON files named in the [workflow_name].json format. Steps Create GitHub Repository Skip this step if using an existing repository. Add GitHub Credentials In Credentials, add the GitHub credential for the repository owner. Download and Import Workflow Import this workflow into n8n. Set Global Values In the Globals node, set the following: repo.owner: GitHub username of the repository owner. repo.name: Name of the repository for backups. repo.path: Path to the folder within the repository where workflows will be saved. Configure GitHub Nodes Edit each GitHub node in the workflow to use the added credentials. Workflow Logic Each workflow run handles files based on their status: New Workflow If a workflow is new, create a new file in the repository. Unchanged Workflow If the workflow is unchanged, skip to the next item. Changed Workflow If a workflow has changes, update the corresponding file in the repository. Current Limitations / Needs work Name Change of Workflows If a workflow is renamed or deleted in n8n, the old file remains in the repository. Deleted Workflows Deleted workflows in n8n are not removed from the repository.
by ConvertAPI
Who is this for? For developers and organizations that need to convert XLSX files to PDF. What problem is this workflow solving? The file format conversion problem. What this workflow does Downloads the XLSX file from the web. Converts the XLSX file to PDF. Stores the PDF file in the local file system. How to customize this workflow to your needs Open the HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Add your secret to the Query Auth account parameter. Please create a ConvertAPI account to get an authentication secret. Optionally, additional Body Parameters can be added for the converter.
by Muzaffer AKYIL
Docker Registry Cleanup Template This template is designed to automatically clean up old image tags in the Docker registry and perform garbage collection. Features List all images in the registry Preserve the last 10 tags for each image (latest tag is always preserved) Delete old tags Email notification for Successful/Excused cancellation Registry garbage collection automation Failure notification in error conditions Prerequisites Docker Registry v2 API access Basic Authentication credentials SMTP email settings (for notifications) SSH node installed on n8n (for garbage collection) Installation 1. Identity Information Add the following credentials in n8n: HTTP Basic Auth**: For Registry access SSH Private Key**: For Garbage collection command Email SMTP**: For notifications 2. Set Variables Replace your-registry-url with your actual registry URL on all nodes: ‘url": ‘https://your.registry.com/v2/_catalog’. Customisation Retention Policy: Set the number of tags to be retained by changing the slice(0, 10) value in the Identify Tags to Remove node Schedule: Change the frequency of operation at the Trigger node Notification Content: Customise email templates according to your needs Notes Check DELETE operations before running in a test environment Make sure that the registry is not in read-only mode The registry may need to be put into maintenance mode for garbage collection Step Details: Retrieving image information:** The workflow starts by fetching a list of images and their associated tags from the Docker registry. Filtering and sorting:** The retrieved tags are then filtered and sorted based on specific criteria, such as creation date and tag name. Deleting old tags:** The workflow identifies old or unused tags and attempts to delete them from the registry. Sending notifications:** The workflow sends email notifications to inform the user about the status of the cleanup process, including any errors or successes. Executing additional cleanup tasks:** Finally, the workflow executes an SSH command on the Docker registry server to perform additional cleanup tasks, such as garbage collection. TL;DR In summary, this n8n template provides a robust and automated solution for managing and cleaning up Docker registries. By regularly running this workflow, users can ensure that their registry remains organized and efficient, and avoid running out of storage space.-
by AiAgent
What It Does This intelligent workflow simplifies the complex task of determining whether a website is legitimate or potentially a scam. By simply submitting a URL through a form, the system initiates a multi-agent evaluation process. Four dedicated AI agents—each powered by GPT-4o and connected to SerpAPI—analyze different dimensions of the website: domain and technical details, search engine signals, product and pricing patterns, and on-site content analysis. Their findings are then passed to a fifth AI agent, the Analyzer, powered by GPT-4o mini, which consolidates the data, scores the site on a scale of 1–10 for scam likelihood, and presents the findings in a clear, structured format for the user. Who It's For This workflow is ideal for anyone who needs to quickly and reliably assess the trustworthiness of a website. Whether you're a consumer double-checking a store before making a purchase, a small business owner validating supplier sites, a cybersecurity analyst conducting threat assessments, or a developer building fraud detection into your platform — this tool offers fast, AI-powered insights without the need for manual research or technical expertise. It's designed for both individuals and teams who value accurate, scalable scam detection. How It Works The process begins with a simple form submission where the user enters the URL of the website they want to investigate. Once submitted, the workflow activates four specialized AI agents—each powered by GPT-4o and connected to SerpAPI—to independently analyze the site from different angles: Agent 1 examines domain age, SSL certificates, and TLD trustworthiness. Agent 2 reviews search engine results, forum mentions, and public scam reports. Agent 3 analyzes product pricing patterns and brand authenticity. Agent 4 assesses on-site content quality, grammar, legitimacy of claims, and presence of business info. Each agent returns its findings, which are then aggregated and passed to a fifth AI agent—the Analyzer. This final agent, powered by GPT-4o mini, evaluates all the input, assigns a scam likelihood score from 1 to 10, and compiles a neatly formatted summary with organized insights and a disclaimer for context. Set UP You will need to obtain an Open AI API key from platform.openai.com/api-keys After you obtain this Open AI API key you will need to connect it to the Open AI Chat Model for all of the Tools agents (Analyzer, Domain & Technical Details, Search Engine Signals, Product & Pricing Patterns, and Content Analysis Tools Agents). You will now need to fund your Open AI account. GPT 4o costs ~$0.01 to run the workflow. Next you will need to create a SerpAPI account at https://serpapi.com/users/sign_up After you create an account you will need to obtain a SerpAPI key. You will then need to use this key to connect to the SerpAPI tool for each of the tools agents (Domain & Technical Details, Search Engine Signals, Product & Pricing Patterns, and Content Analysis Tools Agents) Tip: SerpAPI will allow you to run 100 free searches each month. This workflow uses ~5-15 SerpAPI searches per run. If you would like to utilize the workflow more than that each month, create multiple SerpAPI accounts and have an API key for each account. When you utilize all 100 free searches for an account, switch to the API key for another account within the workflow. Disclaimer This tool is designed to assist in evaluating the potential risk of websites using AI-generated insights. The scam likelihood score and analysis provided are based on publicly available information and should not be considered a definitive or authoritative assessment. This tool does not guarantee the accuracy, safety, or legitimacy of any website. Users should perform their own due diligence and use independent judgment before engaging with any site. N8N, OpenAI, its affiliates, and the creators of this workflow are not responsible for any loss, damages, or consequences arising from the use of this tool or the actions taken based on its results.
by Raz Hadas
Description Transform your investment strategy with a fully automated, AI-driven trading bot. This workflow bridges the gap between AI-powered market insights and real-world trading by executing buy and sell orders directly through the Alpaca paper trading API. Designed to work in tandem with the Automated Stock Sentiment Analysis workflow, this solution takes the top-performing stocks based on daily news sentiment and automatically rebalances your portfolio. It's perfect for algorithmic traders, data-driven investors, and n8n enthusiasts who want to see their AI analysis translate into tangible actions, all while maintaining a comprehensive log of every transaction in Google Sheets. Key Features & Benefits Automated Trading Execution:** Automatically places buy and sell orders on the Alpaca paper trading platform without manual intervention. Sentiment-Driven Decisions:** Leverages the output from the sentiment analysis workflow to make informed decisions, selling positions with waning sentiment and buying into those with high positive sentiment. Dynamic Portfolio Rebalancing:** Intelligently calculates which positions to close and how to allocate the resulting funds into new, high-potential stocks. Paper Trading Ready:** Safely test and refine your trading strategies in a risk-free environment using Alpaca's paper trading API. Daily Performance Tracking:** Automatically logs your account equity and daily percentage change to a Google Sheet, giving you a clear view of your portfolio's performance. Detailed Trade Logging:** Every buy and sell order is meticulously recorded in a Google Sheet for easy review and historical analysis. Scheduled and Autonomous:** The entire process runs on a daily schedule, making it a "set and forget" solution for systematic trading. How It Works This workflow executes a sophisticated, automated trading strategy in a few key stages: Daily Kick-off & Snapshot: The workflow triggers on a daily schedule, first fetching your current Alpaca account balance and logging it to a Google Sheet to track daily performance. Strategy Formulation: It then reads the daily sentiment scores produced by the accompanying "Stock Sentiment Analysis" workflow. A Code node filters these results to identify the top four stocks with the highest positive sentiment. The Decision Engine: The core of the workflow is a custom Code node that acts as the trading brain. It: Retrieves your currently open positions from Alpaca. Compares your holdings against the day's top four sentiment stocks. Generates a "sell list" of positions you hold that are no longer in the top four. Generates a "buy list" of top-sentiment stocks that you don't yet own. Calculates the total cash value from the "sell list" and determines the exact notional value to invest in each stock on the "buy list." Trade Execution: The workflow first iterates through the "sell list" and executes a DELETE request to Alpaca for each, closing the positions. A Wait node pauses the workflow for two minutes to ensure the sell orders are filled and the account balance is updated. It then iterates through the "buy list," executing POST requests to Alpaca to purchase the new assets with the calculated funds. Record Keeping: All executed orders (both buys and sells) are merged and logged in a dedicated Google Sheet, giving you a permanent and detailed transaction history. Nodes Used Schedule Trigger HttpRequest (Alpaca API) Google Sheets Code (JavaScript) SplitOut Wait Merge This workflow is the perfect next step for anyone looking to take their AI analysis to the next level. Take the emotion out of your trading and let this bot systematically execute your data-driven strategy.
by Evoort Solutions
TikTok Transcript Generator Overview This automated workflow extracts transcripts from TikTok videos by reading video URLs from a Google Sheet, calling the API via TikTok Transcript Generator, cleaning the subtitle data, and updating the sheet with transcripts. It efficiently handles batches, errors, and rate limits to provide a seamless transcription process. Key Features Batch processing:** Reads and processes multiple TikTok video URLs from Google Sheets. Automatic transcript generation:* Uses the *TikTok Transcript Generator API on RapidAPI**. Clean subtitle output:** Removes timestamps and headers for clear transcripts. Error handling:** Marks videos with no available transcript. Rate limiting:* Implements wait times to avoid API throttling on *RapidAPI**. Seamless Google Sheets integration:** Updates the same sheet with transcript results and statuses. API Used TikTok Transcript Generator API** Google Sheet Columns | Column Name | Description | |----------------|-----------------------------------------| | Video Url | URL of the TikTok video to transcribe | | Transcript | Generated transcript text (updated by workflow) | | Generated Date | Date when the transcript was generated (YYYY-MM-DD) | Workflow Nodes Explanation | Node Name | Type | Purpose | |--------------------------|-----------------------|-------------------------------------------------------------------| | When clicking ‘Execute workflow’ | Manual Trigger | Manually starts the entire transcription workflow. | | Google Sheets2 | Google Sheets (Read) | Reads TikTok video URLs and transcript data from Google Sheets. | | Loop Over Items | Split In Batches | Processes rows in smaller batches to control execution speed. | | If | Conditional Check | Filters videos needing transcription (URL present, transcript empty). | | HTTP Request | HTTP Request | Calls the TikTok Transcript Generator API on RapidAPI to fetch transcripts. | | If1 | Conditional Check | Checks for valid API responses (handles 404 errors). | | Code | Code (JavaScript) | Cleans and formats raw subtitle text by removing timestamps. | | Google Sheets | Google Sheets (Update)| Updates the sheet with cleaned transcripts and generation dates. | | Google Sheets1 | Google Sheets (Update)| Updates sheet with “No transcription available” message on error.| | Wait | Wait | Adds delay between batches to avoid API rate limits on RapidAPI. | Challenges Resolved Manual Transcription Effort:** Eliminates the need to manually transcribe TikTok videos, saving time and reducing errors. API Rate Limits:* Introduces batching and wait periods to avoid exceeding API usage limits on *RapidAPI**, ensuring smooth execution. Incomplete or Missing Data:** Filters out videos already transcribed and handles missing transcripts gracefully by logging appropriate messages. Data Formatting Issues:** Cleans raw subtitle data to provide readable, timestamp-free transcripts. Data Synchronization:** Updates transcripts back into the same Google Sheet row, maintaining data consistency and ease of access. Use Cases Content creators wanting to transcribe TikTok videos automatically. Social media analysts extracting text data for research. Automation enthusiasts integrating transcript generation into workflows. How to Use Prepare a Google Sheet with the columns: Video Url, Transcript, and Generated Date. Connect your Google Sheets account in the workflow. Enter your RapidAPI key for the TikTok Transcript Generator API. Execute the workflow to generate transcripts. View transcripts and generated dates directly in your Google Sheet. Try this workflow to automate your TikTok video transcriptions efficiently! Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!