by Yaron Been
This workflow provides automated access to the Alitas126 Alitas2 AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for other generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete other generation process using the Alitas126 Alitas2 model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Advanced AI model for automated processing and generation tasks. Key Capabilities Specialized AI model with unique capabilities** Advanced processing and generation features** Custom AI-powered automation tools** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Alitas126/alitas2 AI model Alitas126 Alitas2**: The core AI model for other generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Other Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Specialized Processing**: Handle specific AI tasks and workflows Custom Automation**: Implement unique business logic and processing Data Processing**: Transform and analyze various types of data AI Integration**: Add AI capabilities to existing systems and workflows Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #aiprocessing #dataprocessing #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation
by Anthony
Disclaimer: This template only works on self-hosted for now, as it uses a community node. Use Case Web scrapers often break due to web page layout changes. This workflow attempts to mitigate this problem by auto-generating web scraping data extractor code via LLM. How It Works This workflow leverages ScrapeNinja n8n community node to: scrape webpage HTML, feed it into LLM (Google Gemini) and ask to write a JS extractor function code, then it executes the written JS extractor against scraped HTML to extract useful data from webpage (the code is safely executed in a sandbox) Installation To install ScrapeNinja n8n node, in your self-hosted instance, go to Settings -> Community nodes, enter "n8n-nodes-scrapeninja", and install. Make sure you are using at least v0.3.0. See this in action: https://www.linkedin.com/feed/update/urn:li:activity:7289659870935490560/
by Muhammad Zeeshan Ahmad
Platform: n8n (Telegram Bot Integration) Purpose: Let users fetch top meme coin prices in real-time using a simple /memecoin Telegram command How It Works (Logic Breakdown) This flow listens for a Telegram command and fetches data from the CoinGecko API to respond with live memecoin prices. ๐น 1. Telegram Trigger Node Listens for incoming Telegram messages from users. Activated when a message is sent in a Telegram chat connected to the bot. Passes the raw message (e.g., /memecoin) to the next node. ๐น 2. IF Node โ Check if Message is /memecoin Condition: {{$json"message"}} === "/memecoin" If true โ continue to fetch data from CoinGecko. If false โ nothing happens. ๐น 3. HTTP Request โ Fetch Meme Coins from CoinGecko API: https://api.coingecko.com/api/v3/coins/markets?...category=meme-token Fetches top 5 meme tokens by market cap. Data includes: Name Symbol Current price (USD) Coin ID (for URL linking) ๐น 4. Function Node โ Format the Message Parses the JSON response from CoinGecko. Builds a clean message like: ruby Copy Edit ๐ Dogecoin (DOGE) ๐ฐ Price: $0.123 ๐ More: https://www.coingecko.com/en/coins/dogecoin Loops through top 5 meme coins and adds line breaks. ๐น 5. Telegram Send Node โ Reply to User Sends the formatted message to the original chat. Uses chat_id from the trigger to ensure correct user receives it. ๐ผ Sample User Flow ๐ค User types /memecoin in Telegram bot ๐ค Bot fetches meme coin prices ๐ฌ Bot replies with live prices + links
by Harshil Agrawal
This workflow appends, lookup, updates, and reads data from a Google Sheet spreadsheet. Set node: The Set node is used to generate data that we want to add to Google Sheets. Depending on your use-case you might have data coming from a different source. For example, you might be fetching data from a WebHook call. Add the node that will fetch the data that you want to add to the Google Sheet. Use can then use the Set node to set the data that you want to add to the Google Sheets. Google Sheets node: This node will add the data from the Set node in a new row to the Google Sheet. You will have to enter the Spreadsheet ID and the Range to specify which sheet you want to add the data to. Google Sheets1 node: This node looks for a specific value in the Google Sheet and returns all the rows that contain the value. In this example, we are looking for the value Berlin in our Google Sheet. If you want to look for a different value, enter that value in the Lookup Value field, and specify the column in the Lookup Column field. Set1 node: The Set node sets the value of the rent by $100 for the houses in Berlin. We pass this new data to the next nodes in the workflow. Google Sheets2 node: This node will update the rent for the houses in Berlin with the new rent set in the previous node. We are mapping the rows with their ID. Depending on your use-case, you might want to map the values with a different column. To set this enter the column name in the Key field. Google Sheets3 node: This node returns the information from the Google Sheet. You can specify the columns that should get returned in the Range field. Currently, the node fetches the data for columns A to D. To fetch the data only for columns A to C set the range to A:C. This workflow can be broken down into different workflows each with its own use case. For example, we can have a workflow that appends new data to a Google Sheet, and another workflow that lookups for a certain value and returns that value. You can learn to build this workflow on the documentation page of the Google Sheets node.
by Jonathan | NEX
Effortlessly integrate NixGuard API into your n8n workflows for real-time security insights using your API key. This connector enables seamless interaction with Nix, providing rapid Retrieval-Augmented Generation (RAG) event knowledge with Wazuh integration - completely free and set up in under 5 minutes! ๐ Features: โ Query NixGuard's AI-driven security insights via API authentication โ Real-time security event knowledge integration โ Plug-and-play workflow trigger for effortless automation โ Wazuh compatibility for full security visibility ๐ How to Use: 1๏ธโฃ Add your API Key to authenticate with NixGuard. 2๏ธโฃ Integrate with your existing n8n workflows using the workflow trigger (default enabled). 3๏ธโฃ (Optional) Activate the chat trigger to streamline security queries via chat-based inputs. 4๏ธโฃ Run the workflow and get instant security intelligence! ๐ข Perfect for: Startup CTO's, SOC teams, security engineers, and developers needing real-time security automation within their infrastructure. ๐ Learn more about NixGuard: thenex.world ๐ Get started with a free security subscription: thenex.world/security/subscribe
by n8n Team
This workflow demonstrates how to connect an open-source model to a Basic LLM node. The workflow is triggered when a new manual chat message appears. The message is then run through a Language Model Chain that is set up to process text with a specific prompt to guide the model's responses. Note that open-source LLMs with a small number of parameters require slightly different prompting with more guidance to the model. You can change the default Mistral-7B-Instruct-v0.1 model to any other LLM supported by HuggingFace. You can also connect other nodes, such as Ollama. Note that to use this template, you need to be on n8n version 1.19.4 or later.
by Oneclick AI Squad
This automated n8n workflow qualifies B2B leads via voice calls using the VAPI API and integrates the collected data into Google Sheets. It triggers when a new leadโs phone number is added, streamlining lead qualification and data capture. What is VAPI? VAPI is an API service that enables voice call automation, used here to qualify leads by capturing structured data through interactive calls. Good to Know VAPI API calls may incur costs based on usage; check VAPI pricing for details. Ensure Google Sheets access is properly authorized to avoid data issues. Use credential fields for the HTTP Request node 'Bearer token' instead of hardcoding. Use a placeholder Google Sheet document ID (e.g., "your-sheet-id-placeholder") to avoid leaking private data. How It Works Detect when a new phone number is added for a lead using the New Lead Captured node. Use the Receive Lead Details from VAPI node to capture structured data (name, company, challenges) via a POST request. Trigger an outbound VAPI call to qualify the lead with the Initiate Voice Call (VAPI) node. Store the collected data into a Google Sheet using the Save Qualified Lead to CRM Sheet node. Send a success response back to VAPI with the Send Call Data Acknowledgement node. How to Use Import the workflow into n8n. Configure VAPI API credentials in the HTTP Request node using credential fields. Set up Google Sheets API access and authorize the app. Create a Google Sheet with the following columns: Name (text), Company (text), Challenges (text), Date (date). Test with a sample lead phone number to verify call initiation and data storage. Adjust the workflow as needed and retest. Requirements VAPI API credentials Google Sheets API access Customizing This Workflow Modify the Receive Lead Details from VAPI node to capture additional lead fields or adjust call scripts for specific industries.
by Yaron Been
This workflow provides automated access to the Black Forest Labs Flux Krea Dev AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for image generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete image generation process using the Black Forest Labs Flux Krea Dev model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: An opinionated text-to-image model from Black Forest Labs in collaboration with Krea that excels in photorealism. Creates images that avoid the oversaturated "AI look". Key Capabilities High-quality image generation from text prompts** Advanced AI-powered visual content creation** Customizable image parameters and styles** Text-to-image transformation capabilities** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Black Forest Labs/flux-krea-dev AI model Black Forest Labs Flux Krea Dev**: The core AI model for image generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Image Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Content Creation**: Generate unique images for blogs, social media, and marketing materials Design Prototyping**: Create visual concepts and mockups for design projects Art & Creativity**: Produce artistic images for personal or commercial use Marketing Materials**: Generate eye-catching visuals for campaigns and advertisements Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #imagegeneration #aiart #texttoimage #visualcontent #aiimages #generativeart #flux #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation
by Rahul Joshi
Description This powerful n8n automation template enables seamless synchronization between Zoho Inventory and Supabaseโkeeping your product database up to date with zero manual effort. Whether youโre running an eCommerce store, inventory dashboard, or product catalog app, this workflow ensures your data pipeline stays clean, consistent, and fully automated. What This Template Does: ๐ Runs on a schedule to fetch inventory data from Zoho ๐ Authenticates via OAuth using refresh token for secure API access ๐ฆ Fetches products & variants with complete metadata ๐ Splits each item and maps it into Supabase row-by-row ๐ Pushes rich product data, including name, SKU, unit, tags, stock levels, dimensions, and up to 3 custom attributes Fields Included in Sync: Product ID, Variant ID, Variant Name, Brand, SKU Returnability, Item Type, Unit, Attributes (1โ3) Tags, Stock on Hand, UPC/EAN/ISBN, Status Reorder Level, Dimensions, Created Time, and more Requirements: Zoho Inventory API access (with Refresh Token) Supabase account & API key Target table (e.g., Fairy Frills) set up in Supabase Optional: Custom field mapping for additional use cases Perfect For: Inventory managers syncing Zoho to custom dashboards D2C brands and eCommerce platforms powered by Supabase Internal tooling teams needing a real-time product database sync Startups replacing spreadsheets with a production-grade backend
by Kev
Important: This workflow uses the Autype community node and requires a self-hosted n8n instance. This workflow downloads a fillable PDF form from a URL, extracts all form field names and types using Autype, sends the field list to an AI Agent (OpenAI) together with applicant data, and uses the AI response to fill the form automatically. The AI is instructed to return raw JSON only, and a Code node validates the response before filling. The filled PDF is flattened (non-editable) and saved to Google Drive. Who is this for? Companies that regularly submit the same types of PDF form applications -- permit renewals, tax filings, compliance questionnaires, insurance claims, customs declarations, or any recurring government/regulatory paperwork. Instead of manually filling the same form fields every quarter or year, the AI reads the form structure and fills it with the correct data automatically. Concrete example: A manufacturing company must renew its operating permit every year by submitting a multi-page PDF application to the local regulatory authority. The form asks for company name, registration number, address, contact person, business type, employee count, and more. With this workflow, the company stores its data once in the AI Agent prompt, and every renewal period they simply run the workflow to get a completed, flattened PDF ready for submission. This also works as an additional skill for an AI agent. Instead of a manual trigger, connect the workflow to a webhook or chat trigger so an agent can call it when a user asks "fill out the permit renewal form for Q2 2026." What this workflow does On manual trigger, the workflow fetches a fillable PDF from a URL (e.g. a government portal, internal document server, or S3 bucket). It uploads the PDF to Autype and calls Get Form Fields to extract every field name, type (text, checkbox, dropdown, radio), current value, available options, and read-only status. The field list is passed directly to an AI Agent via an inline expression (no separate prompt-building Code node needed). The AI's system message instructs it to return only raw JSON. A Code node validates and parses the response before Autype fills the form, flattens it, and the result is saved to Google Drive. Showcase How it works Run Workflow -- Manual trigger starts the pipeline. Download PDF Form -- An HTTP Request node fetches the fillable PDF from a URL (the sample uses a registration form with 7 fields). Upload PDF Form -- Uploads the PDF binary to Autype Tools to get a file ID. Get Form Fields -- Autype extracts all form fields and returns them as metadata. Each field includes: name, type (text/checkbox/dropdown/radio/optionlist), value (current), options (for dropdowns/radio), and isReadOnly. No output file is created. AI Agent -- Receives the field list and applicant data directly in its prompt via an n8n expression. The system message instructs the AI to return only a raw JSON object mapping field names to values (strings for text/dropdown/radio, booleans for checkboxes). Prepare Fill Data -- A Code node parses and validates the AI JSON response (strips markdown fences if present), then pairs it with the Autype file ID. Fill PDF Form -- Autype fills every form field with the AI-generated values. Fields are flattened (non-editable) so the output is a clean, final PDF. Save Filled PDF to Drive -- The completed form is uploaded to Google Drive as filled-form-YYYY-MM-DD.pdf. Setup Install the Autype community node (n8n-nodes-autype) via Settings > Community Nodes. Create an Autype API credential with your API key from app.autype.com. See API Keys in Settings. Create an OpenAI API credential with your key from platform.openai.com. Create a Google Drive OAuth2 credential and connect your Google account. Import this workflow and assign your credentials to each node (including the OpenAI Chat Model sub-node). The sample form URL is pre-configured. To use your own form, replace the URL in the "Download PDF Form" node. Edit the applicant data directly in the AI Agent node prompt (the "Prompt (User Message)" field). Set YOUR_FOLDER_ID in the "Save Filled PDF to Drive" node to your target Google Drive folder. Click Test Workflow to run the pipeline. Note: This is a community node, so you need a self-hosted n8n instance to use community nodes. Requirements Self-hosted n8n instance (community nodes are not available on n8n Cloud) Autype account with API key (free tier available) n8n-nodes-autype community node installed OpenAI API key (gpt-4o-mini or any chat model) Google Drive account with OAuth2 credentials (optional, can replace with other output) How to customize Change applicant data:** Edit the prompt text directly in the "AI Agent" node. Replace the example person/company info with your own. Use a different AI model:** Swap the OpenAI Chat Model sub-node for Anthropic Claude, Google Gemini, or any LangChain-compatible chat model. Connect to an AI agent:** Replace the Manual Trigger with a Webhook or Chat Trigger so an AI agent can call this workflow as a tool (e.g. "fill the Q2 permit renewal form"). Skip flattening:** Set flatten to false in the "Fill PDF Form" node if you want the fields to remain editable after filling. Add watermark:** Insert an Autype Watermark step after Fill Form to stamp "DRAFT" or "SUBMITTED" on every page before saving. Add password protection:** Insert an Autype Protect step after filling to encrypt the PDF before uploading to Drive. Change output destination:** Replace the Google Drive node with Email (SMTP), S3, Slack, or any other n8n output node. Pull data from a database:** Instead of hardcoding data in the AI Agent prompt, query a database (Postgres, MySQL, Airtable) or CRM (HubSpot, Salesforce) to dynamically fill different forms for different entities.
by Automate With Marc
๐ฅ Automated Daily Firecrawl Scraper with Telegram Alerts Get structured insights scraped daily from the web using Firecrawlโs AI extraction engine โ then send them directly to your Telegram chat. ๐งฐ What this workflow does: This workflow automatically scrapes specific structured data from any webpage every day at a scheduled time using the Firecrawl API, checks if results are returned, and then sends the formatted results to Telegram. For step-by-step video tutorials of n8n builds, check out my channel: https://www.youtube.com/@Automatewithmarc ๐งญ How It Works: ๐ Schedule Trigger (Daily at 6PM) Starts the workflow every day at a set time. ๐ Firecrawl POST Request Sends a custom extraction prompt and schema to Firecrawl, targeting any list of URLs you provide. โณ 30 Seconds Wait Waits to give Firecrawl enough time to complete processing. ๐ฅ GET Firecrawl Result Fetches the extraction results using the request ID. ๐ Loop with IF Node Checks whether data is returned. If not, waits another 15 seconds and retries. ๐งน Format & Clean (Set Node) Prepares and formats the extracted result into a readable message. ๐ฒ Telegram Message Node Delivers the structured data directly to your Telegram channel or group. ๐ง Requirements: โ Firecrawl API Key (Header Auth) โ Telegram Bot Token & Chat ID ๐ก Use Cases: Extract structured data (like product info or events) from niche websites Automate compliance monitoring or intelligence gathering Create market alert bots with real-time info delivery ๐ Customization Ideas: Swap Telegram with Gmail, Discord, or Slack Expand schema to include more complex nested fields Add a Google Sheet node to log daily scraped data Integrate with a summarizer or language model for intelligent summaries Ready to automate your web intelligence gathering? ๐ง Let Firecrawl do the scraping โ and let this workflow do the rest.
by Sebastien
How to use Get a .csv file with your contacts (you can download this from any contact manager app) Set API key for Google Drive API, and Notion (you need to create a "connection" in Notion) Create Database for your contacts in Notion Choose which properties to extract from the .csv and pass it in to the Notion database. Right now, it transfer 4 pieces of information: full name, email, phone, and company.