by kreonovo
What this does: This automation will dynamically create channels on your Discord server for each of your Webflow forms then send formatted form submissions as messages in those channels. This is useful for Webflow will only notify a single email of a form submission. By using this workflow you can enhance your Webflow form management by receiving them in Discord. This is great if you need to notify multiple team members or communities of your form submissions. Usage guide Full written and video guide Simply create credentials for Webflow and Discord and connect them to the nodes. The video guide demonstrates a realworld usecase using a Webflow template and breaks down each node in detail about how it works.
by jason
If you have made some investments in cryptocurrency, this workflow will allow you to create an Airtable base that will update the value of your portfolio every hour. You can then track how well your investments are doing. You can check out my Airtable base to see how it works or even copy my base so that you can customize this workflow for yourself. To implement this workflow, you will need to update the Airtable nodes with your own credentials and make sure that they are pointing to your Airtable
by Yaron Been
Luma Photon Flash Image Generator Description Accelerated variant of Photon prioritizing speed while maintaining quality Overview This n8n workflow integrates with the Replicate API to use the luma/photon-flash model. This powerful AI model can generate high-quality image content based on your inputs. Features Easy integration with Replicate API Automated status checking and result retrieval Support for all model parameters Error handling and retry logic Clean output formatting Parameters Required Parameters prompt** (string): Text prompt for image generation Optional Parameters seed** (integer, default: None): Random seed. Set for reproducible generation aspect_ratio** (string, default: 16:9): Aspect ratio of the generated image image_reference** (string, default: None): Reference image to guide generation style_reference** (string, default: None): Style reference image to guide generation character_reference** (string, default: None): Character reference image to guide generation image_reference_url** (string, default: None): Deprecated: Use image_reference instead style_reference_url** (string, default: None): Deprecated: Use style_reference instead image_reference_weight** (number, default: 0.85): Weight of the reference image. Larger values will make the reference image have a stronger influence on the generated image. style_reference_weight** (number, default: 0.85): Weight of the style reference image character_reference_url** (string, default: None): Deprecated: Use character_reference instead How to Use Set up your Replicate API key in the workflow Configure the required parameters for your use case Run the workflow to generate image content Access the generated output from the final node API Reference Model: luma/photon-flash API Endpoint: https://api.replicate.com/v1/predictions Requirements Replicate API key n8n instance Basic understanding of image generation parameters
by Yaron Been
Fire V Sekai.mediapipe Labeler Image Generator Description Mediapipe Blendshape Labeler - Predicts the blend shapes of an image. Overview This n8n workflow integrates with the Replicate API to use the fire/v-sekai.mediapipe-labeler model. This powerful AI model can generate high-quality image content based on your inputs. Features Easy integration with Replicate API Automated status checking and result retrieval Support for all model parameters Error handling and retry logic Clean output formatting Parameters Required Parameters media_path** (string): Input image, video, or training zip file Optional Parameters test_mode** (boolean, default: False): Enable test mode for quick verification max_people** (integer, default: 100): Maximum number of people to detect (1-100) export_train** (boolean, default: True): Export training zip containing json annotations and frame pngs aligned_media** (string, default: None): Optional video that is aligned with the input video's annotations frame_sample_rate** (integer, default: 1): Process every nth frame for video input How to Use Set up your Replicate API key in the workflow Configure the required parameters for your use case Run the workflow to generate image content Access the generated output from the final node API Reference Model: fire/v-sekai.mediapipe-labeler API Endpoint: https://api.replicate.com/v1/predictions Requirements Replicate API key n8n instance Basic understanding of image generation parameters
by Baptiste Fort
🎯 Workflow Goal Still manually checking form responses in your inbox? What if every submission landed neatly in Airtable — and you got a clean Slack message instantly? That’s exactly what this workflow does. No code, no delay — just a smooth automation to keep your team in the loop: Tally → Airtable → Slack Build an automated flow that: receives Tally form submissions, cleans up the data into usable fields, stores the results in Airtable, and automatically notifies a Slack channel. Step 1 – Connect Tally to n8n What we’re setting up A Webhook node in POST mode. Technical Add a Webhook node. Set it to POST. Copy the generated URL. In Tally → Integrations → Webhooks → paste this URL. Submit a test response on your form to capture a sample structure. Step 2 – Clean the data After connecting Tally, you now receive raw data inside a fields[] array. Let’s convert that into something clean and structured. Goal Extract key info like Full Name, Email, Phone, etc. into simple keys. What we’re doing Add a Set node to remap and clean the fields. Technical Add a Set node right after the Webhook. Add new values (String type) manually: Name: Full Name → Value: {{$json"fields"["value"]}} Name: Email → Value: {{$json"fields"["value"]}} Name: Phone → Value: {{$json"fields"["value"]}} (Adapt the indexes based on your form structure.) Use the data preview in the Webhook node to check the correct order. Output You now get clean data like: { "Full Name": "Jane Doe", "Email": "jane@example.com", "Phone": "+123456789" } Step 3 – Send to Airtable ✅ Once the data is cleaned, let’s store it in Airtable automatically. Goal Create one new Airtable row for each form submission. What we’re setting up An Airtable – Create Record node. Technical Add an Airtable node. Authenticate or connect your API token. Choose the base and table. Map the fields: Name: {{$json["Full Name"]}} Email: {{$json["Email"]}} Phone: {{$json["Phone"]}} Output Each submission creates a clean new row in your Airtable table. Step 4 – Add a delay ⌛ After saving to Airtable, it’s a good idea to insert a short pause — this prevents actions like Slack messages from stacking too fast. Goal Wait a few seconds before sending a Slack notification. What we’re setting up A Wait node for X seconds. ✅ Technical Add a Wait node. Choose Wait for X minutes. Step 5 – Send a message to Slack 💬 Now that the record is stored, let’s send a Slack message to notify your team. Goal Automatically alert your team in Slack when someone fills the form. What we’re setting up A Slack – Send Message node. Technical Add a Slack node. Connect your account. Choose the target channel, like #leads. Use this message format: New lead received! Name: {{$json["Full Name"]}} Email: {{$json["Email"]}} Phone: {{$json["Phone"]}} Output Your Slack team is notified instantly, with all lead info in one clean message. Workflow Complete Your automation now looks like this: Tally → Clean → Airtable → Wait → Slack Every submission turns into clean data, gets saved in Airtable, and alerts your team on Slack — fully automated, no extra work.
by Paul Taylor
📩 Gmail → GPT → Supabase | Task Extractor This n8n workflow automates the extraction of actionable tasks from unread Gmail messages using OpenAI's GPT API, stores the resulting task metadata in Supabase, and avoids re-processing previously handled emails. ✅ What It Does Triggers on a schedule to check for unread emails in your Gmail inbox. Loops through each email individually using SplitInBatches. Checks Supabase to see if the email has already been processed. If it's a new email: Formats the email content into a structured GPT prompt Calls ChatGPT-4o to extract structured task data Inserts the result into your emails table in Supabase 🧰 Prerequisites Before using this workflow, you must have: An active n8n Cloud or self-hosted instance A connected Gmail account with OAuth credentials in n8n A Supabase project with an emails table and: ALTER TABLE emails ADD CONSTRAINT unique_email_id UNIQUE (email_id); An OpenAI API key with access to GPT-4o or GPT-3.5-turbo 🔐 Required Credentials | Name | Type | Description | |-----------------|------------|-----------------------------------| | Gmail OAuth | Gmail | To pull unread messages | | OpenAI API Key | OpenAI | To generate task summaries | | Supabase API | HTTP | For inserting rows via REST API | 🔁 Environment Variables or Replacements Supabase_TaskManagement_URI → e.g., https://your-project.supabase.co Supabase_TaskManagement_ANON_KEY → Your Supabase anon key These are used in the HTTP request to Supabase. ⏰ Scheduling / Trigger Triggered using a Schedule node Default: every X minutes (adjust to your preference) Uses a Gmail API filter: unread emails with label = INBOX 🧠 Intended Use Case > Designed for productivity-minded professionals who want to extract, summarize, and store actionable tasks from incoming email — without processing the same email twice or wasting GPT API credits. This is part of a larger system integrating GPT, calendar scheduling, and optional task platforms (like ClickUp). 📦 Output (Stored in Supabase) Each processed email includes: email_id subject sender received_at body (email snippet) gpt_summary (structured task) requires_deep_work (from GPT logic) deleted (initially false)
by Daniel Ng
Restore All n8n Workflows from Google Drive Backups Restoring multiple n8n workflows manually, especially when migrating your n8n instance to another host or server, can be an incredibly daunting and time-consuming task. Imagine having to individually export and then manually import hundreds of workflows; it's a recipe for errors and significant downtime. This workflow provides a streamlined way to restore all your n8n workflows from backup JSON files stored in a designated Google Drive folder. It's an essential tool for disaster recovery, migrating workflows to a new n8n instance, or recovering from accidental deletions, ideally used in conjunction with a backup solution like our "Auto Backup Workflows To Google Drive" template. For more powerful n8n templates, visit our website or contact us at AI Automation Pro. We help your business build custom AI workflow automation and apps. Who is this for? This template is intended for: n8n Users and Administrators:** Who have previously backed up their n8n workflows as JSON files to Google Drive. Anyone needing to recover their n8n setup:** Whether due to system failure, data corruption, accidental deletions, or during an instance migration. What problem is this workflow solving? / use case Restoring multiple n8n workflows manually can be a slow and error-prone process. This workflow solves that by: Automating Bulk Restore:** Quickly re-imports all workflows from a specified Google Drive backup folder, drastically cutting down on manual effort. Disaster Recovery:** Enables rapid recovery of your automation environment, minimizing downtime after a system failure or data corruption. Simplified Instance Migration:** Makes the process of transferring your entire workflow suite to a new n8n server significantly more manageable and less error-prone compared to manual imports. Data Integrity:** Helps restore workflows to a known good state from your backups, ensuring consistency after a recovery or migration. What this workflow does Manual Trigger: You initiate the workflow manually whenever a restore operation is needed. List Backup Files: The workflow accesses a specific Google Drive folder (which you must configure) and lists all the files within it. It assumes these are your n8n workflow JSON backup files. Iterate and Process: It then loops through each file found in the Google Drive folder: Download Workflow: Downloads the individual workflow JSON file from Google Drive. Extract Content: Parses the downloaded file to extract the JSON data representing the workflow. Import to n8n: Uses the n8n API to create a new workflow (or update an existing one if an ID match is found) in your current n8n instance using the extracted JSON data. Wait Step: Pauses for 3 seconds after attempting to create each workflow to help manage system load and avoid potential API rate-limiting issues. Step-by-step setup Import Template: Upload the provided JSON file into your n8n instance. Configure Credentials: Google Drive Nodes: You will need to create or select existing Google Drive OAuth2 API credentials for these nodes. n8n Node: Configure your n8n API credentials to allow the workflow to create/update workflows in your instance. Specify Google Drive Backup Folder (CRITICAL): Open the "Google Drive Get All Workflows" node. Locate the "Filter" section, and within it, the "Folder ID" parameter. The default value is a placeholder URL. You MUST change this URL to the direct URL of the Google Drive folder that contains your n8n workflow .json backup files. This would typically be one of the hourly folders (e.g., n8n_backup_YYYY-MM-DD_HH) created by the companion backup workflow. Activate Workflow: Although manually triggered, the workflow needs to be active in your n8n instance to be runnable. How to customize this workflow to your needs Selective Restore:** Option 1 (Manual): Before running the workflow, manually move only the specific workflow JSON files you want to restore into the source Google Drive folder configured in the "Google Drive Get All Workflows" node. Option 2 (Automated Filter): Insert an "Edit Fields" or "Filter" node after the "Google Drive Get All Workflows" node to programmatically select which files (e.g., based on filename patterns) should proceed to the "Loop Over Items" node for restoration. Adjust Wait Time:** The "Wait" node is set to 3 seconds. You can increase this if you have a very large number of workflows or if your n8n instance requires more time between API calls. Conversely, for smaller batches on powerful instances, you might decrease it. Error Handling:** For enhanced robustness, consider adding error handling branches (e.g., using "Error Trigger" nodes or "Continue on Fail" settings within nodes) to log or send notifications if a specific workflow fails to import. Important Considerations Workflow Overwriting/Updating:* If a workflow with the same id as one in a backup JSON file already exists in your n8n instance, this restore process will typically *update/overwrite** that existing workflow with the version from the backup. If the id from the backup file does not correspond to any existing workflow, a new workflow will be created. Idempotency:** Running this workflow multiple times on the exact same backup folder will cause the workflow to re-process all files. This means workflows will be updated/overwritten again if they exist, or created if they don't. Ensure this is the intended behavior. Companion Backup Workflow:** This restore workflow is ideally paired with backups created by a process like our "Auto Backup Workflows To Google Drive" template, which saves workflows in the expected JSON format. Test Safely:** It's highly recommended to test this workflow on a non-production or development n8n instance first, especially when restoring a large number of critical workflows or if you're unsure about the overwrite behavior in your specific n8n setup. Source Folder Content:* Ensure the specified Google Drive folder *only contains n8n workflow JSON files that you intend to restore. Other file types may cause errors in the "Extract from File" node.
by Daniel Ng
This n8n workflow template uses community nodes and is only compatible with the self-hosted version of n8n. Auto Backup n8n Credentials to Google Drive This workflow automates the backup of all your n8n credentials. It can be triggered manually for on-demand backups or will run automatically on a schedule (default to daily execution). It executes a command to export decrypted credentials, formats them into a JSON file, and then uploads this file to a specified Google Drive folder. This process is essential for creating secure backups of your sensitive credential data, facilitating instance recovery or migration. We recommend you use this backup workflow in conjunction with a restore solution like our "Restore Credentials from Google Drive Backups" template. For more powerful n8n templates, visit our website or contact us at AI Automation Pro. We help your business build custom AI workflow automation and apps. Who is this for? This workflow is designed for n8n administrators and users who require a reliable method to back up their n8n credentials. It is particularly beneficial for those managing self-hosted n8n instances, where direct server access allows for command-line operations. What problem is this workflow solving? / use case Managing and backing up n8n credentials manually can be a tedious task, susceptible to errors and often overlooked. This workflow solves the problem by providing an automated, secure, and consistent way to back up all credential data. The primary use case is to ensure that a recovery point for credentials exists, safeguarding against data loss, assisting in instance migrations, or for general disaster recovery preparedness, ideally on a regular, automated basis. What this workflow does The workflow proceeds through the following steps: Triggers: The workflow includes two types of triggers: Manual Trigger: An "On Click Trigger" allows for on-demand execution whenever needed. Scheduled Trigger: A "Schedule Trigger" is included, designed for automated daily backups. Export Credentials: An "Execute Command" node runs the shell command npx n8n export:credentials --all --decrypted. This command exports all credentials from the n8n instance in a decrypted JSON format. Format JSON Data: The output from the command is processed by a "Code" node ("JSON Formatting Data"). This node extracts, parses, and formats the JSON to ensure it is well-structured. Aggregate Credentials: An "Aggregate" node ("Aggregate Cridentials") combines individual credential entries into a single JSON array. Convert to File: The "Convert To File" node transforms the aggregated JSON array into a binary file, preparing it as n8n_backup_credentials.json. Upload to Google Drive: The "Google Drive Upload File" node uploads the generated JSON file to a specified folder in Google Drive. Step-by-step setup To use this workflow, you'll need to configure a few things: n8n Instance Environment: The n8n instance must have access to the npx command and the n8n-cli package. The "Execute Command" node must be able to run shell commands on the server where n8n is hosted. Google Drive Credentials: In the "Google Drive Upload File" node, select or create your Google Drive OAuth2 API credentials. Google Drive Folder ID: Update the folderId parameter in the "Google Drive Upload File" node with the ID of your desired Google Drive folder. File Name (Optional): The backup file will be named n8n_backup_credentials.json. You can customize this in the "Google Drive Upload File" node. Configure Schedule Trigger: The workflow includes a "Schedule Trigger". Review its configuration to ensure it runs daily at your preferred time. How to customize this workflow to your needs Adjust Schedule:** Fine-tune the "Schedule Trigger" for different intervals (e.g., weekly, hourly) or specific days/times as per your requirements. Notifications:** Add notification nodes (e.g., Slack, Email, Discord) after the "Google Drive Upload File" node to receive alerts upon successful backup or in case of failures. Enhanced Error Handling:** Incorporate error handling branches using "Error Trigger" nodes or conditional logic to manage potential failures. Client-Side Encryption (Advanced):* If your security policy requires the backup file itself to be encrypted at rest in Google Drive, you can add a step *before uploading. Insert a "Code" node or use an "Execute Command" node with an encryption utility (like GPG) to encrypt the n8n_backup_credentials.json file. Remember that you would then need a corresponding decryption process. Dynamic File Naming:** Modify the "Google Drive Upload File" node to include a timestamp in the filename (e.g., n8n_backup_credentials_{{$now.toFormat('yyyyMMddHHmmss')}}.json) to keep multiple versions of backups. Important Note on Credential Security To simplify the setup and use of this backup workflow, the exported credentials are stored in the resulting JSON file in a decrypted state. This means the backup file itself is not further encrypted by this workflow. Consequently, it is critically important to: Ensure the Google Drive account used for backups is highly secure (e.g., strong password, two-factor authentication). Restrict access to the Google Drive folder where these backups are stored to only authorized personnel.
by Fenngbrotalk
n8n Workflow: AI-Powered Stock Chart Analysis Bot for Telegram This is a powerful n8n automation workflow that integrates a Telegram bot with OpenAI's multimodal large language model (GPT-4 Vision) to provide users with real-time stock chart analysis. Workflow Breakdown Receive Image:** The workflow is initiated by a Telegram Trigger. It activates whenever a user sends an image (e.g., a stock's candlestick chart) to a designated Telegram chat, automatically downloading the file. Image Pre-processing:** To optimize the AI's performance and efficiency, the Edit Image node resizes the incoming image to a standard 512x512 pixel format. AI Vision Analysis:** The processed image is then passed to a LangChain Chain, which utilizes the OpenAI GPT-4 Vision model. A sophisticated system prompt instructs the AI to act as a professional stock analyst. Intelligent Interpretation:** The AI analyzes the image to identify the stock's name, price trend (uptrend, downtrend, or sideways), key support/resistance levels, and volume changes. It then generates a comprehensive analysis report combining technical indicators and market sentiment. Structured Output:** To ensure reliability and consistency, the AI's output is parsed into a specific JSON format. This structure includes a search_word (for the industry/sector) and the main content (the analysis text). Send Response:** Finally, the workflow extracts the content field from the JSON output and uses the Telegram node to send this professional analysis back to the user as a text message in the same chat. Key Features User-Friendly:** Users simply send an image to get an analysis, requiring no complex commands. Instant & Efficient:** The entire analysis and response process is fully automated and completed within moments. Professional-Grade Analysis:** Leverages the advanced image recognition and reasoning capabilities of GPT-4 Vision to deliver insights comparable to those of a human analyst. Reliable & Consistent:** The use of structured output ensures that the format of the response is always consistent and easy to read or process further.
by isa024787bel
This n8n workflow automates sending out SMS notifications via Vonage which includes new tech-related vocabulary everyday. To build this handy vocabulary improver, you’ll need the following: n8n – You can find details on how to install n8n on the Quickstart page. LingvaNex account – You can create a free account here. Up to 200,000 characters are included in the free plan when you generate your API key. Airtable account – You can register for free. Vonage account – You can sign up free of charge if you aren’t already.
by scrapeless official
> ⚠️ Disclaimer: This workflow uses Scrapeless and Claude AI via community nodes, which require n8n self-hosted to work properly. 🔁 How It Works This intelligent B2B lead generation workflow combines search automation, website crawling, AI analysis, and multi-channel output: It starts by using Scrapeless’s Deep Serp API to find company websites from targeted Google Search queries. Each result is then individually crawled using Scrapeless's Crawler module, retrieving key business information from pages like /about, /contact, /services. The raw web content is processed via a Code node to clean, extract, and prepare structured data. The cleaned data is passed to Claude Sonnet (Anthropic) which analyzes and qualifies the lead based on content richness, contact data, and relevance. A filter step ensures only high-quality leads (e.g. lead score ≥ 6) are kept. Sent via Discord webhook for real-time notification (can be replaced with Slack, email, or CRM tools). > 📌 The result is a fully automated system that finds, qualifies, and organizes B2B leads with high efficiency and minimal manual input. ✅ Pre-Conditions Before using this workflow, make sure you have: An n8n self-hosted instance A Scrapeless account and API key (get it here) An Anthropic Claude API key A configured Discord webhook URL (or alternative notification service) ⚙️ Workflow Overview Manual Trigger → Scrapeless Google Search → Item Lists → Scrapeless Crawler → Code (Data Cleaning) → Claude Sonnet → Code (Response Parser) → Filter → Discord Notification 🔨 Step-by-Step Breakdown Manual Trigger – For testing purposes (can be replaced with Cron or Webhook) Scrapeless Google Search – Queries target B2B topics via Scrapeless’s Deep SERP API Item Lists – Splits search results into individual items Scrapeless Crawler – Visits each company domain and scrapes structured content Code Node (Data Cleaner) – Extracts and formats content for LLM input Claude Sonnet (via HTTP Request) – Evaluates lead quality, relevance, and contact info Code Node (Parser) – Parses Claude’s JSON response IF Filter – Filters leads based on score threshold Discord Webhook – Sends formatted message with company info 🧩 Customization Guidance You can easily adjust the workflow to match your needs: Lead Criteria**: Modify the Claude prompt and scoring logic in the Code node Output Channels**: Replace the Discord webhook with Slack, Email, Airtable, or any CRM node Search Topics**: Change your query in the Scrapeless SERP node to find leads in different niches or countries Scoring Threshold**: Adjust the filter logic (lead_score >= 6) to match your quality tolerance 🧪 How to Use Insert your Scrapeless and Claude API credentials in the designated nodes Replace or configure the Discord webhook (or alternative outputs) Run the workflow manually (or schedule it) View qualified leads directly in your chosen notification channel 📦 Output Example Each qualified lead includes: 🏢 Company Name 🌐 Website ✉️ Email(s) 📞 Phone(s) 📍 Location 📈 Lead Score 📝 Summary of relevant content 👥 Ideal Users This workflow is perfect for: AI SaaS companies** targeting mid-market & enterprise leads Marketing agencies** looking for B2B-qualified leads Automation consultants** building scraping solutions No-code developers** working with n8n, Make, Pipedream Sales teams** needing enriched prospecting data
by Nicolas Le Gallo
Who is this template for ? Basically anyone involved in recurring recruiting processes and looking to save a considerable amount of time and energy (Talent acquisitions Managers, recruiting consultants, hiring managers, founders…etc) What it does : It takes a messy and raw transcript from an “intake meeting” between a recruiter and a Hiring manager and turns it into a clean and exhaustive brief + scorecard templates for each interview rounds It does it under 1 MINUTE while the usual “manual” process usually takes several hours How to customize this workflow to your needs Google doc is the default choice because it allows easy modification of the output, but you can choose to output this under any format and / or store it wherever you want I strongly suggest to choose one of the latest LLM models for better output quality Both LLM prompts can be revised to match your expectations better