by Joachim Hummel
This workflow automates the daily backup of all your n8n workflows to a designated folder in Nextcloud. It ensures that you always have the last 7 days of backups available while automatically deleting older ones to save space. 🔧 Features Scheduled Trigger: Runs automatically once per day (can be executed manually as well). Directory Management: Creates the /N8N-Backup directory in Nextcloud if it doesn't already exist. Backup Collection: Retrieves all workflows from the n8n instance. JSON Conversion: Converts each workflow into a JSON file. Upload to Nextcloud: Saves each backup file into the specified backup directory. Retention Control: Keeps only the latest 7 backups and deletes the rest from Nextcloud. 📌 Notes Make sure to manually create the /N8N-Backup directory in your Nextcloud account before using this flow. Update the Backup Path node if you wish to change the upload directory. Ideal for teams using n8n with self-hosted instances and requiring offsite backup via Nextcloud. 🔒 Requirements n8n instance with access to the Nextcloud node. Valid credentials for your Nextcloud account with API access. Update: 08/11/2025 “Backup Flows to Nextcloud” – Import format fixed Summary: The workflow now exports one clean JSON object per workflow (no arrays, no backup/meta fields), so files can be imported 1:1 via the n8n UI. What changed: Switched from “Convert to File” to a Set node that builds the JSON in binary data. Enabled filters.include = "all" on Get many workflows to include nodes, connections, settings, pinData, and tags. Sanitized filenames and removed IDs/metadata that can break UI imports. Fixed Nextcloud path and binary property mapping (data). Verification: Generated multiple backups and imported each via UI (“Import from file”) without errors. Each file begins with { (single object) and loads with full workflow structure. Notes: Keep “Binary Property” set to data in the Nextcloud node. Filenames are sanitized to avoid special-character issues.
by Lucas Walter
AI News Scraping System This n8n workflow automates the process of pulling in breaking AI-related headlines from curated RSS feeds, scraping their full content, and saving readable Markdown versions directly to Google Drive. Use cases include: Creating a personal newsletter curation system Automating blog post research workflows Archiving news content for later summarization or AI use How it Works Scheduled Triggers The workflow runs every 3–4 hours using multiple Schedule Trigger nodes. Each trigger targets a different news source (e.g., Google News, OpenAI Blog, Hugging Face, etc.). Fetch and Parse Feeds RSS feeds are fetched via the HTTP Request node. Items from the feed are split into individual entries using the Split Out node. Scrape Article Content Each article URL is sent to the Firecrawl API with a prompt to extract only the main content in Markdown. The scraping skips navigation, headers, footers, and ads. Convert and Save The extracted Markdown is converted into a .md file using the Convert to File node. The file is then uploaded to a Google Drive folder. Good to Know This workflow uses the Firecrawl API for web scraping. Be sure to configure a Generic HTTP Header credential with your API key. Output files are saved in Markdown format You can add more Schedule Trigger + HTTP Request pairs to extend this workflow to additional feeds. Requirements Firecrawl API account for scraping Google Drive account (OAuth2 credentials must be configured in n8n) n8n instance (self-hosted or cloud) Customization Ideas Replace or extend RSS feeds with sources relevant to your niche Load up scraped news stories into a prompt to create new content like TikToks and Reels Add a summarization step using an LLM like GPT or Claude Send the Markdown files to Notion, Slack, or a blog CMS Example Feeds | Feed Name | URL | |------------------|----------------------------------------------------------------------| | Google News (AI) | https://rss.app/feeds/v1.1/AkOariu1C7YyUUMv.json | | OpenAI Blog | https://rss.app/feeds/v1.1/xNVg2hbY14Z7Gpva.json | | Hugging Face | https://rss.app/feeds/v1.1/sgHcE2ehHQMTWhrL.json |
by Eric
This is a specific use case. The ElevenLabs guide for Cal.com bookings is comprehensive but I was having trouble with the booking API request. So I built a simple workflow to validate the request and handle the booking creation. Who's this for? You have an ElevenLabs voice agent (or other external service) booking meetings in your Cal.com account and you want more control over the book_meeting tool called by the voice agent. How's it work? Request is received by the webhook trigger node Request sent from ElevenLabs voice agent, or other source Request body contains contact info for the user with whom a meeting will be booked in Cal.com Workflow validates input data for required fields in Cal.com If validation fails, a 400 bad request response is returned If valid, meeting is booked in Cal.com api How do I use this? Create a custom tool in the ElevenLabs agent setup, and connect it to the webhook trigger in this workflow. Add authorization for security. Instruct your voice agent to call this tool after it has collected the required information from the user. Expected input structure Note: Modify this according to your needs, but be sure to reflect your changes in all following nodes. Requirements here depend on required fields in your Cal.com event type. If you have multiple event types in Cal.com with varying required fields, you'll need to handle this in this workflow, and provide appropriate instructions in your *voice agent prompt*. "body": { "attendee_name": "Some Guy", "start": "2025-07-07T13:30:00Z", "attendee_phone": "+12125551234", "attendee_timezone": "America/New_York", "eventTypeId": 123456, "attendee_email": "someguy@example.com", "attendee_company": "Example Inc", "notes": "Discovery call to find synergies." } Modifications Note: ElevenLabs doesn't handle webhook response headers or body, and only recognizes the response code. In other words, if the workflow responds with 400 Bad request that's the only info the voice agent gets back; it doesn't get back any details, eg. "User email still needed". You can modify the structure of the expected webhook request body, and then you should reflect that structure change in all following nodes in the workflow. Ie. if you change attendee_name to attendeeFirstName and attendeeLastName then you need to make this change in the following nodes that use these properties. You can also require or make optional other user data for the Cal.com event type which would reduce or increase the data the voice agent must collect from the user. You can modify the authorization of this webhook to meet your security needs. ElevenLabs has some limitations and you should be mindful of those, but it also offers a secret feature with proves useful. An improvement to this workflow could include a GET request to a CRM or other db to get info on the user interacting with the voice agent. This could reduce some of the data collection needed from the voice agent, like if you already have the user's email address, for example. I believe you can also get the user's phone number if the voice agent is set up on a dial-in interface, so then the agent wouldn't need to ask for it. This all depends on your use case. A savvy step might be prompting the voice agent to get an email, and using the email in this workflow to pull enrichment data from Apollo.io or similar ;-)
by lin@davoy.tech
Are you looking to create a counseling chatbot that provides emotional support and mental health guidance through the LINE messaging platform ? This guide will walk you through connecting LINE with powerful AI language models like GPT-4 to build a chatbot that supports users in navigating their emotions, offering 24/7 conversational therapy and accessible mental health resources . By leveraging LINE's webhook integration and Azure OpenAI , this template allows you to design a chatbot that is both empathetic and efficient, ensuring users receive timely and professional responses. Whether you're a developer, counselor, or business owner, this guide will help you create a customizable counseling chatbot tailored to your audience's needs. Who Is This Template For? Developers who want to integrate AI-powered chatbots into the LINE platform for mental health applications. Counselors & Therapists looking to expand their reach and provide automated emotional support to clients outside of traditional sessions. Businesses & Organizations focused on improving mental health accessibility and offering innovative solutions to their users. Educators & Nonprofits seeking tools to provide free or low-cost counseling services to underserved communities. How this work? Line Webhook to receive new message Send loading animation in Line Check if the input is text or not Send the text as prompt in chat model (GPT 4o) Reply the message to user (you'll need 'edit field' to format it before reply) Pre-Requisites You have access to the LINE Developers Console. An Azure OpenAI account with necessary credentials. Set-up To receive messages from LINE, configure your webhook: Set up a webhook in LINE Developer Console. Copy the Webhook URL from the Line Chatbot node and paste it into the LINE Console. Ensure to remove any 'test' part when moving to production. The loading animation reassures users that the system is processing their request. Authorize using header authorization Message Handling Use the Check Message Type IsText? node to verify if the incoming message is text. If the message type is text, proceed with ChatGPT processing; otherwise, send a reply indicating non-text inputs are not supported. AI Agent Configuration Define the system message within the AI Agent node to guide the conversation based on desired interaction principles. Connect the Azure OpenAI Chat Model to the AI Agent. Formatting Responses Ensure responses are properly formatted before sending them back to the user. Reply Message Use the ReplyMessage - Line node to send the formatted response. Ensure proper header authorization using Bearer tokens.
by Samir Saci
Tags*: Sustainability, Business Travel, Carbon Emissions, Flight Tracking, Carbon Interface API Context Hi! I’m Samir — a Supply Chain Engineer and Data Scientist based in Paris, and founder of LogiGreen Consulting. I help companies monitor and reduce their environmental footprint by combining AI automation, carbon estimation APIs, and workflow automation. This workflow is part of our sustainability reporting initiative, allowing businesses to track the CO₂ emissions of employee flights. > Automate carbon tracking for your business travel with AI-powered workflows in n8n! 📬 For business inquiries, feel free to connect with me on LinkedIn Who is this template for? This workflow is designed for travel managers, sustainability teams, or finance teams who need to measure and report on emissions from business travel. Let’s imagine your company receives a flight confirmation email: The AI Agent reads the email and extracts structured data, such as flight dates, airport codes, and number of passengers. Then, the Carbon Interface API is called to estimate CO₂ emissions, which are stored in a Google Sheet for sustainability reporting. How does it work? This workflow automates the end-to-end process of tracking flight emissions from email to CO₂ estimation: 📨 Gmail Trigger captures booking confirmations 🧠 AI Agent extracts structured data (airports, dates, flight numbers) ✈️ Each flight leg is processed individually 🌍 Carbon Interface API returns distance and carbon emissions 📄 A second Google Sheet node appends the emission data for reporting Steps: 💌 Trigger on new flight confirmation email 🧠 Extract structured trip data using AI Agent (flights, airports, dates) 📑 Store flight metadata in Google Sheets 🧭 For each leg, call the Carbon Interface API 📥 Append distance, CO₂ in kg, and timestamp to the flight row What do I need to get started? You’ll need: A Gmail account receiving SAP Concur or travel confirmation emails A Google Sheet to record trip metadata and CO₂ emissions A free Carbon Interface API key Access to OpenAI for parsing the email via AI Agent A few sample flight confirmation emails to test Next Steps 🗒️ Use the sticky notes in the n8n canvas to: Add your Gmail and Carbon Interface credentials Send a sample booking email to your inbox Verify that emissions and distances are correctly added to your sheet This template was built using n8n v1.93.0 Submitted: June 7, 2025
by Baptiste Fort
Still reminding people about their tasks manually every morning? Let’s be honest — who wants to start the day chasing teammates about what they need to do? What if Slack could do it for you — automatically, at 9 a.m. every day — without missing anything, and without you lifting a finger? In this tutorial, you’ll build a simple automation with n8n that checks Airtable for active tasks and sends reminders in Slack, daily. Here’s the flow you’ll build: Schedule Trigger → Search Records (Airtable) → Send Message (Slack) STEP 1 : Set up your Airtable base Create a new base called Tasks Add a table (for example: Projects, To-Do, or anything relevant) Add the following fields: | Field | Type | Example | | -------- | ----------------- | ------------------------------------------- | | Title | Text | Finalize quote for Client A | | Assignee | Text | Baptiste Fort | | Email | Email | claire@email.com | | Status | Single select | In Progress / Done | | Due Date | Date (dd/mm/yyyy) | 05/07/2025 | Add a few sample tasks with the status In Progress so you can test your workflow later. STEP 2 Create the trigger in n8n In n8n, add a Schedule Trigger node Set it to run every day at 9:00 a.m.: Trigger interval: Days Days Between Triggers: 1 Trigger at hour: 9 Trigger at minute: 0 This is the node that kicks off the workflow every morning. STEP 3 : Search for active tasks in Airtable This step is all about connecting n8n to your Airtable base and pulling the tasks that are still marked as "In Progress". 1. Add the Airtable node In your n8n workflow, add a node called: Airtable → Search Records You can find it by typing "airtable" in the node search. 2. Create your Airtable Personal Access Token If you haven’t already created your Airtable token, here’s how: 🔗 Go to: https://airtable.com/create/tokens Then: Name your token something like TACHES Under Scopes, check: ✅ data.records:read Under Access, select only the base you want to use (e.g. “Tâches”) Click “Save token” Copy the personal token 3. Set up the Airtable credentials in n8n In the Airtable node: Click on the Credentials field Select: Airtable Personal Access Token Click Create New Paste your token Give it a name like: My Airtable Token Click Save 4. Configure the node Now fill in the parameters: Base: Tâches Table: Produits (or Tâches, depending on what you called it) Operation: Search Filter By Formula: {Statut} = "En cours" Return All: ✅ Yes (make sure it’s enabled) Output Format: Simple 5. Test the node Click “Execute Node”. You should now see all tasks with Statut = "En cours" show up in the output (on the right-hand side of your screen), just like in your screenshot. STEP 4: Send each task to Slack Now that we’ve fetched all the active tasks from Airtable, let’s send them to Slack — one by one — using a loop. Add the Slack node Drag a new node into your n8n workflow and select: Slack → Message Name it something like Send Slack Message You can find it quickly by typing "Slack" into the node search bar. Connect your Slack account If you haven't already connected your Slack credentials: Go to n8n → Credentials Select Slack API Click Create new Paste your Slack Bot Token (from your Slack App OAuth settings) Give it a clear name like Slack Bot n8n Choose the workspace and save Then, in the Slack node, choose this credential from the dropdown. Configure the message Set these parameters: Operation: Send Send Message To: Channel Channel: your Slack channel (e.g. #tous-n8n) Message Type: Simple Text Message Message template Paste the following inside the Message Text field: Message template Paste the following inside the Message Text field: New task for {{ $json.name }}: {{ $json["Titre"] }} 👉 Deadline: {{ $json["Date limite"] }} Example output: New task for Jeremy: Relancer fournisseur X 👉 Deadline: 2025-07-04 Test it Click Execute Node to verify the message is correctly sent in Slack. If the formatting works, you’re ready to run it on schedule 🚀
by David w/ SimpleGrow
This n8n workflow tracks user engagement in a specific WhatsApp group by capturing incoming messages via a Whapi webhook. It first filters messages to ensure they come from the correct group, then identifies the message type—text, emoji reaction, voice, or image. The workflow searches for the user in an Airtable database using their WhatsApp ID and increments their message count by one. It updates the Airtable record with the new count and the date of the last interaction. This automated process helps measure user activity and supports engagement initiatives like weekly raffles or rewards. The system is flexible and can be expanded to include more message types or additional actions. Overall, it provides a seamless way to encourage and track user participation in your WhatsApp community.
by Agent Circle
This n8n template demonstrates how to use the tool to crawl comments from a YouTube video and simply get all the results in a linked Google Sheet. Use cases are many: Whether you're a YouTube creator trying to understand your audience, a marketer running sample analysis, a data analyst compiling engagement metrics, or part of a growth team tracking YouTube or social media campaign performance, this workflow helps you extract real, actionable insights from YouTube video comments at scale. How It Works The workflow starts when you manually click Test Workflow or Execute Workflow in N8N. It reads the list of YouTube video URLs from the Video URLs tab in the connected YouTube – Get Video Comments Google Sheet. Only the URLs marked with the Ready status will be processed. The tool loops through each video and sends an HTTP request to the YouTube API to fetch comment data. Then, it checks whether the request is successful before continuing. If comments are found, they are split and processed. Each comment is then inserted in the Results tab of the connected YouTube – Get Video Comments Google Sheet. Once a URL has been finished, its status in the Video URLs tab of the YouTube – Get Video Comments Google Sheet is updated to Finished. How To Use Download the workflow package. Import the workflow package into your N8N interface. Duplicate the "YouTube - Get Video Comments" Google Sheet template into your Google Sheets account. Set up Google Cloud Console credentials in the following nodes in N8N, ensuring enabled access and suitable rights to Google Sheets and YouTube services: For Google Sheets access, ensure each node is properly connected to the correct tab in your connected Google Sheet template: Node Google Sheets - Get Video URLs → connected to the Video URLs tab; Node Google Sheets - Insert/Update Comment → connected to the Results tab; Node Google Sheets - Update Status connected to the Video URLs tab. For YouTube access: Set up a GET method in Node HTTP Request - Get Comments. Open the template in your Google Sheets account. In the tab Video URLs, fill in the video URLs you want to crawl in Column B and update the status for each row in Column A to Ready. Return to the N8N interface and click Execute Workflow. Check the results in the Results tab of the template - the collected comments will appear there. Requirements Basic setup in Google Cloud Console (OAuth or API Key method enabled) with enabled access to YouTube and Google Sheets. How To Customize By default, the workflow is manually triggered in N8N. However, you can automate the process by adding a Google Sheets trigger that monitors new entries in your connected YouTube – Get Video Comments template and starts the workflow automatically. Need Help? Join our community on different platforms for support, inspiration and tips from others. Website: https://www.agentcircle.ai/ Etsy: https://www.etsy.com/shop/AgentCircle Gumroad: http://agentcircle.gumroad.com/ Discord Global: https://discord.gg/d8SkCzKwnP FB Page Global: https://www.facebook.com/agentcircle/ FB Group Global: https://www.facebook.com/groups/aiagentcircle/ X: https://x.com/agent_circle YouTube: https://www.youtube.com/@agentcircle LinkedIn: https://www.linkedin.com/company/agentcircle
by Jaruphat J.
Who’s it for This template is perfect for content creators, AI enthusiasts, marketers, and developers who want to automate the generation of cinematic videos using Google Vertex AI’s Veo 3 model. It’s also ideal for anyone experimenting with generative AI for video using n8n. What it does This workflow: Accepts a text prompt and a GCP access token via form. Sends the prompt to the Veo 3 (preview model) using Vertex AI’s predictLongRunning endpoint. Waits for the video rendering to complete. Fetches the final result and converts the base64-encoded video to a file. Uploads the resulting .mp4 to your Google Drive. Output How to set up Enable Vertex AI API in your GCP project: https://console.cloud.google.com/marketplace/product/google/aiplatform.googleapis.com Authenticate with GCP using Cloud Shell or local terminal: gcloud auth login gcloud config set project [YOUR_PROJECT_ID] gcloud auth application-default set-quota-project [YOUR_PROJECT_ID] gcloud auth print-access-token Copy the token and use it in the form when running the workflow. ⚠️ This token lasts ~1 hour. Regenerate as needed. Connect your Google Drive OAuth2 credentials to allow file upload. Import this workflow into n8n and execute it via form trigger. Requirements n8n (v1.94.1+)** A Google Cloud project with: Vertex AI API enabled Billing enabled A way to get Access Token A Google Drive OAuth2 credential connected to n8n How to customize the workflow You can modify the in the HTTP node to match your use case. Replace the Google Drive upload node with alternatives like Dropbox, S3, or YouTube upload. Extend the workflow to add subtitles, audio dubbing, or LINE/Slack alerts. Step-by-step for each major node: Prompt Input → Vertex Predict → Wait → Fetch Result → Convert to File → Upload Best Practices Followed No hardcoded API tokens Secure: GCP token is input via form, not stored in workflow All nodes are renamed with clear purpose All editable config grouped in Set node External References GCP Veo API Docs: https://cloud.google.com/vertex-ai/docs/generative-ai/video/overview Disclaimer This workflow uses official Google Cloud APIs and requires a valid GCP project. Access token should be generated securely using gcloud CLI. Do not embed tokens in the workflow itself. Notes on GCP Access Token To use the Vertex AI API in n8n securely: Run the following on your local machine or GCP Cloud Shell: gcloud auth login gcloud config set project your-project-id gcloud auth print-access-token Paste the token in the workflow form field when submitting. Do not hardcode the token into HTTP nodes or Set nodes — input it each time or use a secure credential vault.
by Davide
How it Works This workflow automates the process of handling job applications by extracting relevant information from submitted CVs, analyzing the candidate's qualifications against a predefined profile, and storing the results in a Google Sheet. Here’s how it operates: Data Collection and Extraction: The workflow begins with a form submission (On form submission node), which triggers the extraction of data from the uploaded CV file using the Extract from File node. Two informationExtractor nodes (Qualifications and Personal Data) are used to parse specific details such as educational background, work history, skills, city, birthdate, and telephone number from the text content of the CV. Processing and Evaluation: A Merge node combines the extracted personal and qualification data into a single output. This merged data is then passed through a Summarization Chain that generates a concise summary of the candidate’s profile. An HR Expert chain evaluates the candidate against a desired profile (Profile Wanted), assigning a score and providing considerations for hiring. Finally, all collected and processed data including the evaluation results are appended to a Google Sheets document via the Google Sheets node for further review or reporting purposes [[9]]. Set Up Steps To replicate this workflow within your own n8n environment, follow these steps: Configuration: Begin by setting up an n8n instance if you haven't already; you can sign up directly on their website or self-host the application. Import the provided JSON configuration into your n8n workspace. Ensure that all necessary credentials (e.g., Google Drive, Google Sheets, OpenAI API keys) are correctly configured under the Credentials section since some nodes require external service integrations like Google APIs and OpenAI for language processing tasks. Customization: Adjust the parameters of each node according to your specific requirements. For example, modify the fields in the formTrigger node to match what kind of information you wish to collect from applicants. Customize the prompts given to AI models in nodes like Qualifications, Summarization Chain, and HR Expert so they align with the type of analyses you want performed on the candidates' profiles. Update the destination settings in the Google Sheets node to point towards your own spreadsheet where you would like the final outputs recorded. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Baptiste Fort
Still manually copy-pasting your Tally form responses? What if every submission went straight into Airtable — and the user got an automatic email right after? That’s exactly what this workflow does. No code, no headache — just a simple and fast automation: Tally → Airtable → Gmail. STEP 1 — Capture Tally Form Responses Goal Trigger the workflow automatically every time someone submits your Tally form. What we're setting up A webhook that catches form responses and kicks off the rest of the flow. Steps to follow Add a Webhook node Parameter : Value Method : POST Path : formulaire-tally Authentication : None Respond : Immediately Save the workflow → This will generate a URL like: https://your-workspace.n8n.cloud/webhook-test/formulaire-tally 💡 Use the Test URL first (found under Parameters > Test URL) Head over to Tally Go to your form → Form Settings > Integrations > Webhooks Paste the Test URL into the Webhook field Enable the webhook ✅ Submit a test entry → Tally won’t send anything until a real submission is made. This step is required for n8n to capture the structure. Expected output n8n receives a JSON object containing: General info (IDs, timestamps, etc.) A fields[] array with all the form inputs (name, email, etc.) Each field is nicely structured with a label, key, type, and most importantly, a value. Perfect foundation for the next step: data cleanup. STEP 2 — Clean and Structure the Form Data (Set node) Goal Take the raw data sent by Tally and turn it into clean, readable JSON that's easy to use in the rest of the workflow. Tally sends the responses inside a big array called field. Can you grab a field directly with something like {{$json"fields"["value"]}}? Yes. But a good workflow is like a sock drawer — when everything’s folded and labeled, life’s just easier. So we’re going to clean it up using a Set node. Steps to follow Add a Set node right after the Webhook. Enable the “Keep only set” option. Define the following fields in the Set node: Field name: Expression full_name: {{$json"fields"["value"]}} company_name: {{$json"fields"["value"]}} job_title: {{$json"fields"["value"]}} email: {{$json"fields"["value"]}} phone_number: {{$json"fields"["value"] ?? ""}} submission_date: {{$now.toISOString()}} ⚠️ The order of fields[] depends on your Tally form. If you change the question order, make sure to update these indexes accordingly. Expected output You’ll get a clean, structured JSON like this: Now your data is clear, labeled, and ready for the rest of your workflow. STEP 3 — Save Data in Airtable Goal Every time someone submits your Tally form, their info is automatically added to an Airtable base. No more copy-pasting — everything lands right where it should. Steps to follow Create your Airtable base Start by creating a base named Leads (or whatever you prefer), with a table called Form Submissions. Add the following columns in this exact order so everything maps correctly later: Generate an Airtable token So n8n can send data into your base: Go to 👉 [ https://airtable.com/create/tokens](https://airtable.com/create/tokens ) Click Create token Give it a name (e.g. Tally Automation) Check the following permissions: data.records:read data.records:write schema.bases:read Under Base access, either choose your base manually or select “All current and future bases” Click Create token and copy the generated key Add configure the Airtable node in n8n Node: Airtable Operation: Create Authentication: Personal Access Token Paste your token n8n will suggest your base and table (or you can manually grab the IDs from the URL: https://airtable.com/appXXXXXXXX/tblYYYYYYYY/...) Map your fields Inside the Airtable node, add the following field mappings: Every new Tally form submission automatically creates a new row in your Airtable base. STEP 4 — Send an Automatic Confirmation Email Goal Send a professional email as soon as a form is completed Steps to follow Add a Wait node You don’t want the email to go out instantly — it feels cold and robotic. → Add a Wait node right after Airtable. Mode: Wait for a period of time Delay: 5 to 10 minutes Unit: Minutes Add a Gmail > Send Email node Authentication: OAuth2 Connect a Gmail account (business or test) ⚠️ No API keys here — Gmail requires OAuth. Configure the Send Email node Field Value Credential to connect with Gmail account via OAuth2 Resource : Message Operation : Send To : {{ $json.fields["Email"] }} Subject : Thanks for reaching out! Email Type : HTML Message: (but do the mapping correctly using the Input so that lead receives its name correctly ) End of the Workflow And that’s it — your automation is live! Your lead fills out the Tally form → the info goes to Airtable → they get a clean, professional email without you doing a thing.
by Yaron Been
This workflow provides automated access to the Stability Ai Stable Video Diffusion AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for video generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete video generation process using the Stability Ai Stable Video Diffusion model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Advanced AI model by stability-ai for automated processing tasks. Key Capabilities AI-powered video generation and processing** High-quality video synthesis from inputs** Advanced video manipulation capabilities** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Stability Ai/stable-video-diffusion AI model Stability Ai Stable Video Diffusion**: The core AI model for video generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Video Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Video Content Creation**: Generate videos for social media, marketing, and presentations Animation & Motion Graphics**: Create animated content and visual effects Video Editing**: Enhance and transform existing video content Educational Content**: Produce instructional and explainer videos Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #videogeneration #aivideo #videoai #motion #videoautomation #videocreation #stablediffusion #diffusion #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation