by n8n Team
This workflow generates CSV files containing a list of 10 random users with specific characteristics using OpenAI's GPT-4 model. It then splits this data into batches, converts it to CSV format, and saves it to disk for further use. The execution of the workflow begins from here when triggered manually. "OpenAI" Node. This uses the OpenAI API to generate random user data. The input to the OpenAI API is a fixed string, which asks for a list of 10 random users with some specific attributes. The attributes include a name and surname starting with the same letter, a subscription status, and a subscription date (if they are subscribed). There is also a short example of the JSON object structure. This technique is called one-shot prompting. "Split In Batches" Node. This node is used to handle the OpenAI responses one by one. "Parse JSON" Node. This node converts the content of the message received from the OpenAI node (which is in string format) into a JSON object. "Make JSON Table" Node. This node is used to convert the JSON data into a tabular format, which is easier to handle for further data processing. "Convert to CSV" Node. This node converts the table format data received from the "Make JSON Table" node into CSV format and assigns a file name. "Save to Disk" Node. This node is used to save the CSV generated in the previous node to disk in the ".n8n" directory. The workflow is designed in a circular manner. So, after saving the file to disk, it goes back to the "Split In Batches" node to process the OpenAI output, until all batches are processed.
by Kyle Morse
Takes your raw, unpolished voice transcripts and transforms them into well-structured LinkedIn posts using AI. Perfect for when you have good ideas but they come out as rambling thoughts. The Problem: You record voice memos with great ideas, but when you read the transcript, it's full of "ums," incomplete sentences, and scattered thoughts. Turning that into a professional LinkedIn post takes forever. The Solution: Email your raw transcript to this workflow. It combines your unpolished content with examples from your inspiration document (posts you've saved that match your desired style), then uses AI to create a clean, engaging LinkedIn post. What actually happens: You email a raw voice transcript to your workflow email -The workflow pulls style examples from your Google Doc AI reformats your scattered thoughts into a coherent 150-300 word LinkedIn post You get an email back with the polished content + suggested image description Copy, paste, and post to LinkedIn You provide: The raw transcript (from your phone's voice recorder or any transcription tool), a Google Doc with LinkedIn posts you admire for style reference. You get: Professional LinkedIn content that sounds like you, but organized and polished. Technical requirements: Anthropic API, email account, Google Doc with example posts. This is basically having an AI writing assistant that knows your voice and preferred style, turning your brain dumps into professional content.
by David Olusola
When you fill out the form with business challenges and requirements GPT-4 analyzes the input and generates a customized proposal using your template System automatically creates a Google Slides presentation with personalized content Professional proposal email is sent directly to the prospect with the presentation link Set up steps Estimated time: 15-20 minutes Connect your OpenAI API key for GPT-4 access Link your Google account for Slides and Gmail integration Create your proposal template in Google Slides with placeholder variables Customize the AI prompt and email template with your branding Test with sample data and activate the workflow
by Yaron Been
Description This workflow automatically scans food delivery platforms and restaurant websites to find the best deals and discounts. It helps you save money on meals by aggregating special offers and promotions in one place. Overview This workflow automates finding the best food deals and discounts from various websites. It uses Bright Data to scrape deal information and can be configured to send you notifications or save the deals to a spreadsheet. Tools Used n8n:** The automation platform that orchestrates the workflow. Bright Data:** For scraping food deal websites without getting blocked. (Optional) Google Sheets/Discord/Telegram:** To store or get notified about the deals. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Configure Bright Data: Add your Bright Data credentials to the Bright Data node. Set Up Notifications (Optional): Configure the nodes for Google Sheets, Discord, or Telegram if you want to receive notifications. Customize: Specify the websites you want to scrape for deals. Use Cases Foodies:** Always be the first to know about the best restaurant deals in your city. Students:** Save money by finding cheap eats and special offers. Families:** Plan your meals around the best grocery and restaurant discounts. Connect with Me Website:** https://www.nofluff.online YouTube:** https://www.youtube.com/@YaronBeen/videos LinkedIn:** https://www.linkedin.com/in/yaronbeen/ Get Bright Data:** https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #fooddeals #brightdata #webscraping #discounts #fooddiscounts #mealdeals #restaurantdeals #savemoney #foodoffers #n8nworkflow #workflow #nocode #foodtech #dealfinder #specialoffers #fooddelivery #budgetmeals #foodsavings #dealhunting #foodautomation #moneysaving #foodhacks #bestdeals #foodscraping
by Miquel Colomer
Do you want to know where a web visitor lives? This workflow enriches any lead by IP address using the uProc.io Location By IP address tool and sends an email in Spanish or English depending on the detected web visitor country. You need to add your credentials (Email and API Key - real -) located at Integration section to n8n. Node "Create IP and Email Item" can be replaced by any other supported service with IP and Email values, like Mailchimp, Calendly, or MySQL. The "uProc" node returns the location of the provided IP address. "If" node checks if the web visitor country code belongs to Spain (ES Isocode). If positive, we use the Spanish language in our emails. Otherwise, we will use the English language in our communications. Depending on the detected country code, we use the Amazon SES node to send the customized email adapted to the right language.
by TheUnknownEntity
I'm currently trialing a 4 day work week for all staff at my company, and one of the major impacts on productivity is interruptions. As such, I opted to use N8N to create a workflow to monitor my Google Calendar and when an event starts, to update my Slack status with an emote and the title of the calendar task. Additionally I opted to include to change the colour of Philips Hue lamp located in my living room where my wife is currently working so she know's if she can interrupt me or not. My calendar is built on the theory behind the Diary Detox system and as such the Slack Status reflect the colours involved. This was achieved using the emote aliases for the relevant colour circles. The Philips Hue lamp status is changed via the local API with Home Assistant. This is a very similiar process to controlling it with something like the Streamdeck, but the workflow calls the Webhook instead of the Streamdeck. This process can be found in lots of Youtube videos such as this. This gives my wife a very quick and easy way to know if she can interrupt me in my office (when the lights are Green or Blue) or when I'm busy (Red). Please Note: The above images are not intended to be an incentive to create your own Squid Games. Additionally, when integrating Slack with N8N, there are 2 x APIs which can be used. Typically the Bot User OAuth Token is used, however in order for your Status to be updated, the User OAuth Token must be used with the users.profile:read and users.profile:write permissions enabled. For clarity, I have removed the Webhooks from the Workflow as this would allow any person to control my lights. These can be inserted in the HTTP Request nodes. Each node responds to a different automation within the Home Assistant infrastructure. Acknowledgement: I would also credit Jon (Discord) aka 8668 (Workflows) for writing the Function node which turns the ColorID into a named variable.
by Yaron Been
Replace manual task prioritization with intelligent AI reasoning that thinks like a Chief Operating Officer. This workflow automatically fetches your Asana tasks every morning, analyzes them using advanced AI models, and delivers the single most critical task with detailed reasoning - ensuring your team always focuses on what matters most. ✨ What This Workflow Does: 📋 Automated Task Collection**: Fetches all assigned Asana tasks daily at 9 AM 🤖 AI-Powered Analysis**: Uses OpenAI GPT-4 to evaluate urgency, impact, and strategic importance 🎯 Smart Prioritization**: Identifies the #1 most critical task with detailed reasoning 🧠 Contextual Memory**: Leverages vector database for historical context and pattern recognition 💾 Structured Storage**: Saves prioritized tasks to PostgreSQL with full audit trail 🔄 Continuous Learning**: Builds organizational knowledge over time for better decisions 🔧 Key Features: Daily automation** with zero manual intervention Context-aware AI** that learns from past prioritization decisions Strategic reasoning** explaining why each task is prioritized Vector-powered memory** using Pinecone for intelligent context retrieval Clean structured output** with task names, priority levels, and detailed justifications Database integration** for reporting and historical analysis 📋 Prerequisites: Asana account with API access OpenAI API key (GPT-4 recommended) PostgreSQL database Pinecone account (for vector storage and context) 🎯 Perfect For: Operations teams managing multiple competing priorities Startups needing systematic task management Project managers juggling complex workflows Leadership teams requiring strategic focus Any organization wanting AI-driven operational intelligence 💡 How It Works: Morning Automation: Triggers every day at 9 AM Data Collection: Pulls all relevant tasks from Asana AI Analysis: Evaluates each task using COO-level strategic thinking Context Retrieval: Searches vector database for similar past tasks Smart Prioritization: Identifies the single most important task Structured Output: Delivers priority level with detailed reasoning Data Storage: Saves results for reporting and continuous improvement 📦 What You Get: Complete n8n workflow with all AI components configured PostgreSQL database schema for task storage Vector database setup for contextual intelligence Comprehensive documentation and setup guide Sample task data and output examples 💡 Need Help or Want to Learn More? Created by Yaron Been - Automation & AI Specialist 📧 Support: Yaron@nofluff.online 🎥 YouTube Tutorials: https://www.youtube.com/@YaronBeen/videos 💼 LinkedIn: https://www.linkedin.com/in/yaronbeen/ Discover more advanced automation workflows and AI integration tutorials on my channels! 🏷️ Tags: AI, OpenAI, Asana, Task Management, COO, Prioritization, Automation, Vector Database, Operations, GPT-4
by Friedemann Schuetz
Welcome to my Automated Image Metadata Tagging Workflow! This workflow automatically analyzes the image content with the help of AI and writes it directly back into the image file as keywords. This workflow has the following sequence: Google Drive trigger (scan for new files added in a specific folder) Download the added image file Analyse the content of the image and extract the file as Base64 code Merge Metadata and Base64 Code Code Node to write the Keywords into the Metadata (dc:subject) Convert to file and update the original file in the Google Drive folder The following accesses are required for the workflow: Google Drive: Documentation AI API access (e.g. via OpenAI, Anthropic, Google or Ollama) You can contact me via LinkedIn, if you have any questions: https://www.linkedin.com/in/friedemann-schuetz
by Friedemann Schuetz
Welcome to my Automated Image Metadata Tagging Workflow! DISCLAIMER: This workflow only works with self-hosted n8n instances! You have to install the n8n-nodes-exif-data Community Node! This workflow automatically analyzes the image content with the help of AI and writes it directly back into the image file as keywords. (https://n8n.io/workflows/2995).** This workflow has the following steps: Google Drive trigger (scan for new files added in a specific folder) Download the added image file Analyse the content of the image Merge Metadata and image file Write the Keywords into the Metadata (dc:subject/keywords) and create new image file Update the original file in the Google Drive folder The following accesses are required for the workflow: You have to install the n8n-nodes-exif-data Community Node** Google Drive: Documentation AI API access (e.g. via OpenAI, Anthropic, Google or Ollama) You can contact me via LinkedIn, if you have any questions: https://www.linkedin.com/in/friedemann-schuetz
by Yaron Been
Bytedance Seededit 3.0 Image Generator Description Text-guided image editing model that preserves original details while making targeted modifications like lighting changes, object removal, and style conversion Overview This n8n workflow integrates with the Replicate API to use the bytedance/seededit-3.0 model. This powerful AI model can generate high-quality image content based on your inputs. Features Easy integration with Replicate API Automated status checking and result retrieval Support for all model parameters Error handling and retry logic Clean output formatting Parameters Required Parameters prompt** (string): Text prompt for image generation image** (string): Input image to edit Optional Parameters seed** (integer, default: None): Random seed. Set for reproducible generation guidance_scale** (number, default: 5.5): Prompt adherence. Higher = more literal. How to Use Set up your Replicate API key in the workflow Configure the required parameters for your use case Run the workflow to generate image content Access the generated output from the final node API Reference Model: bytedance/seededit-3.0 API Endpoint: https://api.replicate.com/v1/predictions Requirements Replicate API key n8n instance Basic understanding of image generation parameters
by Antonio Cheong
Run Apache Airflow DAG and Retrieve XCom Value What this workflow does This workflow integrates the Apache Airflow API DAGRun and XCom. It enables n8n to trigger Airflow DAGs and retrieve the execution results. Preparation: Update Airflow API Link Prefix Navigate to the airflow-api node. Update the prefix of the Airflow API link in the format: http(s)://ip:port. Example: https://airflow.example.com Configure Authentication Go to the Airflow: dag_run node. Update the Basic Auth credentials with your Airflow username and password. Repeat this step for Airflow: dag_run - state and Airflow: dag_run - get result nodes. Security Note: Using Basic Authentication requires storing credentials in plaintext. If possible, consider using API Keys or Tokens for enhanced security. An example is setting Airflow's API Authentication to basic\_auth. Choose other authentication methods if needed. Ensure the user account has the following permissions: can create on DAG Runs, can read on DAG Runs, can read on XComs, can edit on DAGs, and can read on DAGs. How to Use: To execute this workflow, use the Execute Sub-workflow node with the following input parameters: dag\_id**: The DAG ID (name) in Airflow that you want to trigger. task\_id**: The Task ID (name) from which you want to retrieve the XCom return\_value. conf**: Input data for the Airflow DAG run. wait**: Delay (in seconds) between each Airflow: dag_run - state check. wait\_time**: The maximum time (in seconds) to wait for Airflow: dag_run - state before returning an error. Output: The workflow returns the XCom result from Airflow: dag_run - get result. The XCom return_value is stored in the value field.
by Baptiste Fort
🎯 Workflow Goal Still manually checking form responses in your inbox? What if every submission landed neatly in Airtable — and you got a clean Slack message instantly? That’s exactly what this workflow does. No code, no delay — just a smooth automation to keep your team in the loop: Tally → Airtable → Slack Build an automated flow that: receives Tally form submissions, cleans up the data into usable fields, stores the results in Airtable, and automatically notifies a Slack channel. Step 1 – Connect Tally to n8n What we’re setting up A Webhook node in POST mode. Technical Add a Webhook node. Set it to POST. Copy the generated URL. In Tally → Integrations → Webhooks → paste this URL. Submit a test response on your form to capture a sample structure. Step 2 – Clean the data After connecting Tally, you now receive raw data inside a fields[] array. Let’s convert that into something clean and structured. Goal Extract key info like Full Name, Email, Phone, etc. into simple keys. What we’re doing Add a Set node to remap and clean the fields. Technical Add a Set node right after the Webhook. Add new values (String type) manually: Name: Full Name → Value: {{$json"fields"["value"]}} Name: Email → Value: {{$json"fields"["value"]}} Name: Phone → Value: {{$json"fields"["value"]}} (Adapt the indexes based on your form structure.) Use the data preview in the Webhook node to check the correct order. Output You now get clean data like: { "Full Name": "Jane Doe", "Email": "jane@example.com", "Phone": "+123456789" } Step 3 – Send to Airtable ✅ Once the data is cleaned, let’s store it in Airtable automatically. Goal Create one new Airtable row for each form submission. What we’re setting up An Airtable – Create Record node. Technical Add an Airtable node. Authenticate or connect your API token. Choose the base and table. Map the fields: Name: {{$json["Full Name"]}} Email: {{$json["Email"]}} Phone: {{$json["Phone"]}} Output Each submission creates a clean new row in your Airtable table. Step 4 – Add a delay ⌛ After saving to Airtable, it’s a good idea to insert a short pause — this prevents actions like Slack messages from stacking too fast. Goal Wait a few seconds before sending a Slack notification. What we’re setting up A Wait node for X seconds. ✅ Technical Add a Wait node. Choose Wait for X minutes. Step 5 – Send a message to Slack 💬 Now that the record is stored, let’s send a Slack message to notify your team. Goal Automatically alert your team in Slack when someone fills the form. What we’re setting up A Slack – Send Message node. Technical Add a Slack node. Connect your account. Choose the target channel, like #leads. Use this message format: New lead received! Name: {{$json["Full Name"]}} Email: {{$json["Email"]}} Phone: {{$json["Phone"]}} Output Your Slack team is notified instantly, with all lead info in one clean message. Workflow Complete Your automation now looks like this: Tally → Clean → Airtable → Wait → Slack Every submission turns into clean data, gets saved in Airtable, and alerts your team on Slack — fully automated, no extra work.