by Friedemann Schuetz
Welcome to my Simple OpenAI Image Generator Workflow! This workflow creates an image with the new OpenAI image model "GPT-Image-1" based on a form input. This workflow has the following sequence: Form trigger (image prompt and image size input) Generate the Image via OpenAI API. Return the image to the input form for download. The following accesses are required for the workflow: OpenAI API access: Documentation Instructions Link your OpenAI Platform account in the “OpenAI Image Generation” node ("Credential Type") You can contact me via LinkedIn, if you have any questions: https://www.linkedin.com/in/friedemann-schuetz
by Fady Bekkar
This automation allows you to track feature requests in Notion, create GitHub issues automatically, and notify your team via email based on issue status. It's ideal for technical and functional teams who collaborate on project delivery using Notion and GitHub. 🔹 SECTION 1: Detect and Sort Issues from Notion Combining: Schedule Trigger + Notion Database + Field Mapping + Status Routing ⏰ 1. Schedule Trigger 🔧 Node Type: Schedule Trigger (you can use a webhook trigger if you are on Notion paid plan) 💬 Description: Triggers the workflow every X minutes to check for new or updated Notion database pages. 📑 2. Get Many Database Pages (Notion) 🔧 Node Type: Notion → Get All Database Pages 📋 What it does: Fetches all rows (pages) from a Notion database that represents tasks or feature requests. ✏️ 3. Sort Issues Fields 🔧 Node Type: Set 📋 Goal: Restructures or cleans data fields such as Title, Status, Labels, and Repository. 🔀 4. Switch: Issue Status Decision 🔧 Node Type: Switch 🎯 What it does: Separates logic based on the Status of the Notion item: If status is "To develop" → proceed to create issue Else → send notification to the team 🔹 SECTION 2: GitHub Issue Creation (IF "To develop") Combining: GitHub Node + Notion Update 🐙 5. Create an Issue (GitHub) 🔧 Node Type: GitHub → Create Issue ⚙️ What it does: Creates a new issue on the GitHub repo defined in the Notion row. 📥 Inputs: Uses dynamic fields: Title, Description, Labels, Repository. 🧩 6. Set Status and Issue URL (Notion Update) 🔧 Node Type: Notion → Update Database Page 🧠 Role: Updates the status of the issue in Notion to In progress and stores the created GitHub Issue URL. 🔹 SECTION 3: Notify Team on Already In-Progress Items (IF NOT "To develop") Combining: Notion Users + Filtering + Email Grouping + Gmail 👥 7. Get Many Users (Notion Users) 🔧 Node Type: Notion → Get All Users 📥 What it does: Retrieves the list of team members (to be notified). 🧠 8. Map Notion Users 🔧 Node Type: Set 📋 Role: Maps and formats data for each user (e.g., Name, Email, Role). 🧹 9. Exclude Bot 🔧 Node Type: Switch 🚫 What it does: Excludes automation/bot users (e.g., notifications@noreply). 🧮 10. Group Recipients 🔧 Node Type: Aggregate 🎯 Goal: Collects all user emails into a single array to send one email to all recipients. 📬 11. Send a Message (Gmail) 🔧 Node Type: Gmail → Send Email
by Yaron Been
This workflow provides automated access to the Ibm Granite Granite 3.3 8B Instruct AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for text generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete text generation process using the Ibm Granite Granite 3.3 8B Instruct model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Granite-3.3-8B-Instruct is a 8-billion parameter 128K context length language model fine-tuned for improved reasoning and instruction-following capabilities. Key Capabilities Advanced text generation and processing** Natural language understanding and generation** Intelligent text manipulation and analysis** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Ibm Granite/granite-3.3-8b-instruct AI model Ibm Granite Granite 3.3 8B Instruct**: The core AI model for text generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Text Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Content Writing**: Generate articles, blogs, and marketing copy Code Generation**: Assist with programming and code documentation Text Analysis**: Process and analyze large volumes of text data Automated Communication**: Generate responses and communication templates Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #textgeneration #nlp #aiwriting #textai #contentgeneration #aitext #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation
by Nisarag
This workflow will allow you to get the latest twitter mentions and send those mentions to Rocket.Chat. To ensure that we don't resend the same tweets as before, we use the Function Node and getWorkflowStaticData() to persist the ids of the tweets which have already been sent and filter those out. This leaves us with only the newest tweets.
by damien
This worflow let us know main stat on our Instagram : this is the first i share with community as i am a beginner :) Every morning i know how many follower we have, how many posts have been made. I use a fantastic tool which is : https://socialblade.com and also : https://martechwithme.com/monitoring-youtube-channels-subscribers-with-google-sheets/ to send them to Google Sheets Hope this could help :) This can be improved for sure, so i will be very pleased to have your comments thanks French version : :) * Ce schéma permet d'avoir les statistiques de notre Instagram : combien de followers et combien de posts ont été réalisés. Ces données sont publiées sur MAttermost. J'utilise un outil génial pour récupérer les données de instagram : https://socialblade.com puis https://martechwithme.com/monitoring-youtube-channels-subscribers-with-google-sheets/ pour les récupérer sur Google Sheets
by Yaron Been
This workflow provides automated access to the 0Xdino Cyberrealistic Pony Semireal V36 AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for other generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete other generation process using the 0Xdino Cyberrealistic Pony Semireal V36 model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Advanced AI model for automated processing and generation tasks. Key Capabilities Specialized AI model with unique capabilities** Advanced processing and generation features** Custom AI-powered automation tools** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the 0Xdino/cyberrealistic-pony-semireal-v36 AI model 0Xdino Cyberrealistic Pony Semireal V36**: The core AI model for other generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Other Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Specialized Processing**: Handle specific AI tasks and workflows Custom Automation**: Implement unique business logic and processing Data Processing**: Transform and analyze various types of data AI Integration**: Add AI capabilities to existing systems and workflows Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #aiprocessing #dataprocessing #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation
by Maximiliano Rojas-Delgado
Turn Your Ideas into Videos—Right from Google Sheets! This workflow helps you make cool 8-second videos using Fal.AI and Veo 3, just by typing your idea into a Google Sheet. You can even choose if you want your video to have sound or not. It’s super easy—no tech skills needed! Why use this? Just type your idea in a sheet—no fancy tools or uploads. Get a video link back in the same sheet. Works with or without sound—your choice! How does it work? You write your idea, pick the video shape, and say if you want sound (true or false) in the Google Sheet. n8n reads your idea and asks Fal.AI to make your video. When your video is ready, the link shows up in your sheet. What do you need? A Google account and Google Sheets connected with service account (check this link for reference) A copy of the following Google Spreadsheet: Spreadsheet to copy An OpenAI API key A Fal.AI account with some money in it That’s it! Just add your ideas and let the workflow make the videos for you. Have fun creating! if you have any questions, just contact me at max@nervoai.com
by sthosstudio
This workflow automatically tracks changes on specific websites, typically in e-commerce where you want to get information about price changes. Prerequisites Basic knowledge of HTML and JavaScript Nodes Execute Command nodes create a file named kopacky.json in the /data/ folder (Make sure that n8n has the permissions to make changes to the folder in your setup) and clean data. Cron node triggers the workflow at regular intervals (default is 15 minutes), depending on how often you want to crawl URLs of your watchers. Function Item node (Change me) adds the URL watchers. You can put as many URLs (watchers) as you want by changing the JavaScript code in the node. There are four properties for each watcher: |Property|Meaning| |-|-| |slug|Unique identifier for the watcher.| |link|URL of the website where you want to track changes.| |selector|CSS selector of the HTML tag, where your price is placed. You can use browser web tools to get a specific selector.| |currency|Currency code in which your price is set.| Function Item node (Init item) saves all required data from each watcher to the kopacky.json file. HTTP Request node fetches data from the website. HTML Extract node extracts the required information from the webpage. Send Email nodes (NotifyBetterPrice) send you an email when there is an issue with getting the price, and when a better price is available (this could happen if the website is down, your tracking product is not available anymore, or the owner of the website changed the selector or HTML). IF nodes filter the incoming data and route the workflow. Move Binary Data nodes convert the JSON file to binary data. Write Binary File nodes write the product prices in the file. NOTE: This is the first (beta) version of this workflow, so it could have some issues. For example, there is an issue with getting content of those websites, where the owner of the website blocks any calls from unknown foreign services - it's typical protection against crawlers.
by n8n Team
This workflow automatically downloads a CSV from the web, and parses it in a format that n8n can access. It then ensures that the data from the CSV is matched to the names of the columns in the database, and inserts this data as new rows in Snowflake. Prerequisites: A CSV with data A Snowflake account and credentials A Snowflake database to upload your CSV to Nodes: A HTTP Request node to download the CSV file A Spreadsheet File node to access the data from the CSV A Set node to ensure the data from the CSV is mapped to the column names of the Snowflake database A Snowflake node to insert these new rows into the database.
by n8n Team
The workflow is an automated process designed for incident management and tracking, specifically by integrating Splunk alerts with a Jira ticketing system using n8n. The initial step in the workflow is a Webhook Trigger, which is set up to receive POST requests with data from Splunk to initiate the workflow. Once the workflow is triggered, the "Set Host Name" node cleans up the hostname received from Splunk, ensuring that it is alphanumeric for consistency and security purposes. Subsequently, the "Search Ticket" node interacts with Jira through a Jira Query Language (JQL) request to locate any existing issues that match the sanitized hostname. The workflow splits at the "IF Ticket Not Exists" node, which checks for the presence of a key indicating a matching issue. If an issue exists, the workflow proceeds to add a comment to the identified issue, and if not, it creates a new Jira issue. At the false path, the "Add Ticket Comment" node appends a new comment to the existing Jira issue, encapsulating details from the Splunk alert, such as the timestamp and the alert description.
by n8n Team
This n8n workflow provides a comprehensive automation solution for processing email attachments, specifically targeting enhanced security protocols for organizations that use platforms like Outlook. It starts with the IMAP node, which is set to ingest emails and identify those with .eml attachments. Once an email with an attachment is ingested, the workflow progresses to a conditional operation where it checks for the presence of attachments. If an attachment is found, the binary data is moved and converted to JSON format, preparing it for further analysis. This meticulous approach to detecting attachments is crucial for maintaining a robust security posture, allowing for the proactive identification and handling of potentially malicious content. In the subsequent stage, the workflow leverages the capabilities of Sublime Security by analyzing the email attachment. The binary file is scrutinized for threats, and upon detection, the information is split to matched and unmatched data. This process not only speeds up the threat detection mechanism but also ensures compatibility with other systems, such as Slack, resulting in a smooth and efficient workflow. This automation emphasizes operational efficiency with minimal user involvement, enhancing the organization's defense against cyber threats. The final phase of the workflow involves preparing the output for a Slack report. Whether a threat is detected or not, n8n ensures that stakeholders are immediately informed by dispatching comprehensive reports or notifications to Slack channels. This promotes a culture of transparency and prompt action within the team.
by Mutasem
Use case If you have a form where potential leads reach out, then you probably want to analyze those leads and send a notification if certain requirements are met, e.g. employee number is high enough. MadKudu is built exactly to solve this problem. We use it along with Hunter and Gmail to get an email alert for high quality leads. How to setup Add you MadKudu, Hunter, and Gmail credentials Set the email to send to Click the Test Workflow button, enter your email and check your email Activate the workflow and use the form trigger production URL to collect your leads in a smart way How to adjust this template You may want to raise or lower the threshold for your leads, as you see fit.