by Marth
How It Works ⚙️ This workflow acts as a communication bridge for your candidate pipeline: Webhook Trigger (Status Update): 🚀 The workflow activates when it receives data indicating a candidate's status has changed. This data could come from an internal form, a custom script, or a webhook from a basic Applicant Tracking System (ATS). Extract & Prepare Data (Function): 🧹 This node processes the incoming data. It extracts key information such as the candidate's name, the position they applied for, their previous status (if available), and their new status. It then formats this information into a clear, concise message suitable for a notification. Send Slack Notification: 📢 The prepared message is sent to a designated Slack channel (e.g., #recruitment-updates). This provides instant, real-time updates to your team, ensuring everyone is on the same page. (Alternative: Send Email Notification): This node can easily be swapped with a Gmail or SendGrid node to send email notifications to a predefined list of recipients instead of Slack. How to Set Up 🛠️ Follow these steps carefully to get your "Automated Candidate Status Notifier" workflow up and running: Import Workflow JSON: Open your n8n instance. Click on 'Workflows' in the left sidebar. Click the '+' button or 'New' to create a new workflow. Click the '...' (More Options) icon in the top right. Select 'Import from JSON' and paste the entire JSON code for this workflow. Configure Webhook Trigger (Status Update): Locate the 'Webhook Trigger (Status Update)' node (1. Webhook Trigger). Activate the workflow. n8n will provide a unique 'Webhook URL'. Crucial Step: Configure your data-sending system (e.g., a form submission, an ATS's webhook settings, or your custom script) to send candidate status update data (preferably in JSON format via POST request) to this n8n Webhook URL. Configure Extract & Prepare Data (Function): Locate the 'Extract & Prepare Data' node (2. Extract & Prepare Data). Adjust Field Names: Review the functionCode inside this node. You MUST adjust the variable assignments (e.g., inputData.candidateName, inputData.position) to accurately match the exact field names your sending system uses for candidate name, position, new status, old status, and notes. Use the 'Test Workflow' feature after sending a test webhook to inspect the incoming items[0].json.body data structure. The node automatically formats messages for Slack and Email. Configure Send Slack Notification: Locate the 'Send Slack Notification' node (3. Send Slack Notification). Credentials: Select your existing Slack API credential or click 'Create New' to set one up. Replace YOUR_SLACK_CREDENTIAL_ID with the actual ID or name of your credential from your n8n credentials. Channel: Replace YOUR_SLACK_CHANNEL_ID_OR_NAME with the exact ID or name of the Slack channel where you want to receive notifications (e.g., #recruitment-updates). OPTIONAL: Switch to Email Notification (Gmail/SendGrid/etc.): Delete the 'Send Slack Notification' node. Add a new 'Gmail' or 'SendGrid' (or your preferred email service) node. Configure its credentials. Set the 'To Email' field (e.g., your-team-email@example.com). Set the 'Subject' to ={{ $json.emailSubject }}. Set the 'HTML' body to ={{ $json.emailBody }}. Connect it from the 'Extract & Prepare Data' node. Review and Activate: Thoroughly review all node configurations. Ensure all placeholder values (like YOUR_...) are replaced and settings are correct. Click the 'Save' button in the top right corner. Finally, toggle the 'Inactive' switch to 'Active' to enable your workflow. 🟢 Your automated candidate status notifier is now live, keeping your team updated in real-time!
by Usman Liaqat
This workflow listens for incoming WhatsApp messages that contain media (e.g., images) and automatically downloads the media file using WhatsApp's private media URL. The trigger node activates when a WhatsApp message with media is received. The media ID is extracted from the message payload. A private media URL is retrieved using the media ID. The media file is downloaded using an authenticated HTTP request. Ideal for: Archiving WhatsApp media to external systems. Triggering further automations based on received media. Integrating with cloud storage like Google Drive, Dropbox, or Amazon S3. Set up steps Connect your WhatsApp Business API account. Add HTTP credentials for downloading media via private URL. Set up the webhook in your WhatsApp Business account. Extend the workflow as needed for your use case (e.g., file storage, alerts).
by Yaron Been
Description This workflow automatically searches multiple freelance platforms for new gigs matching your skills and requirements. It saves you time by eliminating the need to manually check multiple job boards and sends you alerts for relevant opportunities. Overview This workflow automatically scrapes freelance job boards and platforms for new gigs matching your skills and requirements. It uses Bright Data to access job listings and can notify you of new opportunities or save them to a database. Tools Used n8n:** The automation platform that orchestrates the workflow. Bright Data:** For scraping freelance platforms like Upwork, Fiverr, Freelancer, etc. without getting blocked. (Optional) Email/Slack/Database:** For notifications or data storage. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Configure Bright Data: Add your Bright Data credentials to the Bright Data node. Set Up Notifications: Configure how you want to receive job alerts. Customize: Add your skills, rate requirements, and other filters. Use Cases Freelancers:** Get notified of new gigs matching your skills. Agencies:** Monitor job boards for potential client opportunities. Remote Workers:** Track new remote job postings across multiple platforms. Connect with Me Website:** https://www.nofluff.online YouTube:** https://www.youtube.com/@YaronBeen/videos LinkedIn:** https://www.linkedin.com/in/yaronbeen/ Get Bright Data:** https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #freelance #brightdata #webscraping #remotework #freelancejobs #gigeconomy #upwork #fiverr #freelancer #jobsearch #remotejobs #freelanceopportunities #n8nworkflow #workflow #nocode #jobhunting #freelancegigs #jobscraper #jobnotifications #freelancecareer #digitalnomad #workfromhome #jobopportunities #freelancetools #jobmonitoring #freelancesuccess
by Yaron Been
Description This workflow monitors Bitcoin prices across multiple exchanges and sends you alerts when significant price drops occur. It helps crypto traders and investors identify buying opportunities without constantly watching the markets. Overview This workflow monitors Bitcoin prices across multiple exchanges and sends you alerts when significant price drops occur. It uses Bright Data to scrape real-time price data and can be configured to notify you through various channels. Tools Used n8n:** The automation platform that orchestrates the workflow. Bright Data:** For scraping cryptocurrency exchange data without getting blocked. Notification Services:** Email, SMS, Telegram, or other messaging platforms. How to Install Import the Workflow: Download the .json file and import it into your n8n instance. Configure Bright Data: Add your Bright Data credentials to the Bright Data node. Set Up Notifications: Configure your preferred notification method. Customize: Set your price thresholds, monitoring frequency, and which exchanges to track. Use Cases Crypto Traders:** Get notified of buying opportunities during price dips. Investors:** Monitor your crypto investments and make informed decisions. Financial Analysts:** Track Bitcoin price movements for market analysis. Connect with Me Website:** https://www.nofluff.online YouTube:** https://www.youtube.com/@YaronBeen/videos LinkedIn:** https://www.linkedin.com/in/yaronbeen/ Get Bright Data:** https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #bitcoin #cryptocurrency #brightdata #pricealerts #cryptotrading #bitcoinalerts #cryptoalerts #cryptomonitoring #n8nworkflow #workflow #nocode #cryptoinvesting #bitcoinprice #cryptomarket #tradingalerts #cryptotools #bitcointrading #pricemonitoring #cryptoautomation #bitcoininvestment #cryptotracker #marketalerts #tradingopportunities #cryptoprices
by Friedemann Schuetz
Welcome to my Airbnb Telegram Agent Workflow! This workflow creates an intelligent Telegram bot that helps users search and find Airbnb accommodations using natural language queries and voice messages. DISCLAIMER: This workflow only works with self-hosted n8n instances! You have to install the n8n-nodes-mcp-client Community Node! What this workflow does This workflow processes incoming Telegram messages (text or voice) and provides personalized Airbnb accommodation recommendations. The AI agent understands natural language queries, searches through Airbnb data using MCP tools, and returns mobile-optimized results with clickable links, prices, and key details. Key Features: Voice message support (speech-to-text and text-to-speech) Conversation memory for context-aware responses Mobile-optimized formatting for Telegram Real-time Airbnb data access via MCP integration This workflow has the following sequence: Telegram Trigger - Receives incoming messages from users Text or Voice Switch - Routes based on message type Voice Processing (if applicable) - Downloads and transcribes voice messages Text Preparation - Formats text input for the AI agent Airbnb AI Agent - Core logic that: Lists available MCP tools for Airbnb data Executes searches with parsed parameters Formats results for mobile display Response Generation - Sends formatted text response Voice Response (optional) - Creates and sends audio summary Requirements: Telegram Bot API**: Documentation Create a bot via @BotFather on Telegram Get bot token and configure webhook OpenAI API**: Documentation Used for speech transcription (Whisper) Used for chat completion (GPT-4) Used for text-to-speech generation MCP Community Client Node**: Documentation Custom integration for Airbnb data Requires MCP server setup with Airbnb/Airtable connection Provides tools for accommodation search and details Important: You need to set up an MCP server with Airbnb data access. The workflow uses MCP tools to retrieve real accommodation data, so ensure your MCP server is properly configured with the Airtable/Airbnb integration. Configuration Notes: Update the Telegram chat ID in the trigger for your specific bot Modify the system prompt in the Airbnb Agent for different use cases The workflow supports both individual users and can be extended for group chats Feel free to contact me via LinkedIn, if you have any questions!
by Yaron Been
This workflow provides automated access to the Izzaanel Betia AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for other generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete other generation process using the Izzaanel Betia model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Advanced AI model for automated processing and generation tasks. Key Capabilities Specialized AI model with unique capabilities** Advanced processing and generation features** Custom AI-powered automation tools** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Izzaanel/betia AI model Izzaanel Betia**: The core AI model for other generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Other Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Specialized Processing**: Handle specific AI tasks and workflows Custom Automation**: Implement unique business logic and processing Data Processing**: Transform and analyze various types of data AI Integration**: Add AI capabilities to existing systems and workflows Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #aiprocessing #dataprocessing #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation
by Thibaud
Title: Automatic Strava Titles & Descriptions Generation with AI Description: This n8n workflow connects your Strava account to an AI to automatically generate personalized titles and descriptions for every new cycling activity. It leverages the native Strava trigger to detect new activities, extracts and formats ride data, then queries an AI agent (OpenRouter, ChatGPT, etc.) with an optimized prompt to get a catchy title and inspiring description. The workflow then updates the Strava activity in real time, with zero manual intervention. Key Features: Secure connection to the Strava API (OAuth2) Automatic triggering for every new activity Intelligent data preparation and formatting AI-powered generation of personalized content (title + description) Instant update of the activity on Strava Use Cases: Cyclists wanting to automatically enhance their Strava rides Sports content creators Community management automation for sports groups Prerequisites: Strava account Strava OAuth2 credentials set up in n8n Access to a compatible AI agent (OpenRouter, ChatGPT, etc.) Benefits: Saves time Advanced personalization Boosts the appeal of every ride to your community
by JustinLee
This workflow demonstrates a simple Retrieval-Augmented Generation (RAG) pipeline in n8n, split into two main sections: 🔹 Part 1: Load Data into Vector Store Reads files from disk (or Google Drive). Splits content into manageable chunks using a recursive text splitter. Generates embeddings using the Cohere Embedding API. Stores the vectors into an In-Memory Vector Store (for simplicity; can be replaced with Pinecone, Qdrant, etc.). 🔹 Part 2: Chat with the Vector Store Takes user input from a chat UI or trigger node. Embeds the query using the same Cohere embedding model. Retrieves similar chunks from the vector store via similarity search. Uses Groq-hosted LLM to generate a final answer based on the context. 🛠️ Technologies Used: 📦 Cohere Embedding API ⚡ Groq LLM for fast inference 🧠 n8n for orchestrating and visualizing the flow 🧲 In-Memory Vector Store (for prototyping) 🧪 Usage: Upload or point to your source documents. Embed them and populate the vector store. Ask questions through the chat trigger node. Receive context-aware responses based on retrieved content.
by iamvaar
⚠️ RUN the FIRST WORKFLOW ONLY ONCE (as it will convert your content in Embedding format and save it in DB and is ready for the RAG Chat) 📌 Telegram Trigger Type:** telegramTrigger Purpose:** Waits for new Telegram messages to trigger the workflow. Note:** Currently disabled. 📄 Content for the Training Type:** googleDocs Purpose:** Fetches document content from Google Docs using its URL. Details:** Uses Service Account authentication. ✂️ Splitting into Chunks Type:** code Purpose:** Splits the fetched document text into smaller chunks (1000 chars each) for processing. Logic:** Loops over text and slices it. 🧠 Embedding Uploaded Document Type:** httpRequest Purpose:** Calls Together AI embedding API to get vector embeddings for each text chunk. Details:** Sends JSON with model name and chunk as input. 🛢 Save the embedding in DB Type:** supabase Purpose:** Saves each text chunk and its embedding vector into the Supabase embed table. SECOND WORKFLOW EXPLAINATION: 💬 When chat message received Type:** chatTrigger Purpose:** Starts the workflow when a user sends a chat message. Details:** Sends an initial greeting message to the user. 🧩 Embend User Message Type:** httpRequest Purpose:** Generates embedding for the user’s input message. Details:** Calls Together AI embeddings API. 🔍 Search Embeddings Type:** httpRequest Purpose:** Searches Supabase DB for the top 5 most similar text chunks based on the generated embedding. Details:** Calls Supabase RPC function matchembeddings1. 📦 Aggregate Type:** aggregate Purpose:** Combines all retrieved text chunks into a single aggregated context for the LLM. 🧠 Basic LLM Chain Type:** chainLlm Purpose:** Passes the user's question + aggregated context to the LLM to generate a detailed answer. Details:** Contains prompt instructing the LLM to answer only based on context. 🤖 OpenRouter Chat Model Type:** lmChatOpenRouter Purpose:** Provides the actual AI language model that processes the prompt. Details:** Uses qwen/qwen3-8b:free model via OpenRouter and you can use any of your choice.
by Yaron Been
This workflow provides automated access to the Jhonp4 Jhonpiedrahita_Ai01 AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for other generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete other generation process using the Jhonp4 Jhonpiedrahita_Ai01 model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Advanced AI model for automated processing and generation tasks. Key Capabilities Specialized AI model with unique capabilities** Advanced processing and generation features** Custom AI-powered automation tools** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Jhonp4/jhonpiedrahita_ai01 AI model Jhonp4 Jhonpiedrahita_Ai01**: The core AI model for other generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Other Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Specialized Processing**: Handle specific AI tasks and workflows Custom Automation**: Implement unique business logic and processing Data Processing**: Transform and analyze various types of data AI Integration**: Add AI capabilities to existing systems and workflows Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #aiprocessing #dataprocessing #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation
by Davide
This workflow automates the process of converting images from JPG/PNG format to WEBP using the APYHub API. It retrieves image URLs from a Google Sheet, converts the images, and uploads the converted files to Google Drive. This workflow is a powerful tool for automating image conversion tasks, saving time and ensuring that images are efficiently converted and stored in the desired format. Using WebP images on a website provides several SEO benefits: Faster Loading Speed – WebP files are smaller than JPG and PNG, reducing page load times and improving user experience. Better Core Web Vitals – Google prioritizes websites with good performance metrics like LCP (Largest Contentful Paint). Improved Mobile Performance – Smaller images consume less bandwidth, enhancing mobile usability. Higher Search Rankings – Faster sites tend to rank better on Google due to improved user experience. Reduced Server Load – Lighter images lower hosting and CDN costs while improving site efficiency. Below is a breakdown of the workflow: 1. How It Works The workflow is designed to convert images from JPG/PNG to WEBP format and manage the converted files. Here's how it works: Manual Trigger: The workflow starts with a Manual Trigger node, which initiates the process when the user clicks "Test workflow." Set API Key: The Set API KEY node defines the API key required to access the APYHub API. Get Images: The Get Images node retrieves a list of image URLs from a Google Sheet. The sheet contains columns for the original image URL (FROM), the converted image URL (TO), and a status flag (DONE). Get Extension: The Get Extension node extracts the file extension (JPG, JPEG, or PNG) from the image URL and adds it to the JSON data. Determine Image Type: The JPG or PNG? node checks the file extension and routes the workflow to the appropriate conversion node: JPG/JPEG: Routes to the From JPG to WEBP node. PNG: Routes to the PNG to WEBP node. Convert Image: The From JPG to WEBP and PNG to WEBP nodes send POST requests to the APYHub API to convert the images to WEBP format. The API returns the URL of the converted image. Update Google Sheet: The Update Sheet node updates the Google Sheet with the URL of the converted image and marks the row as done (DONE). Get Converted Image: The Get File Image node downloads the converted WEBP image from the URL provided by the APYHub API. Upload to Google Drive: The Upload Image node uploads the converted WEBP image to a specified folder in Google Drive. 2. Set Up Steps To set up and use this workflow in n8n, follow these steps: APYHub API Key: Obtain an API Key from APYHub. In the Set API KEY node, define the API key. Google Sheets Integration: Set up Google Sheets credentials in n8n for the Get Images and Update Sheet nodes. Create a Google Sheet with columns for FROM (original image URL), TO (converted image URL), and DONE (status flag). Provide the Document ID and Sheet Name in the Get Images node. Google Drive Integration: Set up Google Drive credentials in n8n for the Upload Image node. Specify the folder ID in Google Drive where the converted images will be uploaded. Test the Workflow: Click the "Test workflow" button in n8n to trigger the workflow. The workflow will: Retrieve image URLs from the Google Sheet. Convert the images to WEBP format using the APYHub API. Update the Google Sheet with the converted image URLs. Upload the converted images to Google Drive. Optional Customization: Modify the workflow to include additional features, such as: Adding more image formats for conversion. Sending notifications when the conversion is complete. Integrating with other storage services (e.g., Dropbox, AWS S3).
by Yaron Been
This workflow provides automated access to the Settyan Flash V2.0.0 Beta.4 AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for other generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete other generation process using the Settyan Flash V2.0.0 Beta.4 model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: Advanced AI model for automated processing and generation tasks. Key Capabilities Specialized AI model with unique capabilities** Advanced processing and generation features** Custom AI-powered automation tools** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Settyan/flash-v2.0.0-beta.4 AI model Settyan Flash V2.0.0 Beta.4**: The core AI model for other generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Other Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Specialized Processing**: Handle specific AI tasks and workflows Custom Automation**: Implement unique business logic and processing Data Processing**: Transform and analyze various types of data AI Integration**: Add AI capabilities to existing systems and workflows Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #aiprocessing #dataprocessing #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation