by Anderson Adelino
Who's it for This template is perfect for community managers, business owners, and WhatsApp group administrators who want to create a welcoming experience for new members. Whether you're running a support group, managing a business community, or moderating a hobby group, this automation ensures every new member feels valued from the moment they join. How it works The workflow automatically detects when someone joins your WhatsApp group and sends them a personalized welcome message directly to their private chat. It uses Evolution API to interface with WhatsApp Business and includes a natural delay to make the interaction feel more human. The entire process is hands-off once configured, ensuring consistent engagement with new members 24/7. What it does Monitors group activity - Receives real-time notifications when members join or leave Filters for your specific group - Ensures messages are only sent for your designated group Validates new joins - Confirms the event is a member joining (not leaving) Adds natural timing - Waits a customizable period before sending the message Sends private welcome - Delivers your welcome message directly to the new member's chat Requirements Evolution API instance** (self-hosted or cloud service) WhatsApp Business account** connected to Evolution API Group admin permissions** for the WhatsApp group n8n instance** (self-hosted or cloud) How to set up Import the workflow into your n8n instance Configure the Set Variables node with: Your WhatsApp group ID (format: xxxxxxxxxxxxx@g.us) Evolution API key Instance name from Evolution API Evolution API URL Custom welcome message Delay time in minutes Copy the webhook URL from the Webhook node Configure Evolution API to send group notifications to your webhook URL Test the workflow by having someone join your group Activate the workflow for continuous operation For a detailed video tutorial on setting up this workflow, check out: https://youtu.be/WO2MJoQqLvo How to customize the workflow Welcome message**: Edit the message in the Set Variables node to match your group's tone Timing**: Adjust the wait time to send messages immediately or after several minutes Multiple groups**: Duplicate the workflow and change the group ID for each group Rich media**: Extend the HTTP Request node to send images or documents with the welcome Conditional messages**: Add IF nodes to send different messages based on time of day or member count Follow-up sequence**: Chain additional HTTP Request nodes to create a welcome series
by Gloria
Premium n8n Workflow: DataForSEO + Airtable Keyword Research This premium n8n workflow harnesses the power of DataForSEO's API combined with Airtable's relational database capabilities to transform your keyword research process, providing deeper insights for content creation without the hefty price tag of traditional SEO tools. 🚀 Features 🔍 Comprehensive Data Extracts related keywords, search volume 📈, keyword difficulty 📊, search intent 🤔, and more directly from DataForSEO's powerful API 🚀. 💰 Cost-Effective Leverages DataForSEO's pay-as-you-go model 💸, making it budget-friendly. 🏗️ Airtable Integration Organizes your data in a powerful relational database for advanced filtering, sorting, and visualization capabilities. 🔄 Cross-Reference Capabilities Create relationships between keyword sets to identify content opportunities traditional tools miss. 🤖 Fully Automated Set up once and run keyword research with a single click. ⚙️ Efficient & Scalable Handles large keyword lists with Airtable's robust data management system. 👥 This Workflow is Perfect For: Content creators 📝 Bloggers 💻 YouTubers 🎥 Small business owners 💼 Digital marketers 📊 SEO professionals 🔍 Entrepreneurs 🚀 E-Commerce website owners 💻 Stop overspending on expensive SEO tools and start generating actionable keyword insights with a professional-grade database. 📝 What's Included? ⚙️ n8n Workflow Template Ready-to-use workflow with pre-configured DataForSEO API endpoints for comprehensive keyword data collection. 📊 Airtable Database Structure Pre-built tables and fields specifically designed for SEO keyword analysis. 🔌 DataForSEO Integration Complete setup for pulling multiple data types (related keywords, suggestions, people also ask, subtopics) from DataForSEO's API. 🔄 Automated Data Processing Logic to clean, format, and structure raw API data into usable insights. 📋 Documentation Step-by-step instructions for connecting your DataForSEO account and configuring the workflow. 🏆 Why Choose the Airtable Version? 📱 Access Anywhere: Review your keyword research on any device through Airtable's apps. 🤝 Team Collaboration: Share your keyword database for collaborative planning. 🔄 Data Relationships: Connect keywords, content ideas, and publishing schedules in one place. 🔌 Extensibility: Integrate with other tools via Airtable's ecosystem. 🎯 Content Planning: Use Airtable as a complete content management system, from research to publication tracking. 🛠️ How It Works 1️⃣ Import the provided n8n workflow into your n8n instance 📥. 2️⃣ Configure your DataForSEO API credentials and Airtable connections ⚙️. 3️⃣ Input your target keywords and desired parameters 📝. 4️⃣ Trigger the workflow — n8n automatically gathers and organizes your keyword research in Airtable 🤖. 5️⃣ Use Airtable’s interface to analyze relationships, identify opportunities, and plan your strategy 📊. Additional detailed instructions are provided in the workflow. 🏁 What You Need to Get Started 🔹 Access to n8n (self-hosted or cloud) ☁️ 🔹 A DataForSEO account with API credentials 🔑 🔹 An Airtable account (free tier works, Pro recommended for advanced features) 📊 🔹 Basic understanding of API usage and n8n workflows 🧠 💡 You can also connect this workflow with my SEO-Based Keyword Categorization & Clustering Strategy Workflow with Airtable and my Multi-Agent SEO Optimized Blog Writing System with Hyperlinks for E-Commerce, both available on my profile, to build a fully automated, end-to-end SEO content machine.
by Alexandru Florea
This workflow automates the backup of decrypted n8n credentials from a self-hosted Docker instance to Google Drive. It allows you to export credentials on n8n versions 2.x.x (where old CLI commands may not work) without accessing the server terminal manually. How it works Configuration**: Defines the Docker container name and file paths using a centralized variables node. SSH Execution**: Connects to the host machine via SSH and executes the n8n export:credentials command inside the specified Docker container. File Retrieval**: Reads the newly created decrypted JSON file from the host filesystem. Cloud Upload**: Uploads the JSON file to a specified folder in Google Drive with a timestamped filename. Set up steps Configure Variables**: Open the "Variables" node and enter your Docker Container name (usually n8n or an ID). SSH Connection**: Configure the "Execute a command" (SSH) node with your host machine's IP, username, and SSH key/password. Google Drive Auth**: Authenticate the "Google Drive Upload File" node with your Google credentials. Select Folder**: In the "Google Drive Upload File" node, select the specific folder on your Drive where you want the backups to be saved. Schedule**: (Optional) Adjust the "Schedule Trigger" to your preferred backup frequency (default is set to run periodically).
by Spiritech Studio
This workflow automates the extraction of detailed LinkedIn profile information through the TexAU API. It is designed for lead enrichment, prospect research, hiring workflows, and any automation requiring structured profile data from LinkedIn. How it Works The workflow is triggered manually or through another workflow. A request is sent to TexAU’s LinkedIn Profile Scraper task, providing the target LinkedIn profile URL or ID. TexAU starts a profile-scraping job in the background. The workflow waits briefly to allow processing time. A follow-up request retrieves the completed results from TexAU’s results endpoint. The workflow outputs a structured JSON representation of the profile, typically including name, headline, role, location, experience, skills, and education. Setup Steps Add your TexAU API key to the HTTP Request node. Supply the LinkedIn profile URL dynamically or manually. Adjust the wait time depending on TexAU’s queue speed. Connect your CRM, database, or AI logic to process the profile results. Good to Know Data availability depends on what is publicly visible. Processing time varies based on TexAU queue load.
by Spiritech Studio
This workflow performs automated LinkedIn people search using the TexAU API. It is ideal for prospecting, recruitment, lead generation, and workflows where structured people-search results from LinkedIn are needed. How it Works The workflow is triggered manually or by another automation. Search inputs such as name, role, keywords, or location are sent to TexAU’s People Search automation. TexAU starts a background job that executes the search on LinkedIn. The workflow waits a short period to allow TexAU to finish processing. It then polls TexAU’s results endpoint and retrieves the completed dataset. The workflow outputs a structured list of matching profiles, typically including name, headline, job title, profile URL, and relevance indicators. Setup Steps Add your TexAU API key to the HTTP Request node. Supply search parameters manually or dynamically. Adjust wait time based on TexAU’s queue load. Route results to your CRM, AI agent, or lead pipeline. Good to Know Search results depend on publicly visible LinkedIn data. TexAU processing time varies by query complexity.
by LeeWei
⚙️ Analyze Video Workflow - Automates Video Analysis with Google Gemini 🚀 Steps to Connect: Google Gemini API Key Go to Google AI and sign up to get your free API key. Create your query authorization credential using that API key in the HTTP Request nodes (e.g., "Upload File", "Get Analysis") that require a Google Gemini credential. Form Trigger Setup Ensure the On form submission node is linked and ready to accept video uploads. No additional configuration needed; the node is pre-set for video input. YouTube Video Analysis (Optional) Update the YouTube Video node’s jsonBody field with your desired YouTube URL (e.g., replace https://youtu.be/gwCQF--cARA?si=uCbaUnoRlEjHO50a with your video link). Keep the prompt as "Please summarize the video in 3 sentences" or modify it as needed. Overview of the n8n Workflow This n8n workflow automates the analysis of uploaded videos or YouTube links using Google Gemini, providing detailed descriptions or summaries of the content. It processes video uploads, extracts analysis, and stores the results, with options for real-time polling and YouTube integration. The workflow includes sticky notes with setup instructions and editable fields, formatted in Markdown for clarity, as seen in the "Example Output.txt". How it Works Uploads a video file or links a YouTube URL for analysis. Processes the video through Google Gemini to generate a detailed description or summary. Stores the analysis results and optionally polls for updates with a delay. Set up Steps Setup time: Approximately 10-15 minutes. Detailed instructions are available in the sticky notes within the workflow. Editable Fields YouTube Video Node: jsonBody** Update the file_uri field with your YouTube video URL (e.g., replace https://youtu.be/gwCQF--cARA?si=uCbaUnoRlEjHO50a with your link). Modify the text prompt if you want a different analysis (e.g., "Describe the video in detail" instead of "Please summarize the video in 3 sentences"). On form submission Node: No edits needed** Pre-configured for video uploads; ready to use as is.
by s3110
LINE x Google Account Linking Workflow This workflow automates the process of linking a new user on your LINE Official Account to their Google Account. When a user adds your LINE account as a friend, this workflow automatically sends them a message with a unique authentication link. After the user approves the connection, their Google profile information is fetched, and a confirmation message is sent, completing the loop. Prerequisites Before you begin, ensure you have the following: An n8n instance:** Either on n8n.cloud or a self-hosted environment. A LINE Developers Account:** A Messaging API channel. Your Channel Access Token (long-lived). A Google Cloud Platform (GCP) Account:** A configured OAuth consent screen. An OAuth 2.0 Client ID and Client Secret. Setup Instructions Follow these steps to configure the workflow. Step 1: Configure LINE Developers Console Log in to the LINE Developers Console. Navigate to your provider and select your Messaging API channel. Go to the Messaging API tab. Issue a Channel access token (long-lived) and copy the value. In the Webhook URL field, paste the Test URL from the LINE Webhook node in your n8n workflow. Enable Use webhook. Step 2: Configure Google Cloud Platform (GCP) Log in to the Google Cloud Console and select your project. Navigate to APIs & Services > OAuth consent screen. Configure it if you haven't already, ensuring you add your own Google account as a test user. Go to APIs & Services > Credentials. Click + CREATE CREDENTIALS and select OAuth 2.0 Client ID. For Application type, choose Web application. Under Authorized redirect URIs, click + ADD URI and paste the Test URL from the Google Auth Callback node in your n8n workflow. Click Create. Copy your Client ID and Client Secret. Step 3: Configure the n8n Workflow Import the workflow JSON into your n8n canvas and follow these steps to set it up. 1. Configure n8n Credentials First, set up the credentials that the HTTP Request nodes will use. For the LINE Messaging API:** In n8n, go to Credentials > Add credential. Search for and select Header Auth. Set Name to Authorization. Set Value to Bearer YOUR_LINE_CHANNEL_ACCESS_TOKEN (replace with the token from Step 1). Save the credential with a memorable name like "LINE Messaging API Auth". For the Google API (Dynamic Token):** Create another Header Auth credential. Set Name to Authorization. For Value, enter a placeholder like Bearer dummy_token. This will be replaced dynamically by the workflow. Save the credential with a name like "Google API Dynamic Token". 2. Update Node Parameters Now, update the parameters in the following nodes: Create Google Auth URL node:** In the value field, replace YOUR_N8N_WEBHOOK_URL_FOR_GOOGLE with the webhook URL of the Google Auth Callback node. Replace YOUR_GOOGLE_CLIENT_ID with the Client ID from GCP (Step 2). Get Google Access Token node:** In the jsonBody field, replace YOUR_GOOGLE_CLIENT_ID, YOUR_GOOGLE_CLIENT_SECRET, and YOUR_N8N_WEBHOOK_URL_FOR_GOOGLE with your actual GCP credentials and callback URL. Get Google User Info node:** For Authentication, select Header Auth. For Credential for Header Auth, choose the "Google API Dynamic Token" credential you created. Important: Click Add Option > Header To Append. Set the Name to Authorization and the Value to the following expression to use the token from the previous step: Bearer {{ $node["Get Google Access Token"].json["access_token"] }}. Send Auth Link to LINE & Send Completion Message to LINE nodes:** For Credential for Header Auth, choose the "LINE Messaging API Auth" credential. Redirect to LINE OA node:** In the redirectURL parameter, replace YOUR_LINE_OFFICIAL_ACCOUNT_ID with your LINE OA's ID (e.g., @123abcde). Step 4: Activate and Test Save the workflow by clicking the Save button. Activate the workflow using the toggle in the top-right corner. On your phone, add your LINE Official Account as a friend. You should receive a message with a link. Follow the link to authorize with your Google account. After successful authorization, you should receive a completion message in LINE and be redirected. > Note: When you are ready for production, remember to replace the "Test" webhook URLs in the LINE and GCP consoles with the "Production" URLs from the n8n webhook nodes.
by Spiritech Studio
This workflow automates the process of retrieving mutual LinkedIn connections using the TexAU API. It is ideal for enrichment, lead qualification, and relationship-mapping tasks where understanding shared connections provides valuable context. How it Works The workflow is triggered manually for testing or can be linked to other workflows. A request is sent to TexAU’s LinkedIn Mutual Connections automation, passing the target LinkedIn profile URL or ID. TexAU begins processing the task in the background. The workflow waits briefly to allow TexAU to complete the job. A follow-up request retrieves the final results from TexAU’s results endpoint. The mutual-connection data is returned in structured form for downstream use such as enrichment, CRM updates, scoring, or AI workflows. Setup Steps Add your TexAU API key to the HTTP Request node headers. Specify the input LinkedIn profile URL. Adjust wait duration depending on TexAU’s processing speed. Connect any downstream CRM, database, or AI components as needed. Good to Know TexAU execution time varies based on task queue load. Returned fields may include connection names, profile URLs, job titles, and shared links.
by Viktor Klepikovskyi
No-Code: Convert Multiple Binary Files to Base64 Introduction This template provides a robust, purely no-code solution for a common integration challenge: converting multiple binary files contained within a single n8n item (e.g., after unzipping an archive) into a structured JSON array of Base64 encoded strings. Purpose Many external APIs, especially those handling batch file uploads or complex data structures, require files to be submitted as a single JSON payload. This payload typically needs an array containing two elements for each file: the reconstructed file path/name and the Base64 encoded content. This template automatically handles the file isolation, encoding, path reconstruction, and final JSON aggregation, replacing the need for complex custom JavaScript Code nodes. Configuration Steps Input: Connect your binary data source (e.g., an HTTP Request followed by a Compression node) to the first node in this template. Split Out: This node automatically separates the multiple binary files into individual items. Extract From File: This node uses the dynamic expression {{ $binary.keys()[0] }} to ensure the correct binary file is targeted and converted to Base64. Set: This node uses a conditional expression to reconstruct the full path (including the directory, if present) for each file. Aggregate: The final node merges all individual items into a single, clean JSON item containing a top-level files array, ready for your final API call. For a detailed walkthrough, including the explanation behind the dynamic expressions and why this is superior to the custom code solution, check out the full blog post: The No-Code Evolution: Base64 Encoding Multiple Files in n8n (Part 2).
by Michael Gullo
Outlook to OneDrive This workflow automates the process of saving binary attachments from Outlook emails into newly created folders in OneDrive. It's ideal for users who regularly receive files and need them organized into separate folders without manual intervention. Each folder is automatically named based on the email subject and the current timestamp, allowing all attachments from that email to be stored inside the corresponding folder. This is particularly useful for streamlining document workflows, improving file traceability, and reducing the time spent on repetitive tasks like organizing client submissions, invoices, or internal reports. The configuration and setup of the workflow can be customized to meet the business or personal needs of the user. Its purpose is to automatically process binary attachments from Outlook emails and upload them to dynamically created folders in OneDrive. Overview Microsoft Outlook Trigger – Monitors your inbox for new emails. Filter – Ensures only emails with binary attachments proceed. Get Outlook Message – Retrieves the full email and downloads attachments. Create Folder – Makes a new folder in OneDrive based on the email subject and time. Split Out – Extracts each binary attachment. Merge– Combines folder and file data before upload. Upload File OneDrive – Uploads each binary file into the new folder. Need Help? Have Questions? For consulting and support, or if you have questions, please feel free to connect with me on LinkedIn or via email.
by DataForSEO
This workflow helps you discover new SEO content opportunities by automatically identifying keyword gaps between your website and a competing domain. On each manual run, the workflow retrieves the top 100 organically ranked keywords for your domain and for a selected competitor using the DataForSEO Labs API. It then compares both keyword lists to find valuable search terms where your competitor already has visibility on Google, but your website doesn’t. For every missing keyword, the workflow adds useful SEO metrics, including search volume, competition level, competitor ranking position, and the competitor’s exact URL that ranks on Google. All identified keywords are stored in a structured Notion database, making it easy to review, prioritize, and turn them into actionable content ideas. Once the keywords are saved, you can use Notion AI to instantly generate a content plan based on these opportunities. Recommended prompt: “Analyse the keywords present in this table - this is the keyword gap between my website’s page {{my url}} and the competitor’s URL listed in the table, and based on this data, build a content strategy for me.” Who’s it for This workflow is ideal for SEO specialists, content marketers, and website owners who want to discover new content ideas, improve organic visibility, and systematically close the keyword gap with competitors. What it does This workflow automatically identifies keywords that your competitor ranks for on Google, but your domain does not, and saves them in Notion with key SEO context for content planning. How it works Triggers manually whenever you want to perform the research. Fetches the top 100 organically ranked keywords for your website via DataForSEO Labs API. Fetches the competitor’s top-ranked organic keywords from the same source. Compares both datasets to find keywords missing for your domain. Extracts key metrics, including search volume, competition, ranking position, and ranking URL. Writes all keyword gap opportunities into a Notion database for review and planning. Allows you to generate a content roadmap using Notion AI. Requirements DataForSEO account and API credentials Notion account with a prepared database (matching the required column structure) Notion integration to n8n Customization You can easily customize this workflow by increasing the number of keywords analyzed, filtering by specific countries or languages, adding wider SEO context (CPC, keyword difficulty, SERP features), exporting results to other tools, or automating content brief creation rather than storing keywords only.
by System Admin
Tagged with: , , , ,