by Olek
How it works This workflow will activate and deactivate a selected other workflow on schedule. > ⚠️ Warning! > This approach won't work for trial users as it requires n8n API that is not available to trial users. > See https://docs.n8n.io/api/ for details. Set up steps Adjust activation/deactivation schedule per your needs. Custom (cron) interval is a recommended approach. Set targeted Workflow ID. You will find it in the URL of the workflow you want to manage. Set n8n API credentials: Create an API key: how to Create n8n credentials using the API key: how to This workflow uses n8n node. #DevOps #workflow-management Other useful stuff Need a universal Error workflow to catch both execution and trigger errors? Here you go: Error handling: Send email via Gmail on execution or trigger-level errors More stuff by Olek and do not forget to backup your workflows often by automating.
by David Olusola
⚡️ How It Works This workflow captures form submissions from your website, formats the data, and automatically creates a new entry in your Notion CRM database. It eliminates manual copy-pasting and keeps your leads or requests organised in one place. 🛠 Setup Steps Webhook Node • Create a webhook in n8n. • Connect your website form to POST submissions to this webhook URL. Code Node • Formats the incoming data to match your Notion database structure. • You can customise the fields in the code to suit your specific form inputs. Notion Node (Create Page) • Connect your Notion account. • Choose your target database. • Map each field from the Code node output to your Notion database properties. Test • Submit a test form entry. • Confirm the data appears correctly in Notion. ⸻ 👥 Who It’s For ✅ Freelancers collecting project inquiries ✅ Agencies managing client onboarding forms ✅ Business owners wanting organised lead capture ✅ Teams that use Notion as their central CRM or task manager ✅ Anyone tired of manually transferring form data into Notion
by Jay Hartley
What this template does This workflow uses the Amadeus API, every day to check for bargain flights for an itinerary and price target of your choice. It then automatically emails you once it found a match. Setup Create an api account on https://developers.amadeus.com/ In Amadeus Flight Search, connect to Oauth2 API: -- Grant Type - Client Credentials -- Access Token URL - https://test.api.amadeus.com/v1/security/oauth2/token -- Client ID/Secret - from your account Set your details in Gmail Set your desired Origin/Destination airports in FromTo Set the dates ahead you wish to search in Get Dates (default is 7 days and 14 days) Set the price target in Under Price How to test it After completing the setup steps above, just hit 'Test workflow'!
by simonscrapes
Use Case Generate accurate search volume data for SEO keyword research: You have a list of potential keywords to target for your website SEO but don't know their actual search volume You need historical data to identify seasonal trends in keyword popularity You want to assess keyword difficulty to prioritize your content strategy You need data-driven insights for planning your SEO campaigns What this Workflow Does The workflow connects to Google's Keyword Planner API to retrieve keyword metrics for your SEO research: Fetches monthly search volume for each keyword Provides historical trends data for the past 12 months Calculates keyword difficulty scores Delivers competition metrics from Google Ads Setup Fill the Set 20 Keywords with up to 20 Keywords of your choosing in an array e.g. ["keyword 1", "keyword 2",...] Create a Google Ads API account and add credentials to Get Search Data node Replace the Connect to your own database with your own database for the output How to Adjust it to Your Needs Change the Set 20 Keywords node input to a source of your choosing e.g. Airtable database with 20 keywords Connect to output source of your choosing More templates and n8n workflows >>> @simonscrapes
by Henry
Who is this for? This workflow is ideal for social media managers, content creators, marketers, and small businesses who want to automate Instagram Carousel posts using Google Sheets and Google Drive. It is also suitable for anyone looking to streamline repetitive Instagram publishing tasks with n8n, Cloudinary, and the Instagram Graph API. What problem is this workflow solving? / Use case Managing and publishing Instagram Carousel posts manually can be time-consuming, especially when handling multiple accounts or campaigns. This workflow solves that by automatically fetching scheduled posts from Google Sheets, uploading images from Google Drive to Cloudinary, and publishing them to Instagram, saving time and reducing the risk of errors. What this workflow does This n8n workflow checks a Google Sheet every 5 minutes for new Carousel posts marked as "ToDo." When found, it uploads images from a specified Google Drive folder to Cloudinary, prepares the media on Instagram using the Graph API, and publishes the Carousel post with the given caption. Setup Prepare a Google Sheet to track posts and image folder URLs. Example : https://docs.google.com/spreadsheets/d/1WEUHeQXFMYsWVAW3DykWwpANxxD3DxH-S6c0i06dW1g/edit?usp=sharing Upload post images to a dedicated Google Drive folder. Set up a Cloudinary account and gather API credentials. Obtain Instagram access_token and ig_business_id for API publishing. Configure the n8n workflow with required credentials and your custom intervals. How to customize this workflow to your needs Adjust the schedule trigger interval to fit your publishing frequency. Expand the Google Sheet with additional metadata as required. Modify the filter logic to support different content types or statuses. Add extra automation steps, such as sending notifications after publishing.
by sateshcharan
Who is this template for? This workflow template is designed for DevOps, Engineering, and Managed Service Provider professionals seeking alerts on various channels, with each channel being logically chosen based on the severity of the event. How it works Each time a new event occurs, the workflow runs (powered by TwentyCRM's native Webhooks feature). After filtering for the required data from the webhook, the filtered data is logged using Google Sheets. Based on the eventType from the webhook, we conditionally select a predefined messaging channel and send updates or alerts through it. Set up instructions Complete the Set up credentials step when you first open the workflow. You'll need a Google-OAuth2.0 with Gmail API & Google Sheets Scope, Slack with OAuth2.0 - chat:write scopes. Set up the Webhook in TwentyCRM, linking the On new TwentyCRM event Trigger with your TwentyCRM App. Set the correct channel to send to in the Post message in channel step. After testing your workflow, swap the Test URL to Production URL in TwentyCRM and activate your workflow. Template was created in n8n v1.63.4.
by Mario
Purpose This workflow allows you to transfer credentials from one n8n instance to another. How it works A multi-form setup guides you through the entire process You get to choose one of your predefined (in the Settings node) remote instances first Then all credentials of the current instance are being retrieved using the Execute Command node On the next form page you can select one of the credentials by their name and initiate the transfer Finally the credential is being created on the remote instance using the n8n API. A final form ending indicates if that action succeeded or not. Setup Select your credentials in the nodes which require those Configure your remote instance(s) in the Settings node Every instance is defined as object with the keys name, apiKey and baseUrl. Those instances are then wrapped inside an array. You can find an example described within a note on the workflow canvas. How to use Grab the (production) URL of the Form from the first node Open the URL and follow the instructions given in the multi-form Disclaimer Please note, that this workflow can only run on self-hosted n8n instances, since it requires the Execute Command Node. Security: Beware, that all credentials are being decrypted and processed within the workflow. Also the API keys to other n8n instances are stored within the workflow. This solution is primarily meant for transferring data between testing environments. For production use consider the n8n enterprise edition which provides a reliable way to manage credentials across different environments.
by Lucas Walter
Who's it for Content creators, social media managers, and marketing teams who want to automatically extract the most engaging clips from long-form YouTube videos and identify content with high viral potential. What it does This workflow analyzes any YouTube video using Vizard AI's clipping technology and automatically generates up to 8 short clips with viral score ratings. It then filters for the highest-scoring clips (9/10 or above) and posts them to a designated Slack channel for team review and distribution. How it works Video submission: Enter a YouTube URL through a user-friendly form AI analysis: Submits the video to Vizard AI for automated clipping and viral score analysis Smart polling: Waits for processing completion and retrieves results Quality filtering: Only surfaces clips with viral scores of 9/10 or higher Team notification: Posts results to Slack with clip titles, scores, and download links Requirements Vizard AI API credentials (sign up at vizard.ai) Slack workspace with OAuth app configured How to set up Configure Vizard AI credentials: Add your Vizard AI API key to the HTTP Request nodes Set up Slack integration: Configure the Slack OAuth2 credentials and select your target channel Customize filtering: Adjust the viral score threshold in the filter node (currently set to 9/10) Test the workflow: Submit a test YouTube URL to ensure everything works properly How to customize the workflow Adjust clip quantity**: Modify the maxClipNumber parameter (currently 8) in the initial API request Change viral score threshold**: Update the filter condition to match your quality standards Extend with automation**: Connect to social media posting tools or caption generation workflows for full automation Add scheduling**: Integrate with webhook triggers, scheduled triggers, or RSS feeds for batch processing videos
by Sira Ekabut
This workflow automates the collection of comments from posts on a Facebook Page. Providing clean, structured data for analysis or further automation. What this workflow does Fetches recent posts from a Facebook Page. Retrieves comments for each post. Outputs structured data of Comments and Posts for further use. Setup Facebook Graph API: Connect your Access Token with the required permissions (pages_read_engagement, pages_read_user_content). Workflow: Set the Page ID and the number of posts to fetch in the "Set Number of Latest Posts to Fetch" node.
by Mark Shcherbakov
Video Guide I prepared a detailed guide that showed the whole process of building a resume analyzer. Who is this for? This workflow is ideal for developers, data analysts, and business owners who want to enable conversational interactions with their database. It’s particularly useful for cases where users need to extract, analyze, or aggregate data without writing SQL queries manually. What problem does this workflow solve? Accessing and analyzing database data often requires SQL expertise or dedicated reports, which can be time-consuming. This workflow empowers users to interact with a database conversationally through an AI-powered agent. It dynamically generates SQL queries based on user requests, streamlining data retrieval and analysis. What this workflow does This workflow integrates OpenAI with a Supabase database, enabling users to interact with their data via an AI agent. The agent can: Retrieve records from the database. Extract and analyze JSON data stored in tables. Provide summaries, aggregations, or specific data points based on user queries. Dynamic SQL Querying: The agent uses user prompts to create and execute SQL queries on the database. Understand JSON Structure: The workflow identifies JSON schema from sample records, enabling the agent to parse and analyze JSON fields effectively. Database Schema Exploration: It provides the agent with tools to retrieve table structures, column details, and relationships for precise query generation. Setup Preparation Create Accounts: N8N: For workflow automation. Supabase: For database hosting and management. OpenAI: For building the conversational AI agent. Configure Database Connection: Set up a PostgreSQL database in Supabase. Use appropriate credentials (username, password, host, and database name) in your workflow. N8N Workflow AI agent with tools: Code Tool: Execute SQL queries based on user input. Database Schema Tool: Retrieve a list of all tables in the database. Use a predefined SQL query to fetch table definitions, including column names, types, and references. Table Definition: Retrieve a list of columns with types for one table.
by scrapeless official
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Prerequisites A n8n account (free trial available) A Scrapeless account and API key A Google account to access Google Sheets 🛠️ Step-by-Step Setup 1. Create a New Workflow in n8n Start by creating a new workflow in n8n. Add a Manual Trigger node to begin. 2. Add the Scrapeless Node Add the Scrapeless node and choose the Scrape operation Paste in your API key Set your target website URL Execute the node to fetch data and verify results 3. Clean Up the Data Add a Code node to clean and format the scraped data. Focus on extracting key fields like: Title Description URL 4. Set Up Google Sheets Create a new spreadsheet in Google Sheets Name the sheet (e.g., Business Leads) Add columns like Title, Description, and URL 5. Connect Google Sheets in n8n Add the Google Sheets node Choose the operation Append or update row Select the spreadsheet and worksheet Manually map each column to the cleaned data fields 6. Run and Test the Workflow Click "Execute Workflow" in n8n Check your Google Sheet to confirm the data is properly inserted Results With this automated workflow, you can continuously extract business lead data, clean it, and push it directly into a spreadsheet — perfect for outbound sales, lead lists, or internal analytics. How to Use ⚙️ Open the Variables node and plug in your Scrapeless credentials. 📄 Confirm the Google Sheets node points to your desired spreadsheet. ▶️ Run the workflow manually from the Start node. Perfect For: Sales teams doing outbound prospecting Marketers building lead lists Agencies running data aggregation tasks
by Extruct AI
Who’s it for: Sales teams, marketers, and analysts who need to quickly access all the social media and public profile links for any company. How it works / What it does: When you enter a company into the form, this workflow automatically searches for and collects all available links to the company’s social media accounts, review sites, and public profiles from sources like Crunchbase and Zoominfo. All discovered URLs are added directly to your Google Sheet. How to set up: Create an Extruct account at www.extruct.ai/. Open the Extruct table template, find the table ID in your browser’s address bar, and copy it. Make a copy of the provided Google Sheets template to your own Google Drive. In n8n, paste the table ID into the variables node of your flow. Set up Bearer authentication in every HTTP Request node using your Extruct API token (found on the API page in Extruct). In the Google Sheets node, paste the link to your copied template and connect your Google account. Run the flow once to load the fields, then map the output fields to the correct columns in your sheet. Activate the flow and start adding companies via the form. Requirements: Extruct account and API token Extruct table template Google account with Google Sheets How to customize the workflow: You can add your own columns to the Extruct table and your Google Sheet. Just add the new column in both places and map it in the Google Sheets node in n8n.