by Audun
Send structured logs to BetterStack from any workflow using HTTP Request Who is this for? This workflow is perfect for automation builders, developers, and DevOps teams using n8n who want to send structured log messages to BetterStack Logs. Whether you're monitoring mission-critical workflows or simply want centralized visibility into process execution, this reusable log template makes integration easy. What problem is this workflow solving? Logging failures or events across multiple workflows typically requires duplicated logic. This workflow solves that by acting as a shared log sender, letting you forward consistent log entries from any other workflow using the Execute Workflow node. What this workflow does Accepts level (e.g., "info", "warn", "error") and message fields via Execute Workflow Trigger Sends the structured log to your BetterStack ingestion endpoint via HTTP Request Uses HTTP Header Auth for secure delivery Includes a manual trigger for testing and a sample call to demonstrate usage Comes with clear sticky notes to help you get started Setup Copy your BetterStack Logs ingestion URL. Create a Header Auth credential in n8n with your Authorization: Bearer YOUR_API_KEY. Replace the URL in the HTTP Request node with your BetterStack endpoint. Optionally modify the test data or log levels for custom scenarios. Use Execute Workflow in any of your workflows to send logs here.
by Cameron Wills
Who is this for? Content creators, social media managers, digital marketers, and researchers who need to download original TikTok videos without watermarks for analysis, repurposing, or archiving purposes. What problem does this workflow solve? Downloading TikTok videos without watermarks typically requires using questionable third-party websites that may have limitations, ads, or privacy concerns. This workflow provides a clean, automated solution that can be integrated into your own systems and processes. What this workflow does This workflow automates the process of downloading TikTok videos without watermarks in three simple steps: Fetch the TikTok video page by providing the video URL Extract the raw video URL from the page's HTML data Download the original video file without watermark (Optional) Upload to Google Drive with public sharing link generation The workflow uses web scraping techniques to extract the original video source directly from TikTok's own servers, maintaining the highest possible quality without any added watermarks or branding. Setup (Est. time: 5-10 minutes) Before getting started, you'll need: n8n installation The URL of a TikTok you want to download (Optional) Google Drive API enabled in Google Cloud Console with OAuth Client ID and Client Secret credentials if you want to use the upload feature How to customize this workflow to your needs Replace the example TikTok URL with your desired video links Modify the file naming convention for downloaded videos Integrate with other nodes to process videos after downloading Create a webhook to trigger the workflow from external applications Set up a schedule to regularly download videos from specific accounts This workflow can be extended to support various use cases like trending content analysis, competitor research, creating compilation videos, or building a content library for inspiration. It provides a foundation that can be customized to fit into larger automated workflows for content creation and social media management.
by Khaled
๐ Web Server Monitor & Alert System This automation pings web servers at regular intervals, logs their status, and sends email alerts if a server goes down. Itโs perfect for maintaining visibility over server uptime โ without complex monitoring tools. ๐ง How It Works This workflow performs minute-by-minute checks on all listed servers in a Google Sheet and: โ Logs all reachable servers in an โAliveโ log. ๐ป Sends an email alert if a server is unreachable. ๐ Logs failed servers in a โDownโ sheet with timestamps. ๐งฉ Key Components โฐ 1. Schedule Trigger Runs the workflow every minute for real-time monitoring. ๐ 2. Web Servers List (Google Sheets) Pulls server IPs or hostnames from a Google Sheet named Server_List. Each row = one server to monitor. This makes adding/removing servers effortless โ just update the sheet. ๐ 3. Servers Alive Check (HTTP Request) Performs an HTTP GET request to each server (e.g., http://your-server.com). If the request fails, it automatically triggers the error path (handled via continueOnFail). โ 4. Web Server Alive Log (Google Sheets) Records successful pings in Server_Status_Alive with: Timestamp Server IP Status = Alive This log can be used for uptime reports or audits. ๐ง 5. Server Down Notification (Gmail) If a server fails, this node sends an email to the admin. It includes: Server address Timestamp Suggested action ๐ 6. Web Server Down Log (Google Sheets) Logs failed pings in a separate sheet for historical tracking and debugging. โ Main Advantages Live Server Monitoring Stay informed about server health in near real-time. No-Code Configuration Add/remove servers from the Google Sheet โ no need to touch the workflow. Email Alerts on Failure Proactively notifies you before users report the issue. Audit-Ready Logging Maintains logs for both healthy and failed checks for documentation or reporting. Flexible & Scalable Monitor 1 or 100 servers with the same template โ just scale the list. โ๏ธ Setup Steps ๐ Prerequisites Google Sheet with server list (column name = โServerโ) Gmail OAuth2 Connection for alerts n8n Instance running regularly ๐ Configuration Google Sheets Sheet 1 (Server_List): Your list of servers. Sheet 2 (Server_Status_Alive): Log for reachable servers. Sheet 3 (Server_Status_Down): Log for unreachable servers. Gmail Integration Connect your Gmail account in the Server Down Notification node. Edit recipient email and message content as needed. HTTP Check Adjust the HTTP request URL template if using port numbers or paths (e.g., http://{{Server}}:8080/status). Schedule Default is every 1 minute. Change via Schedule Trigger if needed. ๐งช Testing Input a reachable server (e.g., example.com) and an unreachable IP. Run the workflow manually or wait for the next scheduled run. Check: Alive log updates correctly. Down log records failures. Email alert is received. ๐ Deployment Activate the workflow, and it will quietly run in the background, notifying you of any server downtime instantly while keeping logs for future review.
by Msaid Mohamed el hadi
๐ธ Instagram Full Profile Scraper with Apify and Google Sheets This n8n workflow automates the process of scraping full Instagram profiles using a custom Apify actor, and logs the results into a Google Sheet. It is designed to run at scheduled intervals and process a list of usernames by calling the API, appending the results, and marking them as processed. ๐ Features โฑ Scheduled Execution โ Runs automatically every few minutes. ๐ Google Sheets Integration โ Reads a list of Instagram usernames and updates the same sheet. ๐ง Apify Actor โ Fetches full public Instagram profile data. ๐งฎ Aggregation โ Batches usernames for bulk scraping. โ๏ธ Data Logging โ Appends scraped data to a second sheet. โ Tracking โ Marks usernames as processed once scraped. ๐ Workflow Structure graph TD; ScheduleTrigger --> GetUsernames; GetUsernames --> LimitItems; LimitItems --> AggregateUsernames; AggregateUsernames --> CallApifyActor; CallApifyActor --> AppendToSheet; CallApifyActor --> MarkAsScraped; ๐ Setup Google Sheet Create a Google Sheet with: Sheet 1 named Usernames (GID: 0) Columns: username, scraped Sheet 2 named fullprofiles (GID: 458127000) Sample sheet: ๐ Instagram Profile Sheet n8n Configuration Import this workflow into your n8n instance. Set up your Google Sheets credentials (googleSheetsOAuth2Api). Replace apify_api_your token in the HTTP Request node with your Apify API token. ๐ฆ Required Credentials Google Sheets OAuth2** โ For reading and writing sheet data. Apify API Token** โ To call the custom actor for profile scraping. ๐ Sheets Used | Sheet Name | Purpose | | -------------- | -------------------------------- | | Usernames | Source of usernames to scrape | | fullprofiles | Destination of full profile data | ๐ Apify Actor Info > Instagram Full Profile Scraper > This actor fetches extended profile information from public Instagram profiles. ๐ View on Apify ๐ Workflow Nodes Overview | Node | Purpose | | ------------------------ | ----------------------------------------------------------------- | | Schedule Trigger | Triggers the workflow periodically. | | Get Usernames | Reads usernames from the Usernames sheet. | | Limit | Limits processing to 20 usernames per run. | | Aggregate | Groups usernames into a batch for the API call. | | Call Apify Actor | Sends the usernames to the Apify actor and receives profile data. | | Append Full Profiles | Appends the scraped data to the fullprofiles sheet. | | Mark Username as Scraped | Marks the processed usernames as scraped = TRUE. | | Sticky Note | Provides a reference link to the Apify actor used. | ๐ Example Sheet Structure Usernames Sheet | username | scraped | | ------------ | ------- | | exampleuser1 | | | exampleuser2 | TRUE | fullprofiles Sheet | username | full\_name | biography | follower\_count | ... | | -------- | ---------- | --------- | --------------- | --- | ๐ Security & Notes This workflow does not bypass any Instagram privacy restrictions. It works only with public Instagram profiles. You are responsible for ensuring that scraping complies with Instagramโs terms of service and any applicable laws. ๐ฌ Support For any issues, feel free to reach out: ๐ค @mohamedgb00714 ๐ง mohamedgb00714@gmail.com
by Marcelo Abreu
Who is this workflow for? If you're using Meta Ads to generate new leads to your sales pipeline, this workflow is for you! ๐๐ป What this workflow does Triggers every time you have a new calendar event on a chosen Google Acount Filter only events with the same name of your "Schedule a demo" event Formats and send event to Meta Conversion API What events can I send? Any event you'd like! It's preconfigured with the "Schedule" event, but you can change to "Purchase", "InitiateCheckout", "Lead" and custom events. Setup Guide Connect Google OAuth2 to n8n Get your PIXEL ID and Access Token from Meta Set your configuration node with Pixel ID, Access Token, source_url and event_name Requirements Meta Access Token + Pixel ID (via Meta Conversion API): Documentation Google Access (via OAuth2): Documentation This free template was created by pdforge. Feel free to contact us via the founder Linkedin, if you have any questions! ๐๐ป
by Oneclick AI Squad
This automated n8n workflow processes student applications on a scheduled basis, validates data, updates databases, and sends welcome communications to students and guardians. Main Components Trigger at Every Day 7 am** - Scheduled trigger that runs the workflow daily Read Student Data** - Reads pending applications from Excel/database Validate Application Data** - Checks data completeness and format Process Application Data** - Processes validated applications Update Student Database** - Updates records in the student database Prepare Welcome Email** - Creates personalized welcome messages Send Email** - Sends welcome emails to students/guardians Success Response** - Confirms successful processing Error Response** - Handles any processing errors Essential Prerequisites Excel file with student applications (student_applications.xlsx) Database access for student records SMTP server credentials for sending emails File storage access for reading application data Required Excel File Structure (student_applications.xlsx): Application ID | First Name | Last Name | Email | Phone Program Interest | Grade Level | School | Guardian Name | Guardian Phone Application Date | Status | Notes Expected Input Data Format: { "firstName": "John", "lastName": "Doe", "email": "john.doe@example.com", "phone": "+1234567890", "program": "Computer Science", "gradeLevel": "10th Grade", "school": "City High School", "guardianName": "Jane Doe", "guardianPhone": "+1234567891" } Key Features โฐ Scheduled Processing:** Runs daily at 7 AM automatically ๐ Data Validation:** Ensures application completeness ๐พ Database Updates:** Maintains student records ๐ง Auto Emails:** Sends welcome messages โ Error Handling:** Manages processing failures Quick Setup Import workflow JSON into n8n Configure schedule trigger (default: 7 AM daily) Set Excel file path in "Read Student Data" node Configure database connection in "Update Student Database" node Add SMTP settings in "Send Email" node Test with sample data Activate workflow Parameters to Configure excel_file_path: Path to student applications file database_connection: Student database credentials smtp_host: Email server address smtp_user: Email username smtp_password: Email password admin_email: Administrator notification email
by InfraNodus
Teach your AI agent HOW to think, not WHAT to think This workflow demonstrates how you can build an AI agent in n8n that uses the reasoning logic you define. So an LLM learns a way of thinking, which you can then apply to multiple problems: Make an AI chatbot that knows how to convince anybody using the "Getting to Yes" method Build an LLM workflow that uses Ray Dalio's principles to spot investment opportunities Create an AI agent crew of interdisciplinary thinkers: e.g. a specialist in psychology who gives an advice on education programmes. How it works This template uses the n8n AI agent node as an orchestrating agent that has access to a certain reasoning logic defined by an InfraNodus knowledge graph. This graph contains a list of reasoning rules (ontology), which is extracted to provide an advice that is relevant to the original prompt. It uses GraphRAG under the hood to traverse the parts of the graph relevant to the query. This advice and the reasoning logic extracted is then used by the AI agent to generate a response that is relevant to the user's query but that uses the reasoning logic provided through the graph. Here's a description step by step: The user submits a question using the AI chatbot (n8n interface, in this case, a web form that can be embedded to any website, or a webhook that can be connected to a Telegram / WhatsApp bot) The AI agent node accesses the Reasoning Logic HTTP InfraNodus nodes. The description of AI agent and the description of the reasoning InfraNodus node provides the agent with an understanding of how to rephrase the original question to retrieve relevant reasoning logic. The request is sent to the InfraNodus node. It provides a response that contains the reasoning logic needed to answer the question. This reasoning logic is then sent back to an LLM along with the original query to produce the response. InfraNodus uses GraphRAG under the hood: convert user query into graph find the overlap with the reasoning graph (using n=1 or more hops to include more relations) use similarity search to get additional parts of the graph generate a response based on this intersection as well as the context provided provide information about the underlying structure How to use You need an InfraNodus account to use this workflow. Create an InfraNodus account Get the API key at https://infranodus.com/api-access and create a Bearer authorization key for the InfraNodus HTTP nodes. Create a separate knowledge graph for the reasoning logic Use the AI ontology creator to generate an ontology for a certain topic or text using AI. Then augment it with your own data. See our help article on creating ontologies for detailed instructions For each graph, go to the workflow, paste the name of the graph into the request JSON body name field. Change the system prompt in the AI agent node to reflect the nature of your reasoning logic. For instance, if it's an expert in interactions, you specify that, if it's a psychology expert, you need to specify that as well. Change the description of the reasoning node (HTTP tool). Use the InfraNodus summary and Project Notes > RAG prompt buttons to generate a description for the reasoning logic, which you can then reuse in your workflow. add the LLM key to the OpenAI node (or to the model of your choice) and launch the workflow Requirements An InfraNodus account and API key An OpenAI (or any other LLM) API key Customizing this workflow You can use this same workflow with a Telegram bot, so you can interact with it using Telegram. There are many more customizations available. Check out the complete guide at https://support.noduslabs.com/hc/en-us/articles/21429518472988-Using-Knowledge-Graphs-as-Reasoning-Experts Also check out the video tutorial with a demo:
by Ria
This workflow demonstrates how to use the workflowStaticData() function to set any type of variable that will persist within workflow executions. https://docs.n8n.io/code/cookbook/builtin/get-workflow-static-data/ This can be useful for example when working with access tokens that expire after a certain time period. Using staticData we can keep a record of that access token and the expiry time and build our workflow logic around it. Important Static Data only persists across production executions, i.e. triggered by Webhooks or Schedule Triggers (not manual executions!) For this the workflow will have to be activated. Setup configure HTTP Request node to fetch access token from your API (optional) activate workflow test the workflow with the webhook production link you can check the population of the static data in the single executions Feedback If you found this useful or want to report some missing information - I'd be happy to hear from you at ria@n8n.io
by Niklas Hatje
Use Case In most companies, employees have a lot of great ideas. That was the same for us at n8n. We wanted to make it as easy as possible to allow everyone to add their ideas to some formatted database - it should be somewhere where everyone is all the time and could add a new idea without much extra effort. Since we're using Slack, this seemed to be the perfect place to easily add ideas and collect them in Notion. What this workflow does This workflow waits for a webhook call within Slack, that gets fired when users use the /idea command on a bot that you will create as part of this template. It then checks the command, adds the idea to Notion, and notifies the user about the newly added idea as you can see below: Creating your Slack bot Visit https://api.slack.com/apps, click on New App and choose a name and workspace. Click on OAuth & Permissions and scroll down to Scopes -> Bot token Scopes Add the chat:write scope Head over to Slash Commands and click on Create New Command Use /idea as the command Copy the test URL from the Webhook node into Request URL Add whatever feels best to the description and usage hint Go to Install app and click install Setup Add a Database in Notion with the columns Name and Creator Add your Notion credentials and add the integration to your Notion page. Fill the setup node below Create your Slack app (see other sticky) Click Test workflow and use the /idea comment in Slack Activate the workflow and exchange the Request URL with the production URL from the webhook How to adjust it to your needs You can adjust the table in Notion and for example, add different types of ideas or areas that they impact You might wanna add different templates in Notion to make it easier for users to fill their ideas with details Rename the Slack command as it works best for you How to enhance this workflow At n8n we use this workflow in combination with some others. E.g. we have the following things on top: We additionally have a /bug Slack command that adds a new bug to Linear. Here we're using AI to classify the bugs and move it to the right team. (see this template and this template) We also added other types, like /pain to be less solution-driven To make it easier for everyone to give input, we added a Votes column that allows everyone to vote on ideas/pain points in the list We're also running a workflow once a week that highlights the most popular new ideas and the most active voters (see here)
by Browser Use
A sample demo showing how to integrate Browser Use Cloud API with N8N workflows. This template demonstrates AI-powered web research automation by collecting competitor intelligence and delivering formatted results to Slack. How It Works Form trigger accepts competitor name input Browser Use Cloud API performs automated web research Webhook processes completion status and retrieves structured data JavaScript code formats results into readable Slack message HTTP request sends final report to Slack Integration Pattern This workflow showcases key cloud API integration techniques: REST API authentication with bearer tokens Webhook-based status monitoring for long-running tasks JSON data parsing and transformation Conditional logic for processing different response states Setup Required Browser Use API key (signup at cloud.browser-use.com) Slack webhook URL Perfect demo for learning browser-use cloud API integrations and building automated research workflows.
by Fahmi Oktafian
Who's it for This workflow is perfect for SEO specialists, marketers, bloggers, and content creators who want to automate keyword research using Google Sheets, Google Suggest, and Google Custom Search. Ideal for those building content pipelines, researching trends, or powering AI content generation with fresh search data. What it does This workflow automates the process of discovering a new keyword daily. It: Rotates through a keyword list in Google Sheets Selects one keyword per day Fetches autocomplete suggestions from Google Suggest Queries the Google Custom Search API for top results Returns structured JSON containing titles, links, and snippets How it works Manual Trigger โ Initiates workflow manually Google Sheets โ Reads keywords from a sheet (column: Title or Keyword) Code Node โ Selects a daily keyword based on the number of days since July 4, 2025 Set Node โ Saves the selected keyword as seed_keyword HTTP Request โ Fetches autocomplete suggestions from Google Suggest API Function Node โ Parses suggestions into usable items HTTP Request โ Calls Google Custom Search API for each suggestion Code Node โ Formats the search results into JSON How to set up Connect your Google Sheets OAuth2 credentials in n8n Use credential variables for Google Custom Search (โ ๏ธ do not hardcode your key and cx) Replace the sample sheet ID with your own Run the workflow manually or schedule it daily Requirements Google account Enabled Custom Search JSON API on Google Cloud Google Sheet with a column labeled Title or Keyword n8n instance (cloud or self-hosted) How to customize Change the start date to control the keyword rotation cycle Randomize keyword selection instead of rotating Enrich results using tools like Ahrefs or SEMrush Push final output to Telegram, Notion, Slack, or Airtable Add filtering logic based on CPC, volume, or duplicates Example Sheet ๐ Click Here to access the example Google Sheet Sheet must contain a column Title or Keyword in the first row: Title teknologi AI berita viral tren startup
by Monospace Design
What is this workflow doing? This simple workflow is pulling the latest Euro foreign exchange reference rates from the European Central Bank and responding expected values to an incoming HTTP request (GET) via a Webhook trigger node. Setup no authentication** needed the workflow is ready to use test** the workflow template by hitting the test workflow button and calling the URL in the webhook node optional: choose your own Webhook listening path in the Webhook trigger node Usage There are two possible usage scenarios: get all Euro exchange rates as an array of objects get only a specific currency exchange rate as a single object All available rates Using the HTTP query ?foreign=USD (where USD is one of the available currency symbols) will provide only that specificly asked rate. Response example: {"currency":"USD","rate":"1.0852"} Single exchange rate If no query is provided, all available rates are returned. Response example: [{"currency":"USD","rate":"1.0852"},{"currency":"JPY","rate":"163.38"},{"currency":"BGN","rate":"1.9558"},{"currency":"CZK","rate":"25.367"},{"currency":"DKK","rate":"7.4542"},{"currency":"GBP","rate":"0.85495"},{"currency":"HUF","rate":"389.53"},{"currency":"PLN","rate":"4.3053"},{"currency":"RON","rate":"4.9722"},{"currency":"SEK","rate":"11.1675"},{"currency":"CHF","rate":"0.9546"},{"currency":"ISK","rate":"149.30"},{"currency":"NOK","rate":"11.4285"},{"currency":"TRY","rate":"33.7742"},{"currency":"AUD","rate":"1.6560"},{"currency":"BRL","rate":"5.4111"},{"currency":"CAD","rate":"1.4674"},{"currency":"CNY","rate":"7.8100"},{"currency":"HKD","rate":"8.4898"},{"currency":"IDR","rate":"16962.54"},{"currency":"ILS","rate":"3.9603"},{"currency":"INR","rate":"89.9375"},{"currency":"KRW","rate":"1444.46"},{"currency":"MXN","rate":"18.5473"},{"currency":"MYR","rate":"5.1840"},{"currency":"NZD","rate":"1.7560"},{"currency":"PHP","rate":"60.874"},{"currency":"SGD","rate":"1.4582"},{"currency":"THB","rate":"38.915"},{"currency":"ZAR","rate":"20.9499"}] Further info Read more about Euro foreign exchange reference rates here.