by Ria
This is a very simple workflow that lets you subscribe to any github repository for the latest release (using n8n as example). How it works: daily poll to Github repository for release for latest (stable) version of n8n parses the content to HTML sends a gmail Setup steps: add your gmail credentials (or use other email node of choice) change the url to the right Github repository you want to check regularly change the To email address to the email that you want to receive the updates for Feedback & Questions If you have any questions or feedback about this workflow - Feel free to get in touch at ria@n8n.io
by Cameron Wills
Who is this for? Content creators, social media managers, digital marketers, and researchers who need to download original TikTok videos without watermarks for analysis, repurposing, or archiving purposes. What problem does this workflow solve? Downloading TikTok videos without watermarks typically requires using questionable third-party websites that may have limitations, ads, or privacy concerns. This workflow provides a clean, automated solution that can be integrated into your own systems and processes. What this workflow does This workflow automates the process of downloading TikTok videos without watermarks in three simple steps: Fetch the TikTok video page by providing the video URL Extract the raw video URL from the page's HTML data Download the original video file without watermark (Optional) Upload to Google Drive with public sharing link generation The workflow uses web scraping techniques to extract the original video source directly from TikTok's own servers, maintaining the highest possible quality without any added watermarks or branding. Setup (Est. time: 5-10 minutes) Before getting started, you'll need: n8n installation The URL of a TikTok you want to download (Optional) Google Drive API enabled in Google Cloud Console with OAuth Client ID and Client Secret credentials if you want to use the upload feature How to customize this workflow to your needs Replace the example TikTok URL with your desired video links Modify the file naming convention for downloaded videos Integrate with other nodes to process videos after downloading Create a webhook to trigger the workflow from external applications Set up a schedule to regularly download videos from specific accounts This workflow can be extended to support various use cases like trending content analysis, competitor research, creating compilation videos, or building a content library for inspiration. It provides a foundation that can be customized to fit into larger automated workflows for content creation and social media management.
by Airtop
Automating LinkedIn Company URL Verification Use Case This automation verifies that a given LinkedIn URL actually belongs to a company by comparing the website listed on their LinkedIn page against the expected company domain. It is essential for ensuring data accuracy in lead qualification, enrichment, and CRM updates. What This Automation Does Input Parameters Company LinkedIn**: The LinkedIn URL to be verified. Company Domain**: The expected domain (e.g., example.com) for validation. Airtop Profile (connected to LinkedIn)**: Airtop Profile with LinkedIn authentication. Output Confirmation whether the LinkedIn page corresponds to the provided domain. Returns the verified LinkedIn URL if the match is confirmed. How It Works Extracts the website URL from the specified LinkedIn company profile. Compares the extracted URL with the provided company domain. If the domain is contained in the extracted website, the LinkedIn profile is confirmed as valid. Returns the original LinkedIn URL if the match is successful. Setup Requirements Airtop API Key LinkedIn-authenticated Airtop Profile Next Steps Use for LinkedIn Discovery Validation**: Ensure correctness after automated LinkedIn page discovery. Combine with CRM Updates**: Prevent incorrect LinkedIn links from being stored in CRM. Automate in Data Pipelines**: Use this as a validation gate before enrichment or scoring steps.
by Agent Studio
Overview This workflow answers user requests sent via Mac Shortcuts Several Shortcuts call the same webhook, with a query and a type of query Types of query are: translate to english translate to spanish correct grammar (without changing the actual content) make content shorter make content longer How it works Select a text you are writing Launch the shortcut The text is sent to the webhook Depending on the type of request, a different prompt is used Each request is sent to an OpenAI node The workflow responds to the request with the response from GPT Shortcut replace the selected text with the new one For a demo and setup instructions: How to use it Activate the workflow Download this Shortcut template Install the shortcut In step 2 of the shortcut, change the url of the Webhook In Shortcut details, "add Keyboard Shortcut" with the key you want to use to launch the shortcut Go to settings, advanced, check "Allow running scripts" You are ready to use the shortcut. Select a text and hit the keyboard shortcut you just defined
by Khaled
๐ Web Server Monitor & Alert System This automation pings web servers at regular intervals, logs their status, and sends email alerts if a server goes down. Itโs perfect for maintaining visibility over server uptime โ without complex monitoring tools. ๐ง How It Works This workflow performs minute-by-minute checks on all listed servers in a Google Sheet and: โ Logs all reachable servers in an โAliveโ log. ๐ป Sends an email alert if a server is unreachable. ๐ Logs failed servers in a โDownโ sheet with timestamps. ๐งฉ Key Components โฐ 1. Schedule Trigger Runs the workflow every minute for real-time monitoring. ๐ 2. Web Servers List (Google Sheets) Pulls server IPs or hostnames from a Google Sheet named Server_List. Each row = one server to monitor. This makes adding/removing servers effortless โ just update the sheet. ๐ 3. Servers Alive Check (HTTP Request) Performs an HTTP GET request to each server (e.g., http://your-server.com). If the request fails, it automatically triggers the error path (handled via continueOnFail). โ 4. Web Server Alive Log (Google Sheets) Records successful pings in Server_Status_Alive with: Timestamp Server IP Status = Alive This log can be used for uptime reports or audits. ๐ง 5. Server Down Notification (Gmail) If a server fails, this node sends an email to the admin. It includes: Server address Timestamp Suggested action ๐ 6. Web Server Down Log (Google Sheets) Logs failed pings in a separate sheet for historical tracking and debugging. โ Main Advantages Live Server Monitoring Stay informed about server health in near real-time. No-Code Configuration Add/remove servers from the Google Sheet โ no need to touch the workflow. Email Alerts on Failure Proactively notifies you before users report the issue. Audit-Ready Logging Maintains logs for both healthy and failed checks for documentation or reporting. Flexible & Scalable Monitor 1 or 100 servers with the same template โ just scale the list. โ๏ธ Setup Steps ๐ Prerequisites Google Sheet with server list (column name = โServerโ) Gmail OAuth2 Connection for alerts n8n Instance running regularly ๐ Configuration Google Sheets Sheet 1 (Server_List): Your list of servers. Sheet 2 (Server_Status_Alive): Log for reachable servers. Sheet 3 (Server_Status_Down): Log for unreachable servers. Gmail Integration Connect your Gmail account in the Server Down Notification node. Edit recipient email and message content as needed. HTTP Check Adjust the HTTP request URL template if using port numbers or paths (e.g., http://{{Server}}:8080/status). Schedule Default is every 1 minute. Change via Schedule Trigger if needed. ๐งช Testing Input a reachable server (e.g., example.com) and an unreachable IP. Run the workflow manually or wait for the next scheduled run. Check: Alive log updates correctly. Down log records failures. Email alert is received. ๐ Deployment Activate the workflow, and it will quietly run in the background, notifying you of any server downtime instantly while keeping logs for future review.
by Msaid Mohamed el hadi
๐ธ Instagram Full Profile Scraper with Apify and Google Sheets This n8n workflow automates the process of scraping full Instagram profiles using a custom Apify actor, and logs the results into a Google Sheet. It is designed to run at scheduled intervals and process a list of usernames by calling the API, appending the results, and marking them as processed. ๐ Features โฑ Scheduled Execution โ Runs automatically every few minutes. ๐ Google Sheets Integration โ Reads a list of Instagram usernames and updates the same sheet. ๐ง Apify Actor โ Fetches full public Instagram profile data. ๐งฎ Aggregation โ Batches usernames for bulk scraping. โ๏ธ Data Logging โ Appends scraped data to a second sheet. โ Tracking โ Marks usernames as processed once scraped. ๐ Workflow Structure graph TD; ScheduleTrigger --> GetUsernames; GetUsernames --> LimitItems; LimitItems --> AggregateUsernames; AggregateUsernames --> CallApifyActor; CallApifyActor --> AppendToSheet; CallApifyActor --> MarkAsScraped; ๐ Setup Google Sheet Create a Google Sheet with: Sheet 1 named Usernames (GID: 0) Columns: username, scraped Sheet 2 named fullprofiles (GID: 458127000) Sample sheet: ๐ Instagram Profile Sheet n8n Configuration Import this workflow into your n8n instance. Set up your Google Sheets credentials (googleSheetsOAuth2Api). Replace apify_api_your token in the HTTP Request node with your Apify API token. ๐ฆ Required Credentials Google Sheets OAuth2** โ For reading and writing sheet data. Apify API Token** โ To call the custom actor for profile scraping. ๐ Sheets Used | Sheet Name | Purpose | | -------------- | -------------------------------- | | Usernames | Source of usernames to scrape | | fullprofiles | Destination of full profile data | ๐ Apify Actor Info > Instagram Full Profile Scraper > This actor fetches extended profile information from public Instagram profiles. ๐ View on Apify ๐ Workflow Nodes Overview | Node | Purpose | | ------------------------ | ----------------------------------------------------------------- | | Schedule Trigger | Triggers the workflow periodically. | | Get Usernames | Reads usernames from the Usernames sheet. | | Limit | Limits processing to 20 usernames per run. | | Aggregate | Groups usernames into a batch for the API call. | | Call Apify Actor | Sends the usernames to the Apify actor and receives profile data. | | Append Full Profiles | Appends the scraped data to the fullprofiles sheet. | | Mark Username as Scraped | Marks the processed usernames as scraped = TRUE. | | Sticky Note | Provides a reference link to the Apify actor used. | ๐ Example Sheet Structure Usernames Sheet | username | scraped | | ------------ | ------- | | exampleuser1 | | | exampleuser2 | TRUE | fullprofiles Sheet | username | full\_name | biography | follower\_count | ... | | -------- | ---------- | --------- | --------------- | --- | ๐ Security & Notes This workflow does not bypass any Instagram privacy restrictions. It works only with public Instagram profiles. You are responsible for ensuring that scraping complies with Instagramโs terms of service and any applicable laws. ๐ฌ Support For any issues, feel free to reach out: ๐ค @mohamedgb00714 ๐ง mohamedgb00714@gmail.com
by Oneclick AI Squad
This n8n template demonstrates how to create an automated customer feedback collection system for restaurants. The workflow triggers when new customer emails are added to an Excel sheet, automatically sends personalized feedback forms, and stores all responses in a separate Excel tracking sheet. Perfect for restaurants wanting to systematically gather customer insights and improve service quality. Good to know Each feedback form is personalized with the customer's name and email All responses are automatically timestamped and organized in Excel sheets The system handles form validation and ensures complete data capture Email notifications keep your team updated on new feedback submissions How it works Email Distribution Workflow New customer entries are detected in Excel Sheet-1 (customer database) containing customer names and email addresses The system automatically generates personalized feedback forms for each new customer Customized feedback emails are sent with embedded forms tailored to restaurant experience evaluation Wait nodes ensure proper processing timing before sending emails Feedback Collection Workflow Customer form submissions trigger the data collection process All feedback responses are captured including ratings, comments, and contact information Data is automatically appended to Excel Sheet-2 (feedback responses) with complete timestamps The system handles multiple concurrent submissions without data loss Excel Sheet Structure Sheet-1 (Customer Database) Name - Customer's full name Email - Customer's email address for form distribution Sheet-2 (Feedback Responses) Timestamp - Date and time of form submission Name - Customer's full name E-Mail - Customer's email address Contact Number - Customer's phone number How was the cleanliness of the dining area? - Cleanliness rating/feedback Did you like the taste of the food? - Food taste evaluation What dish did you enjoy the most? - Favorite dish identification Was your order accurate and timely? - Service accuracy rating Was our staff polite and helpful? - Staff service evaluation Was the food presentation appealing? - Food presentation rating How would you rate your overall dining experience? - Overall experience score Any additional comments or suggestions? - Open-ended feedback field How to use Import the workflow into your n8n instance and configure Excel integration Set up Sheet-1 with customer names and emails for feedback distribution Configure the feedback form with your restaurant's specific questions and branding Add new customer entries to Sheet-1 to automatically trigger feedback emails Monitor Sheet-2 for incoming responses and analyze customer satisfaction trends The system scales automatically with your customer database growth Requirements Google Sheets account for data storage and management Email service integration (Gmail, SMTP, or similar) n8n instance with Google Sheets and email connectors Customising this workflow Customer feedback automation can be adapted for different restaurant types and service models Try popular use-cases such as post-dining follow-ups, seasonal menu feedback, or special event evaluations The workflow can be extended to include automated response analysis, sentiment scoring, and management dashboard integration
by Monospace Design
What is this workflow doing? This simple workflow is pulling the latest Euro foreign exchange reference rates from the European Central Bank and responding expected values to an incoming HTTP request (GET) via a Webhook trigger node. Setup no authentication** needed the workflow is ready to use test** the workflow template by hitting the test workflow button and calling the URL in the webhook node optional: choose your own Webhook listening path in the Webhook trigger node Usage There are two possible usage scenarios: get all Euro exchange rates as an array of objects get only a specific currency exchange rate as a single object All available rates Using the HTTP query ?foreign=USD (where USD is one of the available currency symbols) will provide only that specificly asked rate. Response example: {"currency":"USD","rate":"1.0852"} Single exchange rate If no query is provided, all available rates are returned. Response example: [{"currency":"USD","rate":"1.0852"},{"currency":"JPY","rate":"163.38"},{"currency":"BGN","rate":"1.9558"},{"currency":"CZK","rate":"25.367"},{"currency":"DKK","rate":"7.4542"},{"currency":"GBP","rate":"0.85495"},{"currency":"HUF","rate":"389.53"},{"currency":"PLN","rate":"4.3053"},{"currency":"RON","rate":"4.9722"},{"currency":"SEK","rate":"11.1675"},{"currency":"CHF","rate":"0.9546"},{"currency":"ISK","rate":"149.30"},{"currency":"NOK","rate":"11.4285"},{"currency":"TRY","rate":"33.7742"},{"currency":"AUD","rate":"1.6560"},{"currency":"BRL","rate":"5.4111"},{"currency":"CAD","rate":"1.4674"},{"currency":"CNY","rate":"7.8100"},{"currency":"HKD","rate":"8.4898"},{"currency":"IDR","rate":"16962.54"},{"currency":"ILS","rate":"3.9603"},{"currency":"INR","rate":"89.9375"},{"currency":"KRW","rate":"1444.46"},{"currency":"MXN","rate":"18.5473"},{"currency":"MYR","rate":"5.1840"},{"currency":"NZD","rate":"1.7560"},{"currency":"PHP","rate":"60.874"},{"currency":"SGD","rate":"1.4582"},{"currency":"THB","rate":"38.915"},{"currency":"ZAR","rate":"20.9499"}] Further info Read more about Euro foreign exchange reference rates here.
by Niklas Hatje
Use Case In most companies, employees have a lot of great ideas. That was the same for us at n8n. We wanted to make it as easy as possible to allow everyone to add their ideas to some formatted database - it should be somewhere where everyone is all the time and could add a new idea without much extra effort. Since we're using Slack, this seemed to be the perfect place to easily add ideas and collect them in Notion. What this workflow does This workflow waits for a webhook call within Slack, that gets fired when users use the /idea command on a bot that you will create as part of this template. It then checks the command, adds the idea to Notion, and notifies the user about the newly added idea as you can see below: Creating your Slack bot Visit https://api.slack.com/apps, click on New App and choose a name and workspace. Click on OAuth & Permissions and scroll down to Scopes -> Bot token Scopes Add the chat:write scope Head over to Slash Commands and click on Create New Command Use /idea as the command Copy the test URL from the Webhook node into Request URL Add whatever feels best to the description and usage hint Go to Install app and click install Setup Add a Database in Notion with the columns Name and Creator Add your Notion credentials and add the integration to your Notion page. Fill the setup node below Create your Slack app (see other sticky) Click Test workflow and use the /idea comment in Slack Activate the workflow and exchange the Request URL with the production URL from the webhook How to adjust it to your needs You can adjust the table in Notion and for example, add different types of ideas or areas that they impact You might wanna add different templates in Notion to make it easier for users to fill their ideas with details Rename the Slack command as it works best for you How to enhance this workflow At n8n we use this workflow in combination with some others. E.g. we have the following things on top: We additionally have a /bug Slack command that adds a new bug to Linear. Here we're using AI to classify the bugs and move it to the right team. (see this template and this template) We also added other types, like /pain to be less solution-driven To make it easier for everyone to give input, we added a Votes column that allows everyone to vote on ideas/pain points in the list We're also running a workflow once a week that highlights the most popular new ideas and the most active voters (see here)
by Guillaume Duvernay
Description This template provides a simple and powerful backend for adding speech-to-text capabilities to any application. It creates a dedicated webhook that receives an audio file, transcribes it using OpenAI's gpt-4o-mini model, and returns the clean text. To help you get started immediately, you'll find a complete, ready-to-use HTML code example right inside the workflow in a sticky note. This code creates a functional recording interface you can use for testing or as a foundation for your own design. Who is this for? Developers:** Quickly add a transcription feature to your application by calling this webhook from your existing frontend or backend code. No-code/Low-code builders:** Embed a functional audio recorder and transcription service into your projects by using the example code found inside the workflow. API enthusiasts:** A lean, practical example of how to use n8n to wrap a service like OpenAI into your own secure and scalable API endpoint. What problem does this solve? Provides a ready-made API:** Instantly gives you a secure webhook to handle audio file uploads and transcription processing without any server setup. Decouples frontend from backend:** Your application only needs to know about one simple webhook URL, allowing you to change the backend logic in n8n without touching your app's code. Offers a clear implementation pattern:** The included example code provides a working demonstration of how to send an audio file from a browser and handle the responseโa pattern you can replicate in any framework. How it works This solution works by defining a clear API contract between your application (the client) and the n8n workflow (the backend). The client-side technique: Your application's interface records or selects an audio file. It then makes a POST request to the n8n webhook URL, sending the audio file as multipart/form-data. It waits for the response from the webhook, parses the JSON body, and extracts the value of the Transcript key. You can see this exact pattern in action in the example code provided in the workflow's sticky note. The n8n workflow (backend): The Webhook node catches the incoming POST request and grabs the audio file. The HTTP Request node sends this file to the OpenAI API. The Set node isolates the transcript text from the API's response. The Respond to Webhook node sends a clean JSON object ({"Transcript": "your text here..."}) back to your application. Setup Configure the n8n workflow: In the Transcribe with OpenAI node, add your OpenAI API credentials. Activate the workflow to enable the endpoint. Click the "Copy" button on the Webhook node to get your unique Production Webhook URL. Integrate with the frontend: Inside the workflow, find the sticky note labeled "Example Frontend Code Below". Copy the complete HTML from the note below it. โ ๏ธ Important: In the code you just copied, find the line const WEBHOOK_URL = 'YOUR WEBHOOK URL'; and replace the placeholder with the Production Webhook URL from n8n. Save the code as an HTML file and open it in your browser to test. Taking it further Save transcripts:* Add an *Airtable* or *Google Sheets** node to log every transcript that comes through the workflow. Error handling:** Enhance the workflow to catch potential errors from the OpenAI API and respond with a clear error message. Analyze the transcript:* Add a *Language Model** node after the transcription step to summarize the text, classify its sentiment, or extract key entities before sending the response.
by Wyeth
Encode JSON to Base64 String in n8n This example workflow demonstrates how to convert a JSON object into a base64-encoded string using n8nโs built-in file processing capabilities. This is a common requirement when working with APIs, webhooks, or SaaS integrations that expect payloads to be base64-encoded. > Tip: The three green-highlighted nodes (Stringify โ Convert to File โ Extract from File) can be wrapped in a Subworkflow to create a reusable Base64 encoder in your own projects. ๐ง Requirements Any running n8n instance (local or cloud) No credentials or external services required What This Workflow Does Generates example JSON data Converts the JSON to a string Saves the string as a binary file Extracts the fileโs contents as a base64 string Outputs the base64 string on the final node Step-by-Step Setup Manual Trigger Start the workflow using the Manual Execution node. This is useful for testing and development. Create JSON Data The Create Json Data node uses raw mode to construct a sample object with all major JSON types: strings, numbers, booleans, nulls, arrays, nested objects, etc. Convert to String The Convert to String node uses the expression ={{ JSON.stringify($json) }} to flatten the object into a single string field named json_text. Convert to File The Convert to File node takes the json_text value and saves it to a UTF-8 encoded binary file in the property encoded_text. Extract from File This node takes the binary file and extracts its contents as a base64-encoded string. The result is saved in the base64_text field. Customization Tips Replace the sample JSON in the Create Json Data node with your own payload structure. To make this reusable, extract the three core nodes into a Subworkflow or wrap them in a custom Function. Use the base64_text output field to post to APIs, store in databases, or include in webhook responses.
by Tenkay
This workflow compares two lists of objects (List A and List B) using a user-specified key (e.g. email, id, domain) and returns: Items common to both lists (based on the key) Items only in List A Items only in List B How it works: Accepts a JSON input containing: listA: the first list of items listB: the second list of items key: the field name to use for comparison Performs a field-based comparison using the specified key Returns a structured output: common: items with matching keys (only one version retained) onlyInA: items found only in List A onlyInB: items found only in List B Example Input: { "key": "email", "listA": [ { "email": "alice@example.com", "name": "Alice" }, { "email": "bob@example.com", "name": "Bob" } ], "listB": [ { "email": "bob@example.com", "name": "Bobby" }, { "email": "carol@example.com", "name": "Carol" } ] } Output: common: [ { "email": "bob@example.com", "name": "Bob" } ] onlyInA: [ { "email": "alice@example.com", "name": "Alice" } ] onlyInB: [ { "email": "carol@example.com", "name": "Carol" } ] Use Cases: Deduplicate data between two sources Find overlapping records Identify new or missing entries across systems This workflow is useful for internal data auditing, list reconciliation, transaction reconciliation, or pre-processing sync jobs.