by Zacharia Kimotho
This workflow is designed to generate prompts for AI agents and store them in Airtable. It starts by receiving a chat message, processes it to create a structured prompt, categorizes the prompt, and finally stores it in Airtable. 2. Setup Instructions Prerequisites AI model eg Gemini, openAI etc** Airtable base and table or other storage tool** Step-by-Step Guide Clone the Workflow Copy the provided workflow JSON and import it into your n8n instance. Configure Credentials Set up the Google Gemini(PaLM) API account credentials. Set up the Airtable Personal Access Token account credentials. Map Airtable Base and Table Create a copy of the Prompt Library in Airtable. Map the Airtable base and table in the Airtable node. Customize Prompt Template Edit the 'Create prompt' node to customize the prompt template as needed. Configuration Options Prompt Template:** Customize the prompt template in the 'Create prompt' node to fit your specific use case. Airtable Mapping:** Ensure the Airtable base and table are correctly mapped in the Airtable node. 4. Running and Troubleshooting Running the Workflow Trigger the Workflow: Send a chat message to trigger the workflow. Monitor Execution: Use the n8n interface to monitor the workflow execution. Check Completion: Verify that the prompt is stored in Airtable and check the chat interface for the result. Troubleshooting Tips API Issues:** Ensure that the APIs and Airtable credentials are correctly configured. Data Mapping:** Verify that the Airtable base and table are correctly mapped. Prompt Template:** Check the prompt template for any errors or inconsistencies. Use Case Examples This workflow is particularly useful in scenarios where you want to automate the generation and management of AI agent prompts. Here are some examples: Rapid Prototyping of AI Agents: Quickly generate and test different prompts for AI agents in various applications. Content Creation:** Generate prompts for AI models that create blog posts, articles, or social media content. Customer Service Automation:** Develop prompts for AI-powered chatbots to handle customer inquiries and support requests. Educational Tools:** Create prompts for AI tutors or learning assistants. Industries/Professionals: Software Development:** Developers building AI-powered applications. Marketing:** Marketers automating content creation and social media management. Customer Service:** Customer service managers implementing AI-driven chatbots. Education:** Educators creating AI-based learning tools. Practical Value: Time Savings:** Automates the prompt generation process, saving significant time and effort. Improved Prompt Quality:** Leverages Google Gemini and structured prompt engineering principles to generate more effective prompts. Centralized Prompt Management:** Stores prompts in Airtable for easy access, organization, and reuse. 4. Running and Troubleshooting Running the Workflow:** Activate the workflow in n8n. Send a chat message to the webhook URL configured in the "When chat message received" node. Monitor the workflow execution in the n8n editor. Monitoring Execution:** Check the execution log in n8n to see the data flowing through each node and identify any errors. Checking for Successful Completion:** Verify that a new record is created in your Airtable base with the generated prompt, name, and category. Confirm that the "Return results" node sends back confirmation of the prompt in the chat interface. Troubleshooting Tips:** Error:** 400: Bad Request in the Google Gemini nodes: Cause:** Invalid API key or insufficient permissions. Solution:** Double-check your Google Gemini API key and ensure that the API is enabled for your project. Error:** Airtable node fails to create a record: Cause:** Invalid Airtable credentials, incorrect Base ID or Table ID, or mismatched column names. Solution:** Verify your Airtable API key, Base ID, Table ID, and column names. Ensure that the data types in n8n match the data types in your Airtable columns. Follow me on Linkedin for more
by Aditya Gaur
Who is this template for? This template is designed for developers, DevOps engineers, and automation enthusiasts who want to streamline their GitLab merge request process using n8n, a low-code workflow automation tool. It eliminates manual intervention by automating the merging of GitLab branches through API calls. How it works ? Trigger the workflow: The workflow can be triggered by a webhook, a scheduled event, or a GitLab event (e.g., a new merge request is created or approved). Fetch Merge Request Details: n8n makes an API call to GitLab to retrieve merge request details. Check Merge Conditions: The workflow validates whether the merge request meets predefined conditions (e.g., approvals met, CI/CD pipelines passed). Perform the Merge: If all conditions are met, n8n sends a request to the GitLab API to merge the branch automatically. Setup Steps 1. Prerequisites An n8n instance (Self-hosted or Cloud) A GitLab personal access token with API access A GitLab repository with merge requests enabled 2. Create the n8n Workflow Set up a trigger: Choose a trigger node (Webhook, Cron, or GitLab Trigger). Fetch merge request details: Add an HTTP Request node to call GET /merge_requests/:id from GitLab API. Validate conditions: Check if the merge request has necessary approvals. Ensure CI/CD pipelines have passed. Merge the request: Use an HTTP Request node to call PUT /merge_requests/:id/merge API. 3. Test the Workflow Create a test merge request. Check if the workflow triggers and merges automatically. Debug using n8n logs if needed. 4. Deploy and Monitor Deploy the workflow in production. Use n8n’s monitoring features to track execution. This template enables seamless GitLab merge automation, improving efficiency and reducing manual work! Note: Never hard code API token or secret in your https request.
by bangank36
This workflow converts an exported CSV from Squarespace profiles into a Shopify-compatible format for customer import. How It Works Clone this Google Sheets template, which includes two sheets: Squarespace Profiles (Input) Go to Squarespace Dashboard → Contacts Click the three-dot icon → Select Export all Contacts Shopify Customers (Output) This sheet formats the data to match Shopify's customer import CSV. Shopify Dashboard → Customers → Import customers by CSV The workflow can run on-demand or be triggered via webhook. Via webhook Set up webhook node to expect a POST request Trigger the webhook using this code (psuedo) - replace {webhook-url} with the actual URL const formData = new FormData(); formData.append('file', blob, 'profiles_export.csv'); // Add file to FormData fetch('{webhook-url}', { // Replace with your target URL method: 'POST', mode: 'no-cors', body: formData }); The data is processed into the Shopify Customers sheet. Manually trigger Import Squarespace profiles into the sheet. Run the workflow to convert and populate the Shopify Customers sheet. Once workflow is done, export the Shopify to csv and import to Shopify customers Requirements To use this template, you need: Google Sheets API credentials Google Sheets Setup Use this sample Google Sheets template to get started quickly. Who Is This For? For anyone looking to automate Squarespace contact exports into a Shopify-compatible format—no more manual conversion! Explore More Templates Check out my other n8n templates: 👉 n8n.io/creators/bangank36
by Danger
Ok google download "movie name" I develop this automation to improve my quality of life in handling torrents in my media-center. Goal Automate the search operations of a movie based on its name and trigger a download using your transmission-daemon. Setup Prerequisite Transmission daemon up and running and its authentication method N8N configured self-hosted or with the possibility to add npm package better with docker-compose.yaml Telegram bot credential [optional] Configuration Create a folder where your docker-compose.yaml belongs n8n_dir and proceed in installing the node package. cd ~/n8n_dir npm i torrent-search-api Configuring your docker-compose.yaml file this way. You must include all the dependencies of torrent-search-api. This will let you run the new torrent search node presented in this workflow. version: '3.3' services: n8n: container_name: n8n ports: '5678:5678' restart: always volumes: '~/n8n_dir/.n8n:/home/node/.n8n' '~/n8n_dir/node_modules/@tootallnate:/usr/local/lib/node_modules/@tootallnate' '~/n8n_dir/node_modules/accepts:/usr/local/lib/node_modules/accepts' '~/n8n_dir/node_modules/agent-base:/usr/local/lib/node_modules/agent-base' '~/n8n_dir/node_modules/ajv:/usr/local/lib/node_modules/ajv' '~/n8n_dir/node_modules/ansi-styles:/usr/local/lib/node_modules/ansi-styles' '~/n8n_dir/node_modules/asn1:/usr/local/lib/node_modules/asn1' '~/n8n_dir/node_modules/assert:/usr/local/lib/node_modules/assert' '~/n8n_dir/node_modules/assert-plus:/usr/local/lib/node_modules/assert-plus' '~/n8n_dir/node_modules/ast-types:/usr/local/lib/node_modules/ast-types' '~/n8n_dir/node_modules/asynckit:/usr/local/lib/node_modules/asynckit' '~/n8n_dir/node_modules/aws-sign2:/usr/local/lib/node_modules/aws-sign2' '~/n8n_dir/node_modules/aws4:/usr/local/lib/node_modules/aws4' '~/n8n_dir/node_modules/base64-js:/usr/local/lib/node_modules/base64-js' '~/n8n_dir/node_modules/batch:/usr/local/lib/node_modules/batch' '~/n8n_dir/node_modules/bcrypt-pbkdf:/usr/local/lib/node_modules/bcrypt-pbkdf' '~/n8n_dir/node_modules/bluebird:/usr/local/lib/node_modules/bluebird' '~/n8n_dir/node_modules/boolbase:/usr/local/lib/node_modules/boolbase' '~/n8n_dir/node_modules/brotli:/usr/local/lib/node_modules/brotli' '~/n8n_dir/node_modules/bytes:/usr/local/lib/node_modules/bytes' '~/n8n_dir/node_modules/caseless:/usr/local/lib/node_modules/caseless' '~/n8n_dir/node_modules/chalk:/usr/local/lib/node_modules/chalk' '~/n8n_dir/node_modules/cheerio:/usr/local/lib/node_modules/cheerio' '~/n8n_dir/node_modules/cloudscraper:/usr/local/lib/node_modules/cloudscraper' '~/n8n_dir/node_modules/co:/usr/local/lib/node_modules/co' '~/n8n_dir/node_modules/color-convert:/usr/local/lib/node_modules/color-convert' '~/n8n_dir/node_modules/color-name:/usr/local/lib/node_modules/color-name' '~/n8n_dir/node_modules/combined-stream:/usr/local/lib/node_modules/combined-stream' '~/n8n_dir/node_modules/component-emitter:/usr/local/lib/node_modules/component-emitter' '~/n8n_dir/node_modules/content-disposition:/usr/local/lib/node_modules/content-disposition' '~/n8n_dir/node_modules/content-type:/usr/local/lib/node_modules/content-type' '~/n8n_dir/node_modules/cookiejar:/usr/local/lib/node_modules/cookiejar' '~/n8n_dir/node_modules/core-util-is:/usr/local/lib/node_modules/core-util-is' '~/n8n_dir/node_modules/css-select:/usr/local/lib/node_modules/css-select' '~/n8n_dir/node_modules/css-what:/usr/local/lib/node_modules/css-what' '~/n8n_dir/node_modules/dashdash:/usr/local/lib/node_modules/dashdash' '~/n8n_dir/node_modules/data-uri-to-buffer:/usr/local/lib/node_modules/data-uri-to-buffer' '~/n8n_dir/node_modules/debug:/usr/local/lib/node_modules/debug' '~/n8n_dir/node_modules/deep-is:/usr/local/lib/node_modules/deep-is' '~/n8n_dir/node_modules/degenerator:/usr/local/lib/node_modules/degenerator' '~/n8n_dir/node_modules/delayed-stream:/usr/local/lib/node_modules/delayed-stream' '~/n8n_dir/node_modules/delegates:/usr/local/lib/node_modules/delegates' '~/n8n_dir/node_modules/depd:/usr/local/lib/node_modules/depd' '~/n8n_dir/node_modules/destroy:/usr/local/lib/node_modules/destroy' '~/n8n_dir/node_modules/dom-serializer:/usr/local/lib/node_modules/dom-serializer' '~/n8n_dir/node_modules/domelementtype:/usr/local/lib/node_modules/domelementtype' '~/n8n_dir/node_modules/domhandler:/usr/local/lib/node_modules/domhandler' '~/n8n_dir/node_modules/domutils:/usr/local/lib/node_modules/domutils' '~/n8n_dir/node_modules/ecc-jsbn:/usr/local/lib/node_modules/ecc-jsbn' '~/n8n_dir/node_modules/ee-first:/usr/local/lib/node_modules/ee-first' '~/n8n_dir/node_modules/emitter-component:/usr/local/lib/node_modules/emitter-component' '~/n8n_dir/node_modules/enqueue:/usr/local/lib/node_modules/enqueue' '~/n8n_dir/node_modules/enstore:/usr/local/lib/node_modules/enstore' '~/n8n_dir/node_modules/entities:/usr/local/lib/node_modules/entities' '~/n8n_dir/node_modules/error-inject:/usr/local/lib/node_modules/error-inject' '~/n8n_dir/node_modules/escape-html:/usr/local/lib/node_modules/escape-html' '~/n8n_dir/node_modules/escape-string-regexp:/usr/local/lib/node_modules/escape-string-regexp' '~/n8n_dir/node_modules/escodegen:/usr/local/lib/node_modules/escodegen' '~/n8n_dir/node_modules/esprima:/usr/local/lib/node_modules/esprima' '~/n8n_dir/node_modules/estraverse:/usr/local/lib/node_modules/estraverse' '~/n8n_dir/node_modules/esutils:/usr/local/lib/node_modules/esutils' '~/n8n_dir/node_modules/extend:/usr/local/lib/node_modules/extend' '~/n8n_dir/node_modules/extsprintf:/usr/local/lib/node_modules/extsprintf' '~/n8n_dir/node_modules/fast-deep-equal:/usr/local/lib/node_modules/fast-deep-equal' '~/n8n_dir/node_modules/fast-json-stable-stringify:/usr/local/lib/node_modules/fast-json-stable-stringify' '~/n8n_dir/node_modules/fast-levenshtein:/usr/local/lib/node_modules/fast-levenshtein' '~/n8n_dir/node_modules/file-uri-to-path:/usr/local/lib/node_modules/file-uri-to-path' '~/n8n_dir/node_modules/forever-agent:/usr/local/lib/node_modules/forever-agent' '~/n8n_dir/node_modules/form-data:/usr/local/lib/node_modules/form-data' '~/n8n_dir/node_modules/format-parser:/usr/local/lib/node_modules/format-parser' '~/n8n_dir/node_modules/formidable:/usr/local/lib/node_modules/formidable' '~/n8n_dir/node_modules/fs-extra:/usr/local/lib/node_modules/fs-extra' '~/n8n_dir/node_modules/ftp:/usr/local/lib/node_modules/ftp' '~/n8n_dir/node_modules/get-uri:/usr/local/lib/node_modules/get-uri' '~/n8n_dir/node_modules/getpass:/usr/local/lib/node_modules/getpass' '~/n8n_dir/node_modules/graceful-fs:/usr/local/lib/node_modules/graceful-fs' '~/n8n_dir/node_modules/har-schema:/usr/local/lib/node_modules/har-schema' '~/n8n_dir/node_modules/har-validator:/usr/local/lib/node_modules/har-validator' '~/n8n_dir/node_modules/has-flag:/usr/local/lib/node_modules/has-flag' '~/n8n_dir/node_modules/htmlparser2:/usr/local/lib/node_modules/htmlparser2' '~/n8n_dir/node_modules/http-context:/usr/local/lib/node_modules/http-context' '~/n8n_dir/node_modules/http-errors:/usr/local/lib/node_modules/http-errors' '~/n8n_dir/node_modules/http-incoming:/usr/local/lib/node_modules/http-incoming' '~/n8n_dir/node_modules/http-outgoing:/usr/local/lib/node_modules/http-outgoing' '~/n8n_dir/node_modules/http-proxy-agent:/usr/local/lib/node_modules/http-proxy-agent' '~/n8n_dir/node_modules/http-signature:/usr/local/lib/node_modules/http-signature' '~/n8n_dir/node_modules/https-proxy-agent:/usr/local/lib/node_modules/https-proxy-agent' '~/n8n_dir/node_modules/iconv-lite:/usr/local/lib/node_modules/iconv-lite' '~/n8n_dir/node_modules/inherits:/usr/local/lib/node_modules/inherits' '~/n8n_dir/node_modules/ip:/usr/local/lib/node_modules/ip' '~/n8n_dir/node_modules/is-browser:/usr/local/lib/node_modules/is-browser' '~/n8n_dir/node_modules/is-typedarray:/usr/local/lib/node_modules/is-typedarray' '~/n8n_dir/node_modules/is-url:/usr/local/lib/node_modules/is-url' '~/n8n_dir/node_modules/isarray:/usr/local/lib/node_modules/isarray' '~/n8n_dir/node_modules/isobject:/usr/local/lib/node_modules/isobject' '~/n8n_dir/node_modules/isstream:/usr/local/lib/node_modules/isstream' '~/n8n_dir/node_modules/jsbn:/usr/local/lib/node_modules/jsbn' '~/n8n_dir/node_modules/json-schema:/usr/local/lib/node_modules/json-schema' '~/n8n_dir/node_modules/json-schema-traverse:/usr/local/lib/node_modules/json-schema-traverse' '~/n8n_dir/node_modules/json-stringify-safe:/usr/local/lib/node_modules/json-stringify-safe' '~/n8n_dir/node_modules/jsonfile:/usr/local/lib/node_modules/jsonfile' '~/n8n_dir/node_modules/jsprim:/usr/local/lib/node_modules/jsprim' '~/n8n_dir/node_modules/koa-is-json:/usr/local/lib/node_modules/koa-is-json' '~/n8n_dir/node_modules/levn:/usr/local/lib/node_modules/levn' '~/n8n_dir/node_modules/lodash:/usr/local/lib/node_modules/lodash' '~/n8n_dir/node_modules/lodash.assignin:/usr/local/lib/node_modules/lodash.assignin' '~/n8n_dir/node_modules/lodash.bind:/usr/local/lib/node_modules/lodash.bind' '~/n8n_dir/node_modules/lodash.defaults:/usr/local/lib/node_modules/lodash.defaults' '~/n8n_dir/node_modules/lodash.filter:/usr/local/lib/node_modules/lodash.filter' '~/n8n_dir/node_modules/lodash.flatten:/usr/local/lib/node_modules/lodash.flatten' '~/n8n_dir/node_modules/lodash.foreach:/usr/local/lib/node_modules/lodash.foreach' '~/n8n_dir/node_modules/lodash.map:/usr/local/lib/node_modules/lodash.map' '~/n8n_dir/node_modules/lodash.merge:/usr/local/lib/node_modules/lodash.merge' '~/n8n_dir/node_modules/lodash.pick:/usr/local/lib/node_modules/lodash.pick' '~/n8n_dir/node_modules/lodash.reduce:/usr/local/lib/node_modules/lodash.reduce' '~/n8n_dir/node_modules/lodash.reject:/usr/local/lib/node_modules/lodash.reject' '~/n8n_dir/node_modules/lodash.some:/usr/local/lib/node_modules/lodash.some' '~/n8n_dir/node_modules/lru-cache:/usr/local/lib/node_modules/lru-cache' '~/n8n_dir/node_modules/media-typer:/usr/local/lib/node_modules/media-typer' '~/n8n_dir/node_modules/methods:/usr/local/lib/node_modules/methods' '~/n8n_dir/node_modules/mime:/usr/local/lib/node_modules/mime' '~/n8n_dir/node_modules/mime-db:/usr/local/lib/node_modules/mime-db' '~/n8n_dir/node_modules/mime-types:/usr/local/lib/node_modules/mime-types' '~/n8n_dir/node_modules/monotonic-timestamp:/usr/local/lib/node_modules/monotonic-timestamp' '~/n8n_dir/node_modules/ms:/usr/local/lib/node_modules/ms' '~/n8n_dir/node_modules/negotiator:/usr/local/lib/node_modules/negotiator' '~/n8n_dir/node_modules/netmask:/usr/local/lib/node_modules/netmask' '~/n8n_dir/node_modules/nth-check:/usr/local/lib/node_modules/nth-check' '~/n8n_dir/node_modules/oauth-sign:/usr/local/lib/node_modules/oauth-sign' '~/n8n_dir/node_modules/object-assign:/usr/local/lib/node_modules/object-assign' '~/n8n_dir/node_modules/on-finished:/usr/local/lib/node_modules/on-finished' '~/n8n_dir/node_modules/optionator:/usr/local/lib/node_modules/optionator' '~/n8n_dir/node_modules/pac-proxy-agent:/usr/local/lib/node_modules/pac-proxy-agent' '~/n8n_dir/node_modules/pac-resolver:/usr/local/lib/node_modules/pac-resolver' '~/n8n_dir/node_modules/parseurl:/usr/local/lib/node_modules/parseurl' '~/n8n_dir/node_modules/performance-now:/usr/local/lib/node_modules/performance-now' '~/n8n_dir/node_modules/prelude-ls:/usr/local/lib/node_modules/prelude-ls' '~/n8n_dir/node_modules/process-nextick-args:/usr/local/lib/node_modules/process-nextick-args' '~/n8n_dir/node_modules/promise-polyfill:/usr/local/lib/node_modules/promise-polyfill' '~/n8n_dir/node_modules/proxy-agent:/usr/local/lib/node_modules/proxy-agent' '~/n8n_dir/node_modules/proxy-from-env:/usr/local/lib/node_modules/proxy-from-env' '~/n8n_dir/node_modules/psl:/usr/local/lib/node_modules/psl' '~/n8n_dir/node_modules/punycode:/usr/local/lib/node_modules/punycode' '~/n8n_dir/node_modules/qs:/usr/local/lib/node_modules/qs' '~/n8n_dir/node_modules/querystring:/usr/local/lib/node_modules/querystring' '~/n8n_dir/node_modules/raw-body:/usr/local/lib/node_modules/raw-body' '~/n8n_dir/node_modules/readable-stream:/usr/local/lib/node_modules/readable-stream' '~/n8n_dir/node_modules/request:/usr/local/lib/node_modules/request' '~/n8n_dir/node_modules/request-promise:/usr/local/lib/node_modules/request-promise' '~/n8n_dir/node_modules/request-promise-core:/usr/local/lib/node_modules/request-promise-core' '~/n8n_dir/node_modules/request-x-ray:/usr/local/lib/node_modules/request-x-ray' '~/n8n_dir/node_modules/safe-buffer:/usr/local/lib/node_modules/safe-buffer' '~/n8n_dir/node_modules/safer-buffer:/usr/local/lib/node_modules/safer-buffer' '~/n8n_dir/node_modules/selectn:/usr/local/lib/node_modules/selectn' '~/n8n_dir/node_modules/setprototypeof:/usr/local/lib/node_modules/setprototypeof' '~/n8n_dir/node_modules/sliced:/usr/local/lib/node_modules/sliced' '~/n8n_dir/node_modules/smart-buffer:/usr/local/lib/node_modules/smart-buffer' '~/n8n_dir/node_modules/socks:/usr/local/lib/node_modules/socks' '~/n8n_dir/node_modules/socks-proxy-agent:/usr/local/lib/node_modules/socks-proxy-agent' '~/n8n_dir/node_modules/source-map:/usr/local/lib/node_modules/source-map' '~/n8n_dir/node_modules/sshpk:/usr/local/lib/node_modules/sshpk' '~/n8n_dir/node_modules/statuses:/usr/local/lib/node_modules/statuses' '~/n8n_dir/node_modules/stealthy-require:/usr/local/lib/node_modules/stealthy-require' '~/n8n_dir/node_modules/stream-to-string:/usr/local/lib/node_modules/stream-to-string' '~/n8n_dir/node_modules/string-format:/usr/local/lib/node_modules/string-format' '~/n8n_dir/node_modules/string_decoder:/usr/local/lib/node_modules/string_decoder' '~/n8n_dir/node_modules/superagent:/usr/local/lib/node_modules/superagent' '~/n8n_dir/node_modules/superagent-proxy:/usr/local/lib/node_modules/superagent-proxy' '~/n8n_dir/node_modules/supports-color:/usr/local/lib/node_modules/supports-color' '~/n8n_dir/node_modules/toidentifier:/usr/local/lib/node_modules/toidentifier' '~/n8n_dir/node_modules/torrent-search-api:/usr/local/lib/node_modules/torrent-search-api' '~/n8n_dir/node_modules/tough-cookie:/usr/local/lib/node_modules/tough-cookie' '~/n8n_dir/node_modules/tslib:/usr/local/lib/node_modules/tslib' '~/n8n_dir/node_modules/tunnel-agent:/usr/local/lib/node_modules/tunnel-agent' '~/n8n_dir/node_modules/tweetnacl:/usr/local/lib/node_modules/tweetnacl' '~/n8n_dir/node_modules/type-check:/usr/local/lib/node_modules/type-check' '~/n8n_dir/node_modules/type-is:/usr/local/lib/node_modules/type-is' '~/n8n_dir/node_modules/universalify:/usr/local/lib/node_modules/universalify' '~/n8n_dir/node_modules/unpipe:/usr/local/lib/node_modules/unpipe' '~/n8n_dir/node_modules/uri-js:/usr/local/lib/node_modules/uri-js' '~/n8n_dir/node_modules/util:/usr/local/lib/node_modules/util' '~/n8n_dir/node_modules/util-deprecate:/usr/local/lib/node_modules/util-deprecate' '~/n8n_dir/node_modules/uuid:/usr/local/lib/node_modules/uuid' '~/n8n_dir/node_modules/vary:/usr/local/lib/node_modules/vary' '~/n8n_dir/node_modules/verror:/usr/local/lib/node_modules/verror' '~/n8n_dir/node_modules/word-wrap:/usr/local/lib/node_modules/word-wrap' '~/n8n_dir/node_modules/wrap-fn:/usr/local/lib/node_modules/wrap-fn' '~/n8n_dir/node_modules/x-ray:/usr/local/lib/node_modules/x-ray' '~/n8n_dir/node_modules/x-ray-crawler:/usr/local/lib/node_modules/x-ray-crawler' '~/n8n_dir/node_modules/x-ray-parse:/usr/local/lib/node_modules/x-ray-parse' '~/n8n_dir/node_modules/x-ray-scraper:/usr/local/lib/node_modules/x-ray-scraper' '~/n8n_dir/node_modules/xregexp:/usr/local/lib/node_modules/xregexp' '~/n8n_dir/node_modules/yallist:/usr/local/lib/node_modules/yallist' '~/n8n_dir/node_modules/yieldly:/usr/local/lib/node_modules/yieldly' image: 'n8nio/n8n:latest-rpi' environment: N8N_BASIC_AUTH_ACTIVE=true N8N_BASIC_AUTH_USER=username N8N_BASIC_AUTH_PASSWORD=your_secret_n8n_password EXECUTIONS_DATA_PRUNE=true EXECUTIONS_DATA_MAX_AGE=120 EXECUTIONS_TIMEOUT=300 EXECUTIONS_TIMEOUT_MAX=500 GENERIC_TIMEZONE=Europe/Berlin NODE_FUNCTION_ALLOW_EXTERNAL=torrent-search-api Once configured this way run n8n and create a new workflow coping the one proposed. Configure workflow Transmission In order to send command to transmission you must validate the Basic Auth. To do so: open the Start download node and edit the Credentials. Perform the same operation choosing the new credentials also in node Start download new token. In this automation we call transmission twice due to a security protocol in transmission system that prevents single click commands to be triggered, performing the request twice bypasses this security mechanism. https://en.wikipedia.org/wiki/Cross-site_request_forgery We use the X-Transmission-Session-Id provided by the first request to authenticate the second request. Telegram In order to make the workflow work as expected you must create a telegram bot and configure the nodes (Torrent not found and Telegram1) to send your message once the workflow is complete. Here's an easy guide to follow https://docs.n8n.io/nodes/n8n-nodes-base.telegram/ In those nodes you also should configure the Chat ID, you may use your telegram username or use a bot to retrieve your id. You may chat with useridinfobot that sends you your id. Ok google automation Since right now we do not have a n8n client for mobile that can trigger automation using google assistant I decided to use an IFTTT automation to trigger the webhook. I connect my IFTTT account with google assistant and pick the trigger. Say a phrase with a text ingredient as in the picture below. And configure the trigger this way. scarica $ -> download $ or metti in download $ -> put in download $ or some other trigger you may want. Then configure your server to trigger the webhook of n8n. Conclusion In conclusion we provide a fully working automation that integrates in n8n a node library and provides an easy trigger to perform a complex operation. Security concern Giving the ability to trigger a download may be problematic for potential unwanted torrent malware download, so you may decide to authenticate the webhook request passing in the body another field with a shared token between the two endpoints. Moreover the torrent-search-api library and its dependencies have some vulnerability that you may want to avoid on your own media-center, this will hopefully be patched soon in a further release of the library. This is just an interesting proof of concept. Quality of the download You may want to introduce another block between torrent search and webhook trigger to search for a movie based on the words detected by google assistant, sometimes it misinterprets something and you may end up downloading potential copyrighted material. Please use this automation only for free and open source movies and music.
by Roger Filomeno
Introduction: This workflow template helps you determine if a Twitch user's stream is currently live or offline. Setup Instructions: The Document node holds the sample Twitch username you wish to check, you may adapt it in your workflow by replacing this with a chain that contains the Twitch username you want to check. This value is passed to the GraphQL node query as $('Document').item.json.twitch so make sure to change this based on your workflow. How it Works: The important nodes here are the GrapQL and IF nodes. The GrapQL queries the Twitch API, and then the output returns a document with the stream property. The IF node then checks if this property has a value, if null means the user is offline, otherwise the user is online or live. Common Use Cases: You can use this with other workflow templates to post live stream alerts to Twitter/X, Bluesky, and Discord via webhooks, etc to notify your community to join youR stream. You may also use an LLM node to write a custom alert based on the value of property title How to adjust this template If you want to check a list of Twitch channels, you can simply exchange the Document set node in the beginning with your list of channels. For more information on the GraphQL output please see the official Twitch API documentation: Get Streams
by ist00dent
This n8n template enables you to instantly retrieve detailed geolocation information for any given IP address by simply sending a webhook request. Leverage the power of IP-API.com to gain insights into user locations, personalize experiences, or enhance security protocols within your automated workflows. 🔧 How it works Receive IP Webhook: This node acts as the entry point, listening for incoming POST requests. It expects a JSON body containing an ip property with the IP address you wish to look up. Get IP Geolocation: This node makes an HTTP GET request to the IP-API.com service, passing the IP address from your webhook. The API responds with a comprehensive JSON object detailing the IP's location (country, city, region), ISP, organization, and more. Respond with Geolocation Data: This node sends the full geolocation data received from IP-API.com back to the service that initiated the webhook. 👤 Who is it for? This workflow is ideal for: Marketing & Sales Teams: Personalize website content, offers, or ads based on a visitor's geographic location. Tailor email campaigns by region. Customer Support: Quickly identify a customer's location to provide more localized or relevant assistance. Security & Fraud Detection: Analyze incoming connection IPs to identify suspicious activity, block known malicious regions, or flag potential fraud. Analytics & Reporting: Augment your analytics data with geographical insights about your users or traffic. Developers & Integrators: Easily add IP lookup functionality to custom applications, internal tools, or monitoring systems. Content Delivery Networks (CDNs): Route users to the closest servers for faster content delivery (though advanced CDNs usually handle this automatically). 📑 Data Structure When you trigger the webhook, send a POST request with a JSON body structured as follows: { "ip": "8.8.8.8" // Replace with the IP address you want to look up } The workflow will return a JSON response similar to this (data will vary based on IP): { "status": "success", "country": "United States", "countryCode": "US", "region": "VA", "regionName": "Virginia", "city": "Ashburn", "zip": "20149", "lat": 39.0437, "lon": -77.4875, "timezone": "America/New_York", "isp": "Google LLC", "org": "Google Public DNS", "as": "AS15169 Google LLC", "query": "8.8.8.8" } ⚙️ Setup Instructions Import Workflow: In your n8n editor, click "Import from JSON" and paste the provided workflow JSON. Configure Webhook Path: Double-click the Receive IP Webhook node. In the 'Path' field, set a unique and descriptive path (e.g., /ip-lookup). Activate Workflow: Save and activate the workflow. 📝 Tips This workflow, while simple, is a powerful building block. Here's how you can make it even more useful: Conditional Logic: Add IF nodes after "Get IP Geolocation" to create conditional branches. For example: If countryCode is 'CN' or 'RU', send an alert to your security team. If city is 'New York', route the request to a specific sales representative. Data Enrichment: Integrate this workflow into larger automation. For instance: When a new sign-up occurs, pass their IP address to this workflow, then save the returned geolocation data (country, city, ISP) alongside their user profile in your CRM or database. For e-commerce, use the location data to pre-fill shipping fields or suggest local currency/language. Logging & Analytics: Store the lookup results in a spreadsheet (Google Sheets), database (PostgreSQL, Airtable), or logging service. This can help you track where your users are coming from over time. Rate Limiting: IP-API.com has rate limits for its free tier. If you anticipate high usage, consider adding a Delay node or implementing a caching mechanism with a Cache node to avoid hitting limits. For heavy use, you might need to upgrade to a paid plan. Dynamic Response: Instead of returning the full JSON, you could use a Function node to extract only specific pieces of information (e.g., just the country and city) and return a more concise response. Input Validation: For robust production use, add a Function node after the webhook to validate that the incoming ip value is indeed a valid IP address. If it's not, you can return an error message to the caller.
by Promovaweb
When you collect leads from a form, you need to format the incoming data such as the lead's name and also apply a basic validation of the email entered. Lucky for us, N8N offers all of these features with simple expressions that can easily be applied to data. This workflow aims to show how you can process your lead data before saving it in Mautic. How it Works This workflow receives data from a Wordpress form; applies name formatting and basic validation to the email; Creates the contact in Mautic; If e-mail is invalid, add the lead in Dot Not Contact list. Setup Steps Set up credentials when you first open the workflow. You'll need a Mautic account. You need to configure a form in Wordpress (Elementor, WPForms, etc.) and send it to the N8N Webhook address; Now map the fields you need to apply formatting and validation. After testing your workflow, swap the Test URL to Production URL in Discourse and activate your workflow
by Dmytro
AI-Powered Product Assistant for E-commerce Transform your online store customer service with an intelligent AI assistant that automatically processes customer inquiries, searches your product database, and provides personalized responses about product availability, pricing, and specifications. Perfect for shoe stores, fashion retailers, and any business with extensive product catalogs - this workflow eliminates manual customer service while increasing response speed and accuracy. How it works Customer sends product inquiry via webhook (Instagram DM, website chat, or messaging app) AI extracts key product details (brand, model, size, color) from natural language text System searches your Google Sheets product database with smart filtering AI generates friendly, personalized response with availability, pricing, and stock information Automatic response sent back to customer with product details or alternatives Screenshots: Customer inquiry: "Do you have Nike Air Max 40 size?" AI response: "Nike Air Max 90, size 40 - in stock 3 pieces, price 120$" Set up steps Prepare your product database - Create Google Sheets with columns: Brand, Model, Size, Color, Price, Quantity Configure AI settings - Connect OpenAI API for natural language processing Set up webhook endpoint - Configure trigger for your messaging platform (Instagram, Telegram, website chat) Test with sample inquiries - Verify AI correctly parses requests and finds products Deploy and monitor - Launch your automated assistant and track performance Time investment: 30-45 minutes setup, works immediately with any product catalog up to 1000+ items.
by Agent Studio
Overview This workflow provides Retell agent builders with a simple way to populate dynamic variables using n8n. The workflow fetches user information from a Google Sheet based on the phone number and sends it back to Retell. It is based on Retell's Inbound Webhook Call. Retell is a service that lets you create Voice Agents that handle voice calls simply, based on a prompt or using a conversational flow builder. Who is it for For builders of Retell's Voice Agents who want to make their agents more personalized. Prerequisites Have a Retell AI Account Create a Retell agent Purchase a phone number and associate it with your agent Create a Google Sheets - for example, make a copy of this one. Your Google Sheet must have at least one column with the phone number. The remaining columns will be used to populate your Retell agent’s dynamic variables. All fields are returned as strings to Retell (variables are replaced as text) How it works The webhook call is received from Retell. We filter the call using their whitelisted IP address. It extracts data from the webhook call and uses it to retrieve the user from Google Sheets. It formats the data in the response to match Retell's expected format. Retell uses this data to replace dynamic variables in the prompts. How to use it See the description for screenshots! Set the webhook name (keep it as POST). Copy the Webhook URL (e.g., https://your-instance.app.n8n.cloud/webhook/retell-dynamic-variables) and paste it into Retell's interface. Navigate to "Phone Numbers", click on the phone number, and enable "Add an inbound webhook". In your prompt (e.g., "welcome message"), use the variable with this syntax: {{variable_name}} (see Retell's documentation). These variables will be dynamically replaced by the data in your Google Sheet. Notes In Google Sheets, the phone number must start with '+. Phone numbers must be formatted like the example: with the +, extension, and no spaces. You can use any database—just replace Google Sheets with your own, making sure to keep the phone number formatting consistent. 👉 Reach out to us if you're interested in analysing your Retell Agent conversations.
by Ai Lin ⌘
🎯 What It Does: This project lets you talk to Siri (via Apple Shortcuts) and record or query your daily spending. The shortcut sends your message to an n8n Webhook, which uses AI to decide whether it’s for writing or reading finance data, then replies with a human-friendly message — all powered by n8n + AI + Google Sheets. ⸻ 🌐 PART 1: n8n Setup 🧩 1. Create a Webhook Trigger in n8n • Add a node: Webhook • Set HTTP Method: POST • Set Path: siri-finance • Enable “Respond to Webhook” = ✅ 🧠 2. Add AI Agent Node (e.g. OpenAI, Ollama, Gemini) • Use system prompt like: You are a finance assistant. Decide if the user wants to record or read transactions. If it's recording, return a JSON object with date, type, name, amount, and expense/income. If it's reading, return date range and type (Expense/Income). Always reply with a human-friendly summary. • Input: {{ $json.text }} (from webhook) • Output: structured json.output 🧮 3. (Optional) Add Logic to write to DB / Supabase / Google Sheets • Append tool: Adds a new row • Read tool: Queries past data Now your n8n flow is ready! ⸻ 📱 PART 2: iOS Shortcut Setup ⚙️ 1. Create a new Shortcut • Name it: 記帳助理 (or Finance Bot) • Add Action: Ask for Input • Prompt: “請說出你的記帳內容” • Input Type: Text • Add Action: Get Contents of URL • Method: POST • URL: https://your-n8n-domain/webhook/siri-finance • Headers: Content-Type: application/json • Request Body: { "text": "Provided Input" } • Replace "Provided Input" with Magic Variable → Input Result 🔊 2. Show Result • Add Action: Show Result • Content: Get Contents of URL 🗣️ 3. Optional: Add “Speak Text” • If you want Siri to speak it back, add Speak Text after Show Result. ⸻ ✅ Example Usage • You: “Hey Siri, 開支$50 早餐” • Siri: “已記錄支出:項目 早餐,金額 $50,已寫入” Or • You: “查一下我過去7日用了幾多錢” • Siri: “你過去7日總支出為 $7684.64,包括:⋯⋯” ⸻ 📦 Files to Share You can package the following: • .shortcut file export • Sample n8n workflow .json • Optional Supabase schema / Google Sheet template ⸻ 💡 Tips for Newcomers • Keep your Webhook public but protect with token if needed. • Ensure you handle emoji and newline safely for iOS compatibility. • Add logging nodes in n8n to help debug Siri messages. ⸻ 🗣️ Optional Project Name “Siri 記帳助理” / “Finance VoiceBot” A simple voice-first way to manage your daily expenses.
by CustomJS
This n8n template demonstrates how to download multiple PDF files from public URLs and merge them into a single PDF using the PDF Toolkit from www.customjs.space. @custom-js/n8n-nodes-pdf-toolkit Notice Community nodes can only be installed on self-hosted instances of n8n. What this workflow does Downloads** each PDF using an HTTP Request. Populates* files into an array with *Merge** node from n8n. Merges** all downloaded PDFs using the Merge PDF node from the @custom-js/n8n-nodes-pdf-toolkit. Writes** the final merged PDF to disk. Requirements Self-hosted** n8n instance CustomJS API key** for merging multiple PDF files. PDF files to be merged** to be converted into a PDF Workflow Steps: Manual Trigger: Runs with user interaction. HTTP Request Node For PDF Download: Pass urls for PDF files to merge. Merge Node For Array Population: Just populates two files into an array. Merge PDF files: Uses the CustomJS node to merge the incoming PDF files into a single PDF file. If size of PDF files exceeds 6MB, you can simply pass an array of URLs for PDF files. Usage Get API key from customJS Sign up to customJS platform. Navigate to your profile page Press "Show" button to get API key Set Credentials for CustomJS API on n8n Copy and paste your API key generated from CustomJS here. Design workflow A Manual Trigger for starting workflow. Two HTTP Request Nodes for downloading PDF files. A Merge Node for populating files as an array. Merge PDFs node for merging files Write to Disk node for saving merged PDF file. You can replace logic for triggering and returning results. For example, you can trigger this workflow by calling a webhook and get a result as a response from webhook. Simply replace Manual Trigger and Write to Disk nodes. Perfect for Bundling reports or invoices. Generating document sets from external sources. Automating PDF handling without writing custom code
by Calistus Christian
Overview Receive a URL via Webhook, submit it to urlscan.io, wait ~30 seconds for artifacts (e.g., screenshot), then email a clean summary with links to the result page, screenshot, and API JSON. What this template does Ingests a URL from a POST request. Submits the URL to urlscan.io and captures the scan UUID. Waits 30s** to give urlscan time to generate the screenshot and result artifacts. Sends a formatted HTML email via Gmail with all relevant links. Nodes used Webhook** (POST /urlscan) urlscan.io → Perform a scan** Wait** (30 seconds; configurable) Gmail → Send a message** Input { "url": "https://example.com" }