by Jimleuk
Tired of being let down by the Google Drive Trigger? Rather not exhaust system resources by polling every minute? Then this workflow is for you! Google drive is a great storage option for automation due to its relative simplicity, cheap costs and readily-available integrations. Using Google Drive as a trigger is the next logically step but many n8n users quickly realise the built-in Google Drive trigger just isn't that reliable. Disaster! Typically, the workaround is to poll the Google Drive search API in short intervals but the trade off is wasted server resources during inactivity. The ideal solution is of course, push notifications but they seem quite complicated to implement... or are they? This template demonstrates that setting up Google Push Notifications for Google Drive File Changes actually isn't that hard! Using this approach, Google sends a POST request every time something in a drive changes which solves reliability of events and efficiency of resources. How it works We begin with registering a Notification channel (webhook) with the Google Drive API. 2 key pieces of information is (a) the webhook URL which notifications will be pushed to and (b) because we want to scope to a single location, the driveId. Good to know that you can register as many as you like using http calls but you have to manage them yourself, there's no google dashboard for notification channels! The registration data along with the startPageToken are saved in workflowStaticData - This is a convenient persistence which we can use to hold small bits of data between executions. Now, whenever files or folders are created or updated in our target Google Drive, Google will send push notifications to our webhook trigger in this template. Once triggered, we need still need to call Google Drive's Changes.list to get the actual change events which were detected. we can do this with the HTTP request node. The Changes API will also return the nextPageToken - a marker to establish where next to get the new batch of changes. It's important that we use this token the next time we request from the changes API and so, we'll update the workflowStaticData with this updated value. Unfortunately, the changes.list API isn't able to filter change events by folder or action and so be sure to do your own set of filtering steps to get the files you want. Finally with the valid change events, optionally fetch the file metadata which gives you more attributes to play with. For example, you may want to know if the change event was triggered by n8n, in which case you'll want to check "ModifiedByMe" value. How to use Start with Step 1 and fill in the "Set Variables" node and Click on the Manual Execute Trigger. This will create a single Google Drive Notification Channel for a specific drive. Activate the workflow to start recieving events from Google Drive. To test, perform an action eg. create a file, on the target drive. Watch the webhook calls come pouring in! Once you have the desired events, finish off this template to do something with the changed files. Requirements Google Drive Credentials. Note this workflow also works on Shared Drives. Optimising This Workflow With bulk actions, you'll notice that Google gradually starts to send increasingly large amounts of push notifications - sometimes numbering in the hundreds! For cloud plan users, this could easily exhaust execution limits if lots of changes are made in the same drive daily. One approach is to implement a throttling mechanism externally to batch events before sending them to n8n. This throttling mechanism is outside the scope of this template but quite easy to achieve with something like Supabase Edge Functions.
by Gaetano Castaldo
Web-to-Odoo Lead Funnel (UTM-ready) Create crm.lead records in Odoo from any webform via a secure webhook. The workflow validates required fields, resolves UTMs by name (source, medium, campaign) and writes standard lead fields in Odoo. Clean, portable, and production-ready. Key features ✅ Secure Webhook with Header Auth (x-webhook-token) ✅ Required fields validation (firstname, lastname, email) ✅ UTM lookup by name (utm.source, utm.medium, utm.campaign) ✅ Clean consolidation before create (name, contact_name, email_from, phone, description, type, UTM IDs) ✅ Clear HTTP responses: 200 success / 400 bad request Prerequisites Odoo with Leads enabled (CRM → Settings → Leads) Odoo API Key** for your user (use it as the password) n8n Odoo credentials: URL, DB name, Login, API Key Public URL** for the webhook (ngrok/Cloudflare/reverse proxy). Ensure WEBHOOK_URL / N8N_HOST / N8N_PROTOCOL / N8N_PORT are consistent Header Auth secret** (e.g., x-webhook-token: <your-secret>) How it works Ingest – The Webhook receives a POST at /webhook(-test)/lead-webform with Header Auth. Validate – An IF node checks required fields; if missing → respond with 400 Bad Request. UTM lookup – Three Odoo getAll queries fetch IDs by name: utm.source → source_id utm.medium → medium_id utm.campaign → campaign_id If a record is not found, the corresponding ID remains null. Consolidate – Merge + Code nodes produce a single clean object: { name, contact_name, email_from, phone, description, type: "lead", campaign_id, source_id, medium_id } Create in Odoo – Odoo node (crm.lead → create) writes the lead with standard fields + UTM Many2one IDs. Respond – Success node returns 200 with { status: "ok", lead_id }. Payload (JSON) Required: firstname, lastname, email Optional: phone, notes, source, medium, campaign { "firstname": "John", "lastname": "Doe", "email": "john.doe@example.com", "phone": "+393331234567", "notes": "Wants a demo", "source": "Ads", "medium": "Website", "campaign": "Spring 2025" } Quick test curl -X POST "https://<host>/webhook-test/lead-webform" \ -H "Content-Type: application/json" \ -H "x-webhook-token: <secret>" \ -d '{"firstname":"John","lastname":"Doe","email":"john@ex.com", "phone":"+39333...", "notes":"Demo", "source":"Ads","medium":"Website","campaign":"Spring 2025"}' Notes Recent Odoo versions do not use the mobile field on leads/partners: use phone instead. Keep secrets and credentials out of the template; the user will set their own after import. If you want to auto-create missing UTM records, add an IF after each getAll and a create on utm.*.
by Atta
Stop manually searching Google for sales leads. Start listening to the internet. This advanced workflow automatically identifies, qualifies, and enriches high-value leads by searching for their digital footprints (i.e., specific technology use or public directories). It uses a robust Find/Create/Conditional Update database pattern to prevent duplicates and ensure you only spend credits on enriching incomplete records. The workflow provides a fully persistent lead record, updating the same Airtable row as new data is found across multiple search steps. ✨ Key Features Persistent Data Integrity: Uses a dedicated Loop Over Items structure to run the Find/Create/Update logic sequentially for every lead, guaranteeing no data is lost or duplicated. Conditional Enrichment: A smart gate checks the Airtable record: if the high-value email field is empty, the workflow proceeds to the expensive scraping steps. If it is already complete, it skips the scrape. Targeted Scraping: Executes precise Google Dorks (via Decodo) to find initial leads and then targets the specific Contact Us page for deep email extraction. Database-as-a-State-Machine: Airtable acts as the single source of truth, logging the initial lead status and updating the same row across several enrichment phases. Final Output: Delivers the fully enriched lead data (Domain, Primary Email, Contact Page URL) to a final notification channel. ⚙️ How it Works (The Find/Create/Update Loop) Search & Filter: The workflow is manually triggered and uses the Config variables to execute a wide-scope Google Search via Decodo. The results are filtered into a clean array of unique domains. Loop & Check: The Loop Over Items node starts. Inside, the Airtable Read node checks the database for the current lead's domain. Create/Update: If the lead is NEW, the workflow creates a record (Airtable: Create Lead). If the lead EXISTS, the record is updated (Airtable: Update Lead). Data Merger: The Data Merger: ID Finalizer node consolidates the workflow, ensuring the unique Airtable Record ID is passed to the next step, regardless of whether the lead was created or updated. Conditional Enrichment: The If: Enrichment Needed? node checks the existing Primary Email status. If it's empty, the item proceeds to the deep scraping pipeline (Decodo: Email Search → Decodo: Scrape Contact Page). Final Update: The final node updates the Airtable record with the high-quality email address found from the deep scrape. 📥 Decodo Node Installation The Decodo node is used three times in this workflow for precision scraping and searching. Find the Node: Click the + button in your n8n canvas. Search: Search for the Decodo node and select it. Credentials: When configuring the first Decodo node, use your API key (obtained with the 80% discount coupon). 🎁 Exclusive Deal for n8n Users To run this workflow, you require a robust scraping provider. We have secured a massive discount for Decodo users: Get 80% OFF the 23k Advanced Scraping API plan. Coupon Code: ATTAN8N Sign Up Here: Claim 80% Discount on Decodo 🛠️ Setup Instructions This template requires specific node configuration and Airtable fields. Credentials: Obtain API keys for Decodo (using the coupon above) and Airtable. Airtable Setup (Schema): Create an Airtable base with a 'Leads' table. It must include these fields for mapping: Domain (Single Line Text - Primary Field) Primary Email (Email) Contact Page URL (URL) Source URL (URL) Lead Type (Single Select: Paid Ad Lead, Organic Lead) Status (Single Select: New Lead, Updated, Enrichment Complete) Global Configuration: Open the Config: Set Search Params node. Customize the following fields: tech_footprint: e.g., "We use Klaviyo" target_industry: e.g., site:promarketer.ca ➕ How to Adapt the Template Change Database:* Replace the *Airtable* nodes with *Postgres, **Notion, or Google Sheets for logging, adapting the field mappings. Final Notification:* Add a *Slack* or *Gmail** node to alert the sales team immediately upon successful enrichment. Multi-Step Enrichment:** Integrate a service like Hunter.io or Clearbit to find key employee names and titles before the final database update. Adjust Scoring:* Add an *If Node* after the deep scrape to set a *Lead Score based on whether a direct email (sales@) was found versus a general contact page link. Add AI Lead Scoring:* Integrate a *Gemini* or *OpenAI** node after the deep scraping step to assign an "AI Score" (1-100) based on lead quality (e.g., domain authority, quality of the extracted email), before the final update.
by Sènato Julien KOUKPONOU
Who’s it for This workflow is ideal for community managers, event organizers, and businesses that regularly manage multiple WhatsApp groups. If you have a growing list of invitation codes stored in Google Sheets, this automation helps you automatically join groups, update statuses, and track results without manual work. How it works / What it does The workflow connects Google Sheets with WhatsApp through an automation sequence: Reads the list of invitation codes from a Google Sheet. Processes the first 50 unused codes per run. Validates group links via a Fetch groups node. Attempts to join each group using the Join group node. Updates the sheet with the join status (success or failure). Logs successful joins in a tracking list for easy follow-up. This ensures a fully automated way to manage WhatsApp group invitations while keeping your data organized in Google Sheets. How to set up Prepare a Google Sheet with invitation codes and a status column. Configure the Google Sheets node with read and write access. Set up your fetch-groups and join-group credentials. Adjust the Schedule Trigger to define how often the workflow should run. Test with a few sample codes before scaling. Requirements n8n (self-hosted or cloud). Google Sheets API credentials. WhatsApp integration (via \[Evolution API] or another community node — self-hosted only). How to customize the workflow Change the batch size (default: 50 codes per run). Add error handling or retry logic for invalid links. Send real-time notifications (Slack, email, or Telegram) after each join. Extend your Google Sheet schema with extra details (e.g., group category, campaign, date joined).
by Artem Makarov
About this template This template is to demonstrate how to trace the observations per execution ID in Langfuse via ingestion API. Good to know Endpoint: https://cloud.langfuse.com/api/public/ingestion Auth is a Generic Credential Type with a Basic Auth: username = you_public_key, password = your_secret_key. How it works Trigger**: the workflow is executed by another workflow after an AI run finishes (input parameter execution_id). Remove duplicates** Ensures we only process each execution_id once (optional but recommended). Wait to get execution data** Delay (60-80 secs) so totals and per-step metrics are available. Get execution** Fetches workflow metadata and token totals. Code: structure execution data** Normalizes your run into an array of perModelRuns with model, tokens, latency, and text previews. Split Out* → *Loop Over Items** Iterates each run step. Code: prepare JSON for Langfuse** Builds a batch with: trace-create (stable id trace-<executionId>, grouped into session-<workflowId>) generation-create (model, input/output, usage, timings from latency) HTTP Request to Langfuse** Posts the batch. Optional short Wait between sends. Requirements Langfuse Cloud project and API keys n8n instance with the HTTP node Customizing Add span-create and set parentObservationId on the generation to nest under spans. Add scores or feedback later via score-create. Replace sessionId strategy (per workflow, per user, etc.). If some steps don’t produce tokens, compute and set usage yourself before sending.
by Nexio_2000
This n8n template demonstrates how to export all icons metadata from an Iconfinder account into an organized format with previews, names, iconset names and tags. It generates HTML and CSV outputs. Good to know Iconfinder does not provide a built-in feature to export all icon data at once for contributors, which motivated the creation of this workflow. The workflow exports all iconsets for selected user account and can handle large collections. Preview image URLs are extracted in a consistent size (e.g., 128x128) for easy viewing. Basic icon metadata, including tags and iconset names is included for reference or further automation. How it works The workflow fetches all iconsets from your Iconfinder account. The workflow loops through all your iconsets, handling pagination automatically if an iconset contains more than 100 icons. Each icon is processed to retrieve its metadata, including name, tags, preview image URLs, and the iconset name it belongs to. An HTML file with a preview table and a CSV file with all icon details are generated. How to use Retrieve your User ID – A dedicated node in the workflow is available to fetch your Iconfinder user ID. This ensures the workflow knows which contributor account to access. Setup API access – The workflow includes a setup node where you provide your Iconfinder API key. This node passes the authorization token to all subsequent HTTP request nodes, so you don’t need to manually enter it multiple times. Trigger the workflow – You can start it manually or attach it to a different trigger, such as a webhook or schedule. Export Outputs – The workflow generates an HTML file with preview images and a CSV file containing all metadata. Both files are ready for download or further processing. Requirements Iconfinder account with an API key. Customising this workflow You can adjust the preview size or choose which metadata to include in HTML and CSV outputs. Combine with other workflows to automate asset cataloging.
by Maksudur Rahman
How it works Trigger: When a new meeting is booked in Cal.com. Date Check: The workflow calculates how many days remain before the meeting date. Email Scheduling: Depending on the time left, it sends a series of pre-written “warm-up” emails using Gmail, designed to set expectations and build interest in your offering. Timing Control: Emails are automatically spaced out to ensure natural engagement before the meeting. How to set up Connect your Cal.com API key to authenticate and trigger on new bookings. Connect your Google account to enable Gmail email sending. Customize the email messages in the Set or Send Email nodes to match your brand voice and tone. Test with internal bookings to ensure correct timing and delivery before activating for clients. Requirements Cal.com account with API access. Google account connected to Gmail node. Active n8n instance (self-hosted or cloud). How to customize Adjust email spacing or timing by modifying the Wait nodes. Edit the email copy for different purposes (e.g., sales, onboarding, consultation). Add conditional logic to send different warm-up sequences for specific meeting types or durations.
by Paul Abraham
This n8n template demonstrates how to automatically generate accurate subtitles from any video and optionally translate them into other languages. By combining FFmpeg, OpenAI Whisper, and LibreTranslate, this workflow turns video audio into ready-to-use .srt subtitle files that can be delivered via email. Use cases Auto-generate subtitles for training or educational videos Translate videos into multiple languages for global reach Create accessibility-friendly content with minimal effort Build a backend for media platforms to process subtitles automatically Good to know This workflow requires a self-hosted n8n instance since it uses the Execute Command node. FFmpeg is used for audio extraction and must be installed on the host machine. OpenAI Whisper (Local) is used for transcription, providing highly accurate speech-to-text results. LibreTranslate is used for translating subtitles into other languages. How it works Webhook Trigger – Starts when a video URL is received. Download Video – Fetches the video file from the provided link. Extract Audio (FFmpeg) – Separates audio from the video file. Run Whisper (Local) – Transcribes the extracted audio into text subtitles. Read SRT File – Loads the generated .srt subtitle file. Merge Paths – Combines both original and translated subtitle flows. Translate Subtitles (LibreTranslate) – Translates the .srt file into the target language. Write Translated SRT – Creates a translated .srt file for delivery. Send a Message (Gmail) – Sends the final subtitle file (original or translated) via email. How to use Clone this workflow into your self-hosted n8n instance. Ensure FFmpeg and Whisper are installed and available via your server’s shell path. Add your LibreTranslate service credentials for translation. Configure Gmail (or another email service) to send subtitle files. Trigger the workflow by sending a video URL to the webhook, and receive subtitle files in your inbox. Requirements Self-hosted n8n instance FFmpeg installed and available on the server OpenAI Whisper (Local) installed and callable via command line LibreTranslate service with API credentials Gmail (or any email integration) for delivery Customising this workflow Replace Gmail with Slack, Telegram, or Drive uploads for flexible delivery. Switch LibreTranslate with DeepL or Google Translate for higher-quality translations. Add post-processing steps such as formatting .srt files or embedding subtitles back into the video. Use the workflow as a foundation for building a multi-language subtitle automation pipeline.
by Easy8.ai
Sync New Calendly Bookings to Esko CRM with Comment, Sales Activity, and Outlook Email Intro/Overview This automation workflow automatically syncs newly scheduled Calendly meetings into Easy Redmine CRM. When a meeting is booked via Calendly, it finds the associated lead in Easy Redmine CRM using the invitee’s email address and logs both a comment and a sales activity in the lead’s record. In addition, the workflow sends an email in Microsoft Outlook, ensuring the account manager can check and knows about the meeting. Ideal for sales teams and customer success managers, this workflow ensures your CRM and calendar stay updated with every new meeting—no manual entry required. How it works Calendly Trigger Watches for invitee.created events (newly booked meetings) Authenticates with Calendly via OAuth2 Get ID from email Sends a GET request to Easy Redmine CRM to search for a lead using the invitee’s email Extracts the lead ID, required for further CRM actions Add Comment Posts a comment to the lead’s record in Easy Redmine CRM The comment includes: Invitee’s full name Scheduled date/time of the meeting Meeting description Sales Activity POST Creates a new sales activity in Easy Redmine CRM under the same lead Includes structured details: Invitee’s name Meeting date and time Description of the meeting Send an email Sends a notification email in Microsoft Outlook Maps the name, start time, and description using dynamic values from the Calendly booking Ensures the account manager is informed about the scheduled meeting How to Use Import the workflow Copy or download the JSON of this workflow In your n8n editor, click Import Workflow and paste/upload the JSON Set up credentials Calendly API: Use OAuth2 credentials Ensure proper scopes for reading event data Follow n8n’s setup guide: https://docs.n8n.io/integrations/builtin/credentials/calendly/ Easy Redmine CRM: Add credentials using an API token or OAuth2, depending on your CRM setup Microsoft Outlook: Set up OAuth2 credentials and connect your Outlook email Assign and configure nodes Calendly Trigger: Confirm it’s set to listen for invitee.created Get ID from email: Update the GET request URL to match your Easy Redmine CRM lead search endpoint Ensure it uses the invitee’s email from the trigger Add Comment & Sales Activity POST: Use your Easy Redmine CRM’s correct endpoints Map dynamic fields like lead ID, name, meeting time, and description Send email notification: Set up your preferred Outlook email Map meeting details into email body (e.g. name, time, description) Timezone and formatting If needed, use a Code node to convert the meeting time if your CRM or Outlook expects a specific format or timezone Test the workflow Schedule a test meeting via Calendly using an email tied to an existing CRM lead Check Easy Redmine CRM to confirm: A comment was added A sales activity was created under the correct lead Check Outlook to confirm: An email notification was received Example Use Cases Sales Enablement Log every discovery or demo call automatically, helping reps track all activity in one place Block time in calendars without double-booking risk Customer Success Keep onboarding and check-in meetings logged, improving visibility across the team Ensure calendar events are created for every scheduled session Lead Qualification Identify engaged leads based on bookings and trigger follow-up sequences Requirements Calendly account with API access Easy Redmine CRM user with API access Microsoft Outlook account with email access Customization (Optional) Add Filters Use IF or Switch nodes to handle only specific event types (e.g., demo calls vs intro calls) Extend Comments Include additional data from Calendly like meeting location or answers to custom questions
by Adrian Bent
Part two of the Indeed Job Scraper, Filter, and Enrichment workflow, this workflow takes information about the scraped and filtered job listings on Indeed via Apify, which is stored in Google Sheets to generate a customized, five-line email icebreaker that implies the rest of the icebreaker is personalized. Personalized IJSFE (Indeed Job Scraper For Enrichment). ++ I am an engineering student, so I love my acronyms. Benefits Instant Icebreaker Generation - Convert hours of research, copywriting and personalization into seconds automatically Live Integration - Generate and send personalized icebreakers whenever, wherever Virtually Complete Automation - From research into company and job description to personalized response with the AI solution, this workflow does it in a click. Professional Presentation - Because the chatbot has context about the company and the job listing, it generates an icebreaker that makes the reader think that there was some research done How It Works Google Sheets Search: Google sheets node fetches all rows where the icebreaker column is empty Each row is an item returned that contains information about the company and the job listing AI Personalization: Uses sophisticated GPT-4 prompting Converts a bunch of information about a job posting and company into a customized, five-line personalized email icebreaker Applies a consistent and casual tone automatically to seem more human-written Database Update: Updates all rows fetched in the search and updates only the icebreaker column with the new personalized icebreaker Each item is returned as a row that contains information about the company, the job listing and the icebreaker Required Template Setup Google Sheets Template: Create a Google Sheet with these variables: jobUrl - Unique identifier for job listings title - Position Title descriptionText - Description of job listing hiringDemand/isHighVolumeHiring - Are they hiring at high volume? hiringDemand/isUrgentHire - Are they hiring at high urgency? isRemote - Is this job remote? jobType/0 - Job type: In person, Remote, Part-time, etc. companyCeo/name - CEO name collected from Tavily’s search icebreaker - AI-generated icebreakers for each job listing scrapedCeo - CEO name collected from Apify Scraper email - Email listed on for job listing companyName - Name of the company that posted the job companyDescription - Description of the company that posted the job companyLinks/corporateWebsite - Website of the company that posted the job companyNumEmployees - Number of employees the company listed that they have location/country - Location of where job is to take place salary/salaryText - Salary on job listing Setup Instructions: Google Sheets Search & Update Setup: Create a new Google Sheet with these column headers in the first row Name the sheet whatever you please Connect your Google Sheets OAuth credentials in n8n Update the document ID in the workflow nodes The search logic in the first Google Sheets node relies on the ID column for icebreaker generation, so this structure is essential for the workflow to function correctly. Feel free to reach out for additional help or clarification at my Gmail: terflix45@gmail.com, and I'll get back to you as soon as I can. AI Icebreaker Generation Setup: Configure OpenAI API for sophisticated proposal writing Implement example-based training with input/output for more specific output Set up JSON formatting for structure (Personally, I think JSON is easier to handle as an output) Setup Steps: Search & Fetch Rows Setup Create a Google Sheets database with the provided column structure Connect Google Sheets OAuth credentials Configure the filer on the get rows node to only include empty icebreaker columns Set up AI Personalization Add OpenAI API credentials for personalized icebreaker generation Customize the AI prompts for your specific niche, requirements or interest Update Google Sheets Setup Remember to map all items to their respective columns based on the row number All fields in the update sheets node should have the same value as the first sheets node, and the icebreaker field is to take the ChatGPT output as its value
by Lucas Peyrin
How it works This workflow is your automated sales assistant, designed to intelligently qualify incoming leads and route them to the most appropriate follow-up channel. It uses the powerful BANT (Budget, Authority, Need, Timing) framework, powered by Google Gemini AI, to score leads as 'hot', 'mid', or 'cold', ensuring your sales team focuses on the most promising opportunities. Here's a step-by-step breakdown: Lead Capture: A public Form Trigger collects essential lead information, including their name, email, what they want to build, their budget, desired start time, and job role. These questions are specifically designed to gather BANT data. AI Lead Scoring: The collected data is sent to Google Gemini. A detailed prompt instructs the AI to act as a Lead Scoring Expert, evaluating each BANT component individually and then assigning an overall 'hot', 'mid', or 'cold' score based on predefined criteria. Intelligent Routing: A Switch node acts as the central router. Based on the AI's 'hot', 'mid', or 'cold' score, the workflow directs the lead down one of three distinct follow-up paths. Hot Leads (Calendar Booking): For highly qualified 'hot' leads, the workflow immediately redirects them to your calendar booking link, making it easy for them to schedule a direct conversation. Mid Leads (WhatsApp Engagement): For 'mid' priority leads, Google Gemini generates a personalized, pre-filled WhatsApp message summarizing their inquiry. The lead is then redirected to a WhatsApp chat with your sales team, allowing for quick, informal engagement. Cold Leads (Nurturing Email): For 'cold' leads who might be in an early research phase, Google Gemini crafts a helpful, non-salesy follow-up email. This email provides valuable resources (like templates or community links) and is sent via Gmail, keeping them engaged without pressure. Set up steps Setup time: ~10-15 minutes This workflow requires connecting your Google AI and Gmail accounts, and customizing several nodes to fit your sales process. Get Your Google AI API Key: Visit Google AI Studio at aistudio.google.com/app/apikey. Click "Create API key in new project" and copy the key. In the workflow, select the Score Lead node. Click the Credential dropdown and select + Create New Credential. Paste your key into the API Key field and Save. Repeat this for the Write Placeholder WA Message and Write Follow up Email nodes, selecting the credential you just created. Connect Your Gmail Account: Select the Send Follow up Email with Gmail node. Click the Credential dropdown and select + Create New Credential to connect your Google account. Follow the prompts to grant n8n access. Customize Lead Scoring Criteria: Go to the Score Lead node. In the Text parameter, carefully review and adapt the BANT Criteria Mapping and Scoring Logic to align with your specific sales process and ideal customer profile. This is crucial for accurate lead qualification. Configure Follow-up Channels: Hot Leads: Select the Calendar Booking Link node. Update the redirectUrl parameter with your personal or team's calendar booking link (e.g., Calendly, Chili Piper). Mid Leads: Select the Phone Number node. Set the whatsapp_phone value to your company's WhatsApp phone number (e.g., +15551234567). You can also customize the pre-filled WhatsApp message by adjusting the prompt in the Write Placeholder WA Message node. Cold Leads: Select the Redirect to Website node. Update the redirectUrl parameter to your company's main website or a relevant resource page (e.g., a free templates page, a blog post). You can also customize the email content and resources shared by adjusting the prompt in the Write Follow up Email node. Activate and Test: Activate the workflow using the toggle at the top right. Go to the Lead Contact Form node and click the "Open Form URL" button. Submit several test applications with different answers (e.g., one "hot" lead, one "mid", one "cold") to ensure the AI scores them correctly and they are routed to the appropriate follow-up action. Start qualifying and engaging your leads more effectively!
by Masaki Go
About This Template This workflow automatically fetches the Nikkei 225 closing price every weekday and sends a formatted message to a list of users on LINE. This is perfect for individuals or teams who need to track the market's daily performance without manual data checking. How It Works Schedule Trigger: Runs the workflow automatically every weekday at 4 PM JST (Tokyo time), just after the market closes. Get Data: An HTTP Request node fetches the latest Nikkei 225 data (closing price, change, %) from a data API. Prepare Payload: A Code node formats this data into a user-friendly message and prepares the JSON payload for the LINE Messaging API, including a list of user IDs. Send to LINE: An HTTP Request node sends the formatted message to all specified users via the LINE multicast API endpoint. Who It’s For Anyone who wants to receive daily stock market alerts. Teams that need to share financial data internally. Developers looking for a simple example of an API-to-LINE workflow. Requirements An n8n account. A LINE Official Account & Messaging API access token. An API endpoint to get Nikkei 225 data. (The one in the template is a temporary example). Setup Steps Add LINE Credentials: In the "Send to LINE via HTTP" node, edit the "Authorization" header to include your own LINE Messaging API Bearer Token. Add User IDs: In the "Prepare LINE API Payload" (Code) node, edit the userIds array to add all the LINE User IDs you want to send messages to. Update Data API: The URL in the "Get Nikkei 225 Data" node is a temporary example. Replace it with your own persistent API URL (e.g., from a public provider or your own server). Customization Options Change Schedule:** Edit the "Every Weekday at 4 PM JST" node to run at a different time. (Note: 4 PM JST is 07:00 UTC, which is what the Cron 0 7 * * 1-5 means). Change Message Format:** Edit the message variable inside the "Prepare LINE API Payload" (Code) node to change the text of the LINE message.