by Le Nguyen
PDF Invoice Extractor (AI) End-to-end pipeline: Watch Drive ➜ Download PDF ➜ OCR text ➜ AI normalize to JSON ➜ Upsert Buyer (Account) ➜ Create Opportunity ➜ Map Products ➜ Create OLI via Composite API ➜ Archive to OneDrive. Node by node (what it does & key setup) 1) Google Drive Trigger Purpose**: Fire when a new file appears in a specific Google Drive folder. Key settings**: Event: fileCreated Folder ID: google drive folder id Polling: everyMinute Creds: googleDriveOAuth2Api Output**: Metadata { id, name, ... } for the new file. 2) Download File From Google Purpose**: Get the file binary for processing and archiving. Key settings**: Operation: download File ID: ={{ $json.id }} Creds: googleDriveOAuth2Api Output**: Binary (default key: data) and original metadata. 3) Extract from File Purpose**: Extract text from PDF (OCR as needed) for AI parsing. Key settings**: Operation: pdf OCR: enable for scanned PDFs (in options) Output**: JSON with OCR text at {{ $json.text }}. 4) Message a model (AI JSON Extractor) Purpose: Convert OCR text into **strict normalized JSON array (invoice schema). Key settings**: Node: @n8n/n8n-nodes-langchain.openAi Model: gpt-4.1 (or gpt-4.1-mini) Message role: system (the strict prompt; references {{ $json.text }}) jsonOutput: true Creds: openAiApi Output (per item): $.message.content → the parsed **JSON (ensure it’s an array). 5) Create or update an account (Salesforce) Purpose: Upsert **Buyer as Account using an external ID. Key settings**: Resource: account Operation: upsert External Id Field: tax_id__c External Id Value: ={{ $json.message.content.buyer.tax_id }} Name: ={{ $json.message.content.buyer.name }} Creds: salesforceOAuth2Api Output: Account record (captures Id) for downstream **Opportunity. 6) Create an opportunity (Salesforce) Purpose**: Create Opportunity linked to the Buyer (Account). Key settings**: Resource: opportunity Name: ={{ $('Message a model').item.json.message.content.invoice.code }} Close Date: ={{ $('Message a model').item.json.message.content.invoice.issue_date }} Stage: Closed Won Amount: ={{ $('Message a model').item.json.message.content.summary.grand_total }} AccountId: ={{ $json.id }} (from Upsert Account output) Creds: salesforceOAuth2Api Output**: Opportunity Id for OLI creation. 7) Build SOQL (Code / JS) Purpose: Collect unique product **codes from AI JSON and build a SOQL query for PricebookEntry by Pricebook2Id. Key settings**: pricebook2Id (hardcoded in script): e.g., 01sxxxxxxxxxxxxxxx Source lines: $('Message a model').first().json.message.content.products Output**: { soql, codes } 8) Query PricebookEntries (Salesforce) Purpose**: Fetch PricebookEntry.Id for each Product2.ProductCode. Key settings**: Resource: search Query: ={{ $json.soql }} Creds: salesforceOAuth2Api Output**: Items with Id, Product2.ProductCode (used for mapping). 9) Code in JavaScript (Build OLI payloads) Purpose: Join lines with PBE results and Opportunity Id ➜ build **OpportunityLineItem payloads. Inputs**: OpportunityId: ={{ $('Create an opportunity').first().json.id }} Lines: ={{ $('Message a model').first().json.message.content.products }} PBE rows: from previous node items Output**: { body: { allOrNone:false, records:[{ OpportunityLineItem... }] } } Notes**: Converts discount_total ➜ per-unit if needed (currently commented for standard pricing). Throws on missing PBE mapping or empty lines. 10) Create Opportunity Line Items (HTTP Request) Purpose**: Bulk create OLIs via Salesforce Composite API. Key settings**: Method: POST URL: https://<your-instance>.my.salesforce.com/services/data/v65.0/composite/sobjects Auth: salesforceOAuth2Api (predefined credential) Body (JSON): ={{ $json.body }} Output**: Composite API results (per-record statuses). 11) Update File to One Drive Purpose: Archive the **original PDF in OneDrive. Key settings**: Operation: upload File Name: ={{ $json.name }} Parent Folder ID: onedrive folder id Binary Data: true (from the Download node) Creds: microsoftOneDriveOAuth2Api Output**: Uploaded file metadata. Data flow (wiring) Google Drive Trigger → Download File From Google Download File From Google → Extract from File → Update File to One Drive Extract from File → Message a model Message a model → Create or update an account Create or update an account → Create an opportunity Create an opportunity → Build SOQL Build SOQL → Query PricebookEntries Query PricebookEntries → Code in JavaScript Code in JavaScript → Create Opportunity Line Items Quick setup checklist 🔐 Credentials: Connect Google Drive, OneDrive, Salesforce, OpenAI. 📂 IDs: Drive Folder ID (watch) OneDrive Parent Folder ID (archive) Salesforce Pricebook2Id (in the JS SOQL builder) 🧠 AI Prompt: Use the strict system prompt; jsonOutput = true. 🧾 Field mappings: Buyer tax id/name → Account upsert fields Invoice code/date/amount → Opportunity fields Product name must equal your Product2.ProductCode in SF. ✅ Test: Drop a sample PDF → verify: AI returns array JSON only Account/Opportunity created OLI records created PDF archived to OneDrive Notes & best practices If PDFs are scans, enable OCR in Extract from File. If AI returns non-JSON, keep “Return only a JSON array” as the last line of the prompt and keep jsonOutput enabled. Consider adding validation on parsing.warnings to gate Salesforce writes. For discounts/taxes in OLI: Standard OLI fields don’t support per-line discount amounts directly; model them in UnitPrice or custom fields. Replace the Composite API URL with your org’s domain or use the Salesforce node’s Bulk Upsert for simplicity.
by DuyTran
Description: Overview This workflow generates automated revenue and expense comparison reports from a structured Google Sheet. It enables users to compare financial data across the current period, last month, and last year, then uses an AI agent to analyze and summarize the results for business reporting. Prerequisites A connected Google Sheets OAuth2 credential. A valid DeepSeek AI API (or replaceable with another Chat Model). A sub-workflow (child workflow) that handles processing logic. Properly structured Google Sheets data (see below). Required Google Sheet Structure Column headers must include at least: Date, Amount, Type. Setup Steps Import the workflow into your n8n instance. Connect your Google Sheets and DeepSeek API credentials. Update: Sheet ID and Tab Name (already embedded in node: Get revenual from google sheet). Custom sub-workflow ID (in the Call n8n Workflow Tool node). Optionally configure chatbot webhook in the When chat message received node. What the Workflow Does Accepts date inputs via AI chat interface (ChatTrigger + AI Agent). Fetches raw transaction data from Google Sheets. Segments and pivots revenue by classification for: Current period Last month Last year Aggregates totals and applies custom titles for comparison. Merges all summaries into a final unified JSON report. Customization Options Replace DeepSeek with OpenAI or other LLMs. Change the date fields or cycle comparisons (e.g., quarterly, weekly). Add more AI analysis steps such as sentiment scoring or forecasting. Modify the pivot logic to suit specific KPI tags or labels. Troubleshooting Tips If Google Sheets fetch fails: ensure the document is shared with your n8n Google credential. If parsing errors: verify that all dates follow the expected format. Sub-workflow must be active and configured to accept the correct inputs (6 dates). SEO Keywords (ẩn hoặc mô tả ngầm): Google Sheets report, AI financial report, compare revenue by month, expense analysis automation, chatbot n8n report generator, n8n Google Sheet integration
by vinci-king-01
How it works This workflow automatically scrapes commercial real estate listings from LoopNet and sends opportunity alerts to Telegram while logging data to Google Sheets. Key Steps Scheduled Trigger - Runs every 24 hours to collect fresh CRE market data AI-Powered Scraping - Uses ScrapeGraphAI to extract property information from LoopNet Market Analysis - Analyzes listings for opportunities and generates market insights Smart Notifications - Sends Telegram alerts only when investment opportunities are found Data Logging - Stores daily market metrics in Google Sheets for trend analysis Set up steps Setup time: 10-15 minutes Configure ScrapeGraphAI credentials - Add your ScrapeGraphAI API key for web scraping Set up Telegram connection - Connect your Telegram bot and specify the target channel Configure Google Sheets - Set up Google Sheets integration for data logging Customize the LoopNet URL - Update the URL to target specific CRE markets or property types Adjust schedule - Modify the trigger timing based on your market monitoring needs Keep detailed configuration notes in sticky notes inside your workflow
by Karol
Who’s it for This template is designed for small and medium businesses, startups, and agencies that want to automate customer inquiries, provide instant support, and capture leads without losing valuable conversations. It’s especially useful for teams that get many repetitive questions about products, services, or locations but don’t want to miss out on collecting contact details for follow-up. What it does / How it works The workflow creates a 24/7 AI-powered chatbot that answers company-related questions and collects customer information. It uses: • GPT-4o for natural conversations • Pinecone Vector Store for Retrieval-Augmented Generation (RAG) with your company knowledge base • Google Sheets to store structured lead data • Telegram to instantly notify your team When a customer asks about products, services, or hours, the AI answers using the Pinecone database. Afterwards, it politely asks for their name, email, phone number, and interest. The details are saved to Google Sheets and your team receives a Telegram message with a summary. How to set up Connect your OpenAI account. Create a Pinecone index with company FAQs, documents, or policies. Link your Google Sheet with columns: Name, Email, Phone, Interested in. Add your Telegram bot token and chat/group ID. Replace [INSERT_YOUR_COMPANY_NAME_HERE] in the system prompt with your company name. Requirements • OpenAI API key • Pinecone account • Google Sheets access • Telegram bot & chat ID How to customize • Change the system prompt to match your brand’s tone. • Update the Pinecone namespace and embeddings model if needed. • Add extra fields in Google Sheets (e.g., “Budget” or “Preferred product”). • Extend the flow with CRM integrations or automated email follow-ups. With this setup, you get a smart, RAG-powered chatbot that not only answers questions but also turns every conversation into a potential lead.
by Omer Fayyaz
Efficient loop-less N8N Workflow Backup Automation to Google Drive This workflow eliminates traditional loop-based processing entirely, delivering unprecedented performance and reliability even when the number of workflows to be processed are large What Makes This Different: NO SplitInBatches node** - Traditional workflows use loops to process workflows one by one NO individual file uploads** - No more multiple Google Drive API calls NO batch error handling** - Eliminates complex loop iteration error management ALL workflows processed simultaneously** - Revolutionary single-operation approach Single compressed archive** - One ZIP file containing all workflows One Google Drive upload** - Maximum efficiency, minimum API usage Key Benefits of Non-Loop Architecture: 3-5x Faster Execution** - Eliminated loop overhead and multiple API calls Reduced API Costs** - Single upload instead of dozens of individual operations Higher Reliability** - Fewer failure points, centralized error handling Better Scalability** - Performance doesn't degrade with workflow count Large Workflow Support* - *Efficiently handles hundreds of workflows without performance degradation** Easier Maintenance** - Simpler workflow structure, easier debugging Cleaner Monitoring** - Single success/failure point instead of loop iterations Who's it for This template is designed for n8n users, DevOps engineers, system administrators, and IT professionals who need reliable, automated backup solutions for their n8n workflows. It's perfect for businesses and individuals who want to ensure their workflow automation investments are safely backed up with intelligent scheduling, compression, and cloud storage integration. How it works / What it does This workflow creates an intelligent, automated backup system that transforms n8n workflow backups from inefficient multi-file operations into streamlined single-archive automation. The system: Triggers automatically every 4 hours or manually on-demand Creates timestamped folders in Google Drive for organized backup storage Retrieves all n8n workflows via the n8n API in a single operation Converts workflows to JSON and aggregates binary data efficiently Compresses all workflows into a single ZIP archive (eliminating the need for loops) Uploads the compressed backup to Google Drive in one operation Provides real-time Slack notifications for monitoring and alerting Key Innovation: No Loops Required - Unlike traditional backup workflows that use SplitInBatches or loops to process workflows individually, this system processes all workflows simultaneously and creates a single compressed archive, dramatically improving performance and reliability. How to set up 1. Configure Google Drive API Credentials Set up Google Drive OAuth2 API credentials Ensure the service account has access to create folders and upload files Update the parent folder ID where backup folders will be created 2. Configure n8n API Access Set up internal n8n API credentials for workflow retrieval Ensure the API has permissions to read all workflows Configure retry settings for reliability 3. Set up Slack Notifications Configure Slack API credentials for the notification channel Update the channel ID where backup notifications will be sent Customize notification messages as needed 4. Schedule Configuration The workflow automatically runs every 4 hours Manual execution is available for immediate backups Adjust the schedule in the Schedule Trigger node as needed 5. Test the Integration Run a manual backup to verify all components work correctly Check Google Drive for the created backup folder and ZIP file Verify Slack notifications are received Requirements n8n instance** (self-hosted or cloud) with API access Google Drive account** with API access and sufficient storage Slack workspace** for notifications (optional but recommended) n8n workflows** that need regular backup protection How to customize the workflow Modify Backup Frequency Adjust the Schedule Trigger node for different intervals (hourly, daily, weekly) Add multiple schedule triggers for different backup types Implement conditional scheduling based on workflow changes Enhance Storage Strategy Add multiple Google Drive accounts for redundancy Implement backup rotation and retention policies Add compression options (ZIP, TAR, 7Z) for different use cases Expand Notification System Add email notifications for critical backup events Integrate with monitoring systems (PagerDuty, OpsGenie) Add backup success/failure metrics and reporting Security Enhancements Implement backup encryption before upload Add backup verification and integrity checks Set up access logging and audit trails Performance Optimizations Add parallel processing for large workflow collections Implement incremental backup strategies Add backup size monitoring and alerts Key Features Zero-loop architecture** - Processes all workflows simultaneously without batch processing Intelligent compression** - Single ZIP archive instead of multiple individual files Automated scheduling** - Runs every 4 hours with manual override capability Organized storage** - Timestamped folders with clear naming conventions Real-time monitoring** - Slack notifications for all backup events Error handling** - Centralized error management with graceful failure handling Scalable design** - Handles any number of workflows efficiently Technical Architecture Highlights Eliminated Inefficiencies No SplitInBatches node** - Replaced with direct workflow processing No individual file uploads** - Single compressed archive upload No loop iterations** - All workflows processed in one operation No batch error handling** - Centralized error management Performance Improvements Faster execution** - Eliminated loop overhead and multiple API calls Reduced API quota usage** - Single Google Drive upload per backup Better resource utilization** - Efficient memory and processing usage Improved reliability** - Fewer points of failure in the workflow Data Flow Optimization Parallel processing** - Folder creation and workflow retrieval happen simultaneously Efficient aggregation** - Code node processes all binaries at once Smart compression** - Single ZIP with all workflows included Streamlined upload** - One file transfer instead of multiple operations Use Cases Production n8n instances** requiring reliable backup protection Development teams** needing workflow version control and recovery DevOps automation** requiring disaster recovery capabilities Business continuity** planning for critical automation workflows Compliance requirements** for workflow documentation and backup Team collaboration** with shared workflow backup access Business Value Risk Mitigation** - Protects valuable automation investments Operational Efficiency** - Faster, more reliable backup processes Cost Reduction** - Lower storage costs and API usage Compliance Support** - Organized backup records for audits Team Productivity** - Reduced backup management overhead Scalability** - Handles growth without performance degradation This template revolutionizes n8n workflow backup by eliminating the complexity and inefficiency of traditional loop-based approaches, providing a robust, scalable solution that grows with your automation needs while maintaining the highest levels of reliability and performance.
by KlickTipp
Community Node Disclaimer This workflow uses KlickTipp community nodes, available for self-hosted n8n instances only. Who’s it for Digital marketers, social media managers, and coaches who engage leads through Instagram DMs and want to automate personalized outreach, lead enrichment, and segmentation in KlickTipp — without manual follow-ups or data entry. How it works This workflow creates a complete Instagram-to-email enrichment loop — starting with personalized DM outreach, capturing responses via JotForm, enriching profile data, and syncing everything with KlickTipp. When a workflow trigger or campaign action occurs, it: Sends a personalized Instagram DM inviting the user to fill out a JotForm. Listens for form submissions in real time. Retrieves the lead’s Instagram profile data via the Facebook Graph API. Matches the username to the Instagram DM ID in a Google Sheet. Generates AI-powered marketing insights using OpenAI. Subscribes or updates the lead in KlickTipp, mapping enriched fields and tags. The result: every DM-initiated lead is captured, analyzed, and segmented — ready for intelligent follow-ups and personalized campaigns. How to set up Connect accounts for KlickTipp, JotForm, Google Sheets, Facebook Graph API, and OpenAI. Set up a KlickTipp tag or campaign trigger to initiate the DM sending. Create KlickTipp fields for Instagram data (e.g., Bio, Follower count, Insights). Add tags: Instagram | Outreach, Instagram | Enrichment, Instagram | Username found. Test a sample flow: send a DM → fill the JotForm → verify data enrichment in KlickTipp. 💡 Pro Tip: Personalize the DM message template and test both personal and business accounts to ensure optimal engagement and AI insight quality. Requirements Meta (Instagram) Business Account Facebook Graph API with instagram_basic and pages_show_list permissions KlickTipp account with API access OpenAI connection (gpt-4.1-mini model) (Optional) Active Instagram Page connected to your Facebook App for DM messaging How to customize Adjust DM content and message timing for different campaigns or audiences. Edit tags and field mappings in KlickTipp to match your segmentation logic. Modify the AI prompt to emphasize tone, purchase intent, or niche interests. Add conditional logic (e.g., followers > 1,000 → influencer tag). Extend the flow to LinkedIn, website tracking, or CRM syncing for multi-channel enrichment.
by Rahul Joshi
Description Automate B2B order invoicing by fetching orders from Airtable, validating paid B2B entries, creating Stripe customers and invoices, finalizing invoices, and logging structured invoice data into Google Sheets. This workflow ensures seamless B2B billing, centralized record-keeping, and reduces manual errors in financial operations. ⚡💳📊 What This Template Does Triggers hourly to check for new B2B orders. ⏱️ Fetches order data from Airtable (Orders table). 📥 Filters only paid orders with “B2B” tag. ✅ Creates a corresponding Stripe customer from order details. 👤 Processes order line items for invoicing. 📦 Creates a Stripe invoice with due date and payment terms. 🧾 Finalizes the invoice automatically. ✔️ Formats invoice details (totals, due dates, customer info, links). 🔄 Logs structured invoice data into Google Sheets for tracking. 📊 Key Benefits Fully automates B2B invoicing workflow from orders to finalized invoices. 🔄 Ensures all invoices are linked, structured, and logged in Sheets. 🧾 Reduces manual effort and eliminates data entry errors. ⚡ Maintains centralized invoice tracking for finance teams. 📂 Creates a consistent billing flow integrated with Stripe. 💳 Features Hourly Trigger – Continuously monitors Airtable for new/updated orders. Airtable Integration – Fetches order details automatically. Conditional Filter – Processes only “B2B” paid orders. Stripe Customer Creation – Automatically creates customers in Stripe. Line Item Processor – Handles Shopify/Order line items or test data. Stripe Invoice Creation – Generates draft invoices with due dates. Invoice Finalization – Auto-finalizes and prepares invoices for payment. Data Formatter – Structures invoice info (totals, links, dates, status). Google Sheets Integration – Logs all invoice data for reporting. Requirements n8n instance (cloud or self-hosted). Airtable Personal Access Token with read access to Orders table. Stripe API credentials with customer + invoice permissions. Google Sheets OAuth2 credentials with read/write access. Target Audience Finance/ops teams handling B2B customer invoicing. 💼 SaaS or eCommerce businesses with B2B order flows. 🛍️ Startups needing automated billing + centralized reporting. 🚀 Teams tracking Stripe invoices inside Google Sheets. 📊 Step-by-Step Setup Instructions Connect Airtable credentials and replace with your base/table IDs. 🔑 Configure Stripe API credentials for invoice + customer creation. 💳 Link Google Sheets credentials and update the target sheet ID. 📊 Adjust order filtering conditions (tags, payment status) as needed. ⚙️ Test with sample data to validate invoices are created + logged. ✅
by Madame AI
Product Review Analysis with BrowserAct & Gemini-Powered Recommendations. This n8n template demonstrates how to perform product review sentiment analysis and generate improvement recommendations using an AI Agent. This workflow is perfect for e-commerce store owners, product managers, or marketing teams who want to automate the process of collecting feedback and turning it into actionable insights. How it works The workflow is triggered manually. An HTTP Request node initiates a web scraping task with the BrowserAct API to collect product reviews. A series of If and Wait nodes are used to check the status of the scraping task. If the task is not yet complete, the workflow pauses and retries until it receives the full dataset. An AI Agent node, powered by Google Gemini, then processes the scraped review summaries. It analyzes the sentiment of each review and generates actionable improvement recommendations. Finally, the workflow sends these detailed recommendations via a Telegram message and an Email to the relevant stakeholders. Requirements BrowserAct** API account for web scraping BrowserAct* "Product Review Sentiment Analysis*" Template Gemini** account for the AI Agent Telegram* and *SMTP** credentials for sending messages Need Help ? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase How to INSTANTLY Get Product Improvement Ideas from Amazon Reviews | BrowserAct + n8n + Gemini
by Trung Tran
AI-Powered YouTube Auto-Tagging Workflow (SEO Automation) Watch the demo video below: > Supercharge your YouTube SEO with this AI-powered workflow that automatically generates and applies smart, SEO friendly tags to your new videos every week. No more manual tagging, just better discoverability, improved reach, and consistent optimization. Plus, get instant Slack notifications so your team stays updated on every video’s SEO boost. Who’s it for YouTube creators, channel admins, and marketing teams who publish regularly and want consistent, SEO-friendly tags without manual effort. Agencies managing multiple channels who need an auditable, automated tagging process with Slack notifications. How it works / What it does Weekly Schedule Trigger Runs the workflow once per week. Get all videos uploaded last week Queries YouTube for videos uploaded by the channel in the past 7 days. Get video detail Retrieves each video’s title, description, and ID. YouTube Video Auto Tagging Agent (LLM) Inputs: video.title, video.description, channelName. Uses a SEO-specialist system prompt to generate 15–20 relevant, comma-separated tags. Update video with AI-generated tags Writes the tags back to the video via YouTube Data API. Inform via Slack message Posts a confirmation message (video title + ID + tags) to a chosen Slack channel for visibility. How to set up YouTube connection Create a Google Cloud project and enable YouTube Data API v3. Configure OAuth client (Web app / Desktop as required). Authorize with the Google account that manages the channel. In your automation platform, add the YouTube credential and grant scopes (see Requirements). Slack connection Create or use an existing Slack app/bot. Install to your workspace and capture the Bot Token. Add the Slack credential in your automation platform. LLM / Chat Model Select your model (e.g., OpenAI GPT). Paste the System Prompt (SEO expert) and the User Prompt template: Inputs: {{video_title}}, {{video_description}}, {{channel_name}}. Output: comma-separated list of 15–20 tags (no #, no duplicates). Node configuration Weekly Schedule Trigger: choose day/time (e.g., Mondays 09:00 local). Get all videos uploaded last week: date filter = now() - 7 days. Get video detail: map each video ID from previous node. Agent node: map fields to the prompt variables. Update video: map the agent’s tag string to the YouTube tags field. Slack message: The video "{{video_title}} - {{video_id}}" has been auto-tagged successfully. Tags: {{tags}} Test run Manually run the workflow with one recent video. Verify the tags appear in YouTube Studio and the Slack message posts. Requirements APIs & Scopes YouTube Data API v3** youtube.readonly (to list videos / details) youtube or youtube.force-ssl (to update video metadata incl. tags) Slack Bot Token Scopes** chat:write (post messages) channels:read or groups:read if selecting channels dynamically (optional) Platform Access to a chat/LLM provider (e.g., OpenAI). Outbound HTTPS allowed. Rate limits & quotas YouTube updates consume quota; tag updates are write operations—avoid re-writing unchanged tags. Add basic throttling (e.g., 1–2 updates/sec) if you process many videos. How to customize the workflow Schedule:** switch to daily, or run on publish events instead of weekly. Filtering:** process only videos matching rules (e.g., title contains “tutorial”, or missing tags). Prompt tuning:** Add brand keywords to always include (e.g., “WiseStack AI”). Constrain to language (e.g., “Vietnamese tags only”). Enforce max 500 chars total for tags if you want a stricter cap. Safety guardrails:** Validate model output: split by comma, trim whitespace, dedupe, drop empty/over-long tags. If the agent fails, fall back to a heuristic generator (title/keywords extraction). Change log:** write a row per update to a sheet/DB (videoId, oldTags, newTags, timestamp, runId). Human-in-the-loop:** send tags to Slack as buttons (“Apply / Edit / Skip”) before updating YouTube. Multi-channel support:** loop through a list of channel credentials and repeat the pipeline. Notifications:** add error Slack messages for failed API calls; summarize weekly results. Tip: Keep a small allow/deny list (e.g., banned terms, mandatory brand terms) and run a quick sanitizer right after the agent node to maintain consistency across your channel.
by Vadim
What it does This workflow is an AI agent in the form of a Telegram bot. Its main purpose is to capture contact information and store it in a CRM. The agent supports multi-modal inputs and can extract contact details from text messages, voice recordings, and images (like photos of business cards). The bot guides the user through data collection via a natural conversation, asks clarifying questions for missing information, and summarizes the extracted data for confirmation before saving. It also checks for duplicate contacts by email and gives users the choice to either create a new contact or update an existing one. For simplicity, this example uses a Google Sheets document to store collected contacts. It can easily be replaced by a real CRM like HubSpot, Pipedrive, Monday, etc. How to use the bot Send contact details via text or voice, or upload a photo of a business card. The bot will show the extracted information and ask questions when needed. Once the bot confirms saving of the current contact, you can send the next one. Use the /new command at any moment to discard the previous conversation and start from scratch. Requirements A Telegram bot Access Token Google Gemini API key Google Sheets credentials How to set up Create a new Telegram bot (see n8n docs and Telegram bot API docs for details) Take webhook URL from the Telegram Trigger node (WEBHOOK_URL) and your bot's access token (TOKEN) and run curl -X POST "https://api.telegram.org/bot{TOKEN}/setWebhook?url={WEBHOOK_URL}" Create a new Google Sheets document with "Full name", "Email", "Phone", "Company", "Job title" and "Meeting notes" columns Configure parameters in the parameters node: Set ID of the Google Sheets document Set sheet name ("Sheet1" by default) Configure Google Sheets credentials for AI Agent's tools: Search for contact and Create new contact and Update existing contact. Add Google Gemini API key for the models ("AI Agent", "Transcribe audio", "Analyze image" nodes)
by Interlock GTM
Summary Turns a plain name + email into a fully-enriched HubSpot contact by matching the person in Apollo, pulling their latest LinkedIn activity, summarising the findings with GPT-4o, and upserting the clean data into HubSpot Key use-cases SDRs enriching inbound demo requests before routing RevOps teams keeping executive records fresh Marketers building highly-segmented email audiences Inputs |Field |Type| Example| |-|-|-| name |string| “Jane Doe” email| string |“jane@acme.com” Required credentials |Service |Node |Notes| |-|-|-| Apollo.io API key | HTTP Request – “Enrich with Apollo” |Set in header x-api-key RapidAPI key| (Fresh-LinkedIn-Profile-Data) “Get recent posts”| Header x-rapidapi-key OpenAI 3 LangChain nodes| Supply an API key| default model gpt-4o-mini HubSpot OAuth2| “Enrich in HubSpot”| Add/create any custom contact properties referenced High-level flow Trigger – Runs when another workflow passes name & email. Clean – JS Code node normalises & deduplicates emails. Apollo match – Queries /people/match; skips if no person. LinkedIn fetch – Grabs up to 3 original posts from last 30 days. AI summary chain OpenAI → Structured/Auto-fixing parsers Produces a strict JSON block with job title, location, summaries, etc. HubSpot upsert – Maps every key (plus five custom properties) into the contact record. Sticky-notes annotate the canvas; error-prone bits have retry logic.
by moosa
Daily Tech & Startup Digest: Notion-Powered News Curation Description This n8n workflow automates the curation of a daily tech and startup news digest from articles stored in a Notion database. It filters articles from the past 24 hours, refines them using keyword matching and LLM classification, aggregates them into a single Markdown digest with categorized summaries, and publishes the result as a Notion page. Designed for manual testing or daily scheduled runs, it includes sticky notes (as required by the n8n creator page) to document each step clearly. This original workflow is for educational purposes, showcasing Notion integration, AI classification, and Markdown-to-Notion conversion. Data in Notion Workflow Overview Triggers Manual Trigger**: Tests the workflow (When clicking ‘Execute workflow’). Schedule Trigger**: Runs daily at 8 PM (Schedule Trigger, disabled by default). Article Filtering Fetch Articles**: Queries the Notion database (Get many database pages) for articles from the last 24 hours using a date filter. Keyword Filtering**: JavaScript code (Code in JavaScript) filters articles containing tech/startup keywords (e.g., "tech," "AI," "startup") in title, summary, or full text. LLM Classification**: Uses OpenAI’s gpt-4.1-mini (OpenAI Chat Model) with a text classifier (Text Classifier) to categorize articles as "Tech/Startup" or "Other," keeping only relevant ones. Digest Creation Aggregate Articles**: Combines filtered articles into a single object (Code in JavaScript1) for processing. Generate Digest**: An AI agent (AI Agent) with OpenAI’s gpt-4.1-mini (OpenAI Chat Model1) creates a Markdown digest with an intro paragraph, categorized article summaries (e.g., AI & Developer Tools, Startups & Funding), clickable links, and a closing note. Notion Publishing Format for Notion**: JavaScript code (Code in JavaScript2) converts the Markdown digest into a Notion-compatible JSON payload, supporting headings, bulleted lists, and links, with a title like “Tech & Startup Daily Digest – YYYY-MM-DD”. Create Notion Page**: Sends the payload via HTTP request (HTTP Request) to the Notion API to create a new page. Credentials Uses Notion API and OpenAI API credentials. Notes This workflow is for educational purposes, demonstrating Notion database querying, AI classification, and Markdown-to-Notion publishing. Enable and adjust the schedule trigger (e.g., 8 PM daily) for production use to create daily digests. Set up Notion and OpenAI API credentials in n8n before running. The date filter can be modified (e.g., hours instead of days) to adjust the article selection window.