by Mohan Gopal
This workflow automates the process of reading EDI files generated by Sabre, parsing them using an AI Agent, and producing structured accounting reports like: π Accounts Receivable (AR) Summary π Tax and Surcharges Report It also uses Retrieval-Augmented Generation (RAG) to vectorize the Sabre Interface User Record (IUR)βa 154-page technical documentβso that the AI agent can reference it when clarification is required while generating reports. βοΈ Tools & Integrations Used Component:Tool/Service:Purpose:Workflow Engine:n8n:Automation & orchestration LLM Model:OpenAI GPT-4 / Chat Model:Natural language understanding and parsing Embeddings Model:OpenAI Embeddings:Convert text into semantic vector format Vector Database:Pinecone:Store and retrieve document chunks semantically Storage:Google Drive:Source of raw EDI text files and PDF documentation DataLoader + Splitter:n8n Node + Recursive Splitter:Loads and prepares documents for embedding AI Agents:n8n AI Agent Node:Runs context-aware prompts and parses reports π§± Workflow Breakdown π§ 1. Vectorizing the Sabre IUR Document (RAG Setup) π Objective: Enable the AI Agent to refer to the IUR document (154 pages) for detailed explanations of EDI terms, formats, and rules. Flow Steps: Google Drive Search + Download β Find and pull the IUR PDF file. Default Data Loader β Load the file and preprocess it for semantic splitting. Recursive Character Splitter β Break down large pages into meaningful chunks. OpenAI Embeddings β Vectorize each chunk. Pinecone Vector Store β Save into a Pinecone namespace for future retrieval. β Result: The IUR is now searchable via semantic queries from the AI Agent. π 2. Reading and Extracting Data from EDI Files π Objective: Parse raw EDI files for financial records and summaries. Flow Steps: Trigger β Manual or scheduled execution of the workflow. Google Drive Search β Finds all new .edi or .txt files. Download File Contents β Loads content of each file into memory. Extract from File β Raw text extraction. π 3. Report Generation Using AI Agents π Objective: AI Agents parse the extracted data to generate structured accounting reports. a. Accounts Receivable Report Agent The extracted text is passed to an AI Agent. Model is connected to: OpenAI Chat Model (LLM) Pinecone Vector DB (IUR reference) Outputs a structured AR Summary Report. b. Tax and Surcharges Report Agent Same steps as above. Prompts adjusted to extract tax, fees, surcharges, and amounts. β Output Format: Can be mapped to columns and inserted into a Google Sheet or exported as a CSV/JSON. π Sample Reports You Can Build Already implemented: β Accounts Receivable (AR) Summary Report β Tax and Surcharges Report Can be extended to: Accounts Payable (AP) Passenger Revenue Daily Sales Commission Report Net Profit Margin (if supplier cost + commission is available) π‘ Key Advantages β No-code automation with n8n β Semantic reasoning using AI + Vector DB (RAG) β Can work with various Sabre outputs without manual parsing β Modular: Easy to add new report types β Cloud-integrated (Drive, Pinecone, OpenAI) π§ͺ Potential Improvements Area Suggestions Testing Add a βPreviewβ step to validate extracted data before writing Scalability Batch mode + Google Sheet batching for multiple reports Audit Trail Log every file name, timestamp, report type in a Google Sheet Notification Send Slack/Email when a new report is generated Multi-model support Add Claude/Gemini fallback if OpenAI usage limit is hit
by Shiv Gupta
Pinterest Keyword-Based Content Scraper with AI Agent & BrightData Automation Overview This n8n workflow automates Pinterest content scraping based on user-provided keywords using BrightData's API and Claude Sonnet 4 AI agent. The system intelligently processes keywords, initiates scraping jobs, monitors progress, and formats the extracted data into structured outputs. Architecture Components π§ AI-Powered Controller Claude Sonnet 4 Model**: Processes and understands keywords before initiating scrape AI Agent**: Acts as the intelligent controller coordinating all scraping steps π₯ Data Input Form Trigger**: User-friendly keyword input interface Keywords Field**: Required input field for Pinterest search terms π Scraping Pipeline Launch Scraping Job: Sends keywords to BrightData API Status Monitoring: Continuously checks scraping progress Data Retrieval: Downloads completed scraped content Data Processing: Formats and structures the raw data Storage: Saves results to Google Sheets Workflow Nodes 1. Pinterest Keyword Input Type**: Form Trigger Purpose**: Entry point for user keyword submission Configuration**: Form title: "Pinterest" Required field: "Keywords" 2. Anthropic Chat Model Type**: Language Model (Claude Sonnet 4) Model**: claude-sonnet-4-20250514 Purpose**: AI-powered keyword processing and workflow orchestration 3. Keyword-based Scraping Agent Type**: AI Agent Purpose**: Orchestrates the entire scraping process Instructions**: Initiates Pinterest scraping with provided keywords Monitors scraping status until completion Downloads final scraped data Presents raw scraped data as output 4. BrightData Pinterest Scraping Type**: HTTP Request Tool Method**: POST Endpoint**: https://api.brightdata.com/datasets/v3/trigger Parameters**: dataset_id: gd_lk0sjs4d21kdr7cnlv include_errors: true type: discover_new discover_by: keyword limit_per_input: 2 Purpose**: Creates new scraping snapshot based on keywords 5. Check Scraping Status Type**: HTTP Request Tool Method**: GET Endpoint**: https://api.brightdata.com/datasets/v3/progress/{snapshot_id} Purpose**: Monitors scraping job progress Returns**: Status values like "running" or "ready" 6. Fetch Pinterest Snapshot Data Type**: HTTP Request Tool Method**: GET Endpoint**: https://api.brightdata.com/datasets/v3/snapshot/{snapshot_id} Purpose**: Downloads completed scraped data Trigger**: Executes when status is "ready" 7. Format & Extract Pinterest Content Type**: Code Node (JavaScript) Purpose**: Parses and structures raw scraped data Extracted Fields**: URL Post ID Title Content Date Posted User Likes & Comments Media Image URL Categories Hashtags 8. Save Pinterest Data to Google Sheets Type**: Google Sheets Node Operation**: Append Mapped Columns**: Post URL Title Content Image URL 9. Wait for 1 Minute (Disabled) Type**: Code Tool Purpose**: Adds delay between status checks (currently disabled) Duration**: 60 seconds Setup Requirements Required Credentials Anthropic API Credential ID: ANTHROPIC_CREDENTIAL_ID Required for Claude Sonnet 4 access BrightData API API Key: BRIGHT_DATA_API_KEY Required for Pinterest scraping service Google Sheets OAuth2 Credential ID: GOOGLE_SHEETS_CREDENTIAL_ID Required for data storage Configuration Placeholders Replace the following placeholders with actual values: WEBHOOK_ID_PLACEHOLDER: Form trigger webhook ID GOOGLE_SHEET_ID_PLACEHOLDER: Target Google Sheets document ID WORKFLOW_VERSION_ID: n8n workflow version INSTANCE_ID_PLACEHOLDER: n8n instance identifier WORKFLOW_ID_PLACEHOLDER: Unique workflow identifier Data Flow User Input (Keywords) β AI Agent Processing (Claude) β BrightData Scraping Job Creation β Status Monitoring Loop β Data Retrieval (when ready) β Content Formatting & Extraction β Google Sheets Storage Output Data Structure Each scraped Pinterest pin contains: URL**: Direct link to Pinterest pin Post ID**: Unique Pinterest identifier Title**: Pin title/heading Content**: Pin description text Date Posted**: Publication timestamp User**: Pinterest username Engagement**: Likes and comments count Media**: Media type information Image URL**: Direct image link Categories**: Pin categorization tags Hashtags**: Associated hashtags Comments**: User comments text Usage Instructions Initial Setup: Configure all required API credentials Replace placeholder values with actual IDs Create target Google Sheets document Running the Workflow: Access the form trigger URL Enter desired Pinterest keywords Submit the form to initiate scraping Monitoring Progress: The AI agent will automatically handle status monitoring No manual intervention required during scraping Accessing Results: Structured data will be automatically saved to Google Sheets Each run appends new data to existing sheet Technical Notes Rate Limiting**: BrightData API has built-in rate limiting Data Limits**: Current configuration limits 2 pins per keyword Status Polling**: Automatic status checking until completion Error Handling**: Includes error capture in scraping requests Async Processing**: Supports long-running scraping jobs Customization Options Adjust Data Limits**: Modify limit_per_input parameter Enable Wait Timer**: Activate the disabled wait node for longer jobs Custom Data Fields**: Modify the formatting code for additional fields Alternative Storage**: Replace Google Sheets with other storage options Sample Google Sheets Template Create a copy of the sample sheet structure: https://docs.google.com/spreadsheets/d/SAMPLE_SHEET_ID/edit Required columns: Post URL Title Content Image URL Troubleshooting Authentication Errors**: Verify all API credentials are correctly configured Scraping Failures**: Check BrightData API status and rate limits Data Formatting Issues**: Review the JavaScript formatting code for parsing errors Google Sheets Errors**: Ensure proper OAuth2 permissions and sheet access For any questions or support, please contact: Email or fill out this form
by Oriol SeguΓ
This n8n workflow allows you to automatically monitor the status of multiple URLs in a simple and efficient way. You just need to enter the URLs you want to scan and run the workflow (either manually or scheduled). For each URL, an availability check is performed. The results are logged in a Google Sheet, clearly distinguishing between successful checks and failures (downtime). If any URL fails, the system filters these errors and automatically sends an email alert notifying you of the detected outages. The workflow includes help messages in both English and Spanish, integrates with Google Sheets and Gmail, and is suitable for both one-off tasks and scheduled monitoring. For Who? Webmasters SEO & Marketing Teams SysAdmins Anyone needing automated website uptime monitoring How it works? Enter the URLs to scan in the βURLsβ field. Trigger the workflow manually or schedule it to run automatically. For each URL, the workflow: Checks if the URL is online or down. Logs the status (success or error) in a Google Sheet. At the end, filters the failed URLs (crashes) and sends an email alert l
by Shahrear
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Automatically transform resume submissions into comprehensive candidate profiles with AI-powered parsing, GitHub analysis, and instant team notifications. What this workflow does Monitors Gmail for incoming resume attachments Extracts structured data using VLM Run AI document parsing Analyzes GitHub profiles with deep repository intelligence (30+ frameworks detected) Creates comprehensive candidate profiles with technical skills assessment Delivers multi-channel notifications via Google Sheets, Slack, and candidate emails Setup Prerequisites: Gmail account, VLM Run API credentials, Google Sheets access, Slack workspace, self-hosted n8n. You need to install VLM Run community node Quick Setup: Configure Gmail OAuth2 for resume monitoring Add VLM Run API credentials for document parsing Create Google Sheets candidate database Set up Slack integration for team notifications Update spreadsheet/channel IDs in workflow nodes Test with sample resume and activate Perfect for HR departments and technical recruiting teams Startup hiring and talent acquisition agencies Developer assessment and skill evaluation Remote team hiring and candidate screening Any organization seeking data-driven hiring decisions Key Benefits Eliminates manual data entry** - AI extracts all contact info, skills, and experience GitHub intelligence engine** - Analyzes repositories, calculates experience, detects technologies Comprehensive skill assessment** - Identifies programming languages, frameworks, and project metrics Professional candidate experience** - Automated acknowledgment emails with personalized touches Instant team collaboration** - Rich Slack notifications with GitHub profile highlights Structured data storage** - Searchable candidate database with 20+ data columns Saves hours per candidate** - Transforms 30-minute manual reviews into instant insights How to customize Extend by adding: Integration with ATS systems (Greenhouse, Lever, BambooHR) LinkedIn profile analysis and social media insights Automated interview scheduling based on qualifications Skills-based candidate scoring and ranking algorithms Integration with code assessment platforms Multi-language resume support and translation Custom evaluation criteria and filtering rules Advanced GitHub metrics (code quality, contribution patterns) This workflow revolutionizes technical hiring by combining AI-powered resume parsing with deep GitHub analysis, delivering comprehensive candidate intelligence that empowers data-driven hiring decisions while maintaining a professional candidate experience.
by lin@davoy.tech
The YogiAI workflow automates sending daily yoga pose reminders and related information via Line Push Messages . This automation leverages data from a Google Sheets database containing yoga pose details such as names, image URLs, and links to ensure users receive personalized and engaging content every day. Purpose Provide users with daily yoga pose suggestions tailored to their practice. Deliver visually appealing and informative content through Line's Flex Messages, including images and clickable links. Log user interactions and preferences back into Google Sheets to refine future recommendations. Key Features Automated Daily Reminders : Sends a curated list of yoga poses at a scheduled time (21:30 Bangkok time). Dynamic Content Generation : Uses AI to rewrite and format messages in a user-friendly manner, complete with emojis and clear instructions. Integration with Google Sheets : Pulls data from a predefined Google Sheet and logs interactions for continuous improvement. Customizable Messaging : Ensures JSON outputs are properly formatted for Lineβs Flex Message API, allowing for interactive and visually rich content. Data Source Google Sheets Structure The workflow relies on a Google Sheet structured as follows: PoseName : The name of the yoga pose. uri : The image URL representing the pose. url : A clickable link directing users to more information about the pose. Sample Data Layout Supine Angle https://example.com/SupineAngle-tn146.png https://example.com/pose/SupineAngle Warrior II https://example.com/WarriorII-tn146.png https://example.com/pose/WarriorII *Note : Ensure that you update the Google Sheet with your own data. Refer to this sample sheet for reference. * Scheduled Trigger The workflow is triggered daily at 21:30 (9:30 PM) Bangkok Time (Asia/Bangkok) . This ensures timely delivery of reminders to users, keeping them engaged with their yoga practice. Workflow Process Data Retrieval Node: Get PoseName Fetches yoga pose details from the specified range in the Google Sheet. Content Generation Node: WritePosesToday Utilizes Azure OpenAI to craft user-friendly text, complete with emojis and clear instructions. Node: RewritePosesToday Formats the AI-generated text specifically for Line messaging, ensuring compatibility and visual appeal. JSON Formatting Node: WriteJSONflex Generates JSON structures required for Lineβs Flex Messages, enabling carousel displays of yoga pose images and links. Node: Fix JSON Ensures all JSON outputs are correctly formatted before being sent via Line. Message Delivery Node: Line Push with Flex Bubble Sends the final message, including both text and Flex Message carousels, directly to users via Line Push Messages. Logging Interactions Nodes: YogaLog & YogaLog2 Logs each interaction back into Google Sheets to track which poses were sent and how often they appear, refining future recommendations. Setup Prerequisites Google Sheets Account : Set up a Google Sheet with the required structure and populate it with your yoga pose data. Line Developer Account : Create a Line channel to obtain necessary credentials for sending push messages. Azure OpenAI Account : Configure access to Azure OpenAI services for generating and formatting content. Intended Audience This workflow is ideal for: Yoga Instructors : Seeking to engage students with daily pose suggestions. Fitness Enthusiasts : Looking to maintain consistency in their yoga practice. Content Creators : Interested in automating personalized and visually appealing content distribution.
by Joseph LePage
βοΈπ WordPress + AI Content Creator This workflow automates the creation and publishing of multi-reading-level content for WordPress blogs. It leverages AI to generate optimized articles, automatically creates featured images, and provides versions of the content at different reading levels (Grade 2, 5, and 9). How It Works Content Generation & Processing π― Starts with a manual trigger and a user-defined blog topic Uses AI to create a structured blog post with proper HTML formatting Separates and validates the title and content components Saves a draft version to Google Drive for backup Multi-Reading Level Versions π Automatically rewrites the content for different reading levels: Grade 9: Sophisticated language with appropriate metaphors Grade 5: Simplified with light humor and age-appropriate examples Grade 2: Basic language with simple metaphors and child-friendly explanations WordPress Integration π Creates a draft post in WordPress with the Grade 9 version Generates a relevant featured image using Pollinations.ai Automatically uploads and sets the featured image Sends success/error notifications via Telegram Setup Steps Configure API Credentials π Set up WordPress API connection Configure OpenAI API access Set up Google Drive integration Add Telegram bot credentials for notifications Customize Content Parameters βοΈ Adjust reading level prompts as needed Modify image generation settings Set WordPress post parameters Test and Deploy π Run a test with a sample topic Verify all reading level versions Check WordPress draft creation Confirm notification system This workflow is perfect for content creators who need to maintain a consistent blog presence while catering to different audience reading levels. It's especially useful for educational content, news sites, or any platform that needs to communicate complex topics to diverse audiences.
by Dr. Firas
Google Maps Data Extraction Workflow for Lead Generation This workflow is ideal for sales teams, marketers, entrepreneurs, and researchers looking to efficiently gather detailed business information from Google Maps for: Lead generation Market analysis Competitive research Who Is This Workflow For? Sales professionals** aiming to build targeted contact lists Marketers** looking for localized business data Researchers** needing organized, comprehensive business information Problem This Workflow Solves Manually gathering business contact details from Google Maps is: Tedious Error-prone Time-consuming This workflow automates data extraction to increase efficiency, accuracy, and productivity. What This Workflow Does Automates extraction of business data (name, address, phone, email, website) from Google Maps Crawls and extracts additional website content Integrates OpenAI to enhance data processing Stores structured results in Google Sheets for easy access and analysis Uses Google Search API to fill in missing information Setup Import the provided n8n workflow JSON into your n8n instance. Set your OpenAI and Google Sheets API credentials. Provide your Google Maps Scraper and Website Content Crawler API keys. Ensure SerpAPI is configured to enhance data completeness. Customizing This Workflow to Your Needs Adjust scraping parameters: Location Business category Country code Customize Google Sheets output format to fit your current data structure Integrate additional AI processing steps or APIs for richer data enrichment Final Notes This structured approach ensures: Accurate and compliant data extraction** from Google Maps Streamlined lead generation Actionable and well-organized data ready for business use π Documentation: Notion Guide Demo Video π₯ Watch the full tutorial here: YouTube Demo
by Ranjan Dailata
Who is this for? This workflow automates the process of querying Bing's Copilot Search, extracting structured data from the results, summarizing the information, and sending a notification via webhook. It leverages the Microsoft Copilot to retrieve search results and integrates AI-powered tools for data extraction and summarization. What problem is this workflow solving? Data Analysts and Researchers: Who need to gather and summarize information from Bing search results efficiently.β Developers and Engineers: Looking to integrate Bing search data into applications or services.β Digital Marketers and SEO Specialists: Interested in monitoring search engine results for specific keywords or topics. What this workflow does Manually extracting and summarizing information from search engine results can be time-consuming and error-prone. This workflow automates the process by:β Performing Bing searches using Bright Data's Bing Search API.β Extracting structured data from the search results.β Summarizing the extracted information using AI tools.β Sending the summarized data to a specified endpoint via webhook. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). Update the Perform a Bing Copilot Request node with the prompt you wish to perform the search. Update the Structured Data Webhook Notifier node with the Webhook endpoint of your choice. Update the Summary Webhook Notifier node with the Webhook endpoint of your choice. How to customize this workflow to your needs Modify Search Queries: Adjust the search terms to target different topics or keywords.β Change Data Extraction Logic: Customize the extraction process to capture specific data points from the search results.β Alter Summarization Techniques: Integrate different AI models or adjust parameters to change how summaries are generated.β Update Webhook Endpoints: Direct the summarized data to different endpoints as required.β Schedule Workflow Runs: Set up automated triggers to run the workflow at desired intervals.
by Jimleuk
This n8n template demonstrates the beginnings of building your own n8n-powered WhatsApp chatbot! Under the hood, utilise n8n's powerful AI features to handle different message types and use an AI agent to respond to the user. A powerful tool for any use-case! How it works Incoming WhatsApp Trigger provides a way to get messages into the workflow. The message received is extracted and sent through 1 of 4 branches for processing. Each processing branch uses AI to analyse, summarize or transcribe the message so that the AI agent can understand it. The supported types are text, image, audio (voice notes) and video. The AI Agent is used to generate a response generally and uses a wikipedia tool for more complex queries. Finally, the response message is sent back to the WhatsApp user using the WhatsApp node. How to use Once you have setup and configured your WhatsApp account, you'll need to activate your workflow to start processing messages. Good to know: Large media files may negatively impact workflow performance. Requirements WhatsApp Buisness account Google Gemini for LLM. Gemini is used specifically because it can accept audio and video files whereas at time of writing, many other providers like OpenAI's GPT, do not. Customising this workflow For performance reasons, consider detecting large audio and video before sending to the LLM. Pre-processing such files may allow your agent to perform better. Go beyond and create rich and engagement customer experiences by responding using images, audio and video instead of just text!
by Sebastian/OptiLever
Tired of spending HOURS writing product descriptions that donβt rank or convert? This could be your solution. This free Product Description Writer workflow for n8n uses a multi-agent AI system to turn your product list into conversion-focused, SEO-ready copy. It analyzes your product images, identifies key features, and writes optimized titles and descriptions for platforms like Shopify and Google Shopping. It can process your entire catalog in minutes, saving you countless hours of manual work. This workflow is perfect for: π Shopify stores π Etsy sellers π Product managers π Digital marketers π Anyone who hates writing product copy manually! How it works This workflow automates the entire product description process in a few high-level steps: Reads Your Products: The workflow starts by reading product data from your specified Google Sheet, including the product name, an image URL, and optional fields like brand voice or target market. Analyzes Product Images: It downloads each product image and uses an AI vision model (GPT-4o-mini) to perform a detailed visual analysis, extracting objective information like materials, colors, features, and structure. Writes Optimized Copy: The visual analysis and your original data are passed to two specialized AI agents. The first drafts a Shopify-optimized title and description, while the second refines it and generates additional SEO-focused copy for Google Merchant Center. Updates Your Spreadsheet: The final, optimized product titles and descriptions for both Shopify and Google are automatically written back to the original Google Sheet. Set up steps Setting up this workflow takes only a few minutes. You will need to configure credentials for the following services: Google Sheets**: To allow the workflow to read your product list and write back the results. OpenAI**: To power the AI agents that analyze images and generate the copy. Detailed instructions and customization tips are included in the sticky notes inside the workflow itself. Benefits Automated Vision-Based Copywriting**: Reduces manual description writing time. Multi-Channel Ready**: Outputs are optimized for both Shopify and Google Merchant Center standards. Brand Alignment**: Uses optional user-provided draft descriptions and brand voice to maintain brand tone. SEO and Conversion Focus**: Titles and descriptions are optimized for both search engines and consumer engagement. Image-Centric Accuracy**: Uses actual product images for accurate attribute extraction, minimizing errors from missing or vague text data. Tips & Customization To adjust brand voice or tone, modify the system prompts in the Shopify and GMC AI agents. To extend the workflow for scheduled runs, add a cron trigger or a Google Sheets "status column" filter. For QA/debugging, consider adding logging nodes to Slack or Discord, or export AI outputs to a review sheet before updating the main sheet. To improve Shopify or GMC field mappings, edit the final Google Sheets update node's column settings. For speed optimization, the batch size in the Loop Over Items node can be adjusted, but be mindful of API rate limits.
by Agentick AI
This n8n workflow automates the process of collecting job and decision-maker data, crafting AI-generated referral messages, and drafting them in Gmailβall using a combination of Apify, Google Sheets, LLMs, and email APIs. Use cases Auto-sourcing job postings from LinkedIn via Apify Identifying decision-makers at relevant companies Auto-drafting custom referral request messages using AI Exporting structured data to Google Sheets and drafting Gmail messages for outreach Good to know You can customize the filtering logic to target specific cities or companies. Message creation uses the Gemini 2.0 Flash model and LangChainβs output parser for structured messages. Email data is fetched using Anymailfinder, but can be replaced with other providers like Hunter.io. Gmail API drafts the message, but you need to enable Gmail API access from your Google Cloud console. How it works Trigger A Schedule Trigger runs the automation daily. Job Data Extraction Apify pulls job listings using a predefined actor. The HTTP response is split and structured using the Split Out node. Store Job Data Job listings are saved to a Google Sheet. The node maps key fields like title, company, location, and poster info. Decision-Maker Discovery Another Apify actor pulls decision-maker data from LinkedIn. This is split and filtered (e.g., by city or company name). Store Contacts Contact details (name, title, location, etc.) are appended to another Google Sheet (n8n-sheet). Message Generation A LLM Chain uses Gemini 2.0 Flash to generate short, custom LinkedIn messages. The message respects rules like tone, length (<100 words), and personalization. Parse & Merge AI Output The output is structured using Structured Output Parser and merged with contact data. Save Final Messages The final headline and body are stored back into Google Sheets (n8n-sheet). Email Discovery Get Email IDs node hits Anymailfinder API using the LinkedIn profile link. Draft in Gmail Using Gmail API, the message is drafted in your inbox with subject and body auto-filled. How to use Update Apify actor inputs to specify roles, companies, or locations. Replace the manual Schedule Trigger with a webhook or form input if desired. Update the Google Sheets document and sheet name in the relevant nodes. Add your Gmail and Anymailfinder credentials in n8n settings. Requirements Google Sheets API access Gmail API access Apify account Gemini API key (via Google AI Studio) Anymailfinder (or alternate email discovery API) Customizing this workflow This framework is highly modular. You can: Add more filters for company size, role, or hiring urgency Use alternate LLMs (OpenAI, Claude, etc.) Switch output channels (Slack, WhatsApp, etc.) Plug in different CRM tools for follow-ups
by Dhruv from Saleshandy
π§ How it works This workflow automates QA review of Intercom support conversations by: Triggering on conversation.admin.closed events via a webhook Fetching full conversation data using Intercom API Structuring and summarizing the conversation into a readable transcript Using GPT to evaluate: Response time Clarity Tone & behavior Urgency handling Ownership & resolution Logging structured QA scores in a Google Sheet Providing coaching-style feedback if the rating is 3 or below βοΈ Set up steps π Configure your Intercom and OpenAI credentials in n8n π© Set up the webhook in Intercom to post on conversation close π§ Use your OpenAI API key for the GPT-based nodes ποΈ Connect your Google Sheet (or replace with another data sink) β Add your own filtering logic for spam/promotional tickets if needed Note: This workflow contains a sticky notes to explain each step inside the n8n canvas.