by Oneclick AI Squad
This n8n workflow automates the process of scraping LinkedIn profiles using the Apify platform and organizing the extracted data into Google Sheets for easy analysis and follow-up. Use Cases Lead Generation**: Extract contact information and professional details from LinkedIn profiles Recruitment**: Gather candidate information for talent acquisition Market Research**: Analyze professional networks and industry connections Sales Prospecting**: Build targeted prospect lists with detailed professional information How It Works 1. Workflow Initialization & Input Webhook Start Scraper**: Triggers the entire scraping workflow Read LinkedIn URLs**: Retrieves LinkedIn profile URLs from Google Sheets Schedule Scraper Trigger**: Sets up automated scheduling for regular scraping 2. Data Processing & Extraction Data Formatting**: Prepares and structures the LinkedIn URLs for processing Fetch Profile Data**: Makes HTTP requests to Apify API with profile URLs Run Scraper Actor**: Executes the Apify LinkedIn scraper actor Get Scraped Results**: Retrieves the extracted profile data from Apify 3. Data Storage & Completion Save to Google Sheets**: Stores the scraped profile data in organized spreadsheet format Update Progress Tracker**: Updates workflow status and progress tracking Process Complete Wait**: Ensures all operations finish before final steps Send Success Notification**: Alerts users when scraping is successfully completed Requirements Apify Account Active Apify account with sufficient credits API token for authentication Access to LinkedIn Profile Scraper actor Google Sheets Google account with Sheets access Properly formatted input sheet with LinkedIn URLs Credentials configured in n8n n8n Setup HTTP Request node credentials for Apify Google Sheets node credentials Webhook endpoint configured How to Use Step 1: Prepare Your Data Create a Google Sheet with LinkedIn profile URLs Ensure the sheet has a column named 'linkedin_url' Add any additional columns for metadata (name, company, etc.) Step 2: Configure Credentials Set up Apify API credentials in n8n Configure Google Sheets authentication Update webhook endpoint URL Step 3: Customize Settings Adjust scraping parameters in the Apify node Modify data fields to extract based on your needs Set up notification preferences Step 4: Execute Workflow Trigger via webhook or manual execution Monitor progress through the workflow Check Google Sheets for scraped data Review completion notifications Good to Know Rate Limits**: LinkedIn scraping is subject to rate limits. The workflow includes delays to respect these limits. Data Quality**: Results depend on profile visibility and LinkedIn's anti-scraping measures. Costs**: Apify charges based on compute units used. Monitor your usage to control costs. Compliance**: Ensure your scraping activities comply with LinkedIn's Terms of Service and applicable laws. Customizing This Workflow Enhanced Data Processing Add data enrichment steps to append additional information Implement duplicate detection and merge logic Create data validation rules for quality control Advanced Notifications Set up Slack or email alerts for different scenarios Create detailed reports with scraping statistics Implement error recovery mechanisms Integration Options Connect to CRM systems for automatic lead creation Integrate with marketing automation platforms Export data to analytics tools for further analysis Troubleshooting Common Issues Apify Actor Failures**: Check API limits and actor status Google Sheets Errors**: Verify permissions and sheet structure Rate Limiting**: Implement longer delays between requests Data Quality Issues**: Review scraping parameters and target profiles Best Practices Test with small batches before scaling up Monitor Apify credit usage regularly Keep backup copies of your data Regular validation of scraped information accuracy
by Luciano Gutierrez
Instagram Auto-Comment Responder with AI Agent Integration Version: 1.1.0 โง n8n Version: 1.88.0+ โง License: MIT A fully automated workflow for managing and responding to Instagram comments using AI agents. Designed to improve engagement and save time, this system listens for new Instagram comments, verifies and filters them, fetches relevant post data, processes valid messages with a natural language AI, and posts context-aware replies directly on the original post. Key Features ๐ฌ AI-Driven Engagement: Intelligent responses to comments via a GPT-powered agent. โ Webhook Verification: Handles Instagram webhook handshake to ensure secure integration. ๐ฆ Data Extraction: Maps incoming payload fields (user ID, username, message text, media ID) for processing. ๐ซ Self-Comment Filtering: Automatically skips comments made by the account owner to prevent loops. ๐ก Post Data Retrieval: Fetches the mediaโs id and caption from the Graph API (v22.0) before generating a reply. ๐ง Natural Language Processing: Uses a custom system prompt to maintain brand tone and context. ๐ Automated Replies: Posts the AI-generated message back to the comment thread using Instagramโs API. ๐งฉ Modular Architecture: Clear separation of steps via sticky notes and dedicated HTTP Request and Agent nodes. Use Cases Social Media Automation**: Keep followers engaged 24/7 with instant, relevant replies. Community Building**: Maintain a consistent voice and tone across all interactions. Brand Reputation Management**: Ensure no valid comment goes unanswered. AI Customer Support**: Triage simple questions and direct followers to resources or support. Technical Implementation Webhook Verification Node: Webhook + Respond to Webhook Echoes hub.challenge to confirm subscription and secure incoming events. Data Extraction Node: Set Maps payload fields into structured variables: conta.id, usuario.id, usuario.name, usuario.message.id, usuario.message.text, usuario.media.id, endpoint. User Validation Node: Filter Skips processing if conta.id equals usuario.id (self-comments). Post Data Retrieval Node: HTTP Request (Get post data) GET https://graph.instagram.com/v22.0/{{ $json.usuario.media.id }}?fields=id,caption&access_token={{ credentials }} Captures the mediaโs caption for richer context in replies. AI Response Generation Nodes: AI Agent + OpenRouter Chat Model Uses a detailed system prompt with: Profile persona (expert in AI & automations, friendly tone). Input data (username, comment text, post caption). Filtering logic (spam, praise, questions, vague comments). Returns either the reply text or [IGNORE] for irrelevant content. Posting the Reply Node: HTTP Request (Post comment) POST {{ $json.endpoint }}/{{ $json.usuario.message.id }}/replies with message={{ $json.output }} Sends the AI answer back under the original comment. Instructions for Setup Import Workflow In n8n > Workflows > Import from File, upload the provided .json template. Configure Credentials Instagram Graph API (Header Auth or FacebookGraphApi) with instagram_basic, instagram_manage_comments scopes. OpenRouter/OpenAI API key for AI agent. Customize System Prompt Edit the AI Agentโs prompt to adjust brand tone, language (Brazilian Portuguese), length, or emoji usage. Test & Activate Publish a test comment on an Instagram post. Verify each nodeโs execution, ensuring the webhook, filter, data extraction, HTTP requests, and AI Agent respond as expected. Extend & Monitor Add sentiment analysis or lead capture nodes as needed. Monitor execution logs for errors or rate-limit events. Tags Social Media โข Instagram Automation โข Webhook Verification โข AI Agent โข HTTP Request โข Auto Reply โข Community Management
by Robert Breen
This n8n workflow reads emails from your Outlook inbox, drafts AI-powered replies using OpenAI, and routes them through the gotoHuman node for human approval before replying automatically. โ Key Features Reads Outlook emails** from today only (excluding those from your own address). AI-generated replies** crafted using OpenAI based on the subject and body of the email. Community node integration**: Uses the gotoHuman node for human review and approval of replies before sending. Safe sending**: Only approved responses are automatically sent back via Outlook. Expandable**: Can be easily modified to: Send drafts instead of full replies Include additional email filters Trigger at intervals or via webhook ๐ง Nodes Used Microsoft Outlook โ Fetch and reply to emails OpenAI โ Generates smart reply text gotoHuman โ Human-in-the-loop approval system Loop Over Items, IF, Code, and Set nodes for processing logic Manual Trigger โ For testing ๐ง Setup Instructions 1. Connect APIs Outlook OAuth2**: Go to Azure Portal Register an app Add Mail.Read, Mail.Send scopes Set redirect URI: https://api.n8n.cloud/oauth2-credential/callback Paste credentials in n8n credential manager OpenAI API**: Create account at OpenAI Create an API Key Add it to n8n credentials gotoHuman API**: Go to https://gotoHuman.ai and sign in Create a review template (e.g., โEmail Responsesโ) Copy the Template ID and API key into n8n credentials ๐ช Workflow Steps Overview 1. Trigger Use the Manual Trigger to test or schedule execution with a cron node. 2. Filter Emails from Today A Code node outputs today's date in the proper yyyy-mm-dd format. const today = new Date(); today.setHours(0, 0, 0, 0); return [{ json: { searchQuery: received:${today.toISOString().split('T')[0]} } }]; 3. Search and Filter Outlook Messages Uses the Outlook node with a search query like: received:2025-08-06 -from:rbreen@ynteractive.com (Update to your email) 4. Generate AI Response Text prompt to OpenAI: subject: {{ $json.subject }} body: {{ $json.body.content }} System prompt: > You are a personal assistant helping respond to emails. I am an AI automation expert specializing in helping small and medium-size businesses automate processes. Create a short response to the email. Sign the email as Robert Breen. 5. Review with gotoHuman Submit AI output for human approval using the gotoHuman node. The output schema should match the Review Template fields (e.g., "email", "OriginalEmail"). 6. IF Node Decision If status is approved, send reply If not, return to loop for revision or skip โ๏ธ Customization Ideas โ๏ธ Send only drafts by skipping the "reply" step and storing results. ๐ Schedule the workflow with a Cron trigger for automation. ๐ Add label filters or subject keywords for advanced targeting. ๐ External Links gotoHuman Community Node OpenAI Microsoft Outlook API Setup ๐ฌ Need More Help? If you'd like help customizing this or building similar automations, reach out: Robert Breen AI & Automation Consultant ๐ https://ynteractive.com ๐ง robert.j.breen@gmail.com ๐ LinkedIn
by Ron
Objective In industry and production sometimes machine data is available in databases. That might be sensor data like temperature or pressure or just binary information. In this sample flow reads machine data and sends an alert to your SIGNL4 team when the machine is down. When the machine is up again the alert in SIGNL4 will get closed automatically. Setup We simulate the machine data using a Notion table. When we un-check the Up box we simulate a machine-down event. In certain intervals n8n checks the database for down items. If such an item has been found an alert is send using SIGNL4 and the item in Notion is updates (in order not to read it again). Status updates from SIGNL4 (acknowledgement, close, annotation, escalation, etc.) are received via webhook and we update the Notion item accordingly. This is how the alert looks like in the SIGNL4 app. The flow can be easily adapted to other database monitoring scenarios.
by Yang
๐ฅ Who is this for? This workflow is ideal for virtual assistants, researchers, developers, automation specialists, and data analysts who need to regularly extract and organize structured product information (like books) from a website. Itโs especially useful for those working with catalog-based websites who want to automate extraction and delivery of clean, sorted data. ๐งฉ What problem is this solving? Manually copying product listings like book titles and prices from a website into a spreadsheet is slow and repetitive. This automation solves that problem by scraping content using Dumpling AI, extracting the right data using CSS selectors, and formatting it into a clean CSV file that is sent to your emailโall triggered automatically when a new URL is added to Google Sheets. โ๏ธ What this workflow does This template automates an entire content scraping and delivery process: Watches a Google Sheet for new URLs Scrapes the HTML content of the given webpage using Dumpling AI Uses CSS selectors in the HTML node to extract each book from the page Splits the HTML array into individual items Extracts the book title and price from each HTML block Sorts the books in descending order based on price Converts the sorted data to a CSV file Sends the CSV via email using Gmail ๐ ๏ธ Setup Google Sheets Create a sheet titled something like URLs Add your product listing URLs (e.g., http://books.toscrape.com) Connect the Google Sheets trigger node to your sheet Ensure you have proper credentials connected Dumpling AI Create an account at Dumpling AI) - Generate your API key Set the HTTP Method to POST and pass the URL dynamically from the Google Sheet Use Header Auth to include your API key in the request header Make sure "cleaned": "True" is included in the body for optimized HTML output HTML Node The first HTML node extracts the main book container blocks using: .row > li The second HTML node parses out the individual fields: title: h3 > a (via the title attribute) price: .price_color Sort Node Sorts books by price in descending order Note: price is extracted as a string, ensure it's parsable if you plan to use numeric filtering later Convert to CSV The JSON data is passed into a Convert node and transformed into a CSV file Gmail Sends the CSV as an attachment to a designated email ๐ How to customize this workflow Extract more data**: Add more CSS selectors in the second HTML node to pull fields like author, availability, or product links Switch destinations**: Replace Gmail with Slack, Google Drive, Dropbox, or another platform Adjust sorting**: Sort alphabetically or based on another extracted value Use a different source**: As long as the site structure is consistent, this can scrape any listing-like page Trigger differently**: Use a webhook, form submission, or schedule trigger instead of Google Sheets โ ๏ธ Dependencies and Notes This workflow uses Dumpling AI to perform the web scraping. This requires an API key and uses credits per request. The HTML node depends on valid CSS selectors. If the site layout changes, the selectors may need to be updated. Ensure youโre not scraping content from websites that prohibit automated scraping.
by William Lettieri
Overview Transform your LLM into a powerful GitHub automation specialist with this n8n workflow template. In a world where multiple MCP servers can overwhelm LLMs with context, this streamlined solution provides a dedicated GitHub Agent that handles all GitHub API operations through a single, specialized tool. When you need GitHub operations like creating repositories, managing issues, or handling pull requests, your LLM can make one simple call to the GitHub Agent. This agent specializes exclusively in GitHub MCP server operations, offloading all contextual complexity and providing clean, efficient GitHub automation. โจ Features Single MCP Server Trigger** - One tool and one parameter to handle all GitHub API interactions Specialized GitHub Agent** - Dedicated AI agent with direct GitHub MCP Server connection Self-Executing Workflow** - "When Executed by Another Workflow" trigger enables seamless workflow chaining Scalable Architecture** - Ready to integrate with unlimited GitHub tools and operations Context Optimization** - Reduces LLM token usage by delegating GitHub complexity to a specialized agent Flexible Request Processing** - Handles any GitHub operation through natural language requests ๐ฏ Use Cases Repository Management** - Create, clone, and manage repositories programmatically Issue Tracking** - Automate issue creation, updates, and management workflows Pull Request Automation - Streamline code review and merge processes GitHub Actions Integration** - Trigger and monitor CI/CD workflows Team Collaboration** - Automate notifications and team management tasks Documentation Updates** - Automatically update README files and documentation ๐๏ธ Workflow Architecture Node Breakdown: MCP Server Trigger - Receives requests with GitHub operation parameters Set GitHub Username - Configures GitHub user context for API calls OpenAI Chat Model - Powers the intelligent GitHub agent with contextual understanding Simple Memory - Maintains conversation context and operation history GitHub AI Agent - Specialized Tools Agent with direct GitHub MCP Server access [MCP Server Trigger] โ [Set GitHub Username] โ [GitHub AI Agent] โ [OpenAI Chat Model] โ [Simple Memory] โ [GitHub API Operations] ๐ Requirements Essential Prerequisites: โ OpenAI API Key - For AI Agent and Chat Model functionality โ GitHub Username Configuration - Edit the "Set GitHub Username" node with your GitHub username for API calls โ n8n Version - Compatible with n8n 2024+ releases โ MCP Server Setup - Existing GitHub MCP server configuration Recommended Setup: GitHub Personal Access Token with appropriate permissions Basic understanding of n8n workflow configuration Familiarity with GitHub API operations ๐ Setup Instructions Step 1: Import and Configure Import the workflow template into your n8n instance Navigate to the Set GitHub Username node Replace the placeholder with your actual GitHub username Step 2: API Keys Setup Configure your OpenAI API key in the Chat Model node Ensure your GitHub credentials are properly configured in n8n Test the connection to verify API access Step 3: MCP Server Integration Connect your existing GitHub MCP server to the workflow Verify the MCP Server Trigger is properly configured Test with a simple GitHub operation (e.g., "List my repositories") Step 4: Deploy and Test Activate the workflow in your n8n instance Test with various GitHub operations to ensure functionality Monitor execution logs for any configuration issues ๐ง Customization Options Agent Behavior Modify the Chat Model prompt** to adjust agent personality and response style Configure memory settings** to control conversation context retention Adjust timeout settings** for long-running GitHub operations GitHub Operations Extend supported operations** by adding new GitHub API endpoints Configure repository filters** to limit scope of operations Set up notification preferences** for important GitHub events Integration Points Webhook triggers** for real-time GitHub event processing Scheduled operations** for regular repository maintenance Cross-workflow triggers** for complex automation chains ๐ก Pro Tips Start Simple**: Begin with basic operations like repository listing before attempting complex workflows Monitor Token Usage**: The specialized agent approach significantly reduces OpenAI API costs Batch Operations**: Group related GitHub operations in single requests for efficiency Error Handling**: The agent provides detailed error messages for troubleshooting ๐ค Support and Community Documentation**: Official n8n Documentation Community Forum**: n8n Community Issues & Contributions**: Feel free to suggest improvements or report issues ๐ License This workflow template is provided under the MIT License. You're free to use, modify, and redistribute with attribution. Created by: William Lettieri Version: 1.0 Last Updated: May 28, 2025 Compatibility: n8n 2024+
by Ranjan Dailata
Who this is for? Extract & Summarize Indeed Company Info is an automated workflow that extracts the Indeed company profile information using Bright Data Web Unlocker, transform it using Google Geminiโs LLM, and forward the transformed response with the summary to a specified webhook for downstream use. This workflow is tailored for: Recruiters and HR teams looking to assess companies quickly during talent sourcing. Job seekers researching potential employers and needing summarized company insights. Market researchers and analysts monitoring competitor or industry players. What problem is this workflow solving? Searching and evaluating company profiles on Indeed manually can be time-consuming and inefficient, especially when dealing with large volumes of companies. Manually browsing, copying, and summarizing company descriptions, reviews, and ratings from Indeed hinders productivity and limits real-time insights. This workflow solves this by: Automating the extraction of company details from Indeed using Bright Data Web Unlocker. Summarizing the raw data using Google Gemini's language model for a quick, human-readable overview. Sending the transformed response with the summary to a chosen endpoint, like Slack, Notion, Airtable, or a custom webhook. What this workflow does This automated pipeline does the following: Scrape Indeed company profile pages (e.g., ratings, description, reviews) using Bright Dataโs Web Unlocker. Transform the scraped content into structured JSON using n8nโs built-in tools. Summarize and extract meaningful insights using Google Gemini's large language model. Forward the summarized data to a specified webhook or app for real-time access, storage, or analysis. Forward the formatted response to a specified webhook or app for real-time access, storage, or analysis. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). Update the search query, Bright Data zone by navigating to the Set Indeed Search Query node. Update the Webhook Notifier with the Webhook endpoint of your choice. How to customize this workflow to your needs This workflow is built to be flexible - whether you're a company or a market researcher, entrepreneur, or data analyst. Hereโs how you can adapt it to fit your specific use case: Changing the data source**: Replace the Indeed search input with other job or business listing platforms if needed (e.g., Glassdoor, Crunchbase) Refining the LLM prompt**: Tailor the Gemini prompt to transform or summarize the Indeed company information in a specific format. Routing the output to different destinations**: Send summaries or transformed response to Google Sheets, Airtable, or CRMs like HubSpot or Salesforce etc.
by Amjid Ali
Automate Digital Delivery After PayPal Purchase Using n8n A Complete Step-by-Step Guide to Seamless Template Delivery Built by Amjid Ali โ SyncBricks Deliver personalized files instantly after PayPal transactions using n8n โ without writing a single backend line. ๐ What This n8n Workflow Does This automation template helps you automatically deliver a digital product (such as an n8n template or JSON file) to customers who pay via PayPal โ within seconds. You can: Automatically extract customer info Identify what was purchased Send a clean, branded email with the product file Promote your other courses, books, and tools ๐ฆ Use Case Example Product: AI-Powered Social Media Content Generator & Publisher When a customer buys this product through PayPal, this automation: Listens for a successful payment event Fetches order details via API Sends an HTML email with the template attached Promotes your other offerings with embedded links ๐ง Prerequisites Youโll need: An n8n instance (self-hosted or n8n Cloud) A PayPal developer account PayPal OAuth2 credentials configured in n8n Your product hosted as a downloadable .json file (Oracle, Dropbox, GitHub, etc.) SMTP email credentials in n8n ๐ง Step-by-Step Setup 1. Webhook Trigger Node: Webhook Listens for a POST request from PayPalโs webhook for PAYMENT.CAPTURE.COMPLETED events. ๐ Add the webhook to your PayPal Developer App > Webhooks. 2. Wait Node: Wait Adds a brief delay to ensure the payment is completely processed before continuing. 3. Filter Event Type Node: Switch Processes only when the event is PAYMENT.CAPTURE.COMPLETED. 4. Fetch Order Details Node: HTTP Request Retrieves the order information from PayPal's Orders API. URL format: https://api.paypal.com/v2/checkout/orders/{{ order_id }} 5. Extract Email & Product Info Node: Set Extracts first name, last name, email address, and the purchased item name. 6. Identify Product Purchased Node: Switch Checks if the product is โAI-Powered Social Media Content Generator & Publisherโ. 7. Download Workflow File Node: HTTP Request Fetches the hosted workflow JSON from object storage (Oracle in this case). 8. Convert to Downloadable File Node: Code Converts the JSON content into a binary file and attaches it. 9. Send Custom Email Node: Send Email Sends a rich HTML email to the buyer with: Their name The file attachment Product name Helpful resource links: ๐ Mastering n8n Course on Udemy ๐ Step-by-Step Guide (n8n Book) ๐ n8n Video Tutorials (Free Course) โ๏ธ Sign up for n8n Cloud โ Use code AMJID10 ๐ฅ YouTube Video Walkthrough ๐ Additional Learning Resources ๐ My Full Automation Suite Explore more and master n8n with these resources: ๐ Mastering n8n (Full Udemy Course) ๐ Get Your Step-by-Step Guide (n8n Book) ๐ฅ Get Step-by-Step Tutorials (Video Course) โ๏ธ Sign up for n8n Cloud ๐ก Templates, Tools, and More ๐บ YouTube Channel โ SyncBricks ๐ Need Help or Customization? Reach out! Email: amjid@amjidali.com LinkedIn: linkedin.com/in/amjidali Website: syncbricks.com
by Don Jayamaha Jr
This advanced agent analyzes long-term price action in the Binance Spot Market using 1-day candles. It calculates key macro indicators like RSI, MACD, BBANDS, EMA, SMA, and ADX to identify high-confidence trend setups and market momentum. Used by the Quant AI system for directional bias and macro-level signal validation. ๐ฅ Watch Tutorial: ๐ฏ Purpose Detect major trend reversals, consolidation zones, and macro bias Support long-term swing trading decisions Provide reliable 1-day signals for downstream agents ๐ง Core Features | Feature | Description | | --------------------------- | ------------------------------------------------------------ | | ๐ Trigger | Called by parent workflows via Execute Workflow | | ๐ฅ Input Format | { "message": "MATICUSDT", "sessionId": "telegram_id" } | | ๐ก Webhook Call | Sends request to internal 1d indicators webhook | | ๐งฎ Technical Indicators | RSI, MACD, BBANDS, EMA, SMA, ADX (based on 40 daily candles) | | ๐ง GPT (gpt-4.1-mini) Agent | Interprets numerical data into human-readable trend signals | | ๐ฌ Output | Summary suitable for Telegram or further agent consumption | ๐ External Tools Called https://treasurium.app.n8n.cloud/webhook/1d-indicators Sends: { "symbol": "SOLUSDT" } ๐ Indicator Calculations | Indicator | Purpose | | -------------- | ------------------------------- | | RSI (14) | Overbought / Oversold Signals | | MACD (12,26,9) | Trend Reversals / Momentum | | BBANDS (20, 2) | Volatility Expansion | | EMA (20) | Short-Term Trend Confirmation | | SMA (20) | Macro-Level Support/Resistance | | ADX (14) | Trend Strength + Directional DI | ๐ฆ Setup Import the JSON into n8n. Add your OpenAI API credentials. Ensure webhook /1d-indicators is connected and working. Use this agent as a sub-workflow in: Binance SM Financial Analyst Tool Binance Spot Market Quant AI Agent ๐ค Output Example ๐ 1D Overview โ MATICUSDT โข RSI: 71 โ Overbought โข MACD: Bearish Cross forming โข BBANDS: Widening Volatility โข EMA < SMA โ Downtrend Momentum โข ADX: 33 โ High Trend Strength ๐ Notes Not user-facing โ outputs are structured JSON or Telegram-style summaries. Pairs well with shorter timeframe tools (15mโ4h) for confidence stacking. ๐งพ Licensing & Attribution ยฉ 2025 Treasurium Capital Limited Company Architecture, prompts, and trade report structure are IP-protected. No unauthorized rebranding permitted. ๐ Need help? Reach out on LinkedIn โ Don Jayamaha
by Amit Mehta
How it Works This workflow automates the complete newsletter management process from content creation to client delivery, using Google Sheets, AI content generation, Google Drive, and Gmail. Whether you're a content creator, marketing agency, or small business owner, this workflow helps you automate newsletter creation and manage client communications with built-in approval workflows โ all triggered from a simple spreadsheet. ๐ฏ Use Case Ideal for: Marketing Teams** streamlining newsletter distribution Agencies** managing multiple client newsletters Content Creators** automating regular communications Small Businesses** maintaining customer engagement Setup Instructions 1. Upload the Spreadsheet File name: Newsletter_Management Sheet structure: | ID | Topic | Client Name | Client Email | Status | Created Date | Send Date | Add newsletter topics and set their Status as Pending 2. Configure Google Sheets Nodes Connect your Google account to: Get topic from newsletter sheet Pick records to send email to client Get Client email address Update Status as Generated Update status as Sent 3. Add API Credentials OpenAI API Key** โ for AI content generation Google Drive Access** โ for document storage Gmail Account** โ for sending newsletters and notifications 4. Activate the Workflow Once live, the workflow will: Manual Path: Generate newsletter content from pending topics Scheduled Path: Send approved newsletters to clients automatically Track status updates throughout the entire process Store generated content in Google Drive Send admin notifications and client emails ๐ Workflow Logic Main Workflow (Content Generation) Trigger: Manual activation for newsletter creation Retrieve: Pending topics from Google Sheets Validate: Status confirmation (Pending only) Generate: AI-powered HTML newsletter content Store: Upload to Google Drive Notify: Send completion email to admin Update: Mark status as "Generated" Scheduled Workflow (Client Distribution) Trigger: Schedule-based activation Retrieve: Approved newsletters from Google Sheets Validate: Status confirmation (Approved only) Lookup: Client email addresses Loop: Process multiple recipients Send: Personalized newsletters via Gmail Update: Mark status as "Sent" ๐งฉ Node Descriptions | Node Name | Description | |-----------|-------------| | When clicking 'Test workflow' | Manual trigger to start newsletter generation | | Get topic from newsletter sheet | Retrieves pending newsletter topics from Google Sheets | | Validate Status as Pending | Checks whether status is 'Pending' for processing | | Create HTML for Newsletter | AI-powered content generation using OpenAI | | Prepare Data to create word doc | Formats generated content for document creation | | Upload doc to google drive | Stores completed newsletters in Google Drive | | Send an email to admin | Notifies administrators of completion | | Update Status as Generated | Marks processed items as 'Generated' | | Schedule Trigger | Automated trigger for client email distribution | | Pick records to send email to client | Retrieves approved newsletters for sending | | Validate Status as Approved | Ensures only approved content is processed | | Get Client email address | Fetches client contact information | | Loop Over Items | Processes multiple newsletter recipients | | Send email to client | Delivers personalized newsletters via Gmail | | Update status as Sent | Marks newsletters as successfully delivered | ๐ ๏ธ Customization Tips Modify AI prompts for different content styles and tones Add Slack notifications instead of or alongside Gmail Export to different formats (PDF, Word, etc.) Schedule multiple sending times for different client segments Add approval workflows with webhook triggers Integrate with CRM systems for client management ๐ Suggested Sticky Notes for Workflow | Node/Section | Sticky Note Content | |--------------|---------------------| | Manual Trigger | "Click to start newsletter generation process" | | AI Content Generation | "Customize prompts here for different newsletter styles" | | Google Drive Upload | "Organized storage - change folder structure as needed" | | Gmail Admin Notification | "Update admin email addresses and notification templates" | | Schedule Trigger | "Set optimal sending times for your audience" | | Client Email Loop | "Handles bulk sending - monitors for delivery errors" | | Status Updates | "Maintains audit trail - prevents duplicate processing" | ๐ Required Files | File Name | Purpose | |-----------|---------| | Newsletter_Management.xlsx | Google Sheet to manage topics, clients, and status tracking | | Client_Database.xlsx | Client contact information and preferences | | Newsletter_Workflow.json | Main n8n workflow export for this automation | ๐งช Testing Tips Add one test topic with status = Pending and run manual trigger Verify AI content generation produces quality HTML Check Google Drive upload and folder organization Test admin email delivery and formatting Add test client with valid email for scheduled workflow Monitor workflow logs for API responses and errors Confirm status updates occur at each step ๐ท Suggested Tags & Categories #Newsletter #EmailMarketing #ContentGeneration #ClientCommunication #Automation #GoogleWorkspace #AIContent #MarketingAutomation #WorkflowManagement #BusinessProcess ๐ง Prerequisites Google Workspace account (Sheets, Drive, Gmail) OpenAI API account with GPT-4 access n8n instance (Cloud or self-hosted) Basic understanding of Google Sheets and email marketing ๐ Expected Performance Setup Time**: 30-45 minutes Monthly Executions**: 100-500 (varies by newsletter frequency) Processing Time**: 2-5 minutes per newsletter Scalability**: Handles 100+ clients efficiently ๐จ Important Notes Ensure proper Google API permissions are configured Monitor OpenAI API usage and rate limits Set up error handling for failed email deliveries Regularly backup your Google Sheets data Test thoroughly before production deployment ๐ก Advanced Features Approval Workflows**: Add manual approval steps between generation and sending A/B Testing**: Create multiple versions and track performance Analytics Integration**: Connect with Google Analytics for tracking Multi-language Support**: Generate content in different languages Dynamic Personalization**: Use client data for personalized content
by MattF
This workflow helps SEO teams catch top movers in Google Search Console by comparing daily performance across keyword segments like brand, nonbrand, and content categories. Instead of serving as a routine check, it highlights the queries and pages with the biggest jumps or drops, making it ideal for spotting wins, losses, or unexpected shifts early. How It Works Runs daily on a scheduled trigger (e.g. every morning). Pulls GSC data for the prior two days (e.g. yesterday vs. day before). Segments traffic by keyword type or URL pattern (e.g. brand, nonbrand, recipes, blogs, etc.). Calculates changes in clicks, impressions, CTR, and average position. Flags top movers with the biggest positive or negative deltas. Sends structured reports via Slack or email, grouped by segment and sorted by impact. Setup Steps Connect your Google Search Console account and optionally Gmail or Slack. Swap in your own domain(s) and customize segmentation logic (e.g. brand terms, path filters). By default, the workflow includes Slack alerts, but these can be easily switched to or combined with email, webhook, or other channels. Full setup takes around 15โ20 minutes with working GSC credentials. Note: The โrecipesโ segment is included as an example of how to segment content. This can be changed to match blog, FAQ, product pages, or any other category.
by CustomJS
n8n Workflow: Invoice PDF Generator This n8n workflow captures invoice data and generates a PDF invoice, ready to be sent or saved. It uses a webhook to trigger the process, preprocesses the invoice data, and converts it to a PDF using HTML and custom styling. @custom-js/n8n-nodes-pdf-toolkit Features: Webhook Trigger**: Receives incoming data, including invoice details. Preprocessing**: Transforms the invoice data into HTML format. HTML to PDF Conversion**: Converts the preprocessed HTML into a styled PDF document. Response**: Sends the generated PDF back to the webhook response. Notice Community nodes can only be installed on self-hosted instances of n8n. Requirements Self-hosted** n8n instance A CustomJS API key for website screenshots. Invoice data** for PDF generation Workflow Steps: Webhook Trigger: Accepts incoming data (e.g., invoice number, recipient details, itemized list). This data is passed to the next node for processing. Set Data Node: Configures initial values for the invoice, including the recipient, sender, invoice number, and the items on the invoice. The invoice details include information like description, unit price, and quantity. Preprocess Node: Processes the raw data to format it correctly for HTML. This includes splitting addresses and converting the items into an HTML table format. HTML to PDF Conversion: Converts the generated HTML into a PDF document. The HTML includes a header, a detailed invoice table, and a footer with contact information. Respond to Webhook: Returns the generated PDF as a response to the initial webhook request. Setup Guide: 1. Configure CustomJS API Sign up at CustomJS. Retrieve your API key from the profile page. Add your API key as n8n credentials. 2. Design Workflow Create a Webhook: Set up a webhook to trigger the workflow when invoice data is received. Prepare Data: Ensure the incoming request contains fields like "Invoice No", "Bill To", "From", and "Details" (list of items with price and quantity). Customize the HTML: The HTML template for the invoice includes custom styling to give the invoice a professional look. Convert to PDF: The HTML to PDF node is configured with the data generated from the preprocessing step to convert the invoice HTML to a PDF format. Example Invoice Data: { "Invoice No": "1", "Bill To": "John Doe\n1234 Elm St, Apt 567\nCity, Country, 12345", "From": "ABC Corporation\n789 Business Ave\nCity, Country, 67890", "Details": [ { "description": "Web Hosting", "price": 150, "qty": 2 }, { "description": "Domain", "price": 15, "qty": 5 } ], "Email": "support@mycompany.com" } Result PDF File