by Onur
Lead Sourcing by Job Posts For Outreach With Scrape.do API & Open AI & Google Sheets Overview This n8n workflow automates the complete lead generation process by scraping job postings from Indeed, enriching company data via Apollo.io, identifying decision-makers, and generating personalized LinkedIn outreach messages using OpenAI. It integrates with Scrape.do for reliable web scraping, Apollo.io for B2B data enrichment, OpenAI for AI-powered personalization, and Google Sheets for centralized data storage. Perfect for: Sales teams, recruiters, business development professionals, and marketing agencies looking to automate their outbound prospecting pipeline. Workflow Components 1. โฐ Schedule Trigger | Property | Value | |----------|-------| | Type | Schedule Trigger | | Purpose | Automatically initiates workflow on a recurring schedule | | Frequency | Weekly (Every Monday) | | Time | 00:00 UTC | Function: Ensures consistent, hands-off lead generation by running the pipeline automatically without manual intervention. 2. ๐ Scrape.do Indeed API | Property | Value | |----------|-------| | Type | HTTP Request (GET) | | Purpose | Scrapes job listings from Indeed via Scrape.do proxy API | | Endpoint | https://api.scrape.do | | Output Format | Markdown | Request Parameters: | Parameter | Value | Description | |-----------|-------|-------------| | token | API Token | Scrape.do authentication | | url | Indeed Search URL | Target job search page | | super | true | Uses residential proxies | | geoCode | us | US-based content | | render | true | JavaScript rendering enabled | | device | mobile | Mobile viewport for cleaner HTML | | output | markdown | Lightweight text output | Function: Fetches Indeed job listings with anti-bot bypass, returning clean markdown for easy parsing. 3. ๐ Parse Indeed Jobs | Property | Value | |----------|-------| | Type | Code Node (JavaScript) | | Purpose | Extracts structured job data from markdown | | Mode | Run once for all items | Extracted Fields: | Field | Description | Example | |-------|-------------|---------| | jobTitle | Position title | "Senior Data Engineer" | | jobUrl | Indeed job link | "https://indeed.com/viewjob?jk=abc123" | | jobId | Indeed job identifier | "abc123" | | companyName | Hiring company | "Acme Corporation" | | location | City, State | "San Francisco, CA" | | salary | Pay range | "$120,000 - $150,000" | | jobType | Employment type | "Full-time" | | source | Data source | "Indeed" | | dateFound | Scrape date | "2025-01-15" | Function: Parses markdown using regex patterns, filters invalid entries, and deduplicates by company name. 4. ๐ Add New Company (Google Sheets) | Property | Value | |----------|-------| | Type | Google Sheets Node | | Purpose | Stores parsed job postings for tracking | | Operation | Append rows | | Target Sheet | "Add New Company" | Function: Creates a historical record of all discovered job postings and companies for pipeline tracking. 5. ๐ข Apollo Organization Search | Property | Value | |----------|-------| | Type | HTTP Request (POST) | | Purpose | Enriches company data via Apollo.io API | | Endpoint | https://api.apollo.io/v1/organizations/search | | Authentication | HTTP Header Auth (x-api-key) | Request Body: { "q_organization_name": "Company Name", "page": 1, "per_page": 1 } Response Fields: | Field | Description | |-------|-------------| | id | Apollo organization ID | | name | Official company name | | website_url | Company website | | linkedin_url | LinkedIn company page | | industry | Business sector | | estimated_num_employees | Company size | | founded_year | Year established | | city, state, country | Location details | | short_description | Company overview | Function: Retrieves comprehensive company intelligence including LinkedIn profiles, industry classification, and employee count. 6. ๐ค Extract Apollo Org Data | Property | Value | |----------|-------| | Type | Code Node (JavaScript) | | Purpose | Parses Apollo response and merges with original data | | Mode | Run once for each item | Function: Extracts relevant fields from Apollo API response and combines with job posting data for downstream processing. 7. ๐ฅ Apollo People Search | Property | Value | |----------|-------| | Type | HTTP Request (POST) | | Purpose | Finds decision-makers at target companies | | Endpoint | https://api.apollo.io/v1/mixed_people/search | | Authentication | HTTP Header Auth (x-api-key) | Request Body: { "organization_ids": ["apollo_org_id"], "person_titles": [ "CTO", "Chief Technology Officer", "VP Engineering", "Head of Engineering", "Engineering Manager", "Technical Director", "CEO", "Founder" ], "page": 1, "per_page": 3 } Response Fields: | Field | Description | |-------|-------------| | first_name | Contact first name | | last_name | Contact last name | | title | Job title | | email | Email address | | linkedin_url | LinkedIn profile URL | | phone_number | Direct phone | Function: Identifies key stakeholders and decision-makers based on configurable title filters. 8. ๐ Format Leads | Property | Value | |----------|-------| | Type | Code Node (JavaScript) | | Purpose | Structures lead data for outreach | | Mode | Run once for all items | Function: Combines person data with company context, creating comprehensive lead profiles ready for personalization. 9. ๐ค Generate Personalized Message (OpenAI) | Property | Value | |----------|-------| | Type | OpenAI Node | | Purpose | Creates custom LinkedIn connection messages | | Model | gpt-4o-mini | | Max Tokens | 150 | | Temperature | 0.7 | System Prompt: You are a professional outreach specialist. Write personalized LinkedIn connection request messages. Keep messages under 300 characters. Be friendly, professional, and mention a specific reason for connecting based on their role and company. User Prompt Variables: | Variable | Source | |----------|--------| | Name | $json.fullName | | Title | $json.title | | Company | $json.companyName | | Industry | $json.industry | | Job Context | $json.jobTitle | Function: Generates unique, contextual outreach messages that reference specific hiring activity and company details. 10. ๐ Merge Lead + Message | Property | Value | |----------|-------| | Type | Code Node (JavaScript) | | Purpose | Combines lead data with generated message | | Mode | Run once for each item | Function: Merges OpenAI response with lead profile, creating the final enriched record. 11. ๐พ Save Leads to Sheet | Property | Value | |----------|-------| | Type | Google Sheets Node | | Purpose | Stores final lead data with personalized messages | | Operation | Append rows | | Target Sheet | "Leads" | Data Mapping: | Column | Data | |--------|------| | First Name | Lead's first name | | Last Name | Lead's last name | | Title | Job title | | Company | Company name | | LinkedIn URL | Profile link | | Country | Location | | Industry | Business sector | | Date Added | Timestamp | | Source | "Indeed + Apollo" | | Personalized Message | AI-generated outreach text | Function: Creates actionable lead database ready for outreach campaigns. Workflow Flow โฐ Schedule Trigger โ โผ ๐ Scrape.do Indeed API โโโบ Fetches job listings with JS rendering โ โผ ๐ Parse Indeed Jobs โโโบ Extracts company names, job details โ โผ ๐ Add New Company โโโบ Saves to Google Sheets (Companies) โ โผ ๐ข Apollo Org Search โโโบ Enriches company data โ โผ ๐ค Extract Apollo Org Data โโโบ Parses API response โ โผ ๐ฅ Apollo People Search โโโบ Finds decision-makers โ โผ ๐ Format Leads โโโบ Structures lead profiles โ โผ ๐ค Generate Personalized Message โโโบ AI creates custom outreach โ โผ ๐ Merge Lead + Message โโโบ Combines all data โ โผ ๐พ Save Leads to Sheet โโโบ Final storage (Leads) Configuration Requirements API Keys & Credentials | Credential | Purpose | Where to Get | |------------|---------|--------------| | Scrape.do API Token | Web scraping with anti-bot bypass | scrape.do/dashboard | | Apollo.io API Key | B2B data enrichment | apollo.io/settings/integrations | | OpenAI API Key | AI message generation | platform.openai.com | | Google Sheets OAuth2 | Data storage | n8n Credentials Setup | n8n Credential Setup | Credential Type | Configuration | |-----------------|---------------| | HTTP Header Auth (Apollo) | Header: x-api-key, Value: Your Apollo API key | | OpenAI API | API Key: Your OpenAI API key | | Google Sheets OAuth2 | Complete OAuth flow with Google | Key Features ๐ Intelligent Job Scraping Anti-Bot Bypass:** Residential proxy rotation via Scrape.do JavaScript Rendering:** Full headless browser for dynamic content Mobile Optimization:** Cleaner HTML with mobile viewport Markdown Output:** Lightweight, easy-to-parse format ๐ข B2B Data Enrichment Company Intelligence:** Industry, size, location, LinkedIn Decision-Maker Discovery:** Title-based filtering Contact Information:** Email, phone, LinkedIn profiles Real-Time Data:** Fresh information from Apollo.io ๐ค AI-Powered Personalization Contextual Messages:** References specific hiring activity Character Limit:** Optimized for LinkedIn (300 chars) Variable Temperature:** Balanced creativity and consistency Role-Specific:** Tailored to recipient's title and company ๐ Automated Data Management Dual Sheet Storage:** Companies + Leads separation Timestamp Tracking:** Historical records Deduplication:** Prevents duplicate entries Ready for Export:** CSV-compatible format Use Cases ๐ฏ Sales Prospecting Identify companies actively hiring in your target market Find decision-makers at companies investing in growth Generate personalized cold outreach at scale Track pipeline from discovery to contact ๐ฅ Recruiting & Talent Acquisition Monitor competitor hiring patterns Identify companies building specific teams Connect with hiring managers directly Build talent pipeline relationships ๐ Market Intelligence Track industry hiring trends Monitor competitor expansion signals Identify emerging market opportunities Benchmark salary ranges by role ๐ค Partnership Development Find companies investing in complementary areas Identify potential integration partners Connect with technical leadership Build strategic relationship pipeline Technical Notes | Specification | Value | |---------------|-------| | Processing Time | 2-5 minutes per run (depending on job count) | | Jobs per Run | ~25 unique companies | | API Calls per Run | 1 Scrape.do + 25 Apollo Org + 25 Apollo People + ~75 OpenAI | | Data Accuracy | 90%+ for company matching | | Success Rate | 99%+ with proper error handling | Rate Limits to Consider | Service | Free Tier Limit | Recommendation | |---------|-----------------|----------------| | Scrape.do | 1,000 credits/month | ~40 runs/month | | Apollo.io | 100 requests/day | Add Wait nodes if needed | | OpenAI | Based on usage | Monitor costs (~$0.01-0.05/run) | | Google Sheets | 300 requests/minute | No issues expected | Setup Instructions Step 1: Import Workflow Copy the JSON workflow configuration In n8n: Workflows โ Import from JSON Paste configuration and save Step 2: Configure Scrape.do Sign up at scrape.do Navigate to Dashboard โ API Token Copy your token Token is embedded in URL query parameter (already configured) To customize search: Change the url parameter in "Scrape.do Indeed API" node: q=data+engineer (search term) l=Remote (location) fromage=7 (last 7 days) Step 3: Configure Apollo.io Sign up at apollo.io Go to Settings โ Integrations โ API Keys Create new API key In n8n: Credentials โ Add Credential โ Header Auth Name: x-api-key Value: Your Apollo API key Select this credential in both Apollo HTTP nodes Step 4: Configure OpenAI Go to platform.openai.com Create new API key In n8n: Credentials โ Add Credential โ OpenAI Paste API key Select credential in "Generate Personalized Message" node Step 5: Configure Google Sheets Create new Google Spreadsheet Create two sheets: Sheet 1: "Add New Company" Columns: companyName | jobTitle | jobUrl | location | salary | source | postedDate Sheet 2: "Leads" Columns: First Name | Last Name | Title | Company | LinkedIn URL | Country | Industry | Date Added | Source | Personalized Message Copy Sheet ID from URL In n8n: Credentials โ Add Credential โ Google Sheets OAuth2 Update both Google Sheets nodes with your Sheet ID Step 6: Test and Activate Manual Test: Click "Execute Workflow" button Verify Each Node: Check outputs step by step Review Data: Confirm data appears in Google Sheets Activate: Toggle workflow to "Active" Error Handling Common Issues | Issue | Cause | Solution | |-------|-------|----------| | "Invalid character: " | Empty/malformed company name | Check Parse Indeed Jobs output | | "Node does not have credentials" | Credential not linked | Open node โ Select credential | | Empty Parse Results | Indeed HTML structure changed | Check Scrape.do raw output | | Apollo Rate Limit (429) | Too many requests | Add 5-10s Wait node between calls | | OpenAI Timeout | Too many tokens | Reduce batch size or max_tokens | | "Your request is invalid" | Malformed JSON body | Verify expression syntax in HTTP nodes | Troubleshooting Steps Verify Credentials: Test each credential individually Check Node Outputs: Use "Execute Node" for debugging Monitor API Usage: Check Apollo and OpenAI dashboards Review Logs: Check n8n execution history for details Test with Sample: Use known company name to verify Apollo Recommended Error Handling Additions For production use, consider adding: IF node after Apollo Org Search to handle empty results Error Workflow trigger for notifications Wait nodes between API calls for rate limiting Retry logic for transient failures Performance Specifications | Metric | Value | |--------|-------| | Execution Time | 2-5 minutes per scheduled run | | Jobs Discovered | ~25 per Indeed page | | Leads Generated | 1-3 per company (based on title matches) | | Message Quality | Professional, contextual, <300 chars | | Data Freshness | Real-time from Indeed + Apollo | | Storage Format | Google Sheets (unlimited rows) | API Reference Scrape.do API | Endpoint | Method | Purpose | |----------|--------|---------| | https://api.scrape.do | GET | Direct URL scraping | Documentation: [scrape.do/documentation Apollo.io API | Endpoint | Method | Purpose | |----------|--------|---------| | /v1/organizations/search | POST | Company lookup | | /v1/mixed_people/search | POST | People search | Documentation: apolloio.github.io/apollo-api-docs OpenAI API | Endpoint | Method | Purpose | |----------|--------|---------| | /v1/chat/completions | POST | Message generation | Documentation: [platform.openai.com
by Dr. Firas
๐ฅ Generate UGC Promo Videos with Blotato and Sora 2 for eCommerce ๐งฉ Who is this for? This workflow is perfect for eCommerce brands, content creators, and marketing teams who want to automatically generate short, eye-catching videos from their product images โ without editing software or manual work. ๐ What problem does this workflow solve? Creating engaging promotional videos manually can be time-consuming and expensive. This automation eliminates that friction by combining Blotato, Sora 2, and AI scripting to turn static product images into dynamic UGC-style videos ready for TikTok, Instagram Reels, and YouTube Shorts. โ๏ธ What this workflow does This workflow: Receives a product image directly from Telegram or another input source. Analyzes the image with OpenAI Vision to understand the productโs features and audience. Generates a natural, short UGC-style script using GPT-based AI. Sends the image and script to Sora 2 via the Fal API to generate a vertical promotional video. Monitors the video status every 15 seconds until completion. Downloads or automatically publishes the final video to your social platforms. ๐ง Setup Create a Fal.ai API key and set it in your n8n credentials (Authorization: Key YOUR_FAL_KEY). Connect your Telegram, OpenAI, and HTTP Request nodes as shown in the workflow. Make sure the Build Public Image URL node outputs a valid, public image link. In the HTTP Request node for Sora 2, set: Method: POST URL: https://fal.run/fal-ai/sora-2/image-to-video Headers: Authorization: Key YOUR_FAL_KEY Content-Type: application/json Body: Raw JSON with parameters like prompt, image_url, duration, and aspect_ratio. Run the workflow and monitor the execution logs for your video URL. Blotato โ API key for social media publishing ๐จ How to customize this workflow to your needs ๐งพ Change the video tone: Edit the OpenAI prompt to produce educational, emotional, or luxury-style scripts. ๐ฌ Adjust duration or format: Use Sora 2โs supported durations (4, 8, or 12 seconds) and aspect ratios (e.g., 9:16 for social media). ๐ฒ Auto-publish your videos: Connect the TikTok, Instagram, or YouTube upload nodes for full automation. โจ Add branding: Include overlays, logos, or end screens via CapCut or an external API integration. ๐ฅ Watch This Tutorial ๐ Need help or want to customize this? ๐ฉ Contact: LinkedIn ๐บ YouTube: @DRFIRASS ๐ Workshops: Mes Ateliers n8n ๐ Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube / ๐ Mes Ateliers n8n
by Li CHEN
AWS News Analysis and LinkedIn Automation Pipeline Transform AWS industry news into engaging LinkedIn content with AI-powered analysis and automated approval workflows. Who's it for This template is perfect for: Cloud architects and DevOps engineers** who want to stay current with AWS developments Content creators** looking to automate their AWS news coverage Marketing teams** needing consistent, professional AWS content Technical leaders** who want to share industry insights on LinkedIn AWS consultants** building thought leadership through automated content How it works This workflow creates a comprehensive AWS news analysis and content generation pipeline with two main flows: Flow 1: News Collection and Analysis Scheduled RSS Monitoring: Automatically fetches latest AWS news from the official AWS RSS feed daily at 8 PM AI-Powered Analysis: Uses AWS Bedrock (Claude 3 Sonnet) to analyze each news item, extracting: Professional summary Key themes and keywords Importance rating (Low/Medium/High) Business impact assessment Structured Data Storage: Saves analyzed news to Feishu Bitable with approval status tracking Flow 2: LinkedIn Content Generation Manual Approval Trigger: Feishu automation sends approved news items to the webhook AI Content Creation: AWS Bedrock generates professional LinkedIn posts with: Attention-grabbing headlines Technical insights from a Solutions Architect perspective Business impact analysis Call-to-action engagement Automated Publishing: Posts directly to LinkedIn with relevant hashtags How to set up Prerequisites AWS Bedrock access** with Claude 3 Sonnet model enabled Feishu account** with Bitable access LinkedIn company account** with posting permissions n8n instance** (self-hosted or cloud) Detailed Configuration Steps 1. AWS Bedrock Setup Step 1: Enable Claude 3 Sonnet Model Log into your AWS Console Navigate to AWS Bedrock Go to Model access in the left sidebar Find Anthropic Claude 3 Sonnet and click Request model access Fill out the access request form (usually approved within minutes) Once approved, verify the model appears in your Model access list Step 2: Create IAM User and Credentials Go to IAM Console Click Users โ Create user Name: n8n-bedrock-user Attach policy: AmazonBedrockFullAccess (or create custom policy with minimal permissions) Go to Security credentials tab โ Create access key Choose Application running outside AWS Download the credentials CSV file Step 3: Configure in n8n In n8n, go to Credentials โ Add credential Select AWS credential type Enter your Access Key ID and Secret Access Key Set Region to your preferred AWS region (e.g., us-east-1) Test the connection Useful Links: AWS Bedrock Documentation Claude 3 Sonnet Model Access AWS Bedrock Pricing 2. Feishu Bitable Configuration Step 1: Create Feishu Account and App Sign up at Feishu International Create a new Bitable (multi-dimensional table) Go to Developer Console โ Create App Enable Bitable permissions in your app Generate App Token and App Secret Step 2: Create Bitable Structure Create a new Bitable with these columns: title (Text) pubDate (Date) summary (Long Text) keywords (Multi-select) rating (Single Select: Low, Medium, High) link (URL) approval_status (Single Select: Pending, Approved, Rejected) Get your App Token and Table ID: App Token: Found in app settings Table ID: Found in the Bitable URL (tbl...) Step 3: Set Up Automation In your Bitable, go to Automation โ Create automation Trigger: When field value changes โ Select approval_status field Condition: approval_status equals "Approved" Action: Send HTTP request Method: POST URL: Your n8n webhook URL (from Flow 2) Headers: Content-Type: application/json Body: {{record}} Step 4: Configure Feishu Credentials in n8n Install Feishu Lite community node (self-hosted only) Add Feishu credential with your App Token and App Secret Test the connection Useful Links: Feishu Developer Documentation Bitable API Reference Feishu Automation Guide 3. LinkedIn Company Account Setup Step 1: Create LinkedIn App Go to LinkedIn Developer Portal Click Create App Fill in app details: App name: AWS News Automation LinkedIn Page: Select your company page App logo: Upload your logo Legal agreement: Accept terms Step 2: Configure OAuth2 Settings In your app, go to Auth tab Add redirect URL: https://your-n8n-instance.com/rest/oauth2-credential/callback Request these scopes: w_member_social (Post on behalf of members) r_liteprofile (Read basic profile) r_emailaddress (Read email address) Step 3: Get Company Page Access Go to your LinkedIn Company Page Navigate to Admin tools โ Manage admins Ensure you have Content admin or Super admin role Note your Company Page ID (found in page URL) Step 4: Configure LinkedIn Credentials in n8n Add LinkedIn OAuth2 credential Enter your Client ID and Client Secret Complete OAuth2 flow by clicking Connect my account Select your company page for posting Useful Links: LinkedIn Developer Portal LinkedIn API Documentation LinkedIn OAuth2 Guide 4. Workflow Activation Final Setup Steps: Import the workflow JSON into n8n Configure all credential connections: AWS Bedrock credentials Feishu credentials LinkedIn OAuth2 credentials Update webhook URL in Feishu automation to match your n8n instance Activate the scheduled trigger (daily at 8 PM) Test with manual webhook trigger using sample data Verify Feishu Bitable receives data Test approval workflow and LinkedIn posting Requirements Service Requirements AWS Bedrock** with Claude 3 Sonnet model access AWS account with Bedrock service enabled IAM user with Bedrock permissions Model access approval for Claude 3 Sonnet Feishu Bitable** for news storage and approval workflow Feishu account (International or Lark) Developer app with Bitable permissions Automation capabilities for webhook triggers LinkedIn Company Account** for automated posting LinkedIn company page with admin access LinkedIn Developer app with posting permissions OAuth2 authentication setup n8n community nodes**: Feishu Lite node (self-hosted only) Technical Requirements n8n instance** (self-hosted recommended for community nodes) Webhook endpoint** accessible from Feishu automation Internet connectivity** for API calls and RSS feeds Storage space** for workflow execution logs Cost Considerations AWS Bedrock**: ~$0.01-0.05 per news analysis Feishu**: Free tier available, paid plans for advanced features LinkedIn**: Free API access with rate limits n8n**: Self-hosted (free) or cloud subscription How to customize the workflow Content Customization Modify AI prompts** in the AI Agent nodes to change tone, focus, or target audience Adjust hashtags** in the LinkedIn posting node for different industries Change scheduling** frequency by modifying the Schedule Trigger settings Integration Options Replace LinkedIn** with Twitter/X, Facebook, or other social platforms Add Slack notifications** for approved content before posting Integrate with CRM** systems to track content performance Add content calendar** integration for better planning Advanced Features Multi-language support** by modifying AI prompts for different regions Content categorization** by adding tags for different AWS services Performance tracking** by integrating analytics platforms Team collaboration** by adding approval workflows with multiple reviewers Technical Modifications Change RSS sources** to monitor other AWS blogs or competitor news Adjust AI models** to use different Bedrock models or external APIs Add data validation** nodes for better error handling Implement retry logic** for failed API calls Important Notes Service Limitations This template uses community nodes (Feishu Lite) and requires self-hosted n8n Geo-restrictions** may apply to AWS Bedrock models in certain regions Rate limits** may affect high-frequency posting - adjust scheduling accordingly Content moderation** is recommended before automated posting Cost considerations**: Each AI analysis costs approximately $0.01-0.05 USD per news item Troubleshooting Common Issues AWS Bedrock Issues: Model not found**: Ensure Claude 3 Sonnet access is approved in your region Access denied**: Verify IAM permissions include Bedrock service access Rate limiting**: Implement retry logic or reduce analysis frequency Feishu Integration Issues: Authentication failed**: Check App Token and App Secret are correct Table not found**: Verify Table ID matches your Bitable URL Automation not triggering**: Ensure webhook URL is accessible and returns 200 status LinkedIn Posting Issues: OAuth2 errors**: Re-authenticate LinkedIn credentials Posting failed**: Verify company page admin permissions Rate limits**: LinkedIn has daily posting limits for company pages Security Best Practices Never hardcode credentials** in workflow nodes Use environment variables** for sensitive configuration Regularly rotate API keys** and access tokens Monitor API usage** to prevent unexpected charges Implement error handling** for failed API calls
by Club de Inteligencia Artificial Politรฉcnico CIAP
Telegram Appointment Scheduling Bot with n8n ๐ Description Tired of managing appointments manually? This template transforms your Telegram account into a smart virtual assistant that handles the entire scheduling process for you, 24/7. This workflow allows you to deploy a fully functional Telegram bot that not only schedules appointments but also checks real-time availability in your Google Calendar, logs a history in Google Sheets, and allows your clients to cancel or view their upcoming appointments. It's the perfect solution for professionals, small businesses, or anyone looking to automate their booking system professionally and effortlessly. โจ Key Features Complete Appointment Management:** Allows users to schedule, cancel, and list their future appointments. Conflict Prevention:** Integrates with Google Calendar to check availability before confirming a booking, eliminating the risk of double-booking. Automatic Logging:** Every confirmed appointment is saved to a row in Google Sheets, creating a perfect database for tracking and analysis. Smart Interaction:** The bot handles unrecognized commands and guides the user, ensuring a smooth experience. Easy to Adapt:** Connect your own accounts, customize messages, and tailor it to your business needs in minutes. ๐ Setup Follow these steps to deploy your own instance of this bot: 1. Prerequisites An n8n instance (Cloud or self-hosted). A Telegram account. A Google account. 2. Telegram Bot Talk to @BotFather on Telegram. Create a new bot using /newbot. Give it a name and a username. Copy and save the API token it provides. 3. Google Cloud & APIs Go to the Google Cloud Console. Create a new project. Enable the Google Calendar API and Google Sheets API. Create OAuth 2.0 Client ID credentials. Make sure to add your n8n instance's OAuth redirect URL. Save the Client ID and Client Secret. 4. Google Sheets Create a new spreadsheet in Google Sheets. Define the column headers in the first row. For example: id, Client Name, Date and Time, ISO Date. 5. n8n Import the workflow JSON file into your n8n instance. Set up the credentials: Telegram: Create a new credential and paste your bot's token. Google Calendar & Google Sheets (OAuth2): Create a new credential and paste the Client ID and Client Secret from the Google Cloud Console. Review the Google Calendar and Google Sheets nodes to select your correct calendar and spreadsheet. Activate the workflow! ๐ฌ Usage Once the bot is running, you can interact with it using the following commands in Telegram: To start the bot:** /start To schedule a new appointment:** agendar YYYY-MM-DD HH:MM Your Full Name To cancel an existing appointment:** cancelar YYYY-MM-DD HH:MM Your Full Name To view your future appointments:** mis citas Your Full Name ๐ฅ Authors Jaren Pazmiรฑo President of the Polytechnic Artificial Intelligence Club (CIAP)
by Fahmi Fahreza
Weekly SEO Watchlist Audit to Google Sheets (Gemini + Decodo) Sign up for Decodo HERE for Discount Automatically fetches page content, generates a compact SEO audit (score, issues, fixes), and writes both a per-URL summary and a normalized โAll Issuesโ table to Google Sheetsโgreat for weekly monitoring and prioritization. Whoโs it for? Content/SEO teams that want lightweight, scheduled audits of key pages with actionable next steps and spreadsheet reporting. How it works Weekly trigger loads the Google Sheet of URLs. Split in Batches processes each URL. Decodo fetches page content (markdown + status). Gemini produces a strict JSON audit via the AI Chain + Output Parser. Code nodes flatten data for two tabs. Google Sheets nodes append Summary and All Issues rows. Split in Batches continues to the next URL. How to set up Add credentials for Google Sheets, Decodo, and Gemini. Set sheet_id and Sheet GIDs in the Set node. Ensure input sheet has a URL column. Configure your Google Sheets tabs with proper headers matching each field being appended (e.g., URL, Decodo Score, Priority, etc.). Adjust schedule as needed. Activate the workflow.
by Santhej Kallada
Who is this for? Creators, designers, and developers exploring AI-powered image generation. Automation enthusiasts who want to integrate image creation into n8n workflows. Telegram bot builders looking to add visual AI capabilities. Marketers or freelancers automating creative content workflows. What problem is this workflow solving? Creating AI images usually requires multiple tools and manual setup. This workflow removes the complexity by: Connecting Nano Banana (AI image model) directly to n8n. Allowing image generation via Telegram chatbot. Providing a no-code setup that is fully automated and scalable. What this workflow does This workflow demonstrates how to generate AI images using Nano Banana and n8n, with an integrated Telegram chatbot interface. The process includes: Connecting Gemini Nano Banana to n8n. Automating image generation requests triggered from Telegram. Returning AI-generated images back to the user. Allowing customization of prompts and styles dynamically. By the end, youโll have a fully functional automation to generate and send AI-created images through Telegram โ no coding required. Setup Create accounts: Sign up on n8n.io and ensure you have Telegram Bot API access. Connect your Nano Banana or Gemini API endpoint. Set up your Telegram Bot: Use BotFather to create a new bot and get the token. Add the โTelegram Triggerโ node in n8n. Configure Nano Banana connection: Add an HTTP Request node for Nano Banana API. Insert your API key and prompt parameters. Handle responses: Parse the AI-generated image output. Send the image file back to the Telegram user. Test and Deploy: Run a sample image prompt. Verify that Telegram returns the correct generated image. How to customize this workflow to your needs Modify prompts or styles to fit different artistic use cases. Add conditional logic for image size, aspect ratio, or filters. Integrate with Google Drive or Notion for image storage. Schedule automatic image generation for campaigns or content creation. Expand with OpenAI or Stability AI for hybrid workflows. Notes Nano Banana API may have rate limits depending on usage. Ensure your Telegram bot has permission to send files and images. You can host this workflow on n8n Cloud or self-hosted setups. Want A Video Tutorial on How to Setup This Automation: ย https://youtu.be/0s6ZdU1fjc4
by Growth AI
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Website sitemap generator and visual tree creator Who's it for Web developers, SEO specialists, UX designers, and digital marketers who need to analyze website structure, create visual sitemaps, or audit site architecture for optimization purposes. What it does This workflow automatically generates a comprehensive sitemap from any website URL and creates an organized hierarchical structure in Google Sheets. It follows the website's sitemap to discover all pages, then organizes them by navigation levels (Level 1, Level 2, etc.) with proper parent-child relationships. The output can be further processed to create visual tree diagrams and mind maps. How it works The workflow follows a five-step automation process: URL Input: Accepts website URL via chat interface Site Crawling: Uses Firecrawl to discover all pages following the website's sitemap only Success Validation: Checks if crawling was successful (some sites block external crawlers) Hierarchical Organization: Processes URLs into a structured tree with proper level relationships Google Sheets Export: Creates a formatted spreadsheet with the complete site architecture The system respects robots.txt and follows only sitemap-declared pages to ensure ethical crawling. Requirements Firecrawl API key (for website crawling and sitemap discovery) Google Sheets access Google Drive access (for template duplication) How to set up Step 1: Prepare your template (recommended) It's recommended to create your own copy of the base template: Access the base Google Sheets template Make a copy for your personal use Update the workflow's "Copy template" node with your template's file ID (replace the default ID: 12lV4HwgudgzPPGXKNesIEExbFg09Tuu9gyC_jSS1HjI) This ensures you have control over the template formatting and can customize it as needed Step 2: Configure API credentials Set up the following credentials in n8n: Firecrawl API: For crawling websites and discovering sitemaps Google Sheets OAuth2: For creating and updating spreadsheets Google Drive OAuth2: For duplicating the template file Step 3: Configure Firecrawl settings (optional) The workflow uses optimized Firecrawl settings: ignoreSitemap: false - Respects the website's sitemap sitemapOnly: true - Only crawls URLs listed in sitemap files These settings ensure ethical crawling and faster processing Step 4: Access the workflow The workflow uses a chat trigger interface - no manual configuration needed Simply provide the website URL you want to analyze when prompted How to use the workflow Basic usage Start the chat: Access the workflow via the chat interface Provide URL: Enter the website URL you want to analyze (e.g., "https://example.com") Wait for processing: The system will crawl, organize, and export the data Receive your results: Get an automatic direct clickable link to your generated Google Sheets - no need to search for the file Error handling Invalid URLs: If the provided URL is invalid or the website blocks crawling, you'll receive an immediate error message Graceful failure: The workflow stops without creating unnecessary files when errors occur Common causes: Incorrect URL format, robots.txt restrictions, or site security settings File organization Automatic naming: Generated files follow the pattern "[Website URL] - n8n - Arborescence" Google Drive storage: Files are automatically organized in your Google Drive Instant access: Direct link provided immediately upon completion Advanced processing for visual diagrams Step 1: Copy sitemap data Once your Google Sheets is ready: Copy all the hierarchical data from the generated spreadsheet Prepare it for AI processing Step 2: Generate ASCII tree structure Use any AI model with this prompt: Create a hierarchical tree structure from the following website sitemap data. Return ONLY the tree structure using ASCII tree formatting with โโโ and โโโ characters. Do not include any explanations, comments, or additional text - just the pure tree structure. The tree should start with the root domain and show all pages organized by their hierarchical levels. Use proper indentation to show parent-child relationships. Here is the sitemap data: [PASTE THE SITEMAP DATA HERE] Requirements: Use ASCII tree characters (โโโ โโโ โ) Show clear hierarchical relationships Include all pages from the sitemap Return ONLY the tree structure, no other text Start with the root domain as the top level Step 3: Create visual mind map Visit the Whimsical Diagrams GPT Request a mind map creation using your ASCII tree structure Get a professional visual representation of your website architecture Results interpretation Google Sheets output structure The generated spreadsheet contains: Niv 0 to Niv 5: Hierarchical levels (0 = homepage, 1-5 = navigation depth) URL column: Complete URLs for reference Hyperlinked structure: Clickable links organized by hierarchy Multi-domain support: Handles subdomains and different domain structures Data organization features Automatic sorting: Pages organized by navigation depth and alphabetical order Parent-child relationships: Clear hierarchical structure maintained Domain separation: Main domains and subdomains processed separately Clean formatting: URLs decoded and formatted for readability Workflow limitations Sitemap dependency: Only discovers pages listed in the website's sitemap Crawling restrictions: Some websites may block external crawlers Level depth: Limited to 5 hierarchical levels for clarity Rate limits: Respects Firecrawl API limitations Template dependency: Requires access to the base template for duplication Use cases SEO audits: Analyze site structure for optimization opportunities UX research: Understand navigation patterns and user paths Content strategy: Identify content gaps and organizational issues Site migrations: Document existing structure before redesigns Competitive analysis: Study competitor site architectures Client presentations: Create visual site maps for stakeholder reviews
by Sona Labs
Automatically enrich company records with comprehensive firmographic data by pulling domains from Google Sheets, setting up custom HubSpot fields, enriching through Sona API, and syncing complete profiles to HubSpot CRM with custom property mapping. Import company domains from a Google Sheet, configure custom HubSpot fields for Sona data, automatically enrich domains with detailed firmographic intelligence, and create fully populated company records in HubSpotโso you can build rich prospect databases without manual research. How it works Step 1: Get Company List Reads company domains from your Google Sheet Aggregates all domains into a single array Prepares data for batch processing Step 2: Setup HubSpot Fields Creates custom Sona fields in HubSpot CRM Defines all enrichment data fields needed Ensures proper field mapping for incoming data Step 3: Prepare for Processing Converts aggregated domains into individual items Sets data for batch loop processing Readies each company for enrichment Step 4: Enrich & Sync to HubSpot Loops through each company domain Calls Sona API for enrichment data Creates company in HubSpot with standard fields Formats and updates custom Sona properties Combines firmographics + tech data in one profile Includes 2-second wait between operations for rate limiting What you'll get The workflow enriches each company record with: Firmographic Data**: Company size, employee count, revenue estimates, headquarters location, and founding year Contact Information**: Phone numbers, social media profiles, and timezone details Business Intelligence**: Company descriptions and industry positioning Custom HubSpot Properties**: All Sona data mapped to dedicated custom fields Organized CRM Records**: All data automatically synced to HubSpot for immediate use Domain Tracking**: Companies linked to their websites for future reference Why use this Eliminate manual research**: Save 10-15 minutes per company by automating firmographic lookups Build rich databases**: Transform basic domain lists into comprehensive company profiles Custom field management**: Automatically creates and populates HubSpot custom properties Improve targeting**: Segment and prioritize accounts based on size, location, and other firmographics Keep data current**: Run scheduled enrichments to maintain up-to-date company information Scale your prospecting**: Process hundreds of companies in minutes instead of days Better lead qualification**: Make informed decisions with complete company intelligence Streamlined workflow**: One-click enrichment from spreadsheet to CRM with custom field setup Setup instructions Before you start, you'll need: Google Sheets with a column named website_Domain containing company domains (e.g., example.com) HubSpot Account & App Token - Get an app token by creating a legacy app: Go to HubSpot Settings โ Integrations โ Legacy Apps Click Create Legacy App Select Private (for one account) In the scopes section, enable the following permissions: crm.schemas.companies.write crm.objects.companies.write crm.schemas.companies.read crm.objects.companies.read Click Create Copy the access token from the Auth tab Sona API Key (for company enrichment) Sign up at https://app.sonalabs.com Free tier available for testing Configuration steps: Prepare your data: Create a Google Sheet with a "website_Domain" column and add 2-3 test companies (e.g., example.com, anthropic.com) Connect Google Sheets: In the "Get Company List from Sheet" node, authenticate with Google and select your spreadsheet and sheet name Configure HubSpot field creation: In the "Create Custom HubSpot Fields" node (Step 2), authenticate with your HubSpot access token and review the custom Sona fields that will be created Add Sona credentials: In the "Sona Enrich" node, authenticate with your Sona API key Connect HubSpot for company creation: In the "Create HubSpot Company" and "Update Company with AI Data" nodes, authenticate using your HubSpot access token Test with sample data: Run the workflow with 2-3 test companies and verify: Custom fields are created in HubSpot Company records appear correctly in HubSpot All firmographic data is populated in custom properties Add error handling: Configure notifications for failed enrichments or API errors (optional but recommended) Scale and automate: Process your full company list, then optionally add a Schedule Trigger for automatic daily or weekly enrichment to keep your CRM data fresh
by ฤแป Thร nh Nguyรชn
Automated Facebook Page Story Video Publisher (Google Drive โ Facebook โ Google Sheet) > Recommended: Self-hosted via tino.vn/vps-n8n?affid=388 โ use code VPSN8N for up to 39% off. This workflow is an automated solution for publishing video content from Google Drive to your Facebook Page Stories, while using Google Sheets as a posting queue manager. What This Workflow Does (Workflow Function) This automation orchestrates a complete multi-step process for uploading and publishing videos to Facebook Stories: Queue Management: Every 2 hours and 30 minutes, the workflow checks a Google Sheet (Get Row Sheet node) to find the first video whose Stories column is empty โ meaning it hasnโt been posted yet. Conditional Execution: An If node confirms that the videoโs File ID exists before proceeding. Video Retrieval: Using the File ID, the workflow downloads the video from Google Drive (Google Drive node) and calculates its binary size (Set to the total size in bytes node). Facebook 3-Step Upload: It performs the Facebook Graph APIโs three-step upload process through HTTP Request nodes: Step 1 โ Initialize Session: Starts an upload session and retrieves the upload_url and video_id. Step 2 โ Upload File: Uploads the binary video data to the provided upload_url. Step 3 โ Publish Video: Finalizes and publishes the uploaded video as a Facebook Story. Status Update: Once completed, the workflow updates the same row in Google Sheets (Update upload status in sheet node) using the row_number to mark the video as processed. Prerequisites (What You Need Before Running) 1. n8n Instance > Recommended: Self-hosted via tino.vn/vps-n8n?affid=388 โ use code VPSN8N for up to 39% off. 2. Google Services Google Drive Credentials:** OAuth2 credentials for Google Drive to let n8n download video files. Google Sheets Credentials:** OAuth2 credentials for Google Sheets to read the posting queue and update statuses. Google Sheet:** A spreadsheet (ID: 1RnE5O06l7W6TLCLKkwEH5Oyl-EZ3OE-Uc3OWFbDohYI) containing: File ID โ the videoโs unique ID in Google Drive. Stories โ posting status column (leave empty for pending videos). row_number โ used for updating the correct row after posting. 3. Facebook Setup Page ID:** Your Facebook Page ID (currently hardcoded as 115432036514099 in the info node). Access Token:* A *Page Access Token** with permissions such as pages_manage_posts and pages_read_engagement. This token is hardcoded in the info node and again in Step 3. Post video. Usage Guide and Implementation Notes How to Use Queue Videos: Add video entries to your Google Sheet. Each entry must include a valid Google Drive File ID. Leave the Stories column empty for videos that havenโt been posted. Activate: Save and activate the workflow. The Schedule Trigger will automatically handle new uploads every 2 hours and 30 minutes. Implementation Notes โ ๏ธ Token Security:* Hardcoding your *Access Token* inside the info node is *not recommended**. Tokens expire and expose your Page to risk if leaked. ๐ Action: Replace the static token with a secure Credential setup that supports token rotation. Loop Efficiency:* The *โfalseโ** output of the If node currently loops back to the Get Row Sheet node. This creates unnecessary cycles if no videos are found. ๐ Action: Disconnect that branch so the workflow stops gracefully when no unposted videos remain. Status Updates:* To prevent re-posting the same video, the final Update upload status in sheet node must update the *Stories** column (e.g., write "POSTED"). ๐ Action: Add this mapping explicitly to your Google Sheets node. Automated File ID Sync:** This workflow assumes that the Google Sheet already contains valid File IDs. ๐ You can build a secondary workflow (using Schedule Trigger1 โ Search files and folders โ Append or update row in sheet) to automatically populate new video File IDs from your Google Drive. โ Result Once active, this workflow automatically: pulls pending videos from your Google Sheet, uploads them to Facebook Stories, and marks them as posted โ all without manual intervention.
by WeblineIndia
APK Security Scanner & PDF Report Generator This workflow automatically analyzes any newly uploaded APK file and produces a clean, professional PDF security report. When an APK appears in Google Drive, the workflow downloads it, sends it to MobSF for security scanning, summarizes the results, generates an HTML report using AI, converts it into a PDF via PDF.co and finally saves the PDF back to Google Drive. Quick Start: Fastest Way to Use This Workflow Set up a Google Drive folder for uploading APKs. Install MobSF using Docker and copy your API key. Add credentials for Google Drive, MobSF, OpenAI and PDF.co in n8n. Import the workflow JSON. Update node credentials. Upload an APK to the watched folder and let the automation run. What It Does This workflow provides a complete automated pipeline for analyzing Android APK files. It removes the manual process of scanning apps, extracting security insights, formatting reports and distributing results. Each step is designed to streamline application security checks for development teams, QA engineers and product managers. Once the workflow detects a new APK in Google Drive, it passes the file to MobSF for a detailed static analysis. The workflow extracts the results, transforms them into a clear and well-structured HTML report using AI and then converts the report into a PDF. This ensures the end-user receives a polished audit-ready security document with zero manual involvement. Whoโs It For This workflow is ideal for: Mobile development teams performing security checks on apps. QA and testing teams validating APK builds before release. DevSecOps engineers needing automated, repeatable security audits. Software companies generating compliance and audit documentation. Agencies reviewing client apps for vulnerabilities. Requirements to Use This Workflow An n8n instance (self-hosted or cloud) A Google Drive account with a folder for APK uploads Docker installed to run MobSF locally MobSF API key OpenAI API key PDF.co API key Basic understanding of n8n nodes and credentials setup How It Works & Setup Instructions Step 1 โ Prepare Google Drive Create a folder specifically for APK uploads. Configure the Watch APK Uploads (Google Drive) node to monitor this folder for new files. Step 2 โ Install and Run MobSF Using Docker Install Docker and run: docker run -it --rm -p 8000:8000 \ -v $(pwd)/mobsf:/home/mobsf/.MobSF \ opensecurity/mobile-security-framework-mobsf Open MobSF at http://localhost:8000 and copy your API key. Step 3 โ Add Credentials in n8n Add credentials for: Google Drive MobSF (API key in headers) OpenAI PDF.co Step 4 โ Configure Malware Scanning Upload APK to Analyzer (MobSF Upload API)** sends the file. Start Security Scan (MobSF Scan API)** triggers the vulnerability scan. Step 5 โ Summarize & Generate HTML Report Summarize MobSF Report (JS Code)** extracts key vulnerabilities. Generate HTML Report (GPT Model)** formats them in a structured report. Clean HTML Output (JS Code)** removes escaped characters. Step 6 โ Convert HTML to PDF Use Generate PDF (PDF.co API) to convert the HTML to PDF. Step 7 โ Save Final Report Download using Download Generated PDF, then upload via Upload PDF to Google Drive. How To Customize Nodes Google Drive Trigger:** Change the folder ID to watch a different upload directory. MobSF API Nodes:** Update URLs if MobSF runs on another port or server. AI Report Generator:** Modify prompt instructions to change the writing style or report template. PDF Generation:** Edit margins, page size, or output filename in the PDF.co node. Save Location:** Change Google Drive folder where the final PDF is stored. Add-Ons You can extend this workflow with: Slack or Email Notifications** when a report is ready Automatic naming conventions** (e.g., report-{{date}}-{{app_name}}.pdf) Saving reports into Airtable or Notion** Multi-file batch scanning** VirusTotal scan integration** before generating the PDF Use Case Examples Automated security scanning for every new build generated by CI/CD. Pre-release vulnerability checks for client-delivered APKs. Compliance documentation generation for internal security audits. Bulk scanning of legacy APKs for modernization projects. Creating professional PDF security reports for customers. (Many more use cases can be built using the same workflow foundation.) Troubleshooting Guide | Issue | Possible Cause | Solution | | ----------------------- | -------------------------- | ---------------------------------------------------------- | | MobSF API call fails | Wrong API key or URL | Check MobSF is running and API key is correct. | | PDF not generated | Invalid HTML or PDF.co key | Validate HTML output and verify PDF.co credentials. | | Workflow not triggering | Wrong Google Drive folder | Reconfigure Drive Trigger node with the correct folder ID. | | APK upload fails | File not in binary mode | Ensure HTTP Upload node is using โBinary Dataโ correctly. | | Scan returns empty data | MobSF not fully started | Wait for full MobSF startup logs before scanning. | Need Help? If you need assistance setting up this workflow, customizing it or adding advanced features such as Slack alerts, CI/CD integration or bulk scanning, our n8n workflow development team at WeblineIndia can help. We specialize in building secure, scalable, automation-driven workflows on n8n for businesses of all sizes. Contact us anytime for support or to build custom workflow automation solutions.
by Deborah
Want to learn the basics of n8n? Our comprehensive quick quickstart tutorial is here to guide you through the basics of n8n, step by step. Designed with beginners in mind, this tutorial provides a hands-on approach to learning n8n's basic functionalities.
by Jitesh Dugar
Newsletter Sign-up with Email Verification & Welcome Email Automation ๐ Description A complete, production-ready newsletter automation workflow that validates email addresses, sends personalized welcome emails, and maintains comprehensive logs in Google Sheets. Perfect for marketing teams, content creators, and businesses looking to build high-quality email lists with minimal manual effort. โจ Key Features Email Verification Real-time validation** using Verifi Email API Checks email format (RFC compliance) Verifies domain existence and MX records Detects disposable/temporary email addresses Identifies potential spoofed emails Automated Welcome Emails Personalized HTML emails** with subscriber's first name Beautiful, mobile-responsive design with gradient headers Branded confirmation and unsubscribe links Sent via Gmail (or SMTP) automatically to valid subscribers Smart Data Handling Comprehensive logging** to Google Sheets with three separate tabs Handles incomplete submissions gracefully Preserves original user data throughout verification process Tracks source attribution for multi-channel campaigns Error Management Automatic retry logic on API failures Separate logging for different error types Detailed technical reasons for invalid emails No data loss with direct webhook referencing ๐ฏ Use Cases Newsletter sign-ups** on websites and landing pages Lead generation** forms with quality control Marketing campaigns** requiring verified email lists Community building** with automated onboarding SaaS product launches** with email collection Content creator** audience building E-commerce** customer list management ๐ What Gets Logged Master Log (All Subscribers) Timestamp, name, email, verification result Verification score and email sent status Source tracking, disposable status, domain info Invalid Emails Log Detailed rejection reasons Technical diagnostic information MX record status, RFC compliance Provider information for troubleshooting Invalid Submissions Log Incomplete form data Missing required fields Timestamp for follow-up ๐ง Technical Stack Trigger: Webhook (POST endpoint) Email Verification: Verifi Email API Email Sending: Gmail OAuth2 (or SMTP) Data Storage: Google Sheets (3 tabs) Processing: JavaScript code nodes for data formatting ๐ Setup Requirements Google Account - For Sheets and Gmail integration Verifi Email API Key - (https://verifi.email) Google Sheets - Pre-configured with 3 tabs (template provided) 5-10 minutes - Quick setup with step-by-step instructions included ๐ Benefits โ Improve Email Deliverability - Remove invalid emails before sending campaigns โ Reduce Bounce Rates - Only send to verified, active email addresses โ Save Money - Don't waste email credits on invalid addresses โ Better Analytics - Track conversion rates by source โ Professional Onboarding - Personalized welcome experience โ Scalable Solution - Handles high-volume sign-ups automatically โ Data Quality - Build a clean, high-quality subscriber list ๐จ Customization Options Email Template** - Fully customizable HTML design Verification Threshold** - Adjust score requirements Brand Colors** - Match your company branding Confirmation Flow** - Add double opt-in if desired Multiple Sources** - Track different signup forms Language** - Easily translate email content ๐ฆ What's Included โ Complete n8n workflow JSON (ready to import) โ Google Sheets template structure โ Responsive HTML email template โ Setup documentation with screenshots โ Troubleshooting guide โ Customization examples ๐ Privacy & Compliance GDPR-compliant with unsubscribe links Secure data handling via OAuth2 No data shared with third parties Audit trail in Google Sheets Easy data deletion/export ๐ก Quick Stats 12 Nodes** - Fully automated workflow 3 Data Paths** - Valid, invalid, and incomplete submissions 100% Uptime** - When properly configured Instant Processing** - Real-time email verification Unlimited Scale** - Based on your API limits ๐ Perfect For Marketing Agencies SaaS Companies Content Creators E-commerce Stores Community Platforms Educational Institutions Membership Sites Newsletter Publishers ๐ Why Use This Workflow? Instead of manually verifying emails or dealing with bounce complaints, this workflow automates the entire process from sign-up to welcome email. Save hours of manual work, improve your email deliverability, and create a professional first impression with every new subscriber. Start building a high-quality email list today!