by BluePro
This workflow monitors targeted subreddits for potential sales leads using Reddit’s API, AI content analysis, Supabase, and Google Sheets. It is built specifically to discover posts from Reddit users who may benefit from a particular product or service. It can be easily customized for any market. 🔍 Features Targeted Subreddit Monitoring:** Searches multiple niche subreddits like smallbusiness, startup, sweatystartup, etc., using relevant keywords. AI-Powered Relevance Scoring:** Uses OpenAI GPT to analyze each post and determine if it’s written by someone who might benefit from your product, returning a simple “yes” or “no.” Duplicate Lead Filtering with Supabase:** Ensures you don’t email the same lead more than once by storing already-processed Reddit post IDs in a Supabase table. Content Filtering:** Filters out posts with no body text or no upvotes to ensure only high-quality content is processed. Lead Storage in Google Sheets:** Saves qualified leads into a connected Google Sheet with key data (URL, post content, subreddit, and timestamp). Email Digest Alerts:** Compiles relevant leads and sends a daily digest of matched posts to your team’s inbox for review or outreach. Manual or Scheduled Trigger:** Can be manually triggered or automatically scheduled (via the built-in Schedule Trigger node). ⚙️ Tech Stack Reddit API** – For post discovery OpenAI Chat Model** – For AI-based relevance filtering Supabase** – For lead de-duplication Google Sheets** – For storing lead details Gmail API** – For sending email alerts 🔧 Customization Tips Adjust Audience**: Modify the subreddits and keywords in the initial Code node to match your market. Change the AI Prompt**: Customize the prompt in the “Analysis Content by AI” node to describe your product or service. Search Comments Instead**: To monitor comments instead of posts, change type=link to type=comment in the Reddit Search node. Change Email Recipients**: Edit the Gmail node to direct leads to a different email address or format.
by Yang
🛍️ Pick Best-Value Products from Any Website Using Dumpling AI, GPT-4o, and Google Sheets Who’s it for This workflow is for eCommerce researchers, affiliate marketers, and anyone who needs to compare product listings across sites like Amazon. It’s perfect for quickly identifying top product picks based on delivery speed, free shipping, and price. What it does Just submit a product listing URL. The workflow will crawl it using Dumpling AI, take screenshots of the pages, and pass them to GPT-4o to extract up to 3 best-value picks. It analyzes screenshots visually—no HTML scraping needed. Each result includes: product name price review count free delivery date (if available) How it works 📝 Receives a URL through a web form 🧠 Uses Dumpling AI to crawl the website 📸 Takes screenshots of each product listing 🔍 GPT-4o analyzes each image to pick top products 🔧 A code node parses and flattens the output 📊 Google Sheets stores the result 📧 Sends the spreadsheet link via email Requirements Dumpling AI token** OpenAI key** (GPT-4o) Google Sheet** with columns: product name, price, reviews no., free_delivery_date > You can customize the AI prompt to extract other visual insights (e.g., ratings, specs).
by Luis Hernandez
Overview This comprehensive n8n workflow automates the generation and distribution of detailed monthly technical support reports from GLPI (IT Service Management platform). The workflow intelligently calculates SLA compliance, analyzes technician performance, and delivers professionally formatted HTML reports via email. ✨ Key Features Intelligent SLA Calculation Business Hours Tracking: Automatically calculates resolution time considering only working hours (excludes weekends and lunch breaks) Configurable Schedule: Customizable work hours (default: 8 AM - 12 PM, 1 PM - 6 PM) Dynamic SLA Monitoring: Real-time compliance tracking with configurable thresholds (default: 24 hours) Visual Indicators: Color-coded alerts for critical SLA breaches and high-volume warnings Comprehensive Reporting General Summary: Total cases, open, in-progress, resolved, and closed tickets Performance Metrics: Total and average resolution hours in both decimal and formatted (hours/minutes) display Technician Breakdown: Individual performance analysis per technician including case distribution and SLA compliance Smart Alerts: Automatic warnings for high case volumes (>100 in-progress) and critical SLA levels (<50%) Professional Email Delivery Responsive HTML Design: Mobile-optimized email templates with elegant styling Dynamic Content: Conditional formatting based on performance metrics Automatic Scheduling: Monthly execution on the 6th day to ensure accurate SLA measurement 💼 Business Benefits Time Savings Eliminates Manual Work: Saves 2-4 hours per month previously spent compiling reports manually Automated Data Collection: No more exporting CSVs or copying data between systems One-Click Setup: Configure once and receive reports automatically every month Improved Decision Making Real-Time Insights: Identify bottlenecks and performance issues immediately Technician Accountability: Clear visibility into individual and team performance SLA Compliance Tracking: Proactively manage service level agreements before they become critical Enhanced Communication Stakeholder Ready: Professional reports suitable for management presentations Consistent Format: Standardized metrics ensure month-over-month comparability Instant Distribution: Automatic email delivery to relevant stakeholders 🔧 Technical Specifications Requirements n8n instance (self-hosted or cloud) GLPI server with API access enabled Gmail account (or any SMTP-compatible email service) GLPI API credentials (App-Token and User credentials) Configuration Points Variables Node: Server URL, API tokens, entity name, work hours, SLA limits Schedule Trigger: Monthly execution timing (default: 6th of each month) Email Recipient: Target email address for report delivery Date Range Logic: Automatic previous month calculation Data Processing Retrieves up to 999 tickets per execution (configurable) Filters by entity and date range Excludes weekends and non-business hours from calculations Groups data by technician for detailed analysis 📋 Setup Instructions Prerequisites GLPI Configuration: Enable API and configure the Tickets panel with required fields (ID, -Title, Status, Opening Date, Closing Date, Resolution Date, Priority, Requester, Assigned To) API Credentials: Create Basic Auth credentials in n8n for GLPI API access Email Authentication: Set up Gmail OAuth2 or SMTP credentials in n8n Implementation Steps Import the workflow JSON into your n8n instance Configure the Variables node with your GLPI server details and business hours Set up GLPI API credentials in the HTTP Request nodes Configure email credentials in the Gmail node Update the recipient email address Test the workflow manually before enabling the schedule Activate the workflow for automatic monthly execution 🎯 Use Cases IT Support Teams: Track helpdesk performance and SLA compliance Service Managers: Monitor team productivity and identify training needs Executive Reporting: Provide high-level summaries to stakeholders Resource Planning: Identify workload distribution and capacity issues Compliance Auditing: Maintain historical records of SLA performance 📈 ROI Impact Time Savings: 24-48 hours annually in manual reporting eliminated Error Reduction: Eliminates human calculation errors in SLA tracking Faster Response: Early alerts enable proactive issue resolution Better Visibility: Data-driven insights improve team management
by vinci-king-01
How it works This workflow automatically analyzes website visitors in real-time, enriches their data with company intelligence, and provides lead scoring and sales alerts. Key Steps Webhook Trigger - Receives visitor data from your website tracking system. AI-Powered Company Intelligence - Uses ScrapeGraphAI to extract comprehensive company information from visitor domains. Visitor Enrichment - Combines visitor behavior data with company intelligence to create detailed visitor profiles. Lead Scoring - Automatically scores leads based on company size, industry, engagement, and intent signals. CRM Integration - Updates your CRM with enriched visitor data and lead scores. Sales Alerts - Sends real-time notifications to your sales team for high-priority leads. Set up steps Setup time: 10-15 minutes Configure ScrapeGraphAI credentials - Add your ScrapeGraphAI API key for company intelligence gathering. Set up HubSpot connection - Connect your HubSpot CRM to automatically update contact records. Configure Slack integration - Set up your Slack workspace and specify the sales alert channel. Customize lead scoring criteria - Adjust the scoring algorithm to match your target customer profile. Set up website tracking - Configure your website to send visitor data to the webhook endpoint. Test the workflow - Verify all integrations are working correctly with a test visitor. Key Features Real-time visitor analysis** with company intelligence enrichment Automated lead scoring** based on multiple factors (company size, industry, engagement) Intent signal detection** (pricing interest, demo requests, contact intent) Priority-based sales alerts** with recommended actions CRM integration** for seamless lead management Deal size estimation** based on company characteristics
by David Olusola
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. WordPress to Blotato Social Publisher Overview: This automation monitors your WordPress site for new posts and automatically creates platform-specific social media content using AI, then posts to Twitter, LinkedIn, and Facebook via Blotato. What it does: Monitors WordPress site for new posts every 30 minutes Filters posts published in the last hour to avoid duplicates Processes each new post individually AI generates optimized content for each social platform (Twitter, LinkedIn, Facebook) Extracts platform-specific content from AI response Publishes to all three social media platforms via Blotato API Setup Required: WordPress Connection Configure WordPress credentials in the "Check New Posts" node Enter your WordPress site URL, username, and password/app password Blotato Social Media API Setup Get your Blotato API key from your Blotato account Configure API credentials in the Blotato connection node Map each platform (Twitter, LinkedIn, Facebook) to the correct Blotato channel AI Configuration Set up Google Gemini API credentials Connect the Gemini model to the "AI Social Content Creator" node Customization Options Posting Frequency: Modify schedule trigger (default: every 30 minutes) Content Tone: Adjust AI system message for different writing styles Post Filtering: Change time window in WordPress node (default: last hour) Platform Selection: Remove any social media platforms you don’t want to use Testing Run workflow manually to test connections Verify posts appear correctly on all platforms Monitor for API rate limit issues Features: Platform-optimized content (hashtags, character limits, professional tone) Duplicate prevention system Batch processing for multiple posts Featured image support Customizable posting frequency Customization: Change monitoring frequency Adjust AI prompts for different tones Add/remove social platforms Modify hashtag strategies Need Help? For n8n coaching or one-on-one consultation
by Michael Gullo
Workflow Purpose The workflow is designed to scan submitted URLs using urlscan.io and VirusTotal, combine the results into a single structured summary, and send the report via Telegram. I built this workflow for people who primarily work from their phones and receive a constant stream of emails throughout the day. If a user gets an email asking them to sign a document, review a report, or take any action where the link looks suspicious, they can simply open the Telegram bot and quickly check whether the URL is safe before clicking it. Key Components 1. Input / Trigger Accepts URLs that need to be checked. Initiates requests to VirusTotal and urlscan.io. 2. VirusTotal Scan Always returns results if the URL is reachable. Provides reputation, malicious/clean flags, and scan metadata. 3. urlscan.io Scan Returns details on how the URL behaves when loaded (domains, requests, resources, etc.). Sometimes fails due to blocks or restrictions. 4. Error Handling with Code Node Checks whether urlscan.io responded successfully. Ensures the workflow always produces a summary, even if urlscan.io fails. 5. Summary Generation If both scans succeed → summarize combined findings from VirusTotal + urlscan.io. If urlscan.io fails → state clearly in the summary “urlscan.io scan was blocked/failed. Relying on VirusTotal results.” Ensures user still gets a complete security report. 6. Telegram Output Final formatted summary is delivered to a Telegram chat via the bot. Chat ID issue was fixed after the Code Node restructuring. Outcome The workflow now guarantees a consistent, user-friendly summary regardless of urlscan.io failures. It leverages VirusTotal as the fallback source of truth. The Telegram bot provides real-time alerts with clear indications of scan success/failure. Prequisites Telegram In Telegram, start a chat with @BotFather. Send /newbot, pick a name and a unique username. Copy the HTTP API token BotFather returns (store securely) Start a DM with your bot and send any message. Call getUpdates and read the chat.id urlscan.io Create/log into your urlscan.io account. Go to Settings & API → New API key and generate a key. (Recommended) In Settings & API, set Default Scan Visibility to Unlisted to avoid exposing PII in public scans. Save the key securely (env var or n8n Credentials). Rate limits note: urlscan.io enforces per-minute/hour/day quotas; exceeding them returns HTTP 429. You can view your personal quotas on their dashboard/quotas endpoint Virustotal Sign up / sign in to VirusTotal Community. Open My API key (Profile menu) and copy your Public API key. Store it securely (env var or n8n Credentials). For a more reliable connection with VirusTotal and improved scanning results, enable the Header section in the node settings. Add a header parameter with a clear name (e.g., x-apikey), and then paste your API key into the Value field. Rate limits (Public API): 4 requests/minute, 500/day; not for commercial workflows. Consider Premium if you’ll exceed this. How to Customize the Workflow This workflow is designed to be highly customizable, allowing users to adapt it to their specific needs and use cases. For example, additional malicious website scanners can be integrated through HTTP Request nodes. To make this work, the user simply needs to update the Merge node so that all information flows correctly through the workflow. In addition, users can connect either Gmail or Outlook nodes to automatically test URLs, binary attachments, and other types of information received via email—helping them evaluate data before opening it. Users can also customize how they receive reports. For instance, results can be sent through Telegram (as in the default setup), Slack, Microsoft Teams, or even saved to Google Drive or a Google Sheet for recordkeeping and audit purposes. For consulting and support, or if you have questions, please feel free to connect with me on Linkedin or via email.
by Jitesh Dugar
Revolutionize your recruitment process with intelligent AI-driven candidate screening that evaluates resumes, scores applicants, and automatically routes them based on fit - saving 10-15 hours per week on initial screening. 🎯 What This Workflow Does Transforms your hiring pipeline from manual resume review to intelligent automation: 📝 Captures Applications - Jotform intake with resume upload 🤖 AI Resume Analysis - OpenAI parses skills, experience, education, and red flags 🎯 Intelligent Scoring - Evaluates candidates against job requirements with structured scoring (0-100) 🚦 Smart Routing - Automatically routes based on AI recommendation: Strong Yes (85-100): Instant Slack alert → Interview invitation Maybe/Yes (60-84): Manager review → Approval workflow No (<60): Polite rejection email 📊 Analytics Tracking - All data logged to Google Sheets for hiring insights ✨ Key Features AI Resume Parsing**: Extracts structured data from any resume format Intelligent Scoring System**: Multi-dimensional evaluation (skills match, experience quality, cultural fit) Structured Output**: Consistent JSON schema ensures reliable data for decision-making Automated Communication**: Personalized emails for every candidate outcome Human-in-the-Loop**: Manager approval for borderline candidates Comprehensive Analytics**: Track conversion rates, average scores, and hiring metrics Customizable Job Requirements**: Easy prompt editing to match any role 💼 Perfect For Startups & Scale-ups**: Processing 50+ applications per week HR Teams**: Wanting to reduce time-to-hire by 40-60% Technical Recruiters**: Screening engineering, product, or design roles Growing Companies**: Scaling hiring without scaling headcount 🔧 What You'll Need Required Integrations Jotform** - Application intake form (free tier works) Create your form for free on Jotform using this link OpenAI API** - GPT-4o-mini for cost-effective AI analysis (~$0.15 per candidate) Gmail** - Automated candidate communication Google Sheets** - Hiring database and analytics Optional Integrations Slack** - Instant alerts for hot candidates Linear/Asana** - Task creation for interview scheduling Calendar APIs** - Automated interview booking 🚀 Quick Start Import Template - Copy JSON and import into n8n Create Jotform - Use provided field structure (name, email, resume upload, etc.) Add API Keys - OpenAI, Jotform, Gmail, Google Sheets Customize Job Requirements - Edit AI screening prompt with your role details Personalize Emails - Update templates with your company branding Test & Deploy - Submit test application and verify all nodes 🎨 Customization Options Adjust Scoring Thresholds**: Change routing logic based on your needs Multiple Positions**: Clone workflow for different roles with unique requirements Add Technical Assessments**: Integrate HackerRank, CodeSignal, or custom tests Interview Scheduling**: Connect Calendly or Google Calendar for auto-booking ATS Integration**: Push data to Lever, Greenhouse, or BambooHR Diversity Tracking**: Add demographic fields and analytics Reference Checking**: Automate reference request emails 📈 Expected Results 90% reduction** in manual resume review time 24-hour response time** to all candidates Zero missed applications** - every candidate gets feedback Data-driven hiring** - track what works with comprehensive analytics Better candidate experience** - fast, professional communication Consistent evaluation** - eliminate unconscious bias with structured AI scoring 🏆 Use Cases Technology Companies Screen 100+ engineering applications per week, identify top 10% instantly, schedule interviews same-day. Agencies & Consultancies Evaluate consultant candidates across multiple skill dimensions, route to appropriate practice areas. High-Volume Hiring Process retail, customer service, or sales applications at scale with consistent quality. Remote-First Teams Evaluate global candidates 24/7, respond instantly regardless of timezone. 💡 Pro Tips Train Your AI**: After 50+ applications, refine prompts based on false positives/negatives A/B Test Thresholds**: Experiment with score cutoffs to optimize for your needs Build Talent Pipeline**: Keep "maybe" candidates in CRM for future roles Track Source Effectiveness**: Add UTM parameters to measure which job boards deliver best candidates Continuous Improvement**: Weekly review of AI assessments to calibrate accuracy 🎓 Learning Resources This workflow demonstrates: AI Agents with structured output Multi-stage conditional routing Human-in-the-loop automation Binary data processing (resume files) Email automation with HTML templates Real-time notifications Analytics and data logging Perfect for learning advanced n8n automation patterns! Ready to transform your hiring process? Import this template and start screening candidates intelligently in under 30 minutes. Questions or customization needs? The workflow includes detailed sticky notes explaining each section.
by Rohit Dabra
Jira MCP Server Integration with n8n Overview Transform your Jira project management with the power of AI and automation! This n8n workflow template demonstrates how to create a seamless integration between chat interfaces, AI processing, and Jira Software using MCP (Model Context Protocol) server architecture. What This Workflow Does Chat-Driven Automation**: Trigger Jira operations through simple chat messages AI-Powered Issue Creation**: Automatically generate detailed Jira issues with descriptions and acceptance criteria Complete Jira Management**: Get issue status, changelogs, comments, and perform full CRUD operations Memory Integration**: Maintain context across conversations for smarter automations Zero Manual Entry**: Eliminate repetitive data entry and human errors Key Features ✅ Natural Language Processing: Use Google Gemini to understand and process chat requests ✅ MCP Server Integration: Secure, efficient communication with Jira APIs ✅ Comprehensive Jira Operations: Create, read, update, delete issues and comments ✅ Smart Memory: Context-aware conversations for better automation ✅ Multi-Action Workflow: Handle multiple Jira operations from a single trigger Demo Video 🎥 Watch the Complete Demo: Automate Jira Issue Creation with n8n & AI | MCP Server Integration Prerequisites Before setting up this workflow, ensure you have: n8n instance** (cloud or self-hosted) Jira Software** account with appropriate permissions Google Gemini API** credentials MCP Server** configured and accessible Basic understanding of n8n workflows Setup Guide Step 1: Import the Workflow Copy the workflow JSON from this template In your n8n instance, click Import > From Text Paste the JSON and click Import Step 2: Configure Google Gemini Open the Google Gemini Chat Model node Add your Google Gemini API credentials Configure the model parameters: Model: gemini-pro (recommended) Temperature: 0.7 for balanced creativity Max tokens: As per your requirements Step 3: Set Up MCP Server Connection Configure the MCP Client node: Server URL: Your MCP server endpoint Authentication: Add required credentials Timeout: Set appropriate timeout values Ensure your MCP server supports Jira operations: Issue creation and retrieval Comment management Status updates Changelog access Step 4: Configure Jira Integration Set up Jira credentials in n8n: Go to Credentials > Add Credential Select Jira Software API Add your Jira instance URL, email, and API token Configure each Jira node: Get Issue Status: Set project key and filters Create Issue: Define issue type and required fields Manage Comments: Set permissions and content rules Step 5: Memory Configuration Configure the Simple Memory node: Set memory key for conversation context Define memory retention duration Configure memory scope (user/session level) Step 6: Chat Trigger Setup Configure the When Chat Message Received trigger: Set up webhook URL or chat platform integration Define message filters if needed Test the trigger with sample messages Usage Examples Creating a Jira Issue Chat Input: Can you create an issue in Jira for Login Page with detailed description and acceptance criteria? Expected Output: New Jira issue created with structured description Automatically generated acceptance criteria Proper labeling and categorization Getting Issue Status Chat Input: What's the status of issue PROJ-123? Expected Output: Current issue status Last updated information Assigned user details Managing Comments Chat Input: Add a comment to issue PROJ-123: "Ready for testing in staging environment" Expected Output: Comment added to specified issue Notification sent to relevant team members Customization Options Extending Jira Operations Add more Jira operations (transitions, watchers, attachments) Implement custom field handling Create multi-project workflows AI Enhancement Fine-tune Gemini prompts for better issue descriptions Add custom validation rules Implement approval workflows Integration Expansion Connect to Slack, Discord, or Teams Add email notifications Integrate with time tracking tools Troubleshooting Common Issues MCP Server Connection Failed Verify server URL and credentials Check network connectivity Ensure MCP server is running and accessible Jira API Errors Validate Jira credentials and permissions Check project access rights Verify issue type and field configurations AI Response Issues Review Gemini API quotas and limits Adjust prompt engineering for better results Check model parameters and settings Performance Tips Optimize memory usage for long conversations Implement rate limiting for API calls Use error handling and retry mechanisms Monitor workflow execution times Best Practices Security: Store all credentials securely using n8n's credential system Testing: Test each node individually before running the complete workflow Monitoring: Set up alerts for workflow failures and API limits Documentation: Keep track of custom configurations and modifications Backup: Regular backup of workflow configurations and credentials Happy Automating! 🚀 This workflow template is designed to boost productivity and eliminate manual Jira management tasks. Customize it according to your team's specific needs and processes.
by Li CHEN
AWS News Analysis and LinkedIn Automation Pipeline Transform AWS industry news into engaging LinkedIn content with AI-powered analysis and automated approval workflows. Who's it for This template is perfect for: Cloud architects and DevOps engineers** who want to stay current with AWS developments Content creators** looking to automate their AWS news coverage Marketing teams** needing consistent, professional AWS content Technical leaders** who want to share industry insights on LinkedIn AWS consultants** building thought leadership through automated content How it works This workflow creates a comprehensive AWS news analysis and content generation pipeline with two main flows: Flow 1: News Collection and Analysis Scheduled RSS Monitoring: Automatically fetches latest AWS news from the official AWS RSS feed daily at 8 PM AI-Powered Analysis: Uses AWS Bedrock (Claude 3 Sonnet) to analyze each news item, extracting: Professional summary Key themes and keywords Importance rating (Low/Medium/High) Business impact assessment Structured Data Storage: Saves analyzed news to Feishu Bitable with approval status tracking Flow 2: LinkedIn Content Generation Manual Approval Trigger: Feishu automation sends approved news items to the webhook AI Content Creation: AWS Bedrock generates professional LinkedIn posts with: Attention-grabbing headlines Technical insights from a Solutions Architect perspective Business impact analysis Call-to-action engagement Automated Publishing: Posts directly to LinkedIn with relevant hashtags How to set up Prerequisites AWS Bedrock access** with Claude 3 Sonnet model enabled Feishu account** with Bitable access LinkedIn company account** with posting permissions n8n instance** (self-hosted or cloud) Detailed Configuration Steps 1. AWS Bedrock Setup Step 1: Enable Claude 3 Sonnet Model Log into your AWS Console Navigate to AWS Bedrock Go to Model access in the left sidebar Find Anthropic Claude 3 Sonnet and click Request model access Fill out the access request form (usually approved within minutes) Once approved, verify the model appears in your Model access list Step 2: Create IAM User and Credentials Go to IAM Console Click Users → Create user Name: n8n-bedrock-user Attach policy: AmazonBedrockFullAccess (or create custom policy with minimal permissions) Go to Security credentials tab → Create access key Choose Application running outside AWS Download the credentials CSV file Step 3: Configure in n8n In n8n, go to Credentials → Add credential Select AWS credential type Enter your Access Key ID and Secret Access Key Set Region to your preferred AWS region (e.g., us-east-1) Test the connection Useful Links: AWS Bedrock Documentation Claude 3 Sonnet Model Access AWS Bedrock Pricing 2. Feishu Bitable Configuration Step 1: Create Feishu Account and App Sign up at Feishu International Create a new Bitable (multi-dimensional table) Go to Developer Console → Create App Enable Bitable permissions in your app Generate App Token and App Secret Step 2: Create Bitable Structure Create a new Bitable with these columns: title (Text) pubDate (Date) summary (Long Text) keywords (Multi-select) rating (Single Select: Low, Medium, High) link (URL) approval_status (Single Select: Pending, Approved, Rejected) Get your App Token and Table ID: App Token: Found in app settings Table ID: Found in the Bitable URL (tbl...) Step 3: Set Up Automation In your Bitable, go to Automation → Create automation Trigger: When field value changes → Select approval_status field Condition: approval_status equals "Approved" Action: Send HTTP request Method: POST URL: Your n8n webhook URL (from Flow 2) Headers: Content-Type: application/json Body: {{record}} Step 4: Configure Feishu Credentials in n8n Install Feishu Lite community node (self-hosted only) Add Feishu credential with your App Token and App Secret Test the connection Useful Links: Feishu Developer Documentation Bitable API Reference Feishu Automation Guide 3. LinkedIn Company Account Setup Step 1: Create LinkedIn App Go to LinkedIn Developer Portal Click Create App Fill in app details: App name: AWS News Automation LinkedIn Page: Select your company page App logo: Upload your logo Legal agreement: Accept terms Step 2: Configure OAuth2 Settings In your app, go to Auth tab Add redirect URL: https://your-n8n-instance.com/rest/oauth2-credential/callback Request these scopes: w_member_social (Post on behalf of members) r_liteprofile (Read basic profile) r_emailaddress (Read email address) Step 3: Get Company Page Access Go to your LinkedIn Company Page Navigate to Admin tools → Manage admins Ensure you have Content admin or Super admin role Note your Company Page ID (found in page URL) Step 4: Configure LinkedIn Credentials in n8n Add LinkedIn OAuth2 credential Enter your Client ID and Client Secret Complete OAuth2 flow by clicking Connect my account Select your company page for posting Useful Links: LinkedIn Developer Portal LinkedIn API Documentation LinkedIn OAuth2 Guide 4. Workflow Activation Final Setup Steps: Import the workflow JSON into n8n Configure all credential connections: AWS Bedrock credentials Feishu credentials LinkedIn OAuth2 credentials Update webhook URL in Feishu automation to match your n8n instance Activate the scheduled trigger (daily at 8 PM) Test with manual webhook trigger using sample data Verify Feishu Bitable receives data Test approval workflow and LinkedIn posting Requirements Service Requirements AWS Bedrock** with Claude 3 Sonnet model access AWS account with Bedrock service enabled IAM user with Bedrock permissions Model access approval for Claude 3 Sonnet Feishu Bitable** for news storage and approval workflow Feishu account (International or Lark) Developer app with Bitable permissions Automation capabilities for webhook triggers LinkedIn Company Account** for automated posting LinkedIn company page with admin access LinkedIn Developer app with posting permissions OAuth2 authentication setup n8n community nodes**: Feishu Lite node (self-hosted only) Technical Requirements n8n instance** (self-hosted recommended for community nodes) Webhook endpoint** accessible from Feishu automation Internet connectivity** for API calls and RSS feeds Storage space** for workflow execution logs Cost Considerations AWS Bedrock**: ~$0.01-0.05 per news analysis Feishu**: Free tier available, paid plans for advanced features LinkedIn**: Free API access with rate limits n8n**: Self-hosted (free) or cloud subscription How to customize the workflow Content Customization Modify AI prompts** in the AI Agent nodes to change tone, focus, or target audience Adjust hashtags** in the LinkedIn posting node for different industries Change scheduling** frequency by modifying the Schedule Trigger settings Integration Options Replace LinkedIn** with Twitter/X, Facebook, or other social platforms Add Slack notifications** for approved content before posting Integrate with CRM** systems to track content performance Add content calendar** integration for better planning Advanced Features Multi-language support** by modifying AI prompts for different regions Content categorization** by adding tags for different AWS services Performance tracking** by integrating analytics platforms Team collaboration** by adding approval workflows with multiple reviewers Technical Modifications Change RSS sources** to monitor other AWS blogs or competitor news Adjust AI models** to use different Bedrock models or external APIs Add data validation** nodes for better error handling Implement retry logic** for failed API calls Important Notes Service Limitations This template uses community nodes (Feishu Lite) and requires self-hosted n8n Geo-restrictions** may apply to AWS Bedrock models in certain regions Rate limits** may affect high-frequency posting - adjust scheduling accordingly Content moderation** is recommended before automated posting Cost considerations**: Each AI analysis costs approximately $0.01-0.05 USD per news item Troubleshooting Common Issues AWS Bedrock Issues: Model not found**: Ensure Claude 3 Sonnet access is approved in your region Access denied**: Verify IAM permissions include Bedrock service access Rate limiting**: Implement retry logic or reduce analysis frequency Feishu Integration Issues: Authentication failed**: Check App Token and App Secret are correct Table not found**: Verify Table ID matches your Bitable URL Automation not triggering**: Ensure webhook URL is accessible and returns 200 status LinkedIn Posting Issues: OAuth2 errors**: Re-authenticate LinkedIn credentials Posting failed**: Verify company page admin permissions Rate limits**: LinkedIn has daily posting limits for company pages Security Best Practices Never hardcode credentials** in workflow nodes Use environment variables** for sensitive configuration Regularly rotate API keys** and access tokens Monitor API usage** to prevent unexpected charges Implement error handling** for failed API calls
by n8n Automation Expert | Template Creator | 2+ Years Experience
🚀 Transform Your Job Hunt with AI-Powered Telegram Bot Turn job searching into a conversational experience! This intelligent Telegram bot automatically scrapes job postings from LinkedIn, Indeed, and Monster, filters for sales & marketing positions, and delivers personalized results directly to your chat. ✨ Key Features Interactive Telegram Commands**: Simple /jobs [keyword] [location] searches Multi-Platform Scraping**: Simultaneous data collection from 3 major job boards AI-Powered Filtering**: Smart relevance detection and experience level classification Real-Time Notifications**: Instant job alerts delivered to Telegram Automated Data Storage**: Saves results to Google Sheets and Airtable Duplicate Removal**: Advanced deduplication across platforms Mobile-First Experience**: Full job search functionality through Telegram 🎯 Perfect For Sales Professionals**: Account managers, sales representatives, business development Marketing Experts**: Digital marketers, marketing managers, growth specialists Recruiters**: Streamlined candidate sourcing and job market analysis Job Seekers**: Hands-free job discovery with instant notifications 🛠️ Setup Requirements Required Credentials: Telegram Bot Token**: Create bot via @BotFather Bright Data API**: Professional web scraping service (LinkedIn/Indeed datasets) Google Sheets OAuth2**: For spreadsheet integration Airtable Token**: Database storage and management Prerequisites: n8n instance with HTTPS enabled (required for Telegram webhooks) Valid domain name with SSL certificate Basic understanding of Telegram bot commands 🔧 How It Works User Experience: Send /start to activate the bot and see available commands Use /jobs sales manager New York to search for specific positions Receive formatted job results instantly in Telegram Click "Apply Now" links to go directly to job postings All jobs automatically saved to your connected spreadsheets Behind the Scenes: Command Processing: Bot parses user input for keywords and location Parallel Scraping: Simultaneous API calls to LinkedIn, Indeed, and Monster AI Processing: Intelligent filtering, experience level detection, remote work identification Data Enhancement: Salary extraction, duplicate removal, relevance scoring Multi-Format Storage: Automatic saving to Google Sheets, Airtable, and JSON export Real-Time Response: Formatted results delivered back to Telegram chat 🎨 Telegram Bot Commands /start - Welcome message and command overview /jobs [keyword] [location] - Search for jobs (e.g., /jobs marketing manager remote) /help - Show detailed help information /status - Check bot status and recent activity 📊 Sample Output The bot delivers beautifully formatted job results: 🎯 Job Search Results 🎯 Found 7 relevant opportunities Platforms: linkedin, indeed, monster Remote jobs: 3 ─────────────────── 💼 Senior Sales Manager 🏢 TechCorp Industries 📍 New York, NY 💰 $80,000 - $120,000 🌐 Remote Available 📊 senior level 🔗 Apply Now 🔒 Security & Best Practices Rate Limiting**: Built-in Telegram API compliance (30 requests/second) Error Handling**: Graceful failure recovery with user-friendly messages Input Validation**: Sanitized user input to prevent injection attacks Credential Management**: Secure API key storage using n8n credentials system HTTPS Enforcement**: Required for production Telegram webhook integration 📈 Benefits & ROI 95% Time Reduction**: Automated job discovery vs manual searching Multi-Source Coverage**: Access 3 major job platforms simultaneously Mobile Accessibility**: Search jobs anywhere using Telegram mobile app Real-Time Alerts**: Never miss new opportunities with instant notifications Data Organization**: Automatic spreadsheet management for job tracking Market Intelligence**: Comprehensive job market analysis and trends 🚀 Advanced Customization Custom Keywords**: Modify filtering logic for specific industries Location Targeting**: Adjust geographic search parameters Experience Levels**: Fine-tune senior/mid/entry level detection Additional Platforms**: Easily add more job boards via HTTP requests Notification Scheduling**: Set up periodic automated job alerts Team Integration**: Deploy for multiple users or team channels 💡 Use Cases Individual Job Seekers**: Personal job hunting assistant Recruitment Agencies**: Streamlined candidate sourcing Sales Teams**: Territory-specific opportunity monitoring Marketing Departments**: Industry trend analysis and competitor tracking Career Coaches**: Client job market research and opportunity identification Ready to revolutionize your job search? Deploy this workflow and start receiving personalized job opportunities directly in Telegram!
by vinci-king-01
Sales Pipeline Automation Dashboard with AI Lead Intelligence 🎯 Target Audience Sales managers and team leads Business development representatives Marketing teams managing lead generation CRM administrators and sales operations Account executives and sales representatives Sales enablement professionals Revenue operations (RevOps) teams 🚀 Problem Statement Manual lead qualification and sales pipeline management is inefficient and often leads to missed opportunities or poor lead prioritization. This template solves the challenge of automatically scoring, qualifying, and routing leads using AI-powered intelligence to maximize conversion rates and sales team productivity. 🔧 How it Works This workflow automatically processes new leads using AI-powered intelligence, scores and qualifies them based on multiple factors, and automates the entire sales pipeline from lead capture to deal creation. Key Components Dual Trigger System - Scheduled monitoring and webhook triggers for real-time lead processing AI-Powered Lead Intelligence - Advanced scoring algorithm based on 7 key factors Multi-Source Data Enrichment - LinkedIn and Crunchbase integration for comprehensive lead profiles Automated Sales Actions - Intelligent routing, task creation, and follow-up sequences Multi-Platform Integration - HubSpot CRM, Slack notifications, and Google Sheets dashboard 📊 Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | timestamp | DateTime | When the lead was processed | "2024-01-15T10:30:00Z" | | lead_id | String | Unique lead identifier | "LEAD-2024-001234" | | first_name | String | Lead's first name | "John" | | last_name | String | Lead's last name | "Smith" | | email | String | Lead's email address | "john@company.com" | | company_name | String | Company name | "Acme Corp" | | job_title | String | Lead's job title | "Marketing Director" | | lead_score | Number | AI-calculated score (0-100) | 85 | | grade | String | Lead grade (A+, A, B+, B, C+) | "A+" | | category | String | Lead category | "Enterprise" | | priority | String | Priority level | "Critical" | | lead_source | String | How the lead was acquired | "Website Form" | | assigned_rep | String | Assigned sales representative | "Senior AE" | | company_size | String | Company employee count | "201-500 employees" | | industry | String | Company industry | "Technology" | | funding_stage | String | Company funding stage | "Series B" | | estimated_value | String | Estimated deal value | "$50K-100K" | 🛠️ Setup Instructions Estimated setup time: 25-30 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials HubSpot CRM account with API access Google Sheets account with API access Slack workspace for notifications (optional) Email service for welcome emails (optional) Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai npm install n8n-nodes-slack 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up HubSpot CRM Integration Add HubSpot API credentials Grant necessary permissions for contacts, deals, and tasks Configure custom properties for lead scoring and qualification Test the connection to ensure it's working 4. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for sales pipeline data Configure the sheet name (default: "Sales Pipeline") 5. Configure Lead Scoring Parameters Update the lead scoring weights in the Code node Customize ideal customer profile criteria Set automation trigger thresholds Adjust sales rep assignment logic 6. Set up Notification Channels Configure Slack webhook or API credentials Set up email service credentials for welcome emails Define notification preferences for different lead grades Test notification delivery 7. Configure Triggers Set up webhook endpoint for real-time lead capture Configure scheduled trigger for periodic monitoring Choose appropriate time zones for your business hours Test both trigger mechanisms 8. Test and Validate Run the workflow manually with sample lead data Check HubSpot for proper contact and deal creation Verify Google Sheets data formatting Test all notification channels 🔄 Workflow Customization Options Modify Lead Scoring Algorithm Adjust scoring weights for different factors Add new scoring criteria (geographic location, technology stack, etc.) Customize ideal customer profile parameters Implement industry-specific scoring models Extend Data Enrichment Add more data sources (ZoomInfo, Apollo, etc.) Include social media presence analysis Add technographic data collection Implement intent signal detection Customize Sales Automation Modify follow-up sequences for different lead categories Add more sophisticated sales rep assignment logic Implement territory-based routing Add automated meeting scheduling Output Customization Add data visualization and reporting features Implement sales pipeline analytics Create executive dashboards with key metrics Add conversion rate tracking and analysis 📈 Use Cases Lead Qualification**: Automatically score and qualify incoming leads Sales Pipeline Management**: Streamline the entire sales process Lead Routing**: Intelligently assign leads to appropriate sales reps Follow-up Automation**: Ensure consistent and timely follow-up Sales Intelligence**: Provide comprehensive lead insights Performance Tracking**: Monitor sales team and pipeline performance 🚨 Important Notes Respect LinkedIn and Crunchbase terms of service and rate limits Implement appropriate delays between requests to avoid rate limiting Regularly review and update your lead scoring parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Ensure GDPR compliance for lead data processing 🔧 Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status HubSpot API errors: Check API key and permissions Google Sheets permission errors: Check OAuth2 scope and permissions Lead scoring errors: Review the Code node's JavaScript logic Rate limiting: Adjust request frequency and implement delays Support Resources: ScrapeGraphAI documentation and API reference HubSpot API documentation and developer resources n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Sales automation best practices and guidelines
by inderjeet Bhambra
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works? The Content Strategy AI Pipeline is an intelligent, multi-stage content creation system that transforms simple user prompts into polished, ready-to-publish content. The system intelligently extracts platform requirements, audience insights, and brand tone from user requests, then develops strategic reasoning and emotional connection strategies before crafting compelling content outlines and final publication-ready posts or articles. Supporting both social media platforms (Instagram, LinkedIn, X, Facebook, TikTok) and blog content. Key Differentiators: Strategic thinking approach, emotional intelligence integration, platform-native optimization, zero-editing-required output, and professional content strategist-level quality through multi-model AI orchestration. Technical Points Multi-model AI orchestration for specialized tasks Emotional psychology integration for audience connection Platform algorithm optimization built-in Industry-standard content strategy methodology automated Enterprise-grade reliability with session management and memory API-ready architecture for integration into existing workflows Test Inputs Sample Request: "Create an Instagram post for a fitness coach targeting busy moms, tone should be motivational and relatable" Expected Flow: Platform: Instagram → Niche: Fitness → Audience: Busy Moms → Tone: Motivational → Output: 125-150 word post with hashtags `