by Shahzaib Anwar
📌 Overview This workflow automatically processes incoming Shopify/Gmail leads and pushes them into HubSpot as both Contacts and Deals. It helps sales and marketing teams capture leads instantly, enrich CRM data, and avoid missed opportunities. ⚡ How it works Trigger: Watches for new emails in Gmail. Extract Data: Parses email body (Name, Email, City, Phone, Message, Product URL/Title). Condition: Checks if sender is Shopify before processing. HubSpot: Creates/updates a Contact with customer details. Creates a Deal associated with that contact. 🎯 Benefits 📥 Automates lead capture → CRM 🚫 Eliminates manual copy-paste from Gmail 🔄 Real-time sync between Gmail and HubSpot 📈 Improves sales follow-up speed and accuracy 🛠 Setup Steps Import this workflow into your n8n instance. Connect your Gmail and HubSpot credentials. Replace the HubSpot Deal Stage ID with your own pipeline stage. (Optional) Adjust the Code Node regex if your email format differs. Activate the workflow and test with a sample lead email. 📝 Example Email Format Name: John Doe Email: john@example.com City: London Phone: +44 7000 000000 Body: Interested in product Product Url: https://example.com/product Product Title: Sample Product sticky_notes: name: Gmail Trigger note: > 📧 Watches for new emails in Gmail. Polls every minute and passes email data into the flow. name: Get a Message note: > 📩 Fetches the full Gmail message content (body + metadata) for parsing. name: Extract From Email note: > 🔍 Extracts the sender’s email address from Gmail to identify the source. name: If Sender is Shopify note: > ✅ Condition node that ensures only Shopify-originated emails/leads are processed. name: Code Node (Regex Parser) note: > 🧾 Parses the email body using regex to extract Name, Email, City, Phone, Message, Product URL, and Title. name: Edit Fields (Set Node) note: > 📝 Cleans and structures the extracted fields into proper JSON format before sending to HubSpot. name: HubSpot → Create/Update Contact note: > 👤 Creates or updates a HubSpot Contact with the extracted lead details. name: HubSpot → Create Deal note: > 💼 Creates a HubSpot Deal linked to the Contact, including campaign/product information.
by Dr. Firas
💥 Generate UGC Promo Videos with Blotato and Sora 2 for eCommerce 🧩 Who is this for? This workflow is perfect for eCommerce brands, content creators, and marketing teams who want to automatically generate short, eye-catching videos from their product images — without editing software or manual work. 🚀 What problem does this workflow solve? Creating engaging promotional videos manually can be time-consuming and expensive. This automation eliminates that friction by combining Blotato, Sora 2, and AI scripting to turn static product images into dynamic UGC-style videos ready for TikTok, Instagram Reels, and YouTube Shorts. ⚙️ What this workflow does This workflow: Receives a product image directly from Telegram or another input source. Analyzes the image with OpenAI Vision to understand the product’s features and audience. Generates a natural, short UGC-style script using GPT-based AI. Sends the image and script to Sora 2 via the Fal API to generate a vertical promotional video. Monitors the video status every 15 seconds until completion. Downloads or automatically publishes the final video to your social platforms. 🧠 Setup Create a Fal.ai API key and set it in your n8n credentials (Authorization: Key YOUR_FAL_KEY). Connect your Telegram, OpenAI, and HTTP Request nodes as shown in the workflow. Make sure the Build Public Image URL node outputs a valid, public image link. In the HTTP Request node for Sora 2, set: Method: POST URL: https://fal.run/fal-ai/sora-2/image-to-video Headers: Authorization: Key YOUR_FAL_KEY Content-Type: application/json Body: Raw JSON with parameters like prompt, image_url, duration, and aspect_ratio. Run the workflow and monitor the execution logs for your video URL. Blotato → API key for social media publishing 🎨 How to customize this workflow to your needs 🧾 Change the video tone: Edit the OpenAI prompt to produce educational, emotional, or luxury-style scripts. 🎬 Adjust duration or format: Use Sora 2’s supported durations (4, 8, or 12 seconds) and aspect ratios (e.g., 9:16 for social media). 📲 Auto-publish your videos: Connect the TikTok, Instagram, or YouTube upload nodes for full automation. ✨ Add branding: Include overlays, logos, or end screens via CapCut or an external API integration. 🎥 Watch This Tutorial 👋 Need help or want to customize this? 📩 Contact: LinkedIn 📺 YouTube: @DRFIRASS 🚀 Workshops: Mes Ateliers n8n 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube / 🚀 Mes Ateliers n8n
by vinci-king-01
Sales Pipeline Automation Dashboard with AI Lead Intelligence 🎯 Target Audience Sales managers and team leads Business development representatives Marketing teams managing lead generation CRM administrators and sales operations Account executives and sales representatives Sales enablement professionals Revenue operations (RevOps) teams 🚀 Problem Statement Manual lead qualification and sales pipeline management is inefficient and often leads to missed opportunities or poor lead prioritization. This template solves the challenge of automatically scoring, qualifying, and routing leads using AI-powered intelligence to maximize conversion rates and sales team productivity. 🔧 How it Works This workflow automatically processes new leads using AI-powered intelligence, scores and qualifies them based on multiple factors, and automates the entire sales pipeline from lead capture to deal creation. Key Components Dual Trigger System - Scheduled monitoring and webhook triggers for real-time lead processing AI-Powered Lead Intelligence - Advanced scoring algorithm based on 7 key factors Multi-Source Data Enrichment - LinkedIn and Crunchbase integration for comprehensive lead profiles Automated Sales Actions - Intelligent routing, task creation, and follow-up sequences Multi-Platform Integration - HubSpot CRM, Slack notifications, and Google Sheets dashboard 📊 Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | timestamp | DateTime | When the lead was processed | "2024-01-15T10:30:00Z" | | lead_id | String | Unique lead identifier | "LEAD-2024-001234" | | first_name | String | Lead's first name | "John" | | last_name | String | Lead's last name | "Smith" | | email | String | Lead's email address | "john@company.com" | | company_name | String | Company name | "Acme Corp" | | job_title | String | Lead's job title | "Marketing Director" | | lead_score | Number | AI-calculated score (0-100) | 85 | | grade | String | Lead grade (A+, A, B+, B, C+) | "A+" | | category | String | Lead category | "Enterprise" | | priority | String | Priority level | "Critical" | | lead_source | String | How the lead was acquired | "Website Form" | | assigned_rep | String | Assigned sales representative | "Senior AE" | | company_size | String | Company employee count | "201-500 employees" | | industry | String | Company industry | "Technology" | | funding_stage | String | Company funding stage | "Series B" | | estimated_value | String | Estimated deal value | "$50K-100K" | 🛠️ Setup Instructions Estimated setup time: 25-30 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials HubSpot CRM account with API access Google Sheets account with API access Slack workspace for notifications (optional) Email service for welcome emails (optional) Step-by-Step Configuration 1. Install Community Nodes Install required community nodes npm install n8n-nodes-scrapegraphai npm install n8n-nodes-slack 2. Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working 3. Set up HubSpot CRM Integration Add HubSpot API credentials Grant necessary permissions for contacts, deals, and tasks Configure custom properties for lead scoring and qualification Test the connection to ensure it's working 4. Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for sales pipeline data Configure the sheet name (default: "Sales Pipeline") 5. Configure Lead Scoring Parameters Update the lead scoring weights in the Code node Customize ideal customer profile criteria Set automation trigger thresholds Adjust sales rep assignment logic 6. Set up Notification Channels Configure Slack webhook or API credentials Set up email service credentials for welcome emails Define notification preferences for different lead grades Test notification delivery 7. Configure Triggers Set up webhook endpoint for real-time lead capture Configure scheduled trigger for periodic monitoring Choose appropriate time zones for your business hours Test both trigger mechanisms 8. Test and Validate Run the workflow manually with sample lead data Check HubSpot for proper contact and deal creation Verify Google Sheets data formatting Test all notification channels 🔄 Workflow Customization Options Modify Lead Scoring Algorithm Adjust scoring weights for different factors Add new scoring criteria (geographic location, technology stack, etc.) Customize ideal customer profile parameters Implement industry-specific scoring models Extend Data Enrichment Add more data sources (ZoomInfo, Apollo, etc.) Include social media presence analysis Add technographic data collection Implement intent signal detection Customize Sales Automation Modify follow-up sequences for different lead categories Add more sophisticated sales rep assignment logic Implement territory-based routing Add automated meeting scheduling Output Customization Add data visualization and reporting features Implement sales pipeline analytics Create executive dashboards with key metrics Add conversion rate tracking and analysis 📈 Use Cases Lead Qualification**: Automatically score and qualify incoming leads Sales Pipeline Management**: Streamline the entire sales process Lead Routing**: Intelligently assign leads to appropriate sales reps Follow-up Automation**: Ensure consistent and timely follow-up Sales Intelligence**: Provide comprehensive lead insights Performance Tracking**: Monitor sales team and pipeline performance 🚨 Important Notes Respect LinkedIn and Crunchbase terms of service and rate limits Implement appropriate delays between requests to avoid rate limiting Regularly review and update your lead scoring parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Ensure GDPR compliance for lead data processing 🔧 Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status HubSpot API errors: Check API key and permissions Google Sheets permission errors: Check OAuth2 scope and permissions Lead scoring errors: Review the Code node's JavaScript logic Rate limiting: Adjust request frequency and implement delays Support Resources: ScrapeGraphAI documentation and API reference HubSpot API documentation and developer resources n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Sales automation best practices and guidelines
by n8n Automation Expert | Template Creator | 2+ Years Experience
🚀 Transform Your Job Hunt with AI-Powered Telegram Bot Turn job searching into a conversational experience! This intelligent Telegram bot automatically scrapes job postings from LinkedIn, Indeed, and Monster, filters for sales & marketing positions, and delivers personalized results directly to your chat. ✨ Key Features Interactive Telegram Commands**: Simple /jobs [keyword] [location] searches Multi-Platform Scraping**: Simultaneous data collection from 3 major job boards AI-Powered Filtering**: Smart relevance detection and experience level classification Real-Time Notifications**: Instant job alerts delivered to Telegram Automated Data Storage**: Saves results to Google Sheets and Airtable Duplicate Removal**: Advanced deduplication across platforms Mobile-First Experience**: Full job search functionality through Telegram 🎯 Perfect For Sales Professionals**: Account managers, sales representatives, business development Marketing Experts**: Digital marketers, marketing managers, growth specialists Recruiters**: Streamlined candidate sourcing and job market analysis Job Seekers**: Hands-free job discovery with instant notifications 🛠️ Setup Requirements Required Credentials: Telegram Bot Token**: Create bot via @BotFather Bright Data API**: Professional web scraping service (LinkedIn/Indeed datasets) Google Sheets OAuth2**: For spreadsheet integration Airtable Token**: Database storage and management Prerequisites: n8n instance with HTTPS enabled (required for Telegram webhooks) Valid domain name with SSL certificate Basic understanding of Telegram bot commands 🔧 How It Works User Experience: Send /start to activate the bot and see available commands Use /jobs sales manager New York to search for specific positions Receive formatted job results instantly in Telegram Click "Apply Now" links to go directly to job postings All jobs automatically saved to your connected spreadsheets Behind the Scenes: Command Processing: Bot parses user input for keywords and location Parallel Scraping: Simultaneous API calls to LinkedIn, Indeed, and Monster AI Processing: Intelligent filtering, experience level detection, remote work identification Data Enhancement: Salary extraction, duplicate removal, relevance scoring Multi-Format Storage: Automatic saving to Google Sheets, Airtable, and JSON export Real-Time Response: Formatted results delivered back to Telegram chat 🎨 Telegram Bot Commands /start - Welcome message and command overview /jobs [keyword] [location] - Search for jobs (e.g., /jobs marketing manager remote) /help - Show detailed help information /status - Check bot status and recent activity 📊 Sample Output The bot delivers beautifully formatted job results: 🎯 Job Search Results 🎯 Found 7 relevant opportunities Platforms: linkedin, indeed, monster Remote jobs: 3 ─────────────────── 💼 Senior Sales Manager 🏢 TechCorp Industries 📍 New York, NY 💰 $80,000 - $120,000 🌐 Remote Available 📊 senior level 🔗 Apply Now 🔒 Security & Best Practices Rate Limiting**: Built-in Telegram API compliance (30 requests/second) Error Handling**: Graceful failure recovery with user-friendly messages Input Validation**: Sanitized user input to prevent injection attacks Credential Management**: Secure API key storage using n8n credentials system HTTPS Enforcement**: Required for production Telegram webhook integration 📈 Benefits & ROI 95% Time Reduction**: Automated job discovery vs manual searching Multi-Source Coverage**: Access 3 major job platforms simultaneously Mobile Accessibility**: Search jobs anywhere using Telegram mobile app Real-Time Alerts**: Never miss new opportunities with instant notifications Data Organization**: Automatic spreadsheet management for job tracking Market Intelligence**: Comprehensive job market analysis and trends 🚀 Advanced Customization Custom Keywords**: Modify filtering logic for specific industries Location Targeting**: Adjust geographic search parameters Experience Levels**: Fine-tune senior/mid/entry level detection Additional Platforms**: Easily add more job boards via HTTP requests Notification Scheduling**: Set up periodic automated job alerts Team Integration**: Deploy for multiple users or team channels 💡 Use Cases Individual Job Seekers**: Personal job hunting assistant Recruitment Agencies**: Streamlined candidate sourcing Sales Teams**: Territory-specific opportunity monitoring Marketing Departments**: Industry trend analysis and competitor tracking Career Coaches**: Client job market research and opportunity identification Ready to revolutionize your job search? Deploy this workflow and start receiving personalized job opportunities directly in Telegram!
by inderjeet Bhambra
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works? The Content Strategy AI Pipeline is an intelligent, multi-stage content creation system that transforms simple user prompts into polished, ready-to-publish content. The system intelligently extracts platform requirements, audience insights, and brand tone from user requests, then develops strategic reasoning and emotional connection strategies before crafting compelling content outlines and final publication-ready posts or articles. Supporting both social media platforms (Instagram, LinkedIn, X, Facebook, TikTok) and blog content. Key Differentiators: Strategic thinking approach, emotional intelligence integration, platform-native optimization, zero-editing-required output, and professional content strategist-level quality through multi-model AI orchestration. Technical Points Multi-model AI orchestration for specialized tasks Emotional psychology integration for audience connection Platform algorithm optimization built-in Industry-standard content strategy methodology automated Enterprise-grade reliability with session management and memory API-ready architecture for integration into existing workflows Test Inputs Sample Request: "Create an Instagram post for a fitness coach targeting busy moms, tone should be motivational and relatable" Expected Flow: Platform: Instagram → Niche: Fitness → Audience: Busy Moms → Tone: Motivational → Output: 125-150 word post with hashtags `
by Meak
LinkedIn Job-Based Cold Email System Most outreach tools rely on generic lead lists and recycled contact data. This workflow builds a live, personalized lead engine that scrapes new LinkedIn job posts, finds company decision-maker emails, and generates custom cold emails using GPT — all fully automated through n8n. Benefits Automated daily scraping of “Marketing Manager” jobs in Belgium Real-time leads from companies currently hiring for marketing roles Filters out HR and staffing agencies to keep only real businesses Enriches each company with verified CEO, Sales, and Marketing emails Generates unique, human-like cold emails and subject lines with GPT-4o Saves clean data to Google Sheets and drafts personalized Gmail messages How It Works Schedule Trigger runs every morning at 08:00. Apify LinkedIn Scraper collects new “Marketing Manager” jobs in Belgium. Remove Duplicates ensures each company appears only once. Filter Staffing excludes recruiters, HR agencies, and interim firms. Save Useful Infos extracts core company data — name, domain, size, description. Filter Domain & Size keeps valid websites and companies under 100 employees. Anymailfinder API looks up CEO, Sales, and Marketing decision-maker emails. Merge + If Node validates email results and removes invalid entries. Split Out + Deduplicate ensures unique, verified contacts. Extract Lead Name (Code Node) separates first and last names. Google Sheets Node appends all enriched lead data to your master sheet. GPT-4o (LangChain) writes a 100–120 word personalized cold email. GPT-4o (LangChain) creates a short, casual subject line. Gmail Draft Node builds a ready-to-send email using both outputs. Wait Node loops until all leads are processed. Who Is This For B2B agencies targeting Belgian SMEs Outbound marketers using job postings as purchase intent signals Freelancers or founders running lean, automated outreach systems Growth teams building scalable cold email engines Setup Apify**: use curious_coder~linkedin-jobs-scraper actor + API token Anymailfinder**: header auth with decision-maker categories (ceo, sales, marketing) Google Sheets**: connect a sheet named “LinkedIn Job Scraper” and map columns OpenAI (GPT-4o)**: insert your API key into both LangChain nodes Gmail**: OAuth2 connection; resource set to draft n8n**: store all credentials securely; set HTTP nodes to continue on error ROI & Results Save 1–3 hours per day on manual research and outreach prep Contact active hiring companies when they need marketing help most Scale to multiple industries or regions by changing search URLs Outperform paid lead databases with fresh, verified data Strategy Insights Add funding or tech-stack data for better lead scoring A/B test GPT subject lines and log open rates in Sheets Schedule GPT follow-ups 3 and 7 days later for full automation Push all enriched data to your CRM for advanced segmentation Use hiring signals to trigger ad audiences or retargeting campaigns Check Out My Channel For more advanced automation workflows that generate real client results, check out my YouTube channel — where I share the exact systems I use to automate outreach, scale agency pipelines, and close deals faster.
by Li CHEN
AWS News Analysis and LinkedIn Automation Pipeline Transform AWS industry news into engaging LinkedIn content with AI-powered analysis and automated approval workflows. Who's it for This template is perfect for: Cloud architects and DevOps engineers** who want to stay current with AWS developments Content creators** looking to automate their AWS news coverage Marketing teams** needing consistent, professional AWS content Technical leaders** who want to share industry insights on LinkedIn AWS consultants** building thought leadership through automated content How it works This workflow creates a comprehensive AWS news analysis and content generation pipeline with two main flows: Flow 1: News Collection and Analysis Scheduled RSS Monitoring: Automatically fetches latest AWS news from the official AWS RSS feed daily at 8 PM AI-Powered Analysis: Uses AWS Bedrock (Claude 3 Sonnet) to analyze each news item, extracting: Professional summary Key themes and keywords Importance rating (Low/Medium/High) Business impact assessment Structured Data Storage: Saves analyzed news to Feishu Bitable with approval status tracking Flow 2: LinkedIn Content Generation Manual Approval Trigger: Feishu automation sends approved news items to the webhook AI Content Creation: AWS Bedrock generates professional LinkedIn posts with: Attention-grabbing headlines Technical insights from a Solutions Architect perspective Business impact analysis Call-to-action engagement Automated Publishing: Posts directly to LinkedIn with relevant hashtags How to set up Prerequisites AWS Bedrock access** with Claude 3 Sonnet model enabled Feishu account** with Bitable access LinkedIn company account** with posting permissions n8n instance** (self-hosted or cloud) Detailed Configuration Steps 1. AWS Bedrock Setup Step 1: Enable Claude 3 Sonnet Model Log into your AWS Console Navigate to AWS Bedrock Go to Model access in the left sidebar Find Anthropic Claude 3 Sonnet and click Request model access Fill out the access request form (usually approved within minutes) Once approved, verify the model appears in your Model access list Step 2: Create IAM User and Credentials Go to IAM Console Click Users → Create user Name: n8n-bedrock-user Attach policy: AmazonBedrockFullAccess (or create custom policy with minimal permissions) Go to Security credentials tab → Create access key Choose Application running outside AWS Download the credentials CSV file Step 3: Configure in n8n In n8n, go to Credentials → Add credential Select AWS credential type Enter your Access Key ID and Secret Access Key Set Region to your preferred AWS region (e.g., us-east-1) Test the connection Useful Links: AWS Bedrock Documentation Claude 3 Sonnet Model Access AWS Bedrock Pricing 2. Feishu Bitable Configuration Step 1: Create Feishu Account and App Sign up at Feishu International Create a new Bitable (multi-dimensional table) Go to Developer Console → Create App Enable Bitable permissions in your app Generate App Token and App Secret Step 2: Create Bitable Structure Create a new Bitable with these columns: title (Text) pubDate (Date) summary (Long Text) keywords (Multi-select) rating (Single Select: Low, Medium, High) link (URL) approval_status (Single Select: Pending, Approved, Rejected) Get your App Token and Table ID: App Token: Found in app settings Table ID: Found in the Bitable URL (tbl...) Step 3: Set Up Automation In your Bitable, go to Automation → Create automation Trigger: When field value changes → Select approval_status field Condition: approval_status equals "Approved" Action: Send HTTP request Method: POST URL: Your n8n webhook URL (from Flow 2) Headers: Content-Type: application/json Body: {{record}} Step 4: Configure Feishu Credentials in n8n Install Feishu Lite community node (self-hosted only) Add Feishu credential with your App Token and App Secret Test the connection Useful Links: Feishu Developer Documentation Bitable API Reference Feishu Automation Guide 3. LinkedIn Company Account Setup Step 1: Create LinkedIn App Go to LinkedIn Developer Portal Click Create App Fill in app details: App name: AWS News Automation LinkedIn Page: Select your company page App logo: Upload your logo Legal agreement: Accept terms Step 2: Configure OAuth2 Settings In your app, go to Auth tab Add redirect URL: https://your-n8n-instance.com/rest/oauth2-credential/callback Request these scopes: w_member_social (Post on behalf of members) r_liteprofile (Read basic profile) r_emailaddress (Read email address) Step 3: Get Company Page Access Go to your LinkedIn Company Page Navigate to Admin tools → Manage admins Ensure you have Content admin or Super admin role Note your Company Page ID (found in page URL) Step 4: Configure LinkedIn Credentials in n8n Add LinkedIn OAuth2 credential Enter your Client ID and Client Secret Complete OAuth2 flow by clicking Connect my account Select your company page for posting Useful Links: LinkedIn Developer Portal LinkedIn API Documentation LinkedIn OAuth2 Guide 4. Workflow Activation Final Setup Steps: Import the workflow JSON into n8n Configure all credential connections: AWS Bedrock credentials Feishu credentials LinkedIn OAuth2 credentials Update webhook URL in Feishu automation to match your n8n instance Activate the scheduled trigger (daily at 8 PM) Test with manual webhook trigger using sample data Verify Feishu Bitable receives data Test approval workflow and LinkedIn posting Requirements Service Requirements AWS Bedrock** with Claude 3 Sonnet model access AWS account with Bedrock service enabled IAM user with Bedrock permissions Model access approval for Claude 3 Sonnet Feishu Bitable** for news storage and approval workflow Feishu account (International or Lark) Developer app with Bitable permissions Automation capabilities for webhook triggers LinkedIn Company Account** for automated posting LinkedIn company page with admin access LinkedIn Developer app with posting permissions OAuth2 authentication setup n8n community nodes**: Feishu Lite node (self-hosted only) Technical Requirements n8n instance** (self-hosted recommended for community nodes) Webhook endpoint** accessible from Feishu automation Internet connectivity** for API calls and RSS feeds Storage space** for workflow execution logs Cost Considerations AWS Bedrock**: ~$0.01-0.05 per news analysis Feishu**: Free tier available, paid plans for advanced features LinkedIn**: Free API access with rate limits n8n**: Self-hosted (free) or cloud subscription How to customize the workflow Content Customization Modify AI prompts** in the AI Agent nodes to change tone, focus, or target audience Adjust hashtags** in the LinkedIn posting node for different industries Change scheduling** frequency by modifying the Schedule Trigger settings Integration Options Replace LinkedIn** with Twitter/X, Facebook, or other social platforms Add Slack notifications** for approved content before posting Integrate with CRM** systems to track content performance Add content calendar** integration for better planning Advanced Features Multi-language support** by modifying AI prompts for different regions Content categorization** by adding tags for different AWS services Performance tracking** by integrating analytics platforms Team collaboration** by adding approval workflows with multiple reviewers Technical Modifications Change RSS sources** to monitor other AWS blogs or competitor news Adjust AI models** to use different Bedrock models or external APIs Add data validation** nodes for better error handling Implement retry logic** for failed API calls Important Notes Service Limitations This template uses community nodes (Feishu Lite) and requires self-hosted n8n Geo-restrictions** may apply to AWS Bedrock models in certain regions Rate limits** may affect high-frequency posting - adjust scheduling accordingly Content moderation** is recommended before automated posting Cost considerations**: Each AI analysis costs approximately $0.01-0.05 USD per news item Troubleshooting Common Issues AWS Bedrock Issues: Model not found**: Ensure Claude 3 Sonnet access is approved in your region Access denied**: Verify IAM permissions include Bedrock service access Rate limiting**: Implement retry logic or reduce analysis frequency Feishu Integration Issues: Authentication failed**: Check App Token and App Secret are correct Table not found**: Verify Table ID matches your Bitable URL Automation not triggering**: Ensure webhook URL is accessible and returns 200 status LinkedIn Posting Issues: OAuth2 errors**: Re-authenticate LinkedIn credentials Posting failed**: Verify company page admin permissions Rate limits**: LinkedIn has daily posting limits for company pages Security Best Practices Never hardcode credentials** in workflow nodes Use environment variables** for sensitive configuration Regularly rotate API keys** and access tokens Monitor API usage** to prevent unexpected charges Implement error handling** for failed API calls
by Club de Inteligencia Artificial Politécnico CIAP
Telegram Appointment Scheduling Bot with n8n 📃 Description Tired of managing appointments manually? This template transforms your Telegram account into a smart virtual assistant that handles the entire scheduling process for you, 24/7. This workflow allows you to deploy a fully functional Telegram bot that not only schedules appointments but also checks real-time availability in your Google Calendar, logs a history in Google Sheets, and allows your clients to cancel or view their upcoming appointments. It's the perfect solution for professionals, small businesses, or anyone looking to automate their booking system professionally and effortlessly. ✨ Key Features Complete Appointment Management:** Allows users to schedule, cancel, and list their future appointments. Conflict Prevention:** Integrates with Google Calendar to check availability before confirming a booking, eliminating the risk of double-booking. Automatic Logging:** Every confirmed appointment is saved to a row in Google Sheets, creating a perfect database for tracking and analysis. Smart Interaction:** The bot handles unrecognized commands and guides the user, ensuring a smooth experience. Easy to Adapt:** Connect your own accounts, customize messages, and tailor it to your business needs in minutes. 🚀 Setup Follow these steps to deploy your own instance of this bot: 1. Prerequisites An n8n instance (Cloud or self-hosted). A Telegram account. A Google account. 2. Telegram Bot Talk to @BotFather on Telegram. Create a new bot using /newbot. Give it a name and a username. Copy and save the API token it provides. 3. Google Cloud & APIs Go to the Google Cloud Console. Create a new project. Enable the Google Calendar API and Google Sheets API. Create OAuth 2.0 Client ID credentials. Make sure to add your n8n instance's OAuth redirect URL. Save the Client ID and Client Secret. 4. Google Sheets Create a new spreadsheet in Google Sheets. Define the column headers in the first row. For example: id, Client Name, Date and Time, ISO Date. 5. n8n Import the workflow JSON file into your n8n instance. Set up the credentials: Telegram: Create a new credential and paste your bot's token. Google Calendar & Google Sheets (OAuth2): Create a new credential and paste the Client ID and Client Secret from the Google Cloud Console. Review the Google Calendar and Google Sheets nodes to select your correct calendar and spreadsheet. Activate the workflow! 💬 Usage Once the bot is running, you can interact with it using the following commands in Telegram: To start the bot:** /start To schedule a new appointment:** agendar YYYY-MM-DD HH:MM Your Full Name To cancel an existing appointment:** cancelar YYYY-MM-DD HH:MM Your Full Name To view your future appointments:** mis citas Your Full Name 👥 Authors Jaren Pazmiño President of the Polytechnic Artificial Intelligence Club (CIAP)
by Santhej Kallada
Who is this for? Creators, designers, and developers exploring AI-powered image generation. Automation enthusiasts who want to integrate image creation into n8n workflows. Telegram bot builders looking to add visual AI capabilities. Marketers or freelancers automating creative content workflows. What problem is this workflow solving? Creating AI images usually requires multiple tools and manual setup. This workflow removes the complexity by: Connecting Nano Banana (AI image model) directly to n8n. Allowing image generation via Telegram chatbot. Providing a no-code setup that is fully automated and scalable. What this workflow does This workflow demonstrates how to generate AI images using Nano Banana and n8n, with an integrated Telegram chatbot interface. The process includes: Connecting Gemini Nano Banana to n8n. Automating image generation requests triggered from Telegram. Returning AI-generated images back to the user. Allowing customization of prompts and styles dynamically. By the end, you’ll have a fully functional automation to generate and send AI-created images through Telegram — no coding required. Setup Create accounts: Sign up on n8n.io and ensure you have Telegram Bot API access. Connect your Nano Banana or Gemini API endpoint. Set up your Telegram Bot: Use BotFather to create a new bot and get the token. Add the “Telegram Trigger” node in n8n. Configure Nano Banana connection: Add an HTTP Request node for Nano Banana API. Insert your API key and prompt parameters. Handle responses: Parse the AI-generated image output. Send the image file back to the Telegram user. Test and Deploy: Run a sample image prompt. Verify that Telegram returns the correct generated image. How to customize this workflow to your needs Modify prompts or styles to fit different artistic use cases. Add conditional logic for image size, aspect ratio, or filters. Integrate with Google Drive or Notion for image storage. Schedule automatic image generation for campaigns or content creation. Expand with OpenAI or Stability AI for hybrid workflows. Notes Nano Banana API may have rate limits depending on usage. Ensure your Telegram bot has permission to send files and images. You can host this workflow on n8n Cloud or self-hosted setups. Want A Video Tutorial on How to Setup This Automation: https://youtu.be/0s6ZdU1fjc4
by Growth AI
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Website sitemap generator and visual tree creator Who's it for Web developers, SEO specialists, UX designers, and digital marketers who need to analyze website structure, create visual sitemaps, or audit site architecture for optimization purposes. What it does This workflow automatically generates a comprehensive sitemap from any website URL and creates an organized hierarchical structure in Google Sheets. It follows the website's sitemap to discover all pages, then organizes them by navigation levels (Level 1, Level 2, etc.) with proper parent-child relationships. The output can be further processed to create visual tree diagrams and mind maps. How it works The workflow follows a five-step automation process: URL Input: Accepts website URL via chat interface Site Crawling: Uses Firecrawl to discover all pages following the website's sitemap only Success Validation: Checks if crawling was successful (some sites block external crawlers) Hierarchical Organization: Processes URLs into a structured tree with proper level relationships Google Sheets Export: Creates a formatted spreadsheet with the complete site architecture The system respects robots.txt and follows only sitemap-declared pages to ensure ethical crawling. Requirements Firecrawl API key (for website crawling and sitemap discovery) Google Sheets access Google Drive access (for template duplication) How to set up Step 1: Prepare your template (recommended) It's recommended to create your own copy of the base template: Access the base Google Sheets template Make a copy for your personal use Update the workflow's "Copy template" node with your template's file ID (replace the default ID: 12lV4HwgudgzPPGXKNesIEExbFg09Tuu9gyC_jSS1HjI) This ensures you have control over the template formatting and can customize it as needed Step 2: Configure API credentials Set up the following credentials in n8n: Firecrawl API: For crawling websites and discovering sitemaps Google Sheets OAuth2: For creating and updating spreadsheets Google Drive OAuth2: For duplicating the template file Step 3: Configure Firecrawl settings (optional) The workflow uses optimized Firecrawl settings: ignoreSitemap: false - Respects the website's sitemap sitemapOnly: true - Only crawls URLs listed in sitemap files These settings ensure ethical crawling and faster processing Step 4: Access the workflow The workflow uses a chat trigger interface - no manual configuration needed Simply provide the website URL you want to analyze when prompted How to use the workflow Basic usage Start the chat: Access the workflow via the chat interface Provide URL: Enter the website URL you want to analyze (e.g., "https://example.com") Wait for processing: The system will crawl, organize, and export the data Receive your results: Get an automatic direct clickable link to your generated Google Sheets - no need to search for the file Error handling Invalid URLs: If the provided URL is invalid or the website blocks crawling, you'll receive an immediate error message Graceful failure: The workflow stops without creating unnecessary files when errors occur Common causes: Incorrect URL format, robots.txt restrictions, or site security settings File organization Automatic naming: Generated files follow the pattern "[Website URL] - n8n - Arborescence" Google Drive storage: Files are automatically organized in your Google Drive Instant access: Direct link provided immediately upon completion Advanced processing for visual diagrams Step 1: Copy sitemap data Once your Google Sheets is ready: Copy all the hierarchical data from the generated spreadsheet Prepare it for AI processing Step 2: Generate ASCII tree structure Use any AI model with this prompt: Create a hierarchical tree structure from the following website sitemap data. Return ONLY the tree structure using ASCII tree formatting with ├── and └── characters. Do not include any explanations, comments, or additional text - just the pure tree structure. The tree should start with the root domain and show all pages organized by their hierarchical levels. Use proper indentation to show parent-child relationships. Here is the sitemap data: [PASTE THE SITEMAP DATA HERE] Requirements: Use ASCII tree characters (├── └── │) Show clear hierarchical relationships Include all pages from the sitemap Return ONLY the tree structure, no other text Start with the root domain as the top level Step 3: Create visual mind map Visit the Whimsical Diagrams GPT Request a mind map creation using your ASCII tree structure Get a professional visual representation of your website architecture Results interpretation Google Sheets output structure The generated spreadsheet contains: Niv 0 to Niv 5: Hierarchical levels (0 = homepage, 1-5 = navigation depth) URL column: Complete URLs for reference Hyperlinked structure: Clickable links organized by hierarchy Multi-domain support: Handles subdomains and different domain structures Data organization features Automatic sorting: Pages organized by navigation depth and alphabetical order Parent-child relationships: Clear hierarchical structure maintained Domain separation: Main domains and subdomains processed separately Clean formatting: URLs decoded and formatted for readability Workflow limitations Sitemap dependency: Only discovers pages listed in the website's sitemap Crawling restrictions: Some websites may block external crawlers Level depth: Limited to 5 hierarchical levels for clarity Rate limits: Respects Firecrawl API limitations Template dependency: Requires access to the base template for duplication Use cases SEO audits: Analyze site structure for optimization opportunities UX research: Understand navigation patterns and user paths Content strategy: Identify content gaps and organizational issues Site migrations: Document existing structure before redesigns Competitive analysis: Study competitor site architectures Client presentations: Create visual site maps for stakeholder reviews
by Đỗ Thành Nguyên
Automated Facebook Page Story Video Publisher (Google Drive → Facebook → Google Sheet) > Recommended: Self-hosted via tino.vn/vps-n8n?affid=388 — use code VPSN8N for up to 39% off. This workflow is an automated solution for publishing video content from Google Drive to your Facebook Page Stories, while using Google Sheets as a posting queue manager. What This Workflow Does (Workflow Function) This automation orchestrates a complete multi-step process for uploading and publishing videos to Facebook Stories: Queue Management: Every 2 hours and 30 minutes, the workflow checks a Google Sheet (Get Row Sheet node) to find the first video whose Stories column is empty — meaning it hasn’t been posted yet. Conditional Execution: An If node confirms that the video’s File ID exists before proceeding. Video Retrieval: Using the File ID, the workflow downloads the video from Google Drive (Google Drive node) and calculates its binary size (Set to the total size in bytes node). Facebook 3-Step Upload: It performs the Facebook Graph API’s three-step upload process through HTTP Request nodes: Step 1 – Initialize Session: Starts an upload session and retrieves the upload_url and video_id. Step 2 – Upload File: Uploads the binary video data to the provided upload_url. Step 3 – Publish Video: Finalizes and publishes the uploaded video as a Facebook Story. Status Update: Once completed, the workflow updates the same row in Google Sheets (Update upload status in sheet node) using the row_number to mark the video as processed. Prerequisites (What You Need Before Running) 1. n8n Instance > Recommended: Self-hosted via tino.vn/vps-n8n?affid=388 — use code VPSN8N for up to 39% off. 2. Google Services Google Drive Credentials:** OAuth2 credentials for Google Drive to let n8n download video files. Google Sheets Credentials:** OAuth2 credentials for Google Sheets to read the posting queue and update statuses. Google Sheet:** A spreadsheet (ID: 1RnE5O06l7W6TLCLKkwEH5Oyl-EZ3OE-Uc3OWFbDohYI) containing: File ID — the video’s unique ID in Google Drive. Stories — posting status column (leave empty for pending videos). row_number — used for updating the correct row after posting. 3. Facebook Setup Page ID:** Your Facebook Page ID (currently hardcoded as 115432036514099 in the info node). Access Token:* A *Page Access Token** with permissions such as pages_manage_posts and pages_read_engagement. This token is hardcoded in the info node and again in Step 3. Post video. Usage Guide and Implementation Notes How to Use Queue Videos: Add video entries to your Google Sheet. Each entry must include a valid Google Drive File ID. Leave the Stories column empty for videos that haven’t been posted. Activate: Save and activate the workflow. The Schedule Trigger will automatically handle new uploads every 2 hours and 30 minutes. Implementation Notes ⚠️ Token Security:* Hardcoding your *Access Token* inside the info node is *not recommended**. Tokens expire and expose your Page to risk if leaked. 👉 Action: Replace the static token with a secure Credential setup that supports token rotation. Loop Efficiency:* The *“false”** output of the If node currently loops back to the Get Row Sheet node. This creates unnecessary cycles if no videos are found. 👉 Action: Disconnect that branch so the workflow stops gracefully when no unposted videos remain. Status Updates:* To prevent re-posting the same video, the final Update upload status in sheet node must update the *Stories** column (e.g., write "POSTED"). 👉 Action: Add this mapping explicitly to your Google Sheets node. Automated File ID Sync:** This workflow assumes that the Google Sheet already contains valid File IDs. 👉 You can build a secondary workflow (using Schedule Trigger1 → Search files and folders → Append or update row in sheet) to automatically populate new video File IDs from your Google Drive. ✅ Result Once active, this workflow automatically: pulls pending videos from your Google Sheet, uploads them to Facebook Stories, and marks them as posted — all without manual intervention.
by Max Tkacz
Easily generate images with Black Forest's Flux Text-to-Image AI models using Hugging Face’s Inference API. This template serves a webform where you can enter prompts and select predefined visual styles that are customizable with no-code. The workflow integrates seamlessly with Hugging Face's free tier, and it’s easy to modify for any Text-to-Image model that supports API access. Try it Curious what this template does? Try a public version here: https://devrel.app.n8n.cloud/form/flux Set Up Watch this quick set up video 👇 Accounts required Huggingface.co account (free) Cloudflare.com account (free - used for storage; but can be swapped easily e.g. GDrive) Key Features: Text-to-Image Creation**: Generates unique visuals based on your prompt and style. Hugging Face Integration**: Utilizes Hugging Face’s Inference API for reliable image generation. Customizable Visual Styles**: Select from preset styles or easily add your own. Adaptable**: Swap in any Hugging Face Text-to-Image model that supports API calls. Ideal for: Creators**: Rapidly create visuals for projects. Marketers**: Prototype campaign visuals. Developers**: Test different AI image models effortlessly. How It Works: You submit an image prompt via the webform and select a visual style, which appends style instructions to your prompt. The Hugging Face Inference API then generates and returns the image, which gets hosted on Cloudflare S3. The workflow can be easily adjusted to use other models and styles for complete flexibility.