by Sulieman Said
How it Works This workflow automates the process of discovering companies in different cities, extracting their contact data, and storing it in Airtable. City Loop (Airtable → Google Maps API) Reads a list of cities from Airtable. Uses each city combined with a search term (e.g., SEO Agency, Berlin) to query Google Maps. Marks processed cities as “checked” to allow safe restarts if interrupted. Business Discovery & Deduplication Searches for businesses via Google Maps Text Search. Checks Airtable to avoid scraping the same company multiple times. Fetches detailed info for each business via Google Maps Place Details API. Impressum Extraction (Website → HTML Parsing) Builds an Impressum page URL for each business. Requests the HTML and cleans out ads, headers, footers, etc. Extracts relevant contact info using an AI extractor (OpenAI node). Contact Information Extraction Pulls out: Decision Maker (Name + Position in one string, if available). Email address (must be valid, containing @). Phone number (international format if possible). Filters out incomplete results (e.g., empty email). Database Storage Writes company data back into Airtable: Company name Address Website Email Phone number Decision Maker (Name + Position) Search term & city used Setup Steps 1. Prerequisites Google Maps API Key with access to: Places API → Text Search + Place Details Airtable base with at least two tables: Cities (with columns: ID, City, Country, Status) Companies (for scraped results) OpenAI API key (for decision maker + contact extraction). 2. Authentication Configure your Airtable API credentials in n8n. Set up HTTP Query Auth with your Google Maps API key. Add your OpenAI API key in the OpenAI Chat node. 3. Configuration In the Airtable “Cities” table, list all cities you want to scrape. Define your search term in the “Execute Workflow” node (e.g., SEO Agency). Adjust the batch sizes and wait intervals if you want faster/slower scraping (Google API has strict rate limits). 4. Execution Start manually or from another workflow. The workflow will scrape all companies in each city step by step. It can be safely stopped and resumed — cities already marked as processed will be skipped. 5. Results Enriched company dataset stored in Airtable, ready for CRM import, lead generation, or further automation. Tips & Notes Always respect GDPR and local laws when handling scraped data. The workflow is modular → you can swap Airtable with Google Sheets, Notion, or a database of your choice. Add custom filters to limit results (e.g., only companies with websites). Use sticky notes inside the workflow to understand each step (mandatory for template publishing). Keep an eye on Google Places API costs** — queries are billed after the free quota. If you are still within the first 2 months of the Google Cloud Developer free trial, you can benefit from free credits. Questions or custom requests? 📩 suliemansaid.business@gmail.com
by Jose Cuartas
Sync Gmail emails to PostgreSQL with S3 attachment storage Automated Gmail Email Processing System Who's it for Businesses and individuals who need to: Archive email communications in a searchable database Backup email attachments to cloud storage Analyze email patterns and communication data Comply with data retention policies Integrate emails with other business systems What it does This workflow automatically captures, processes, and stores Gmail emails in a PostgreSQL database while uploading file attachments to S3/MinIO storage. It handles both individual emails (via Gmail Trigger) and bulk processing (via Schedule Trigger). Key features: Dual processing: real-time individual emails + scheduled bulk retrieval Complete email metadata extraction (sender, recipients, labels, timestamps) HTML to plain text conversion for searchable content Binary attachment processing with metadata extraction Organized S3/MinIO file storage structure UPSERT database operations to prevent duplicates How it works Email Capture: Gmail Trigger detects new emails, Schedule Trigger gets bulk emails from last hour Parallel Processing: Emails with attachments go through binary processing, others go directly to transformation Attachment Handling: Extract metadata, upload to S3/MinIO, create database references Data Transformation: Convert Gmail API format to PostgreSQL structure Storage: UPSERT emails to database with linked attachment information Requirements Credentials needed: Gmail OAuth2 (gmail.readonly scope) PostgreSQL database connection S3/MinIO storage credentials Database setup: Run the provided SQL schema to create the messages table with JSONB fields for flexible data storage. How to set up Gmail OAuth2: Enable Gmail API in Google Cloud Console, create OAuth2 credentials PostgreSQL: Create database and run the SQL schema provided in setup sticky note S3/MinIO: Create bucket "gmail-attachments" with proper upload permissions Configure: Update authenticatedUserEmail in transform scripts to your email Test: Start with single email before enabling bulk processing How to customize Email filters**: Modify Gmail queries (in:sent, in:inbox) to target specific emails Storage structure**: Change S3 file path format in Upload node Processing schedule**: Adjust trigger frequencies based on email volume Database fields**: Extend PostgreSQL schema for additional metadata Attachment types**: Add file type filtering in binary processing logic Note: This workflow processes emails from the last hour to avoid overwhelming the system. Adjust timeframes based on your email volume and processing needs.
by Muhammad Bello
Description This n8n template demonstrates how to turn raw YouTube comments into research-backed content ideas complete with hooks and outlines. Use cases include: Quickly mining a competitor’s audience for video ideas. Generating hooks and outlines for your own channel’s comments. Validating content opportunities with live audience feedback. Good to know Apify is used to scrape YouTube comments (requires an API token). GPT-4.1-mini is used for both filtering and content generation. Tavily provides fresh research to ground the AI’s responses. All outputs are stored in Google Sheets, making it easy to manage and track ideas. How it works Trigger – Paste a YouTube URL into the chat trigger. Scrape Comments – Apify fetches all comments and metadata. Filter – GPT-4.1-mini decides if each comment could inspire a content idea. Store – Comments and “Yes/No” decisions are appended to Google Sheets. Research & Enrich – For “Yes” comments, Tavily provides context, and GPT generates a topic, hook, and outline. Update Sheet – The same row in Google Sheets is updated with enriched fields. Google Sheets Setup Your Google Sheet should include these columns (in this order): id | text | author | likes | isIdea | topic | research | hook | outline id** – unique identifier for each comment text** – the full YouTube comment author** – commenter’s name/handle likes** – number of likes on the comment isIdea** – “Yes” or “No” depending on GPT filter topic** – extracted video topic research** – 300–500 word background from Tavily hook** – engaging opening sentence for a video outline** – structured video outline Setup Steps Connect your Apify, OpenAI, Tavily, and Google Sheets credentials in n8n. Point the Google Sheets nodes to your own document and ensure the above headers exist. Replace sample API keys with your own stored in n8n Credentials. Time to set up: \~15–25 minutes for a first-time n8n user (less if you already have credentials handy). Customizing this workflow Filter logic** – Loosen the GPT filter to allow borderline ideas, or tighten it to only accept the best ones. Research depth** – Change Tavily’s search depth (e.g., depth: basic vs depth: advanced) to control how detailed the background research is. Notification channels* – Send new “Yes” ideas directly to *Slack* (#content-ideas), *Notion* (your content board), or *Email** (notify the content manager instantly). Alternative outputs** – Instead of hooks/outlines, generate: A script draft for YouTube Shorts. Blog post angles based on the same audience comments. A poll question for community engagement.
by Ghufran Barcha
Telegram AI Personal Assistant — Calendar & Email Manager This workflow turns a Telegram bot into a fully functional personal AI assistant capable of handling your schedule and inbox through natural conversation. Send it a text message, record a voice note, or snap a photo — it understands all three and responds intelligently. The assistant is powered by Claude Haiku (via OpenRouter) and comes with a built-in 30-message memory buffer, so it remembers context across a conversation just like a real assistant would. It has full read/write access to Google Calendar and Gmail, meaning it can book meetings, check your availability, send emails, reply to threads, and clean up your inbox — all from a single Telegram chat. What this workflow does Multi-modal input handling Text messages are processed directly Voice notes are downloaded from Telegram and transcribed using OpenAI Whisper Photos are downloaded and analyzed using GPT-4o vision, with any caption included as additional context All three input types are normalized into a unified context object before reaching the agent, so the AI always receives clean, structured input regardless of how the user communicated. Authorization layer Only the allowlisted Telegram User ID can interact with the assistant. Any unauthorized message receives an instant rejection and the workflow stops — no agent calls are made. AI agent with tools The LangChain agent receives the full context and decides autonomously whether to reply conversationally or invoke one of the connected tools. It uses the current date/time from n8n to handle scheduling requests accurately. Google Calendar tools: check availability, create events, list upcoming events, fetch a specific event, update event details, delete events. Gmail tools: send new emails, search the inbox, read a specific email, reply to a thread, delete messages. Persistent memory Each user's conversation is tracked using a sliding window of the last 30 messages, keyed by their Telegram User ID. The assistant remembers what was said earlier in the same session without needing reminders. Example use cases "What do I have on Thursday?" → fetches and summarizes calendar events "Schedule a call with Ahmed tomorrow at 3pm" → creates a calendar event "Any emails from the client today?" → searches Gmail and summarizes results "Reply to John's last email and say I'll confirm by Friday" → reads the thread and sends a reply [sends a photo of a meeting invite] → extracts details from the image and creates a calendar event Setup instructions Telegram API — create a bot via @BotFather and connect the token to the Telegram Trigger node and both send nodes. OpenAI API — required for Whisper voice transcription and GPT-4o image analysis. OpenRouter API — used to run Claude Haiku as the agent's language model. You can swap this for any OpenRouter-compatible model. Google Calendar OAuth2 — authorize your Google account and update the calendar ID (currently set to an example address) in all six calendar tool nodes. Gmail OAuth2 — authorize your Gmail account in all five Gmail tool nodes. User authorization — open the If node and replace the placeholder value with your own Telegram numeric User ID. You can find this by messaging @userinfobot on Telegram. Customization tips To support multiple authorized users, replace the If node with a list-based check or a Code node that checks against an array of allowed IDs. To change the AI model, swap the OpenRouter Chat Model node — the agent prompt and all tools remain fully compatible. To adjust memory length, change the contextWindowLength value in the Simple Memory node (currently 30 messages). To modify the assistant's personality or add new instructions, edit the system prompt inside the MainAgent node. Additional tools (e.g. Notion, Slack, Airtable) can be connected to the MainAgent node as sub-nodes without changing any other part of the workflow.
by Oneclick AI Squad
This n8n workflow monitors and alerts you about new construction projects in specified areas, helping you track competing builders and identify business opportunities. The system automatically searches multiple data sources and sends detailed email reports with upcoming projects. Good to know Email parsing accuracy depends on the consistency of request formats - use the provided template for best results. The workflow includes fallback mock data for demonstration when external APIs are unavailable. Government data sources may have rate limits - the workflow includes proper error handling. Results are filtered to show only upcoming/recent projects (within 3 months). How it works Email Trigger** - Detects new email requests with "Construction Alert Request" in the subject line Check Email Subject** - Validates that the email contains the correct trigger phrase Extract Location Info** - Parses the email body to extract area, city, state, and zip code information Search Government Data** - Queries government databases for public construction projects and permits Search Construction Sites** - Searches construction industry databases for private projects Process Construction Data** - Combines and filters results from both sources, removing duplicates Wait For Data** - Wait for Combines and filters results. Check If Projects Found** - Determines whether to send a results report or no-results notification Generate Email Report** - Creates a professional HTML email with project details and summaries Send Alert Email** - Delivers the construction project report to the requester Send No Results Email** - Notifies when no projects are found in the specified area The workflow also includes a Schedule Trigger that can run automatically on weekdays at 9 AM for regular monitoring. Email Format Examples Input Email Format To: alerts@yourcompany.com Subject: Construction Alert Request Area: Downtown Chicago City: Chicago State: IL Zip: 60601 Additional notes: Looking for commercial projects over $1M Alternative format: To: alerts@yourcompany.com Subject: Construction Alert Request Please search for construction projects in Miami, FL 33101 Focus on residential and mixed-use developments. Output Email Example Subject: 🏗️ Construction Alert: 8 Projects Found in Downtown Chicago 🏗️ Construction Project Alert Report Search Area: Downtown Chicago Report Generated: August 4, 2024, 2:30 PM 📊 Summary Total Projects Found: 8 Search Query: Downtown Chicago IL construction permits 🔍 Upcoming Construction Projects New Commercial Complex - Downtown Chicago 📍 Location: Downtown Chicago | 📅 Start Date: March 2024 | 🏢 Type: Mixed Development Description: Mixed-use commercial and residential development Source: Local Planning Department Office Building Construction - Chicago 📍 Location: Chicago, IL | 📅 Start Date: April 2024 | 🏢 Type: Commercial Description: 5-story office building with retail space Source: Building Permits [Additional projects...] 💡 Next Steps • Review each project for potential competition • Contact project owners for partnership opportunities • Monitor progress and timeline changes • Update your competitive analysis How to use Setup Instructions Import the workflow into your n8n instance Configure Email Credentials: Set up IMAP credentials for receiving emails Set up SMTP credentials for sending alerts Test the workflow with a sample email Set up scheduling (optional) for automated daily checks Sending Alert Requests Send an email to your configured address Use "Construction Alert Request" in the subject line Include location details in the email body Receive detailed project reports within minutes Requirements n8n instance** (cloud or self-hosted) Email account** with IMAP/SMTP access Internet connection** for API calls to construction databases Valid email addresses** for sending and receiving alerts API Integration Code Examples Government Data API Integration // Example API call to USA.gov jobs API const searchGovernmentProjects = async (location) => { const response = await fetch('https://api.usa.gov/jobs/search.json', { method: 'GET', headers: { 'Content-Type': 'application/json', }, params: { keyword: 'construction permit', location_name: location, size: 20 } }); return await response.json(); }; Construction Industry API Integration // Example API call to construction databases const searchConstructionProjects = async (area) => { const response = await fetch('https://www.construction.com/api/search', { method: 'GET', headers: { 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36', 'Accept': 'application/json' }, params: { q: ${area} construction projects, type: 'projects', limit: 15 } }); return await response.json(); }; Email Processing Function // Extract location from email content const extractLocationInfo = (emailBody) => { const lines = emailBody.split('\n'); let area = '', city = '', state = '', zipcode = ''; for (const line of lines) { if (line.toLowerCase().includes('area:')) { area = line.split(':')[1]?.trim(); } if (line.toLowerCase().includes('city:')) { city = line.split(':')[1]?.trim(); } if (line.toLowerCase().includes('state:')) { state = line.split(':')[1]?.trim(); } if (line.toLowerCase().includes('zip:')) { zipcode = line.split(':')[1]?.trim(); } } return { area, city, state, zipcode }; }; Customizing this workflow Adding New Data Sources Add HTTP Request nodes for additional APIs Update the Process Construction Data node to handle new data formats Modify the search parameters based on API requirements Enhanced Email Parsing // Custom email parsing for different formats const parseEmailContent = (emailBody) => { // Add regex patterns for different email formats const patterns = { address: /(\d+\s+[\w\s]+,\s[\w\s]+,\s[A-Z]{2}\s*\d{5})/, coordinates: /(\d+\.\d+),\s*(-?\d+\.\d+)/, zipcode: /\b\d{5}(-\d{4})?\b/ }; // Extract using multiple patterns // Implementation details... }; Custom Alert Conditions Modify the Check If Projects Found node to filter by: Project value/budget Project type (residential, commercial, etc.) Distance from your location Timeline criteria Advanced Scheduling // Set up multiple schedule triggers for different areas const scheduleConfigs = [ { area: "Downtown", cron: "0 9 * * 1-5" }, // Weekdays 9 AM { area: "Suburbs", cron: "0 14 * * 1,3,5" }, // Mon, Wed, Fri 2 PM { area: "Industrial", cron: "0 8 * * 1" } // Monday 8 AM ]; Integration with CRM Systems Add HTTP Request nodes to automatically create leads in your CRM when high-value projects are found: // Example CRM integration const createCRMLead = async (project) => { await fetch('https://your-crm.com/api/leads', { method: 'POST', headers: { 'Authorization': 'Bearer YOUR_TOKEN', 'Content-Type': 'application/json' }, body: JSON.stringify({ name: project.title, location: project.location, value: project.estimatedValue, source: 'Construction Alert System' }) }); }; Troubleshooting No emails received**: Check IMAP credentials and email filters Empty results**: Verify API endpoints and add fallback data sources Failed email delivery**: Confirm SMTP settings and recipient addresses API rate limits**: Implement delays between requests and error handling
by Ninmegne Paul
🔧 How it works Scheduled Trigger The workflow is triggered automatically every day at 12:00 PM using a Cron node. RSS Feed Collection It fetches the latest content from multiple RSS feeds related to Technology, Manga, and Movies. Content Processing & Formatting The collected data is filtered and organized based on your interests. A dynamic HTML email template is generated to present the content in a clean and readable layout. Email Delivery The final newsletter is sent directly to your inbox using the Send Email node. ⚙️ Set up steps Configure RSS Sources Update the RSS feed URLs inside the Set nodes to match your preferred sources. Set Email Recipient Replace the email address in the Send Email node with your own. Adjust Schedule Modify the execution time in the Cron Trigger node if you want the newsletter to be sent at a different time. Activate the Workflow Enable the workflow to start receiving your personalized daily newsletter automatically.
by Davide
This workflow automates the process of generating advertising (ADV) images from multiple reference images and publishing them directly to social media (Instagram and Facebook with Upload-Post) with Seedream v4 AI. This workflow automates the process of generating an AI image based on a user's text prompt and up to 6 reference images. The process is triggered by a user submitting a web form. Key Advantages ✅ Automated Image Creation – Generates high-quality, consistent visuals from multiple references without manual editing. ✅ Seamless Social Media Publishing – Automatically posts to Instagram and Facebook with minimal effort. ✅ SEO-Optimized Titles – Ensures your posts get better reach with AI-generated, keyword-friendly titles. ✅ Scalable Workflow – Can be triggered manually, on schedule, or via form submissions. ✅ Time-Saving – Reduces manual steps from design to publishing, enabling faster content production. ✅ Multi-Platform Support – Easily extendable to other platforms (TikTok, LinkedIn, etc.) with Upload-Post API. How It Works Form Trigger: A user fills out a form with a "Prompt" (text description) and a list of "Reference images" (comma-separated URLs). Data Processing: The workflow converts the submitted image URL string into a proper array for the AI API. AI Image Generation: The workflow sends the prompt and image URLs to the fal.ai API (specifically, the ByteDance seedream model) to generate a new, consistent image. Status Polling: It periodically checks the status of the AI job until the image generation is COMPLETED. Result Retrieval: Once complete, it fetches the URL of the generated image and downloads the image file itself. SEO Title Generation: The original user prompt is sent to OpenAI's GPT-4o-mini model to generate an optimized, engaging social media title. Cloud Backup: The generated image is uploaded to a specified Google Drive folder for storage. Social Media Posting: Finally, the workflow posts the downloaded image file to both Instagram and Facebook via the Upload-Post.com API, using the AI-generated title. Set Up Steps To make this workflow functional, you need to configure several third-party services and their corresponding credentials within n8n. Obtain fal.ai API Key: Create an account at fal.ai. Locate your API key in your account settings. In the "Create Video" and "Get status" nodes, edit the HTTP Header Auth credentials. Set the Header Name to Authorization and the Value to Key YOUR_FAL_AI_API_KEY. Configure Upload-Post.com API: Create an account at Upload-Post.com and get your API key. Create a profile within the Upload-Post app (e.g., test1); this profile manages your social account connections. In both the "Post to Instagram" and "Post to Facebook" nodes, edit the HTTP Header Auth credentials. Set the Header Name to Authorization and the Value to Apikey YOUR_UPLOAD_POST_API_KEY. Crucially, in the same nodes, find the user parameter in the body and replace the placeholder YOUR_USERNAME with the profile name you created (e.g., test1). Configure OpenAI/OpenRouter (Optional for Title Generation): The "Generate title" node uses an OpenAI-compatible API. The provided example uses OpenRouter. Ensure you have valid credentials (e.g., for OpenRouter or directly for OpenAI) configured in n8n and selected in this node. Configure Google Drive (Optional for Backup): The "Upload Image" node requires Google OAuth credentials. Set up a Google Cloud project, enable the Drive API, and create OAuth 2.0 credentials in the n8n settings. Authenticate and select the desired destination folder in the node's parameters. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by n8n Automation Expert | Template Creator | 2+ Years Experience
🔗 Automated Blockchain Transaction Audit System Transform your blockchain compliance workflow with this enterprise-grade automation that monitors transactions across Ethereum and Solana networks, automatically generates professional audit reports, and maintains complete documentation trails. 🚀 What This Workflow Does This comprehensive automation system: 📊 Multi-Chain Monitoring**: Real-time transaction tracking for Ethereum (via Alchemy API) and Solana networks 🤖 AI-Powered Risk Analysis**: Intelligent scoring algorithm that evaluates transaction risk (0-100 scale) 📄 Automated PDF Generation**: Professional audit reports created instantly using APITemplate.io ☁️ Cloud Storage Integration**: Seamless uploads to Google Drive with organized folder structure 📋 Database Management**: Automatic Notion database entries for complete audit trail tracking 📧 Smart Notifications**: Multi-channel alerts to finance teams with detailed transaction summaries 🔒 Compliance Verification**: Built-in KYC/AML checks and regulatory compliance monitoring 💼 Perfect For FinTech Companies** managing blockchain transactions DeFi Protocols** requiring audit documentation Enterprise Finance Teams** handling crypto compliance Blockchain Auditors** automating report generation Compliance Officers** tracking regulatory requirements 🛠 Key Integrations Alchemy API** - Ethereum transaction monitoring Solana RPC** - Native Solana network access APITemplate.io** - Professional PDF report generation Google Drive** - Secure cloud document storage Notion** - Comprehensive audit database Email/SMTP** - Multi-recipient notification system Etherscan/Solscan** - Smart contract verification ⚡ Technical Highlights 10 Optimized Nodes** with parallel processing capabilities Sub-30 Second Processing** for complete audit cycles Enterprise Security** with credential management Error Handling** with automatic retry mechanisms Scalable Architecture** supporting 1000+ transactions/hour Risk Scoring Algorithm** with customizable parameters 📊 Business Impact 80% Cost Reduction** in manual audit processes 95% Error Elimination** through automation 100% Compliance Coverage** with immutable audit trails 70% Time Savings** for finance teams 🔧 Setup Requirements Before using this workflow, ensure you have: Alchemy API key for Ethereum monitoring APITemplate.io account with audit report template Google Drive service account with folder permissions Notion workspace with configured audit database SMTP credentials for email notifications Etherscan API key for contract verification 📈 Use Cases Transaction Compliance Monitoring**: Automatic flagging of high-risk transactions Regulatory Reporting**: Scheduled audit report generation for authorities Internal Auditing**: Complete documentation for financial reviews Risk Management**: Real-time scoring and alert systems Multi-Chain Portfolio Tracking**: Unified reporting across blockchain networks 🎯 Why Choose This Workflow This isn't just another blockchain monitor - it's a complete document management ecosystem that transforms raw blockchain data into professional, compliant documentation while maintaining enterprise-grade security and scalability. Perfect for organizations serious about blockchain compliance and audit trail management! 🚀 🔄 Workflow Process Webhook Trigger receives blockchain event Parallel Monitoring queries Ethereum & Solana networks AI Processing analyzes transaction data and calculates risk Document Generation creates professional PDF audit reports Multi-Channel Distribution uploads to Drive, logs in Notion, sends notifications Verification & Response confirms all processes completed successfully Ready to automate your blockchain compliance? Import this workflow and transform your audit processes today! ✨
by Sabrina Ramonov 🍄
Description Fully automated pipeline where you send an email to yourself with a rough idea (subject contains “thread”), n8n’s Gmail trigger picks it up, OpenAI ChatGPT rewrites/apply a viral-thread template, and Blotato posts the long-form thread to X/Twitter, Bluesky, and Meta Threads (optionally schedule or include images/videos). Template is easily extensible to other social platforms. Who Is This For? Digital creators, content marketers, social media managers, agencies, entrepreneurs, and influencers who want fast, automated long-form thread posting. 📄 Documentation Full Step-by-Step Tutorial How It Works 1. Trigger: Gmail Connect your Gmail account. n8n monitors emails sent from you and filters for subjects containing the word “thread”. 2. AI Thread Writer: OpenAI ChatGPT Connect your OpenAI account. Prompt ChatGPT to clean up your draft and format a long-form viral thread. 3. Publish to Social Media via Blotato Connect your Blotato account and choose social accounts (X/Twitter, Threads, Bluesky). Schedule or post immediately. Supports optional image/video URLs via a mediaUrls array (publicly accessible URLs). Example email to trigger the workflow: Email Subject: thread Email Body: I'm obsessed with voice AI apps. Super Whisper is my current favorite because it runs locally and keeps my voice data private. I talk to it instead of typing. Way faster. Setup & Required Accounts Gmail account (used as trigger) n8n Gmail OAuth doc: https://docs.n8n.io/integrations/builtin/credentials/google/oauth-single-service OpenAI Platform account (access to ChatGPT) Blotato account: https://blotato.com Generate Blotato API Key: Settings > API > Generate API Key (paid feature only) Sign in to Blotato and create an API Key (required for posting) n8n: Ensure "Verified Community Nodes" enabled in your n8n Admin Panel Install the "Blotato" community node and create Blotato credentials Optional: Media & Style Tweaks Attach images/videos: insert publicly accessible URLs into the mediaUrls array (advanced). To emulate a specific tone/structure, provide ChatGPT examples of your favorite viral threads or replace the example viral-thread prompt with your preferred example. Voice-to-text tip: record ideas (e.g., Superwhispr) and send the transcript by email — ChatGPT will clean it up. Tips & Tricks During testing, use “Scheduled Time” in Blotato instead of immediate posting to preview before going live. Start with a single social platform while testing. If your script is long or includes media, processing may take longer. Many users prefer speaking their ideas (voice notes) then letting AI edit — faster than typing. Troubleshooting Check your Blotato API Dashboard to inspect each request, response, and error. Confirm API key validity, n8n node credentials, and that emails sent have subject containing “thread”. Need Help? In the Blotato web app, click the orange support button in the bottom right to access Blotato support.
by Fahmi Fahreza
Match Resumes to Jobs Automatically with Gemini AI and Decodo Scraping Sign up for Decodo HERE for Discount This automation intelligently connects candidate profiles to job opportunities. It takes an intake form with a short summary, resume link, and optional LinkedIn profile, then enriches the data using Decodo and Gemini. The workflow analyzes skills, experience, and role relevance, ranks top matches, and emails a polished HTML report directly to your inbox—saving hours of manual review and matching effort. Who’s it for? This template is designed for recruiters, hiring managers, and talent operations teams who handle large candidate volumes and want faster, more accurate shortlisting. It’s also helpful for job seekers or career coaches who wish to identify high-fit openings automatically using structured AI analysis. How it works Receive an intake form containing a candidate’s resume, summary, and LinkedIn URL. Parse and summarize the resume with Gemini for core skills and experience. Enrich the data using Decodo scraping to gather extra profile details. Merge insights and rank job matches from Decodo’s job data. Generate an HTML shortlist and email it automatically through Gmail. How to set up Connect credentials for Gmail, Google Gemini, and Decodo. Update the Webhook path and test your form connection. Customize variables such as location or role preferences. Enable Send as HTML in the Gmail node for clean reports. Publish as self-hosted if community nodes are included.
by Oneclick AI Squad
Automatically discovers trending topics in your niche and generates ready-to-use content ideas with AI. 🎯 How It Works 1. Multi-Source Trend Monitoring Twitter/X trending topics and hashtags Reddit hot posts from niche subreddits Google Trends daily search trends Runs every 2 hours for fresh opportunities 2. Smart Filtering & Scoring Filters by your niche keywords Removes duplicates across sources Calculates viral potential score (0-100) Ranks by engagement, recency, and relevance Prevents suggesting already-covered topics 3. AI Content Generation Uses Claude AI to analyze each trend Generates 5 unique content ideas per trend Provides hooks, key points, and platform recommendations Explains why each idea has viral potential 4. Comprehensive Delivery Beautiful HTML email digest with all opportunities Slack summary for quick review Database logging for tracking Research links for deeper investigation ⚙️ Configuration Guide Step 1: Configure Your Niche Edit the "Load Niche Config" node: niche: 'AI & Technology', // Your industry keywords: [ // Topics to track 'artificial intelligence', 'machine learning', 'AI tools', // Add your keywords ], subreddits: 'artificial+machinelearning', // Relevant subreddits thresholds: { minTwitterLikes: 1000, // Minimum engagement minRedditUpvotes: 500, minComments: 50 } Step 2: Connect Data Sources Twitter/X API: Sign up for Twitter Developer Account Get API credentials (OAuth 2.0) Add credentials to "Fetch Twitter/X Trends" node Reddit API: Create Reddit app: https://www.reddit.com/prefs/apps Get OAuth credentials Add credentials to "Fetch Reddit Hot Topics" node Google Trends: No authentication needed (public API) Already configured in workflow Step 3: Configure AI Integration Anthropic Claude API: Get API key from: https://console.anthropic.com/ Add credentials to "AI - Generate Content Ideas" node Alternative: Use OpenAI GPT-4 by modifying the node Step 4: Setup Notifications Email: Configure SMTP in "Send Email Digest" node Update recipient email address Customize HTML template if desired Slack: Create incoming webhook: https://api.slack.com/messaging/webhooks Add webhook URL to "Send Slack Summary" node Customize channel name Step 5: Database (Optional) Create PostgreSQL database with schema below Add credentials to "Log to Content Database" node Skip if you don't need database tracking Database Schema CREATE TABLE content.viral_opportunities ( id SERIAL PRIMARY KEY, opportunity_id VARCHAR(255) UNIQUE, detected_at TIMESTAMP, topic TEXT, source VARCHAR(50), source_url TEXT, engagement BIGINT, viral_score INTEGER, opportunity_level VARCHAR(20), niche VARCHAR(100), content_ideas JSONB, research_links JSONB, urgency TEXT, status VARCHAR(50), created_at TIMESTAMP DEFAULT NOW() ); 🎨 Customization Options Adjust Scan Frequency Edit "Every 2 Hours" trigger: More frequent: Every 1 hour Less frequent: Every 4-6 hours Consider API rate limits Tune Viral Score Algorithm Edit "Calculate Viral Potential Score" node: Adjust engagement weight (currently 40%) Change recency importance (currently 30%) Modify threshold in "Filter High Potential Only" (currently 40) Customize Content Ideas Modify the AI prompt in "AI - Generate Content Ideas": Change number of ideas (currently 5) Add specific format requirements Include brand voice guidelines Target specific platforms 📊 Expected Results Typical scan finds: 5-15 opportunities** per scan (2 hours) 3-5 HIGH priority** (score 75+) 25+ content ideas** generated Email sent** with full digest Slack alert** for quick review 💡 Pro Tips Timing Matters: Create content within 24-48 hours of detection High Priority First: Focus on opportunities scoring 75+ Platform Match: Choose platforms where your audience is active Add Your Voice: Use AI ideas as starting points, not final copy Track Performance: Note which opportunity types perform best Refine Keywords: Regularly update your niche keywords based on results Mix Formats: Try different content formats for same trend 🚨 Important Notes ⚠️ API Rate Limits: Twitter: Monitor rate limits closely Reddit: 60 requests per minute Claude AI: Tier-based limits Consider caching results 💰 Cost Considerations: Twitter API: May require paid tier Reddit API: Free for reasonable use Claude AI: ~$0.50-1.00 per scan Total: ~$15-30/month estimated 🎯 Best Practices: Start with 1-2 sources, add more later Test with broader keywords initially Review first few reports to tune scoring Don't create content for every opportunity Quality over quantity 🔄 What Happens Next? Workflow runs every 2 hours Scans Twitter, Reddit, Google Trends Filters by your keywords Scores viral potential Generates AI content ideas Sends digest to email + Slack Logs to database Marks topics as suggested Repeat!
by Krishna Sharma
📄 Smart Lead Capture, Scoring & Slack Alerts This workflow captures new leads from Typeform, checks for duplicates in HubSpot CRM, enriches and scores them, assigns priority tiers (Cold, Warm, Hot), and instantly notifies your sales team in Slack. 🔧 How It Works Typeform Trigger → Monitors form submissions and passes lead details into the workflow. HubSpot Deduplication → Searches HubSpot by email before creating a new record. Conditional Routing → If no match → Creates a new contact in HubSpot. If match found → Updates the existing contact with fresh data. Lead Scoring (Function Node) → Custom JavaScript assigns a score based on your rules (e.g. company email, job title, engagement signals, enrichment data). Tier Assignment → Categorizes the lead as ❄️ Cold, 🌡 Warm, or 🔥 Hot based on score thresholds. Slack Notification → Sends formatted lead alerts to a dedicated sales channel with priority indicators. 👤 Who Is This For? Sales teams who need to prioritize hot leads in real-time. Marketing teams running inbound lead capture campaigns with Typeform. RevOps teams that want custom scoring beyond HubSpot defaults. Founders/SMBs looking to tighten lead-to-revenue pipeline with automation. 💡 Use Case / Problem Solved ❌ Duplicate contacts clogging HubSpot CRM. ❌ Manual lead triage slows down response time. ❌ HubSpot’s default scoring is rigid. ✅ Automates lead creation + scoring + notification in one flow. ✅ Sales teams get immediate Slack alerts with context to act fast. ⚙️ What This Workflow Does Captures lead data directly from Typeform. Cleans & deduplicates contacts before pushing to HubSpot CRM. Scores and categorizes leads via custom logic. Sends structured lead alerts to Slack, tagged by priority. Provides a scalable foundation you can extend with data enrichment (e.g., Clearbit, Apollo). 🛠️ Setup Instructions 🔑 Prerequisites Typeform account with API access → Typeform Developer Docs HubSpot CRM account with API key or OAuth → HubSpot API Docs Slack workspace & API access → Slack API Docs (Optional) n8n automation platform to build & run → n8n Hub 📝 Steps to Configure Typeform Node (Trigger) Connect your Typeform account in n8n. Select the form to track submissions. Fields typically include: first name, last name, email, company, phone. HubSpot Node (Search Contact) Configure a search by email. Route outcomes: Not Found → Create Contact Found → Update Contact HubSpot Node (Create/Update Contact) Map Typeform fields into HubSpot (email, name, phone, company). Ensure you capture both standard and custom properties. Function Node (Lead Scoring) Example JavaScript: // Simple lead scoring example const email = $json.email || ""; let score = 0; if (email.endsWith("@company.com")) score += 30; if ($json.company && $json.company.length > 2) score += 20; if ($json.phone) score += 10; let tier = "❄️ Cold"; if (score >= 60) tier = "🔥 Hot"; else if (score >= 30) tier = "🌡 Warm"; return { ...$json, leadScore: score, leadTier: tier }; Customize rules based on your GTM strategy. Reference → n8n Function Node Docs Slack Node (Send Message) Example Slack message template: 🚀 New Lead Alert! 👤 {{ $json.firstname }} {{ $json.lastname }} 📧 {{ $json.email }} | 🏢 {{ $json.company }} 📊 Score: {{ $json.leadScore }} — {{ $json.leadTier }} Send to dedicated #sales-leads channel. Reference → Slack Node in n8n 📌 Notes & Extensions 🔄 Add enrichment with Clearbit or Apollo.io before scoring. 📊 Use HubSpot workflows to trigger nurturing campaigns for ❄️ Cold leads. ⏱ For 🔥 Hot leads, auto-assign to an SDR using HubSpot deal automation. 🧩 Export data to Google Sheets or Airtable for analytics.