by Michael Taleb
How it works Watches a Google Drive folder for new (scanned) invoices. Each new file automatically triggers the workflow. Downloads and processes each invoice through OCR Space to extract the text. Extracts the company name (e.g. from the “billed to” field) and uses an AI agent to cross-reference it against a database in Google Sheets. If a match is found, retrieves the correct recipient email and sends the invoice as an attachment. If no match or an error occurs, the workflow alerts an operator by email for manual review. Setting up the workflow Connect Google Drive • In n8n, connect your Google Drive account. • Create or select a folder where you will upload scanned invoices. Connect Gmail (or another email service) • Add your Gmail account as a credential in n8n. • This will be used to send the processed invoice to the correct recipient. Set up OCR.Space • Create a free OCR.Space account: https://ocr.space • In n8n, create a Generic Credential (Header Auth). • Use apikey as the name and your OCR API key as the value. Connect the AI Agent • Add your OpenAI API key as a credential in n8n. • The AI Agent will extract the company name from the invoice text and match it against your database. • If a match is found, it retrieves the correct email. Prepare the Google Sheet database • Make a copy of the database sheet: Google Sheet Template • Fill it with company names and recipient emails. • Connect your Google account to n8n and link this sheet to the workflow. Run the workflow • When a new invoice is uploaded to your Google Drive folder, the workflow will: • Extract the text with OCR.Space • Use the AI Agent to identify the company name • Cross-reference it with your Google Sheet database • Send the invoice automatically to the correct recipient via Gmail • If no match is found, an error email is sent to you for manual review
by Dean Pike
LinkedIn URL → Scrape → Match → Screen → Decide, all automated This workflow automatically processes candidate LinkedIn profiles shared via Telegram, intelligently matches them to job descriptions, performs AI-powered screening analysis, and sends actionable summaries to your team in Telegram. Good to know Handles LinkedIn profile scraping via Apify API (extracts full profile data including experience, education, skills) Built-in spam prevention: limits users to 3 LinkedIn profile submissions Two-stage JD matching: prioritizes role mentioned in candidate's Telegram message, falls back to LinkedIn profile analysis if needed Uses Google Gemini API for AI screening (generous free tier and rate limits, typically enough to avoid paying for API requests - check latest pricing at Google AI Pricing and rate limits documentation) Automatic polling mechanism checks Apify extraction status up to 10 times (15-second intervals) Complete audit trail logged in Google Sheets with unique submission IDs Who's it for Hiring teams and recruiters who want to streamline first-round screening for candidates who share LinkedIn profiles directly. Perfect for companies accepting applications via messaging platforms (Telegram, WhatsApp, etc.), especially useful fortech-savvy audiences and remote/global hiring. How it works Telegram bot receives message containing LinkedIn profile URL from candidate Validates URL format and checks spam prevention (max 3 submissions per Telegram username) Sends confirmation message to candidate and notifies internal talent team via Telegram group Extracts clean LinkedIn URL and initiates Apify scraping job Polls Apify API up to 10 times (15-second intervals) until profile extraction completes AI agent matches candidate to best-fit job description by analyzing Telegram message context first (if candidate mentioned a role), or LinkedIn profile content as fallback (selects up to 3 potential JD matches) If multiple JDs matched, second AI agent selects the single best fit based on detailed profile analysis AI recruiter agent analyzes LinkedIn profile against selected JD and generates structured screening report (strengths, weaknesses, risk/reward factors, overall fit score 0-10 with justification) Logs complete analysis to Google Sheets tracker with unique submission ID Sends formatted summary to Telegram group with candidate details, matched JD, and overall fit score Requirements Telegram Bot Token (Create bot via @BotFather) Apify account with API token (Sign up for free tier) Google Drive account (OAuth2) Google Sheets account (OAuth2) Google Gemini API key (Get free key here) Google Drive folder for Job Descriptions (as PDFs or Google Docs) Telegram group for internal talent team notifications How to set up Create Telegram bot and internal Telegram chat group with new bot: Message @BotFather on Telegram Send /newbot and follow instructions to create your bot Save the API token provided Create Telegram group chat and invite your new bot + invite the @GetIDs bot Note down the group chat ID (How to get group chat ID) Setup Apify: Sign up at Apify Get your API token from Settings Note: Free tier includes sufficient scraping credits for testing and production ($0.01 per successful LinkedIn profile enriched, a free monthly limit of $5.00) - LinkedIn profile scraper "actor" details Create Google Sheet: Create new sheet named "LinkedIn Profile AI Candidate Screening" Add columns: Submission ID, Date, LinkedIn Profile URL, First Name, Last Name, Email (if known), Telegram Username, Strengths, Weaknesses, Risk Factor, Reward Factor, JD Match, Overall Fit, Justification Copy the spreadsheet ID from URL Setup Google Drive folder: Create folder named "Job Descriptions" Upload your JD files (PDFs or Google Docs) with clear, descriptive filenames Copy the folder ID from URL Configure workflow nodes: In "Receive Telegram Msg to Recruiter Bot" node: Add Telegram API credentials In "Extract LinkedIn Profile Information" node: Replace YOUR_APIFY_API_TOKEN with your Apify token In "Check LinkedIn Profile Extraction Status" node: Replace YOUR_APIFY_API_TOKEN with your Apify token In "Get Fully Extracted LinkedIn Profile Data" node: Replace YOUR_APIFY_API_TOKEN with your Apify token In "Access JD Files" node: Update folder ID to your "Job Descriptions" folder In "Get All Rows Matching Telegram Username" node: Select your Google Sheet In "Add Candidate Analysis in GSheet" node: Select your Google Sheet and verify column mappings In "Send Msg to Internal Talent Group" node: Update chat ID to your Telegram group chat ID In "Send Review Completed Msg to Talent Group" node: Update chat ID and Google Sheet URL Add your company description: In "JD Matching Agent" system message: Replace company description with your details In "Detailed JD Matching Agent" system message: Replace company description with your details In "Recruiter Scoring Agent" system message: Update company description Test the workflow: Send a LinkedIn profile URL to your bot from Telegram Monitor execution to ensure all nodes run successfully Check Google Sheets for logged results Activate workflow Customizing this workflow Change spam limits: Edit "Spam Check: Sent <4 LinkedIn Profiles?" node to adjust maximum submissions (currently 3) Adjust polling attempts: Edit "Checked 10x for LinkedIn Profile Data?" node to change maximum polling attempts (currently 10) or modify wait time in "Wait for LinkedIn Profile" node (currently 15 seconds) Change JD matching logic: Edit "JD Matching Agent" node prompt to adjust how LinkedIn profiles are matched to roles (e.g., weight current role vs. overall experience) Modify screening criteria: Edit "Recruiter Scoring Agent" node system message to focus on specific qualities (culture fit, leadership potential, technical depth, industry experience, etc.) Add more messaging platforms: Add nodes to support WhatsApp, Discord, or other messaging platforms using similar URL-based triggers Customize Telegram messages: Edit notification nodes to change formatting, add emojis, or include additional candidate data Auto-proceed logic: Add IF node after screening to auto-proceed candidates with fit score above threshold (e.g., 8+/10) and trigger different notification paths Add candidate responses: Connect nodes to automatically message candidates back via Telegram (confirmation, rejection, interview invite) Add interview scheduling: For approved candidates, send Telegram message with Cal.com or Calendly link so they can book their interview Enrich with additional data: Add nodes to cross-reference candidate data with other sources (GitHub, Twitter/X, company websites) Multi-language support: Add translation nodes to support candidates submitting profiles in different languages Add human approval step: Create buttons in Telegram group messages for instant Approve/Reject decisions that update Google Sheets Pro tip: Add your Telegram bot to your company's careers page with instructions like: "Want fast-track screening? Share your LinkedIn profile with our AI recruiter: @YourBotName" Troubleshooting Telegram bot not responding: Ensure bot token is correct in "Receive Telegram Msg to Recruiter Bot" node, and users have sent /start to your bot at least once "LinkedIn profile URL invalid" error: Check that candidates are sending full URLs in format https://www.linkedin.com/in/username (not shortened links or text without URL) Apify extraction failing: Verify Apify API token is correctly set in all three HTTP Request nodes ("Extract LinkedIn Profile Information", "Check LinkedIn Profile Extraction Status", "Get Fully Extracted LinkedIn Profile Data") LinkedIn extraction timeout: Increase polling attempts in "Checked 10x for LinkedIn Profile Data?" node (currently 10) or increase wait time in "Wait for LinkedIn Profile" node (currently 15 seconds) Spam check blocking valid users: Check "Get All Rows Matching Telegram Username" node is pointing to correct Google Sheet, and adjust limit in "Spam Check: Sent <4 LinkedIn Profiles?" node if needed JD matching returns no results: Check "Access JD Files" node folder ID points to your Job Descriptions folder, and JD files are named clearly (e.g., "Marketing Director JD.pdf") JD matching is not relevant for my company: Update the "Company Description" in the System Messages in all three AI agent nodes ("JD Matching Agent", "Detailed JD Matching Agent", "Recruiter Scoring Agent") "Can't find matching JD": Ensure candidate's Telegram message mentions role name OR their LinkedIn profile clearly indicates relevant experience for available JDs Google Sheets errors: Verify sheet name is "LinkedIn Profile AI Candidate Screening" and column headers exactly match workflow expectations (Submission ID, Date, LinkedIn Profile URL, First Name, Last Name, etc.) Telegram group notifications not appearing: Verify chat ID is correct in "Send Msg to Internal Talent Group" and "Send Review Completed Msg to Talent Group" nodes (use negative number for group chats, e.g., -4954246611) Missing candidate data in Google Sheets: LinkedIn profile may be incomplete - verify Apify successfully extracted data by checking "Get Fully Extracted LinkedIn Profile Data" node output Loop counter not working: Check "Restore Loop Counter" code node references correct node names ("Checked 10x for LinkedIn Profile Data?" and "Initialize Loop Counter to Poll for Completion") 401/403 API errors: Re-authorize all OAuth2 credentials (Google Drive, Google Sheets) and verify Apify and Telegram API tokens are valid AI analysis quality issues: Edit system prompts in "JD Matching Agent", "Detailed JD Matching Agent", and "Recruiter Scoring Agent" nodes to refine screening criteria and provide more context about your hiring needs Gemini API rate limit errors: Check your usage at Google AI Studio and consider upgrading to paid tier if exceeding free tier limits (see rate limits documentation) Sample Outputs Google Sheets - LinkedIn AI Candidate Screening - sample Telegram messages between AI recruiter bot and job applicant Telegram messages from AI recruiter bot in internal group chat
by Hichul
This workflow is an autonomous AI research assistant that transforms a single topic into a comprehensive, multi-chapter report. Designed for researchers, students, and content creators, it automates the entire process from planning and online research to writing, formatting, and delivering a final PDF report directly to your email. How it works Trigger: The workflow begins when a user submits a research topic and an email address through a form. AI Planning: An AI agent breaks the main topic into five focused subtopics. It then generates an engaging report title, a detailed introduction, and chapter headings, saving this initial structure to a Google Sheet. Parallel Research and Writing: The workflow then splits into five parallel paths, one for each subtopic. In each path, it uses the Tavily Search API to gather real-time information from the web. A dedicated AI writer then synthesizes this research into a complete, well-formatted HTML chapter with inline citations. Content Aggregation: As each chapter is completed, its content, sources, and section titles are saved to the central Google Sheet. Final Assembly and Delivery: Once all chapters are written, the workflow compiles the title, introduction, a newly generated table of contents, all chapters, and a complete list of sources into a single HTML document. This document is sent to APITemplate.io to be converted into a professionally formatted PDF, which is then emailed as an attachment to the address provided in the form. Set up steps Google Sheets: Make a copy of the provided Google Sheet Template. Connect your Google account in the credentials menu. Update all Google Sheets nodes to use your copied sheet by selecting it from the list. AI Language Model (Google Gemini): Sign up for an API key from the Google AI Platform. Connect your account in the Google Gemini Chat Model nodes. This template is pre-configured for the affordable gemini-pro-flash-lite model. Tavily Search API: Sign up for a free account at Tavily and get an API key. In the Tavily HTTP Request nodes, create a new Header Auth credential. For the Name, enter X-Tavily-API-Key and for the Value, paste your Tavily API key. APITemplate.io: Sign up for a free account at APITemplate.io and get an API key. Connect your account in the Generate PDF node's credentials. Gmail: Connect the Gmail account you want to send the final report from in the Send Report node's credentials.
by Raymond Camden
How It Works This N8N template demonstrates using Foxit's Extraction API to get information from an incoming document and then using Diffbot's APIs to turn the text into a list of organizations mentioned in the document and create a summary. How it works Listen for a new file added to a Google Drive folder. When executed, the bits are downloaded. Upload the bits to Foxit. Call the Extract API to get the text contents of the document. Poll the API to see if it's done, and when it is, grab the text. Send the text to Diffbot API to get a list of entities mentioned in the doc as well as the summary. Use a code step to filter the entities returned from Diffbot to ones that are organizations, as well as filtering to a high confidence score. Use another code step to make an HTML string from the previous data. Email it using the GMail node. Requirements A Google account for Google Drive and GMail Foxit developer account (https://developer-api.foxit.com) Diffbot developer account (https://app.diffbot.com/get-started) Next Steps This workflow assumes PDF input, but Foxit has APIs to convert Office docs to PDF and that flow could be added before the Extract API is called. Diffbot returns an incredible set of information and more could be used in the email. Instead of emailing, you could sort documents by organizations into new folders.
by Kev
Generate ready-to-publish short-form videos from text prompts using AI Click on the image to see the Example output in google drive Transform simple text concepts into professional short-form videos complete with AI-generated visuals, narrator voice, background music, and dynamic text overlays - all automatically generated and ready for Instagram, TikTok, or YouTube Shorts. This workflow demonstrates a cost-effective approach to video automation by combining AI-generated images with audio composition instead of expensive AI video generation. Processing takes 1-2 minutes and outputs professional 9:16 vertical videos optimized for social platforms. The template serves as both a showcase and building block for larger automation systems, with sticky notes providing clear guidance for customization and extension. Who's it for Content creators, social media managers, and marketers who need consistent, high-quality video content without manual production work. Perfect for motivational content, storytelling videos, educational snippets, and brand campaigns. How it works The workflow uses a form trigger to collect video theme, setting, and style preferences. ChatGPT generates cohesive scripts and image prompts, while Google Gemini creates themed background images and OpenAI TTS produces narrator audio. Background music is sourced from Openverse for CC-licensed tracks. All assets are uploaded to JsonCut API which composes the final video with synchronized overlays, transitions, and professional audio mixing. Results are stored in NocoDB for management. How to set up JsonCut API: Sign up at jsoncut.com and create an API key at app.jsoncut.com. Configure HTTP Header Auth credential in n8n with header name x-api-key OpenAI API: Set up credentials for script generation and text-to-speech Google Gemini API: Configure access for Imagen 4.0 image generation NocoDB (Optional): Set up instance for video storage and configure database credentials Requirements JsonCut free account with API key OpenAI API access for GPT and TTS Google Gemini API for image generation NocoDB (optional) for result storage How to customize the workflow This template is designed as a foundation for larger automation systems. The modular structure allows easy modification of AI prompts for different content niches (business, wellness, education), replacement of the form trigger with RSS feeds or database triggers for automated content generation, integration with social media APIs for direct publishing, and customization of visual branding through JsonCut configuration. The workflow can be extended for bulk processing, A/B testing multiple variations, or integration with existing content management systems. Sticky notes throughout the workflow provide detailed guidance for common customizations and scaling options.
by gclbck
Analyze YouTube videos for virality with an AI-powered report This workflow automates the discovery and analysis of potentially viral YouTube videos. It searches for recent, popular videos based on a keyword, calculates a unique "Algorithmic Lift Score" to measure virality, and uses an AI agent to generate an insightful summary report that is sent directly to your email. What it does This workflow identifies videos that are outperforming their channel's baseline, a key indicator of viral potential. It operates in several stages: Searches YouTube: It finds recent, top-performing videos based on your specified keyword and timeframe. Gathers Data: For each video found, it fetches detailed statistics for both the video (views, likes, comments) and its channel (subscriber count, total views). Calculates Virality Score: It calculates an "Algorithmic Lift Score" for each video. This custom metric prioritizes videos that achieve high view counts and engagement relative to their channel's subscriber base. Analyzes with AI: The top 5 videos, sorted by their virality score, are sent to an AI agent (pre-configured for OpenAI). The AI generates a concise summary highlighting trends, top performers, and other noteworthy patterns. Sends Email Report: The final AI-generated analysis is converted to HTML and emailed to you, providing a ready-to-read report on what's trending in your niche. Who it's for This workflow is perfect for: Content Creators** looking for trending topics and content ideas. Digital Marketers** conducting competitor analysis or market research. Social Media Managers** wanting to understand what content resonates on YouTube. Data Analysts** who need to automate the collection and analysis of YouTube trends. Requirements A Google API Key with the "YouTube Data API v3" enabled. An OpenAI API Key (or another compatible AI model credential). A connected Gmail account in n8n to send the final report. How to set up Configure the Setup Node: Click on the "Setup" node and fill in the values: query: The keyword you want to search for (e.g., "AI tools"). GoogleAPIkey: Your Google API key. daysback: How many days in the past to search for new videos. maxResult: The number of videos to analyze (e.g., 20). email: The email address where the report will be sent. Set AI Credentials: Click the "OpenAI Chat Model" node and add your OpenAI API key to the credentials. Set Gmail Credentials: Click the "Send_Report" node and connect your Gmail account to the credentials.
by Harry Siggins
Research meeting attendees and prepare daily agenda in Slack This workflow automatically researches your meeting attendees every morning and sends you a comprehensive brief in Slack with context about who you're meeting, their company, and key talking points. Who's it for Sales professionals who need quick context before meetings Executives with packed calendars who need meeting preparation Customer success teams managing multiple client relationships Account managers preparing for client calls Business development teams researching prospects Anyone who wants to be better prepared for their daily meetings How it works Daily Trigger: Runs every weekday morning at 6 AM (customizable) to analyze your Google Calendar Calendar Analysis: Fetches all meetings scheduled for today and filters for external meetings (those with attendees other than yourself) AI-Powered Research: For each external meeting, an AI agent researches attendees using multiple sources: Searches your CRM (Attio) for existing contact information Queries Gmail history for past email interactions Searches past calendar events for previous meetings with attendees Performs web searches for recent news about attendees and their companies Retrieves company data from Apollo.io including industry, size, and technologies CRM Updates: Automatically creates new contact records in Attio for unknown attendees and adds meeting preparation notes to existing contacts Brief Generation: Compiles all research into a scannable, actionable meeting brief with key talking points Slack Delivery: Sends the formatted brief to your designated Slack channel for easy mobile access Setup requirements Google Calendar** OAuth2 connection (for fetching meetings) Slack** workspace with bot permissions (for receiving briefs) Gmail** OAuth2 connection (for email history search) OpenRouter** API key (for AI processing) Attio CRM** account and API token (optional - for contact management) Apollo.io** API key (optional - for company research) Anthropic** API key (optional - for advanced web search) How to customize Adjust Schedule: Modify the Schedule Trigger node to run at your preferred time - change from 6 AM to whenever works best for your schedule Customize Research Sources: Remove CRM integration if you don't use Attio Remove Apollo.io if you don't need company research Add additional research tools as needed Modify Output Format: Edit the prompt in "Format Daily Meeting Brief" node to change how the information is structured and presented Change Delivery Method: Replace Slack with Microsoft Teams, email, or Discord Add multiple delivery channels if needed Send to different channels based on meeting type Filter Meetings: Adjust the filtering logic to include/exclude certain types of meetings based on keywords, attendees, or calendar properties Advanced customization Add VIP alerts**: Create special handling for meetings with executives or key clients Include preparation documents**: Automatically attach relevant files from Google Drive Time zone handling**: Adjust for meetings across different time zones Language support**: Modify prompts to generate briefs in different languages
by Growth AI
Who's it for Marketing teams, business intelligence professionals, competitive analysts, and executives who need consistent industry monitoring with AI-powered analysis and automated team distribution via Discord. What it does This intelligent workflow automatically monitors multiple industry topics, scrapes and analyzes relevant news articles using Claude AI, and delivers professionally formatted intelligence reports to your Discord channel. The system provides weekly automated monitoring cycles with personalized bot communication and comprehensive content analysis. How it works The workflow follows a sophisticated 7-phase automation process: Scheduled Activation: Triggers weekly monitoring cycles (default: Mondays at 9 AM) Query Management: Retrieves monitoring topics from centralized Google Sheets configuration News Discovery: Executes comprehensive Google News searches using SerpAPI for each configured topic Content Extraction: Scrapes full article content from top 3 sources per topic using Firecrawl AI Analysis: Processes scraped content using Claude 4 Sonnet for intelligent synthesis and formatting Discord Optimization: Automatically segments content to comply with Discord's 2000-character message limits Automated Delivery: Posts formatted intelligence reports to Discord channel with branded "Claptrap" bot personality Requirements Google Sheets account for query management SerpAPI account for Google News access Firecrawl account for article content extraction Anthropic API access for Claude 4 Sonnet Discord bot with proper channel permissions Scheduled execution capability (cron-based trigger) How to set up Step 1: Configure Google Sheets query management Create monitoring sheet: Set up Google Sheets document with "Query" sheet Add search topics: Include industry keywords, competitor names, and relevant search terms Sheet structure: Simple column format with "Query" header containing search terms Access permissions: Ensure n8n has read access to the Google Sheets document Step 2: Configure API credentials Set up the following credentials in n8n: Google Sheets OAuth2: For accessing query configuration sheet SerpAPI: For Google News search functionality with proper rate limits Firecrawl API: For reliable article content extraction across various websites Anthropic API: For Claude 4 Sonnet access with sufficient token limits Discord Bot API: With message posting permissions in target channel Step 3: Customize scheduling settings Cron expression: Default set to "0 9 * * 1" (Mondays at 9 AM) Frequency options: Adjust for daily, weekly, or custom monitoring cycles Timezone considerations: Configure according to team's working hours Execution timing: Ensure adequate processing time for multiple topics Step 4: Configure Discord integration Set up Discord delivery settings: Guild ID: Target Discord server (currently: 919951151888236595) Channel ID: Specific monitoring channel (currently: 1334455789284364309) Bot permissions: Message posting, embed suppression capabilities Brand personality: Customize "Claptrap" bot messaging style and tone Step 5: Customize content analysis Configure AI analysis parameters: Analysis depth: Currently processes top 3 articles per topic Content format: Structured markdown format with consistent styling Language settings: Currently configured for French output (easily customizable) Quality controls: Error handling for inaccessible articles and content How to customize the workflow Query management expansion Topic categories: Organize queries by industry, competitor, or strategic focus areas Keyword optimization: Refine search terms based on result quality and relevance Dynamic queries: Implement time-based or event-triggered query modifications Multi-language support: Add international keyword variations for global monitoring Advanced content processing Article quantity: Modify from 3 to more articles per topic based on analysis needs Content filtering: Add quality scoring and relevance filtering for article selection Source preferences: Implement preferred publisher lists or source quality weighting Content enrichment: Add sentiment analysis, trend identification, or competitive positioning Discord delivery enhancements Rich formatting: Implement Discord embeds, reactions, or interactive elements Multi-channel distribution: Route different topics to specialized Discord channels Alert levels: Add priority-based messaging for urgent industry developments Archive functionality: Create searchable message threads or database storage Integration expansions Slack compatibility: Replace or supplement Discord with Slack notifications Email reports: Add formatted email distribution for executive summaries Database storage: Implement persistent storage for historical analysis and trending API endpoints: Create webhook endpoints for third-party system integration AI analysis customization Analysis templates: Create topic-specific analysis frameworks and formatting Competitive focus: Enhance competitor mention detection and analysis depth Trend identification: Implement cross-topic trend analysis and strategic insights Summary levels: Create executive summaries alongside detailed technical analysis Advanced monitoring features Intelligent content curation The system provides sophisticated content management: Relevance scoring: Automatic ranking of articles by topic relevance and publication authority Duplicate detection: Prevents redundant coverage of the same story across different sources Content quality assessment: Filters low-quality or promotional content automatically Source diversity: Ensures coverage from multiple perspectives and publication types Error handling and reliability Graceful degradation: Continues processing even if individual articles fail to scrape Retry mechanisms: Automatic retry logic for temporary API failures or network issues Content fallbacks: Uses article snippets when full content extraction fails Notification continuity: Ensures Discord delivery even with partial content processing Results interpretation Intelligence report structure Each monitoring cycle delivers: Topic-specific summaries: Individual analysis for each configured search query Source attribution: Complete citation with publication date, source, and URL Structured formatting: Consistent presentation optimized for quick scanning Professional analysis: AI-generated insights maintaining factual accuracy and business context Performance analytics Monitor system effectiveness through: Processing metrics: Track successful article extraction and analysis rates Content quality: Assess relevance and usefulness of delivered intelligence Team engagement: Monitor Discord channel activity and report utilization System reliability: Track execution success rates and error patterns Use cases Competitive intelligence Market monitoring: Track competitor announcements, product launches, and strategic moves Industry trends: Identify emerging technologies, regulatory changes, and market shifts Partnership tracking: Monitor alliance formations, acquisitions, and strategic partnerships Leadership changes: Track executive movements and organizational restructuring Strategic planning support Market research: Continuous intelligence gathering for strategic decision-making Risk assessment: Early warning system for industry disruptions and regulatory changes Opportunity identification: Spot emerging markets, technologies, and business opportunities Brand monitoring: Track industry perception and competitive positioning Team collaboration enhancement Knowledge sharing: Centralized distribution of relevant industry intelligence Discussion facilitation: Provide common information baseline for strategic discussions Decision support: Deliver timely intelligence for business planning and strategy sessions Competitive awareness: Keep teams informed about competitive landscape changes Workflow limitations Language dependency: Currently optimized for French analysis output (easily customizable) Processing capacity: Limited to 3 articles per query (configurable based on API limits) Platform specificity: Configured for Discord delivery (adaptable to other platforms) Scheduling constraints: Fixed weekly schedule (customizable via cron expressions) Content access: Dependent on article accessibility and website compatibility with Firecrawl API dependencies: Requires active subscriptions and proper rate limit management for all integrated services
by inderjeet Bhambra
Who's it for Content creators, trainers, and educators who need to convert lengthy documents into digestible micro-learning experiences. How it works This workflow takes your source content (PDFs, articles, handbooks) and uses GPT-4 to intelligently break it into 2-3 minute learning modules. Each module includes a key concept, explanation, practical example, and knowledge check question. How to set up Configure OpenAI credentials with GPT-4 access Connect Slack workspace (optional) Set up Google Docs integration Optionally, Send content via webhook or paste directly Requirements OpenAI API key with GPT-4 access Google Docs account (for document creation) Slack workspace (optional, for notifications) How to customize the workflow Adjust module count and length in AI prompts Modify output formats (email, mobile, Slack) Change document structure and styling Add custom delivery channels Perfect for converting employee handbooks, training materials, and documentation into engaging micro-learning courses that people actually complete.
by Msaid Mohamed el hadi
🤖 Instagram Automation Suite: AI Chatbot & Content Powerhouse Workflow Overview This cutting-edge n8n workflow is a comprehensive automation solution designed to streamline various Instagram operations. It combines an intelligent AI chatbot for direct message management, automated user following, and an advanced content generation system, all integrated to enhance your Instagram presence and efficiency. This workflow automatically: Manages Instagram Direct Messages via Telegram Chatbot: Listens for new messages on Telegram. Routes messages from a specific Instagram user (Wolf23000) for processing. Utilizes an AI agent (powered by OpenRouter's cutting-edge models) to determine the intent of the message (e.g., chat back, run an Instagram-related action like getting profile info, posting, or following). Sends AI-generated responses back to the user via Telegram. Automates Instagram User Following: Scheduled to run at regular intervals (hourly). Processes a list of usernames (likely from a Google Sheet, though not explicitly shown in the provided JSON, it's a common pattern for "Auto Follow users from sheet" sticky note). Initiates following actions on Instagram for the specified users. Generates & Schedules Instagram Posts: Scheduled to run monthly. Leverages an AI agent (powered by OpenRouter) to generate 30 or 31 Instagram post ideas for the current month, based on a predefined "Instagram personality profile." Each post idea includes an imagePrompt (for AI image generation), a caption with emojis and hashtags, and a scheduledDate. Refines these post ideas by enhancing the imagePrompt to be more vivid and detailed for AI image generation, and polishing the caption for optimal engagement. Updates a Google Sheet ("posts generation plan") with the generated content, including the enhanced image prompts and the resulting image URLs (presumably from a separate image generation step not fully detailed in the provided JSON, but implied by image_url updates). Key Benefits Intelligent DM Management: Automate responses and actions for Instagram direct messages, ensuring timely and relevant interactions without manual effort. Effortless Audience Growth: Automatically follow target users, expanding your reach and potential engagement on Instagram. AI-Powered Content Creation: Generate a full month's worth of diverse, engaging Instagram post ideas tailored to a specific personality, complete with image prompts and captions. Content Optimization: Automatically enhance image prompts for better AI image generation and refine captions for maximum impact. Time-Saving: Significantly reduce the manual workload associated with Instagram management, from direct messages to content planning and execution. Consistent Brand Voice: Maintain a consistent and engaging presence on Instagram with AI-generated content aligned with your defined personality. Setup Requirements To set up and run this workflow, you'll need the following: n8n Installation: Install n8n (cloud or self-hosted). The latest stable version, as of July 2025, is v1.101.1. Import the workflow configuration. Configure API credentials for all integrated services. Set up scheduling preferences for continuous operation. System Requirements for Self-Hosting: A modern multi-core processor (2 cores minimum, 4 recommended), 2 GB RAM (4 GB or more recommended), and 20 GB of free SSD storage. Node.js version 16 or later (18.x LTS recommended) is required. PostgreSQL is the recommended database for production. Telegram API Access: Create a Telegram bot via BotFather and obtain your API token. Configure the Telegram Trigger node with your bot's API credentials to receive messages. Pricing: Telegram's API is free to use. OpenRouter API Access: Create an OpenRouter account and generate an API key. This key ({{your open router api key }} as seen in the code) is used to access their chat models (e.g., google/gemini-2.5-flash-preview) for AI agent operations. Pricing: OpenRouter offers a variety of models with different pricing structures, including some free models like DeepSeek R1. Most models operate on a pay-per-usage basis, with costs clearly displayed for each model and prompt. Instagram Session ID: You'll need a valid Instagram session ID ({{ your instagram session ID }} as seen in the code) for the workflow to interact with Instagram. This usually involves extracting it from your browser's cookies after logging into Instagram. Caution: Instagram's terms of service generally prohibit automated interactions, and using session IDs for scraping or automation can lead to account suspension. Use with extreme caution and at your own risk. Apify token setup: *You'll need to replace {{ your apify token }} with you apify token in https requests Google Sheets Credentials: A Google Cloud API key with access to Google Sheets. Set up OAuth2 authentication in n8n for read/write access to your "posts generation plan" spreadsheet (Document ID: 1XHNwAXR4USThaAzX1Y6M5PF2P8WqCBU8mi34FBLkV6M). This sheet is used to store and manage generated post ideas. Pricing: The Google Sheets API is generally free for most common use cases, with generous per-minute quotas (300 read and 300 write requests per minute per project, 60 per user per project). No additional charges are incurred for exceeding these limits. https://docs.google.com/spreadsheets/d/1Ze5SC1g6Q5VzMAKYx0zmqlT00Db1HOchUth1jrPyM2Y/edit?usp=sharing https://docs.google.com/spreadsheets/d/1XHNwAXR4USThaAzX1Y6M5PF2P8WqCBU8mi34FBLkV6M/edit?usp=sharing Predefined Instagram Personality JSON: The workflow relies on a detailed JSON object defining an "Instagram personality" (e.g., user_id, username, full_name, bio, content_preferences, personality_traits, unfulancer_attributes). This JSON needs to be correctly set within the Code nodes (Variables, Variables1, Variables2) to guide the AI content generation. Workflow Architecture [Telegram New Message Trigger] ⬇️ [Variables (Set OpenRouter API Key, Instagram Personality, Session ID)] ⬇️ [Switch (Filter messages from 'Wolf23000' and ensure message text exists)] ⬇️ [Edit Fields (Extract message text)] ⬇️ [AI Agent (Determine action based on message intent)] ⬇️ [Structured Output Parser (Parse AI agent's JSON output)] ⬇️ [Switch1 (Route based on AI agent's determined action: chat_back, run_agent, get_instagram_profile)] ⬇️ ┌─────────────┬─────────────┬─────────────┐ │ │ │ │ ▼ ▼ ▼ ▼ [Send a text message1 (Chat back)] [Send a text message (Run agent confirmation)] [Send a text message2 (Get profile confirmation)] ▲ │ [Schedule Trigger (Hourly for Instagram follow)] ⬇️ [Variables (Set OpenRouter API Key, Instagram Personality, Session ID)] ⬇️ [Code (Prepare usernames for following)] ⬇️ [Code1 (Process followed usernames)] ⬇️ [Schedule Trigger2 (Monthly for Instagram post generation)] ⬇️ [AI Agent1 (Generate monthly Instagram post ideas)] ⬇️ [OpenRouter Chat Model (AI Model for content generation)] ⬇️ [Code2 (Parse AI agent's JSON output)] ⬇️ [Schedule Trigger3 (Daily for post generation refinement and auto-posting)] ⬇️ [AI Agent2 (Enhance image prompts and captions)] ⬇️ [OpenRouter Chat Model2 (AI Model for prompt refinement)] ⬇️ [Update row in sheet1 (Update Google Sheet with enhanced content)] ⬇️ [Get row(s) in sheet2 (Retrieve data from Google Sheet)] Connect With Me Exploring AI-Powered Social Media Automation? 📧 Email: mohamedgb00714@gmail.com 💼 LinkedIn: Mohamed el Hadi Msaid Supercharge your Instagram presence with intelligent automation and AI-driven content\! 🚀
by Sankalp Dev
This automation workflow transforms Meta advertising data into executive ready presentation decks, eliminating manual report creation while ensuring stakeholders receive consistent performance insights. It generates professional Google Slides presentations from your ad campaigns and delivers them automatically via email to designated recipients. By combining scheduled data extraction with AI-powered analysis and automated presentation building, you'll receive polished, actionable reports that facilitate strategic advertising decisions and client communication Key Features: Scheduled automated summary deck generation (daily, weekly, or monthly) AI powered data analysis using advanced language models Intelligent presentation generation with actionable recommendations Direct email delivery of formatted summary decks Prerequisites: GoMarble MCP account and API access Anthropic account Google Slides, Google Drive & Gmail accounts n8n instance (cloud or self-hosted) Configuration Time: ~15-20 minutes Step By Step Setup: 1. Connect GoMarble MCP to n8n Follow the integration guide: GoMarble MCP Setup Configure your Meta Ads account credentials in GoMarble platform 2. Configure the Schedule Trigger 3.Customize the Ad Account Settings. Update the account name to match your ad account name. 4. Customise the Report Prompt (Although the workflow includes a pre configured template report prompt) Define specific metrics and KPIs to track Set analysis parameters and report format preferences 5. Set up AI Agent Configuration Configure Anthropic Claude model with your API credentials Connect the GoMarble MCP tools for Meta advertising data 6. Configure Google Services Integration Set up Google Slides OAuth2 API for presentation creation Configure Google Drive OAuth2 API for file management Link Gmail OAuth2 for automated email delivery 7. Customize Email Delivery Set recipient email addresses for stakeholders Customize email subject line and message content Advanced Configuration Modify report prompt to include specific metrics and KPIs Adjust slide content structure (5-slide format: Executive Snapshot, Channel KPIs, Top Campaigns, Under-performers, Action Recommendations) What You'll Get Automated Presentation Creation: Weekly Google Slides decks generated without manual intervention Professional Ads Analysis: Executive-ready performance summaries with key metrics and insights Structured Intelligence: Consistent 5-slide format covering spend, ROAS, campaign performance, and strategic recommendations Direct Stakeholder Delivery: Presentations automatically emailed as attachments to specified recipients Data-Driven Insights: AI-powered analysis of campaign performance with actionable next steps Scalable Reporting: Easy to modify timing, recipients, or content structure as business needs evolve Perfect for marketing teams, agencies, and business owners who need regular Meta advertising performance updates delivered professionally without manual report creation.
by Typhoon Team
This n8n template demonstrates how to use Typhoon OCR + LLM to digitize business cards, enrich the extracted details, and save them directly into Google Sheets or any CRM. It works with both Thai and English business cards and even includes an optional step to draft greeting emails automatically. Use cases: Automatically capture leads at events, enrich contact details before saving them into your CRM, or simply keep a structured database of your professional network. Good to know Two versions of the workflow are provided: 🟢 Without Search API → cost-free option using only Typhoon OCR + LLM 🔵 With Search API → adds Google Search enrichment for richer profiles (may incur API costs via SerpAPI) The Send Email step is optional — include it if you want to follow up instantly, or disable it if not needed. Typhoon provides a free API for anyone to sign up and use → opentyphoon.ai How it works A form submission triggers the workflow with a business card image (JPG/PNG). Typhoon OCR extracts text from the card (supports Thai & English). Typhoon LLM parses the extracted text into structured JSON fields (e.g., name, job title, organization, email). Depending on your chosen path: Version 1: Typhoon LLM enriches the record with job type, level, and sector. Version 2: The workflow calls the Search API (via SerpAPI) to add a profile/company summary. The cleaned and enriched contact is saved to Google Sheets (can be swapped with your preferred CRM or database). (Optional) Typhoon LLM drafts a short, friendly greeting email, which can be sent automatically via Gmail. How to use The included form trigger is just one example. You can replace it with: A webhook for uploads A file drop in cloud storage Or even a manual trigger for testing You can easily change the destination from Google Sheets to HubSpot, Notion, Airtable, or Salesforce. The enrichment prompt is customizable — adjust it to classify contacts based on your organization’s needs. Requirements Typhoon API key Google Sheets API credentials + a prepared spreadsheet (Optional) Gmail API credentials for sending emails (Optional) SerpAPI key for the Search API enrichment path Customising this workflow This AI-powered business card reader can be adapted to many scenarios: Event lead capture: Collect cards at conferences and sync them to your CRM automatically. Sales enablement: Draft instant greeting emails for new contacts. Networking: Keep a clean and enriched database of your professional connections.