by Anirudh Aeran
This workflow is a complete, AI-powered content engine designed to help automation experts build their personal brand on LinkedIn. It transforms a technical n8n workflow (in JSON format) into a polished, engaging LinkedIn post, complete with a custom-generated AI image and a strategic call-to-action. This system acts as your personal content co-pilot, handling the creative heavy lifting so you can focus on building, not just writing. Whoβs it for? This template is for n8n developers, automation consultants, and tech content creators who want to consistently showcase their work on LinkedIn but lack the time or desire to write marketing copy and design visuals from scratch. If you want to turn your projects into high-quality content with minimal effort, this is your solution. How it works This workflow is divided into two main parts that work together through Telegram: Content Generation & Image Creation: You send an n8n workflow's JSON file to your first Telegram bot. The workflow sends the JSON to Google Gemini with a sophisticated prompt, instructing it to analyze the workflow and write a compelling LinkedIn post in one of two high-engagement styles ("Builder" or "Strategist"). Gemini also generates a detailed prompt for an AI image model, including a specific headline to be embedded in the visual. This image prompt is then sent to the Cloudflare Workers AI model to generate a unique, high-quality image for your post. The final image and the AI-generated text prompt are sent back to you via Telegram for review. Posting to LinkedIn: You use a second Telegram bot for publishing. Simply reply to the image you received from the first bot with the final, polished post text. The workflow triggers on your reply, grabs the image and the text, and automatically publishes it as a new post on your LinkedIn profile. Why Two Different Workflows? The first workflow sends you the image and the post content. You can make changes in the content or the image and send the image to BOT-2. Then copy the post content send it to BOT-2 as a reply to the image. Then both the image and Content will be posted on LinkedIn as a single post. How to set up Create Two Telegram Bots: You need two separate bots. Use BotFather on Telegram to create them and get their API tokens. Bot 1 (Generator): For submitting JSON and receiving the generated content/image. Bot 2 (Publisher): For replying to the image to post on LinkedIn. (After Human Verification) Set Up Accounts & Credentials: Add credentials for Google Gemini, Cloudflare (with an API Token), Google Sheets, and LinkedIn. For Cloudflare, you will also need your Account ID. Google Sheet for Tracking: Create a Google Sheet with the columns: Keyword, Image Prompt, Style Used to keep a log of your generated content. Configure Nodes: In all Telegram nodes, select the correct credential for each bot. In the Google Gemini node, ensure your API credential is selected. In the Cloudflare nodes ("Get accounts" and "Get Flux Schnell image"), select your Cloudflare credential and replace the placeholder with your Account ID in the URL. In the LinkedIn node, select your credential and choose the author (your profile). In the Google Sheets node, enter your Sheet ID. Activate: Activate both Telegram Triggers in the workflow. Requirements An n8n instance. Credentials for: Google Gemini, Cloudflare, LinkedIn, Google Sheets. Two Telegram bots with their API tokens. A Cloudflare Account ID.
by Rahul Joshi
π Description Automate your content repurposing workflow by transforming long-form articles, blogs, and newsletters into short, high-signal, AI-ready social media snippets. βοΈπ€ This workflow fetches pending content from Airtable, generates 30-word snippets, data points, and quote-style insights using GPT-4o-mini, and updates the original record with all generated fields. If Facebook is selected as a target platform, the workflow automatically posts the best snippet via the Meta Graph API and logs the result. Perfect for content, marketing, and social media teams scaling daily publishing without manual rewriting. ππ£ π What This Template Does 1οΈβ£ Fetches βpendingβ long-form content from Airtable. π₯ 2οΈβ£ Processes all records in batches to avoid rate limits. π 3οΈβ£ Sends full content + metadata to GPT-4o-mini to generate structured snippets. π€ 4οΈβ£ Ensures valid JSON output via the structured parser. π 5οΈβ£ Updates Airtable with: β 30-word snippets β data points β quote insights β a recommended primary snippet β timestamps & status 6οΈβ£ Checks if Facebook is selected as a posting platform. βοΈ 7οΈβ£ Automatically publishes the recommended snippet using the Meta Graph API. π€ 8οΈβ£ Updates Airtable again with post status + response. π 9οΈβ£ Sends a success notification to Slack with full details. π¬ β Key Benefits β Automates creation of platform-ready social media snippets β Produces AI-friendly, high-signal content that works for LLM discovery β Eliminates manual rewriting for LinkedIn, Facebook, Twitter, Instagram β Automatically posts to Meta if selected β hands-free publishing β Maintains clean, structured content in Airtable for future reuse β Saves time for marketing, growth, and social teams π§© Features Airtable integration for content fetch + update GPT-4o-mini AI snippet generation Structured JSON parser for clean, reliable AI output Auto-detection of selected social platforms Facebook Graph API publishing Slack notifications for success Scheduled automation for hands-free daily processing Full audit trail with timestamps π Requirements Airtable Personal Access Token OpenAI API key (GPT-4o-mini) Facebook Graph API credentials (for auto-posting) Slack API credentials n8n with LangChain nodes enabled π― Target Audience Content marketing teams repurposing long-form content Social media managers publishing daily posts Growth teams optimizing content for AI search engines Agencies producing content at scale for multiple clients
by Artem Boiko
How it works This template helps project managers collect task updates and photo reports from field workers via Telegram and stores everything in a Google Sheet. It enables daily project reporting without paper or back-office overhead. High-level flow: Workers receive daily tasks via Telegram They respond with photo reports Bot auto-saves replies (photos + status) to a Google Sheet The system tracks task completion, adds timestamps, and maintains report history Set up steps π Estimated setup time: 15β30 min Youβll need: A Telegram bot (via BotFather) A connected Google Sheet (with specific column headers) A set of preconfigured tasks π Detailed setup instructions and required table structure are documented in sticky notes inside the workflow. Consulting and Training We work with leading construction, engineering, consulting agencies and technology firms around the world to help them implement open data principles, automate CAD/BIM processing and build robust ETL pipelines. If you would like to test this solution with your own data, or are interested in adapting the workflow to real project tasks, feel free to contact us. Docs & Issues: Full Readme on GitHub
by AppStoneLab Technologies LLP
Reddit Thread β AI-Powered X & LinkedIn Posts with Human Approval Gate Turn any Reddit thread into polished, platform-optimized social media posts for X (Twitter) and LinkedIn - in minutes, not hours. This workflow reads a Reddit thread, extracts the full discussion (including nested comment threads sorted by score), feeds everything to Google Gemini for summarization and post generation, then pauses for your review before publishing anything live. No accidental posts. No context lost. Just high-quality content, on your terms. β¨ Key Features π Any Reddit URL supported** β Standard, mobile (m.reddit.com), and short redd.it links all work π¬ Full thread extraction** β Recursively pulls all comments and replies, sorted by score at every depth level π§ Two-stage AI pipeline** β Gemini first summarizes the thread, then generates platform-specific posts from that summary π¦ X-optimized post** β Max 280 characters, punchy, curiosity-driven with relevant hashtags; auto-truncated if over the limit πΌ LinkedIn-optimized post** β 150β300 words, professional tone, structured paragraphs, engagement question, and hashtags π€ Human-in-the-loop approval** β A review form shows both posts before anything is published; supports manual overrides per platform π« Graceful rejection path** β If rejected, the workflow terminates cleanly with no content published π What This Workflow Does This workflow solves a real content creation bottleneck: Reddit threads are goldmines of community insight, niche expertise, and trending discussions - but turning that raw discussion into polished, platform-appropriate social posts takes significant manual effort. This automation handles the entire pipeline: from a raw URL to live posts on two platforms, with you staying in full control via an approval gate. It's ideal for content marketers, community managers, indie hackers, developers, and newsletter writers who want to repurpose Reddit content without losing quality or spending hours manually summarizing threads. βοΈ How It Works (Step-by-Step) Submit a Reddit URL - A web form (Form Trigger) accepts any Reddit thread URL as input. Parse the URL - A Code node validates and deconstructs the URL using regex, extracting the subreddit and post ID to build the Reddit JSON API endpoint. Fetch the thread - An HTTP Request node calls Reddit's public JSON API (reddit.com/r/.../comments/id.json) with limit=100 and depth=3 to retrieve the full thread. Extract & structure content - A Code node recursively traverses the entire comment tree, sorts comments by score at every depth level, and builds a clean flat text representation of the full thread - including post metadata (title, score, upvote ratio, flair, awards) - ready for AI injection. Summarize with Gemini - The assembled thread content is passed to Google Gemini (3.1 Flash Lite), which returns a comprehensive markdown summary covering: Thread Overview, Key Topics, Notable Insights, Community Sentiment, and Actionable Takeaways. Generate social posts - A second Gemini call uses the summary to craft a platform-optimized X post (β€280 chars) and a LinkedIn post (150β300 words), returning strict JSON output. Parse & validate - A Code node safely extracts the JSON, strips any markdown fences, falls back to regex parsing if needed, and enforces the 280-character hard limit on the X post. Human Approval form - The workflow pauses and presents both posts in a review form. You can approve as-is, paste a manual override for either platform, or reject the entire run. Resolve final content - A Code node merges your overrides (if any) with the AI versions; overrides always win, AI version is the fallback. Route by decision - An IF node checks your approval decision: β Approve & Publish β Posts simultaneously to X and LinkedIn β Reject β Workflow ends cleanly; nothing is published π How to Use This Workflow Step 1 - Set up credentials Click Use template, then configure the following credentials in n8n: | Service | Credential Type | How to Get It | |---|---|---| | π€ Google Gemini | Google PaLM API | Get API Key β Google AI Studio | | π¦ X (Twitter) | Twitter OAuth2 | X Developer Portal β Create App β OAuth2 | | πΌ LinkedIn | LinkedIn OAuth2 | LinkedIn Developer Portal β Create App | > Note on Reddit: No API key required. This workflow uses Reddit's public JSON API (append .json to any thread URL) which is freely accessible without authentication. Step 2 - Configure the LinkedIn node Open the Post to LinkedIn node and replace the person field value (=ID) with your LinkedIn Person URN. You can retrieve it by calling the LinkedIn API: GET https://api.linkedin.com/v2/userinfo after authenticating. Step 3 - Activate the workflow Toggle the workflow to Active in your n8n instance. This enables the Form Trigger and the Wait node's webhook to function correctly. Step 4 - Run it Open the Form Trigger URL (found in the Reddit URL Input node) Paste any Reddit thread URL Wait for the approval form to arrive (check the execution log for the form URL) Review, optionally edit, and approve or reject Done! your posts are live π π οΈ How to Customize π€ Swap the AI model** - Both Gemini nodes use gemini-3.1-flash-lite-preview. You can switch to gemini-3.1-pro-preview or claude-sonnet-4-6 for higher quality output by updating the modelId in both Gemini nodes or by adding Anthropic nodes. π Change the post format* - Edit the prompt in the *Generate Social Posts** node to adjust tone, length, hashtag count, or add support for other platforms (Instagram, Threads, Facebook). π Add more platforms* - After the *Approved?** node's true branch, connect additional posting nodes (e.g., Facebook Graph API, Buffer, Telegram) in parallel. π Log to Google Sheets** - Add a Google Sheets node after the publish nodes to track published posts, Reddit thread URLs, dates, and engagement metrics. β±οΈ Make it scheduled** - Replace the Form Trigger with a Schedule Trigger + a list of pre-configured Reddit URLs in Google Sheets for fully automated daily publishing. β οΈ Important Notes The Reddit public JSON API does not require authentication but is rate-limited. For high-volume use, consider adding a Reddit OAuth2 credential. The Wait node requires your n8n instance to be publicly accessible (or use n8n Cloud) so the approval form's webhook URL can be reached by your browser. LinkedIn's API requires your app to have the w_member_social permission scope to post on behalf of a user. X (Twitter) API v2 requires an approved developer account. Free tier allows limited monthly tweets.
by Joseph
Overview This n8n workflow creates an intelligent AI agent that automates browser interactions through Airtop's browser automation platform. The agent can control real browser sessions, navigate websites, interact with web elements, and maintain detailed session records - all while providing live viewing capabilities for real-time monitoring. Youtube Tutorial: https://www.youtube.com/watch?v=XoZqFY7QFps What This Workflow Does The AI agent acts as your virtual assistant in the browser, capable of: Session Management**: Creates, monitors, and terminates browser sessions with proper tracking Web Navigation**: Visits websites, clicks elements, fills forms, and performs complex interactions Multi-Window Support**: Manages multiple browser windows within sessions Live Monitoring**: Provides real-time viewing URLs so you can watch the automation Data Tracking**: Maintains comprehensive records of all browser activities Profile Integration**: Uses Airtop profiles for authenticated sessions Email Notifications**: Sends live URLs and status updates via Gmail Demo Use Case: Automated Reddit Posting The tutorial demonstrates the agent's capabilities by: Logging into Reddit using pre-configured Airtop profile credentials Navigating to a specific subreddit based on user input Creating and publishing a new post with title and content Tracking the entire process with detailed session records Providing live viewing access throughout the automation Core Workflow Components 1. Chat Interface Trigger Node Type**: Chat Trigger Purpose**: Accepts user commands for browser automation tasks Input**: Natural language instructions (e.g., "Create a Reddit post in r/automation") 2. AI Agent Processing Node Type**: OpenAI GPT-4 Purpose**: Interprets user requests and determines appropriate browser actions System Message**: Contains the comprehensive agent instructions from your documentation Capabilities**: Understands complex web interaction requests Plans multi-step browser workflows Manages session states intelligently Handles error scenarios gracefully 3. Google Sheets Data Management Multiple Google Sheets nodes manage different aspects of session tracking: Browser Sessions Sheet Fields**: session_id, description, status, created_date Purpose**: Tracks active browser sessions Operations**: Create, read, update session records Window Sessions Sheet Fields**: session_id, window_id, description, airtop_live_view_url, status Purpose**: Tracks individual browser windows within sessions Operations**: Create, read, update window records Airtop Profiles Sheet Fields**: platform_name, platform_url, profile_name Purpose**: Stores available authenticated profiles Operations**: Read available profiles for session creation 4. Airtop Browser Automation Nodes Multiple specialized nodes for browser control: Session Management create_session**: Creates new browser sessions with optional profile authentication terminate_session**: Closes browser sessions and updates records read_airtop_profiles**: Retrieves available authentication profiles Window Management create_window**: Opens new browser windows with specified URLs query_page**: Analyzes page content and identifies interactive elements Web Interaction click_element**: Clicks specific page elements based on AI descriptions type_text**: Inputs text into form fields and input elements 5. Gmail Integration Node Type**: Gmail Send Purpose**: Sends live viewing URLs and status updates Recipients**: User email for real-time monitoring Content**: Complete Airtop live view URLs for browser session observation 6. Error Handling & Validation Input Validation**: Ensures required parameters are present Session State Checks**: Verifies browser session status before operations Error Recovery**: Handles failed operations gracefully Data Consistency**: Maintains accurate session records even during failures Technical Requirements API Credentials Needed Airtop.ai API Key Sign up at airtop.ai Generate API key from dashboard Required for all browser automation functions OpenAI API Key OpenAI account with GPT-4 access Required for AI agent intelligence and decision-making Google Sheets Access Google account with Google Sheets API access Copy the provided template and get your sheet URL Required for session and profile data management Gmail OAuth Google account with Gmail API access Required for sending live viewing URLs and notifications Airtable Base Structure Create three tables in your Airtable base: 1. Browser Details (Sessions) session_id (Single line text) description (Single line text) status (Single select: Open, Closed) created_date (Date) 2. Window Details (Windows) session_id (Single line text) window_id (Single line text) description (Single line text) airtop_live_view_url (URL) status (Single select: Open, Closed) 3. Airtop Profiles platform_name (Single line text) platform_url (URL) profile_name (Single line text) Workflow Logic Flow User Request Processing User Input: Natural language command via chat interface AI Analysis: OpenAI processes request and determines required actions Session Check: Agent reads current browser session status Action Planning: AI creates step-by-step execution plan Browser Session Lifecycle Session Creation: Check for existing open sessions Ask user about profile usage if needed Create new Airtop session Record session details in Airtable Window Management: Create browser window with target URL Capture live viewing URL Record window details in Airtable Send live URL via Gmail Web Interactions: Query page content for element identification Execute clicks, form fills, navigation Monitor page state changes Handle dynamic content loading Session Cleanup: Terminate browser session when complete Update all related records to "Closed" status Send completion notification Data Flow Architecture User Input β AI Processing β Session Management β Browser Actions β Data Recording β User Notifications Key Features & Benefits Intelligent Automation Natural Language Control**: Users can describe tasks in plain English Context Awareness**: AI understands complex multi-step workflows Adaptive Responses**: Handles unexpected page changes and errors Profile Integration**: Seamlessly uses stored authentication credentials Real-Time Monitoring Live View URLs**: Watch browser automation as it happens Status Updates**: Real-time notifications of task progress Session Tracking**: Complete audit trail of all browser activities Multi-Window Support**: Handle complex workflows across multiple tabs Enterprise-Ready Features Error Recovery**: Robust handling of network issues and page failures Session Persistence**: Maintains state across workflow interruptions Data Integrity**: Consistent record-keeping even during failures Scalable Architecture**: Can handle multiple concurrent automation tasks Use Cases Beyond Reddit This workflow architecture supports automation for any website: Social Media Management Multi-platform posting**: Facebook, Twitter, LinkedIn, Instagram Community engagement**: Responding to comments, messages Content scheduling**: Publishing posts at optimal times Analytics gathering**: Collecting engagement metrics Business Process Automation CRM data entry**: Updating customer records across platforms Support ticket management**: Creating, updating, routing tickets E-commerce operations**: Product listings, inventory updates Report generation**: Gathering data from multiple web sources Personal Productivity Travel booking**: Comparing prices, making reservations Bill management**: Paying utilities, checking statements Job applications**: Submitting applications, tracking status Research tasks**: Gathering information from multiple sources Advanced Configuration Options Custom Profiles Create Airtop profiles for different websites Store authentication credentials securely Switch between different user accounts Handle multi-factor authentication flows Workflow Customization Modify AI system prompts for specific use cases Add custom validation rules Implement retry logic for failed operations Create domain-specific interaction patterns Integration Extensions Connect to additional data sources Add webhook notifications Implement approval workflows Create audit logs and reporting Getting Started π Copy the Google Sheets Template - Just click and make a copy! Set up credentials for Airtop, OpenAI, and Gmail Import workflow into your n8n instance Configure node credentials with your API keys and Google Sheets URL Test with simple commands like "Visit google.com" Expand to complex workflows as you become comfortable Best Practices Session Management Always check for existing sessions before creating new ones Properly terminate sessions to avoid resource waste Use descriptive names for sessions and windows Regularly clean up old session records Error Handling Implement timeout handling for slow-loading pages Add retry logic for network failures Validate element existence before interactions Log detailed error information for debugging Security Considerations Store sensitive credentials in Airtop profiles, not workflow Use webhook authentication for production deployments Implement rate limiting to avoid being blocked by websites Regular audit of browser session activities This workflow transforms n8n into a powerful browser automation platform, enabling you to automate virtually any web-based task while maintaining full visibility and control over the automation process.
by Richard Nijsten
Create sprint goals using Pega Agile Studio with AI Based on the Google Sheet data, the AI will retrieve the userstories ID's, retrieves the userstory data and the corresponding attachments and creates sprint goals according to the defined system prompt. Who's it for Product owners/Scrum masters that use Pega Agile Studio How it works It retrieves the data from each userstory using the Agile Studio api, and the Google API to gather the attachments in the corresponding documents/sheets/slides. Then the AI processes it according to the system prompt and generates a mail. How to set up Create a Google Sheet and add the userstories you want in the Google Sheet in column named "Userstory". Set the GoogleSheet ID in the node "Retrieve_Data_From_Sheet" Setup other credentials. Publish the subworkflow for attachment handling. Execute the workflow. Requirements Access to Pega Agile Studio OAuth2 Api. AI API. Access to Google Cloud for the Google API's
by Rahul Joshi
π Description Automate daily founder intelligence from Hacker News without manual monitoring. This workflow scans Hacker News discussions (Show HN, launches, AI, startups, SaaS), filters out noise and non-discussion pages, and extracts only high-signal threads. AI then converts these discussions into a concise, founder-ready daily digest highlighting key trends, why they matter, and practical actions. The digest is delivered via email, while structured insights are logged to Google Sheets for long-term tracking and analysis. β οΈ Deployment Disclaimer This template is designed for self-hosted n8n installations only. It relies on external MCP tools and custom AI orchestration that are not supported on n8n Cloud. π What This Template Does 1οΈβ£ Runs automatically on a daily schedule β° 2οΈβ£ Searches Hacker News discussions via Google using SerpAPI π 3οΈβ£ Extracts titles, summaries, links, and metadata from results π 4οΈβ£ Filters out guidelines, index pages, and non-discussion links π« 5οΈβ£ Aggregates valid discussion threads into a single dataset π¦ 6οΈβ£ Uses AI to identify key trends, problems, and founder-relevant signals π§ 7οΈβ£ Generates a concise daily founder digest (trend, why it matters, actions) βοΈ 8οΈβ£ Sends the digest automatically via email π§ 9οΈβ£ Cleans and normalizes insights for storage π§Ή π Appends structured founder intelligence to Google Sheets for tracking π β Key Benefits β Eliminates manual Hacker News scanning β Surfaces only high-signal, founder-relevant discussions β Converts raw discussions into clear, actionable insights β Delivers a daily, skimmable founder digest automatically β Builds a historical intelligence log in Google Sheets β Creates a repeatable founder research workflow βοΈ Features Daily scheduled execution Hacker News discovery via Google Search (SerpAPI) Noise filtering with custom JavaScript logic AI-powered trend and insight extraction Founder-focused digest generation Email delivery via Gmail Insight archiving in Google Sheets π Requirements SerpAPI account Azure OpenAI credentials Gmail account connected to n8n Google Sheets account Self-hosted n8n instance π― Target Audience Startup founders tracking early signals Product and growth leaders monitoring trends VCs and analysts scouting emerging tools Teams needing automated market and founder intelligence
by oka hironobu
Who is this for E-commerce store owners and sales managers who want AI-powered insights from their Shopify data without manually crunching numbers every week. What it does This workflow automatically analyzes your Shopify sales performance every Monday morning and delivers intelligent insights via Slack and email. It pulls the last 7 days of orders, calculates key metrics (revenue, order count, average order value), and sends the data to Gemini AI for trend analysis and actionable recommendations. The AI identifies patterns in your sales data and provides specific suggestions for improving performance. All metrics are logged to Google Sheets for historical tracking, and the team receives instant alerts when revenue drops more than 20% compared to the previous week. How to set up Connect your Shopify store with order read permissions Add your Gemini API key for AI analysis Set up Slack integration and choose your target channels (#sales and #alerts) Configure Gmail credentials for stakeholder email reports Create a Google Sheets document with a "Weekly Metrics" tab for data tracking Update the email recipients and spreadsheet ID in the respective nodes Requirements Shopify store with API access Google Gemini API key Slack workspace Gmail account Google Sheets access How to customize Adjust the revenue drop threshold in the IF node (default: 20%), modify the schedule frequency, customize Gemini prompts for different analysis types, or add additional Slack channels for department-specific reports.
by Shashwat Singh
This workflow serves as a complete "AI Receptionist" for mortgage brokers or high-ticket service providers. It automates the messy process of qualifying leads, getting internal approval, and collecting sensitive client documents securely. The workflow is fully annotated with Sticky Notes to help you understand every step of the logic. Key Features Smart Qualification:** automatically filters leads based on income thresholds (e.g., >$80k) before they reach your inbox. Human-in-the-Loop:** Uses Telegram buttons to ask the broker for "Approve/Decline" confirmation before generating client emails. Generative AI:* Uses *Google Gemini (2.5-flash-lite)** to write personalized, warm welcome emails based on the lead's specific loan type. Google Workspace Automation:** Automatically creates a unique Google Drive folder for the client and drafts the email in Gmail (with the upload link pre-inserted). Document Collection Portal:** Includes a validation loop that checks if the client uploaded all required files (ID, Payslip, Bank Statements) and notifies the broker of success or failure. Database Logging:** Keeps a "Source of Truth" log in Supabase for every lead action. Prerequisites To use this template, you will need credentials for: Google Cloud Console: Enable APIs for Drive, Gmail, and Gemini (PaLM). Telegram: A Bot Token and Chat ID for notifications. Supabase: A project URL and API Key (Database schema provided in the Sticky Notes). How to Configure Credentials: Set up your Google, Telegram, and Supabase credentials in n8n. Supabase Table: Create a table named leads_consolidated (schema details are in the workflow notes). Triggers: Ensure your n8n instance is publicly accessible so the Webhook and Form triggers function correctly. Telegram ID: Update the Chat ID fields in the Telegram nodes to your own ID. This template is designed to be modularβyou can easily swap Supabase for Airtable or Gemini for OpenAI if preferred.
by Kevin Meneses
What this workflow does This workflow automates the discovery, evaluation, and notification of job opportunities based on a candidateβs professional profile. It fetches remote job listings, compares each role against a structured candidate profile stored in Google Sheets, and uses AI to evaluate real alignment in terms of skills, seniority, salary, industry, and role complexity. Only the most relevant opportunities are kept, stored in Google Sheets, and delivered via email as a Top 5 shortlist. Decodo β Web Scraper for n8n How to configure (quick setup) Define the candidate profile in Google Sheets (skills, salary expectations, preferences). Configure credentials (Google Sheets, Gmail, decodo, AI provider). Set the matching threshold (e.g. skill match β₯ 90%). Run the workflow manually or on a schedule. Output Structured job match results in Google Sheets Automated email with the Top 5 best-matched job opportunities
by Madame AI
Scrape physician profiles from BrowserAct to Google Sheets This workflow automates the process of building a targeted database of healthcare providers by scraping physician details for a specific location and syncing them to your records. It leverages BrowserAct to extract data from healthcare directories and ensures your database stays clean by preventing duplicate entries. Target Audience Medical recruiters, pharmaceutical sales representatives, lead generation specialists, and healthcare data analysts. How it works Define Location: The workflow starts by setting the target Location and State in a Set node. Scrape Data: A BrowserAct node executes a task (using the "Physician Profile Enricher" template) to search a healthcare directory (e.g., Healow) for doctors matching the criteria. Parse JSON: A Code node takes the raw string output from the scraper and parses it into individual JSON objects. Update Database: The workflow uses a Google Sheets node to append new records or update existing ones based on the physician's name, preventing duplicates. Notify Team: A Slack node sends a message to a specific channel to confirm the batch job has finished successfully. How to set up Configure Credentials: Connect your BrowserAct, Google Sheets, and Slack accounts in n8n. Prepare BrowserAct: Ensure the Physician Profile Enricher template is saved in your BrowserAct account. Setup Google Sheet: Create a new Google Sheet with the required headers (listed below). Select Spreadsheet: Open the Google Sheets node and select your newly created file and sheet. Set Variables: Open the Define Location node and input your target Location (City) and State. Configure Notification: Open the Slack node and select the channel where you want to receive alerts. Google Sheet Headers To use this workflow, create a Google Sheet with the following headers: Name Specialty Address Requirements BrowserAct* account with the *Physician Profile Enricher** template. Google Sheets** account. Slack** account. How to customize the workflow Change the Data Source: Modify the BrowserAct template to scrape a different directory (e.g., Zocdoc or WebMD) and update the Google Sheet columns accordingly. Switch Notifications: Replace the Slack node with a Microsoft Teams, Discord, or Email node to suit your team's communication preferences. Enrich Data: Add an AI Agent node after the Code node to format addresses or research the specific clinics listed. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Video Automate Medical Lead Gen: Scrape Healow to Google Sheets & Slack
by Websensepro
#Analyse YouTube Comments via AI Agent and Create Video Topics Description Identifying what your audience wants can be a manual and time-consuming process. This workflow automates your content research by analyzing comments from your latest YouTube videos using an AI Agent. Every week, it fetches feedback, stores it in Google Sheets for backup, and uses OpenAI to suggest the TOP 5 SEO-optimized video topics based on real viewer requests. The final content strategy is delivered directly to your Gmail inbox, allowing you to focus on creating rather than researching. What it Does Weekly Trigger:* The workflow starts automatically every Friday (customizable) via the *Schedule Trigger**. Fetches Video Data:** It pulls the latest 5 videos from your specified YouTube playlist. Retrieves Comments:* Uses the *YouTube API (HTTP Request)** to fetch up to 100 comment threads per video. Data Archiving:* All retrieved comments are saved into a *Google Sheet**, ensuring you have a permanent backup of audience feedback. AI Analysis:* A *JavaScript node* formats the comments and passes them to an *AI Agent. Powered by **OpenAI (GPT-4o-mini), the agent identifies recurring themes and requests. Structured Output:* The *Output Parser** ensures the AI provides exactly 5 video titles in a clean, professional format. Email Delivery:* The *Gmail node** sends a summary of the suggested content plan directly to your inbox. Use Cases YouTube Creators:** Automatically find out what tutorials or "How-to" videos your audience is asking for. Social Media Managers:** Generate weekly reports on audience sentiment and demand for clients. Educational Channels:** Identify specific pain points or questions students are mentioning in the comments. Customization YouTube Node:** Replace the Playlist ID with the ID of the playlist you want to monitor. Google Sheets:* Select your own Spreadsheet and Sheet Name in the *Append row in sheet** node. AI Agent Prompt:** You can modify the prompt to change the tone of the titles (e.g., "Make them clickbaity" or "Make them educational"). Gmail Node:** Update the "To" email address to your own email. Requirements An n8n instance (Cloud or Self-hosted). Google Cloud Credentials** (With YouTube Data API v3, Google Sheets API, and Gmail API enabled). OpenAI API Key** (For the AI Agent analysis). How to Set Up Connect Credentials: Link your YouTube, Google Sheets, Gmail, and OpenAI accounts to their respective nodes. Configure Playlist: In the "Get many playlist items" node, enter your YouTube Playlist ID. Select Sheet: In the "Append row in sheet" node, pick your target Google Sheet. Update Gmail: In the "Send a message" node, set your recipient email address. Activate: Save the workflow and toggle it to Active.