by Basil Irfan
🚀 LinkedIn Lead-Gen Flywheel – Apify → GPT-4o → Google Sheets → Phantombuster What this workflow does Collect audience specs – simple web-form asks for your ideal company profile. Generate a laser-targeted Apollo search URL with GPT-4o (no manual filtering). Scrape the matching leads via an Apify actor (returns clean JSON). Craft hyper-personalized icebreakers for each lead using GPT-4o (ultra-short, human-sounding). Log everything to Google Sheets – name, LinkedIn URL, company site, summary, and the icebreaker. (Optional) Auto-launch Phantombuster to fire off those connection requests at scale. Why it matters Zero grunt work:** audience research, scraping, copy-writing, and outreach all run hands-free. Punchy personalization:** micro-icebreakers outperform canned intros, boosting accept rates. Scales with you:** flip a switch to go from 10 to 1 000+ connections/day. Node rundown | Step | Node | Key Inputs | Key Outputs | |------|------|-----------|-------------| | 1 | Form Trigger | Audience description | description_of_company | | 2 | OpenAI (GPT-4o) | Audience text | SearchUrl | | 3 | HTTP Request – Apify | SearchUrl, APIFY_TOKEN | Lead JSON | | 4 | OpenAI (GPT-4o) | Lead JSON | Icebreaker | | 5 | Google Sheets | Lead + Icebreaker | Row append/update | | 6 | Aggregate | Sheet rows | Batched output | | 7 | HTTP Request – Phantombuster | PHANTOM_KEY, AGENT_ID | Launch status | Prerequisites OpenAI API key** (GPT-4o access recommended) Apify API token** with access to actor id Google Service Account creds** shared with your target sheet Phantombuster API key** and Agent ID for your LinkedIn connector Active Apollo account to open the generated search URL (only required for debugging) Setup (5-minute sprint) Import the workflow into n8n. Add the required credentials in Credentials → OpenAI, Apify, Google Sheets, Phantombuster. Paste your Phantombuster Agent ID into the HTTP Request node URL. Publish the Form Trigger URL—this is where you (or your SDRs) describe the target audience. Hit Execute Workflow once to verify data flows end-to-end. Customization tips Titles & keywords:** tweak the prompt in the first GPT-4o node to lock in different roles or industries. Icebreaker style:** adjust the second GPT-4o prompt to match your brand voice. Data columns:** map extra fields from Apify into Google Sheets as needed. Skip outreach:** disable the Phantombuster node if you only want the leads + icebreakers.
by Jay Hartley
What this template does This workflow uses the Amadeus API, every day to check for bargain flights for an itinerary and price target of your choice. It then automatically emails you once it found a match. Setup Create an api account on https://developers.amadeus.com/ In Amadeus Flight Search, connect to Oauth2 API: -- Grant Type - Client Credentials -- Access Token URL - https://test.api.amadeus.com/v1/security/oauth2/token -- Client ID/Secret - from your account Set your details in Gmail Set your desired Origin/Destination airports in FromTo Set the dates ahead you wish to search in Get Dates (default is 7 days and 14 days) Set the price target in Under Price How to test it After completing the setup steps above, just hit 'Test workflow'!
by ist00dent
This n8n template allows you to instantly generate QR codes from any text or URL by simply sending a webhook request. It's a versatile tool for creating dynamic QR codes for various purposes, from marketing campaigns to event registrations, directly integrated into your automated workflows. 🔧 How it works Receive Data Webhook: This node acts as the entry point for the workflow. It listens for incoming POST requests and expects a JSON body with a data property containing the text or URL you want to encode into the QR code. Generate QR Code: This node makes an HTTP GET request to the QR Server API (api.qrserver.com) to generate the QR code image. The content from your webhook is passed as the data parameter to the API. Respond with QR Code: This node sends the response from the QR Server API back to the service that initiated the webhook. The QR Server API directly returns the image data, so your webhook response will be the QR code image itself. 👤 Who is it for? This workflow is ideal for: Marketers: Generate QR codes for product links, event registrations, or promotional materials on the fly. Developers: Integrate QR code generation into applications, websites, or internal tools. Event Organizers: Create dynamic QR codes for ticketing, information access, or check-ins. Businesses: Streamline processes requiring physical-to-digital transitions, like menu access or contact sharing. Automation Enthusiasts: Add QR code generation capabilities to any workflow. 📑 Data Structure When you trigger the webhook, send a POST request with a JSON body structured as follows: { "data": "https://www.yourwebsite.com/your-specific-page-or-text-to-encode" } The workflow will return the QR code image directly in the response. ⚙️ Setup Instructions Import Workflow: In your n8n editor, click "Import from JSON" and paste the provided workflow JSON. Configure Webhook Path: Double-click the Receive Data Webhook node. In the 'Path' field, set a unique and descriptive path (e.g., /generate-qr). Customize QR Code (Optional): Double-click the Generate QR Code node. You can adjust the size parameter in the URL (e.g., size=200x200 for a larger QR code) or add other parameters supported by the QR Server API (e.g., bgcolor, color, qzone). Activate Workflow: Save and activate the workflow. 📝 Tips Handling the Image Output: Since the QR Server API directly returns the image, the webhook response will be the image data. Depending on your use case, you might want to: Save to File/Cloud: Insert a node (e.g., Write Binary File, Amazon S3, Google Drive) after Generate QR Code to save the image to a file system or cloud storage. Embed in HTML/Email: If you're building an HTML response or sending an email, you might need to convert the image data to a Base64 string or provide a URL to a saved image. Error Handling: Enhance workflow robustness by adding an Error Trigger node. This allows you to catch any issues during QR code generation and set up notifications or logging. Dynamic Size/Color: You can extend the Receive Data Webhook to accept parameters for size, color, or bgcolor in the incoming JSON. Then, dynamically pass these to the url of the Generate QR Code node to create highly customizable QR codes. Input Validation: For more advanced use cases, you could add a Function node after the webhook to validate the incoming data to ensure it's in a valid format (e.g., a URL).
by Richard Uren
Task Read a list of customers from a GoogleSheet and create them in Shopify using Shopify's Admin API (GraphQL). Why ? Generate test users for development stores. Migrate customers from other platforms. Easy intro to Shopify's GraphQL API. Setup Setting up Google Sheets access Follow the instructions in the N8N Docs for granting Oauth2 access to Google services. You'll need to grant API access to Google Sheets and Google Drive (to list available sheets). Setting up Shopify access Shopify's Admin API uses 'Header Auth' with a key of X-Shopify-Access-Token and a value of your shopify access token which starts with shpat_ . How to generate a Shopify Access Token To generate a Shopify Access Token create an app, grant the app the necessary scopes, then generate a token. From inside a store do the following : click Settings (nav link) click Apps and sales channels (nav link) click Develop Apps (button) click Create App (button) give the app a name click configure Admin API Scopes (button) at a minimum grant read_customers and write_customers scope. Grant additional scopes if you plan on accessing other parts of the API. click save To generate the token click install app (button) click install on the dialog that pops up (button) click 'reveal token once' (button) copy the token into a password vault or somewhere secure. Template Updates To test this out you'll need to make the following changes : 1) Create a header credential where the key is X-Shopify-Access-Token and the value is your Shopify Access Token (it starts with shpat_ 2) In the GraphQL node change the endpoint URL to your store. Something like https://{your store goes here}.myshopify.com/admin/api/2025-04/graphql.json Google Sheet Structure Columns can be in any order, because the rows will be mapped to fields in a json object. N8N will treat the first row in the sheet as a column name, so at a minimum use the column names below in row 1 of your sheet. first_name : Any string last_name : Any string email : Valid email mobile_phone : International mobile phone format with no spaces eg. +61414708406 (Shopify will reject anything else). Example CSV "first_name","last_name","email","mobile_phone" "Bob","Smith","bob@example.com","+61414999999"
by Bela
In this automation we first make a screenshot with a screenshot API called URLbox and then send this screenshot into the OpenAI API and analyze it. You can extend this automation by the way you want to ingest the website url's & names into this workflow. Options as data source: Postgres Google Sheets Your CRM ... Setup: Replace Website & URL in Setup Node Put in your URLbox API Key Put in your OpenAI credentials Click here for a blog article with more information on the automation.
by Growth AI
Automated trend monitoring for content strategy Who's it for Content creators, marketers, and social media managers who want to stay ahead of emerging trends and generate relevant content ideas based on data-driven insights. What it does This workflow automatically identifies trending topics related to your industry, collects recent news articles about these trends, and generates content suggestions. It transforms raw trend data into actionable editorial opportunities by analyzing search volume growth and current news coverage. How it works The workflow follows a three-step automation process: Trend Analysis: Examines searches related to your topics and identifies those with the strongest recent growth Article Collection: Searches Google News for current articles about emerging trends and scrapes their full content Content Generation: Creates personalized content suggestions based on collected articles and trend data The system automatically excludes geo-localized searches to provide a global perspective on trends, though this can be customized. Requirements SerpAPI account (for trend and news data) Firecrawl API key (for scraping article content from Google News results) Google Sheets access AI model API key (for content analysis and recommendations - you can use any LLM provider you prefer) How to set up Step 1: Prepare your tracking sheet Duplicate this Google Sheets template Rename your copy and ensure it's accessible Step 2: Configure API credentials Before running the workflow, set up the following credentials in n8n: SerpAPI: For trend analysis and Google News search Firecrawl API: For scraping article content AI Model API: For content analysis and recommendations (Anthropic Claude, OpenAI GPT, or any other LLM provider) Google Sheets OAuth2: For accessing and updating your tracking spreadsheet Step 3: Configure your monitoring topics In your Google Sheet "Query" tab: Query column: Enter the main topics/keywords you want to monitor for trending queries (e.g., "digital marketing", "artificial intelligence", "sustainable fashion") Query to avoid column: Optionally add specific queries you want to exclude from trend analysis (e.g., brand names, irrelevant terms, or overly specific searches that don't match your content strategy) This step is crucial as these queries will be the foundation for discovering related trending topics. Step 4: Configure the workflow In the "Get Query" node, paste your duplicated Google Sheets URL in the "Document" field Ensure your Google Sheet contains your monitoring topics in the Query column Step 5: Customize language and location settings The workflow is currently configured for French content and France location. You can modify these settings in the SerpAPI nodes: Language (hl): Change from "fr" to your preferred language code Geographic location (geo/gl): Change from "FR" to your target country code Date range: Currently set to "today 1-m" (last month) but can be adjusted Step 6: Adjust filtering (optional) The "Sorting Queries" node excludes geo-localized queries by default. You can modify the AI agent's instructions to include location-specific queries or change filtering criteria based on your requirements. The system will also automatically exclude any queries you've listed in the "Query to avoid" column. Step 7: Configure scheduling (optional) The workflow includes an automated scheduler that runs monthly (1st day of each month at 8 AM). You can modify the cron expression 0 8 1 * * in the Schedule Trigger node to change: Frequency (daily, weekly, monthly) Time of execution Day of the month How to customize the workflow Change trend count: The workflow processes up to 10 related queries per topic but filters them through AI to select the most relevant non-geolocalized ones Adjust article collection: Currently collects exactly 3 news articles per query for analysis Content style: Customize the AI prompts in content generation nodes to match your brand voice Output format: Modify the Google Sheets structure to include additional data points AI model: Replace the Anthropic model with your preferred LLM provider Scraping options: Configure Firecrawl settings to extract specific content elements from articles Results interpretation For each monitored topic, the workflow generates a separate sheet named by month and topic (e.g., "January Digital Marketing") containing: Data structure (four columns): Query: The trending search term ranked by growth Évolution: Growth percentage over the last month News: Links to 3 relevant news articles Idée: AI-generated content suggestions based on comprehensive article analysis The workflow provides monthly retrospective analysis, helping you identify emerging topics before competitors and optimize your content calendar with high-potential subjects. Workflow limitations Processes up to 10 related queries per topic with AI filtering Collects exactly 3 news articles per query Results are automatically organized in monthly sheets Requires stable internet connection for API calls
by Evervise
🎯 AI Landing Page Analyzer & Optimizer Transform your landing page audits into a powerful lead generation machine with this professional n8n workflow powered by 4 specialized AI agents. What It Does This workflow analyzes any landing page and delivers a comprehensive, $2,000-value audit report in under 90 seconds. Perfect for agencies, consultants, and SaaS companies looking to generate high-quality leads while showcasing their expertise. 🤖 Four Specialized AI Agents Design Critic - Analyzes UX/UI, visual hierarchy, CTA placement, mobile responsiveness, and trust signals Copywriter - Reviews messaging, headlines, value propositions, and emotional triggers SEO Specialist - Audits technical SEO, meta tags, heading structure, and performance indicators Growth Strategist - Synthesizes all findings, assigns an A-F grade, and creates prioritized action plans ✨ Key Features Comprehensive Scoring**: 6-dimension scorecard with detailed justifications Actionable Insights**: Top 5 priorities ranked by impact vs. effort Conversion Lift Estimates**: Conservative, realistic, and optimistic projections Beautiful HTML Reports**: Professional email with all analyses, branded for your business Natural Upsell Path**: Built-in CTA for paid optimization services Fast Delivery**: Complete analysis in 60-90 seconds 📊 What Gets Analyzed Design & UX Visual hierarchy and layout structure CTA design, placement, and effectiveness Color scheme and whitespace usage Trust signals and social proof Mobile responsiveness indicators Copywriting Headline impact and hook effectiveness Value proposition clarity Benefits vs. features balance Emotional triggers and persuasion CTA copy strength and action orientation SEO & Technical Title tag and meta description optimization Heading structure (H1, H2 hierarchy) Image optimization and alt tags Mobile-friendliness Analytics tracking setup Page speed indicators Strategic Overview Overall grade (A+ to F) with detailed explanation Comprehensive scorecard across all dimensions Prioritized action items by ROI Estimated conversion lift potential Strategic recommendations and next steps 💼 Perfect For Digital Marketing Agencies**: Offer free audits to generate leads Freelance Consultants**: Showcase expertise and attract clients SaaS Companies**: Lead magnet for conversion optimization tools Web Development Agencies**: Pre-sales qualification and demonstrations CRO Specialists**: Automated initial assessments 📋 What You Need Required n8n instance (self-hosted or cloud) Anthropic API key (Claude Sonnet 4.5) Gmail account for sending reports (or any SMTP provider) Optional Enhancements Screenshot API (UrlBox, Puppeteer) for visual captures Google PageSpeed Insights API for performance testing CRM integration (HubSpot, Salesforce, Pipedrive) Slack notifications for lead alerts Calendar booking integration (Calendly) ⚙️ Technical Details AI Model**: Claude Sonnet 4.5 (4 separate agents) Average Runtime**: 60-90 seconds Cost Per Analysis**: ~$0.15-0.25 (API costs) Form Fields**: 6 (landing page URL, industry, goal, conversion rate, frustration, email) Output**: Beautiful HTML email report with comprehensive analysis 🎨 Customization Options The workflow includes detailed documentation and is fully customizable: Adjust agent prompts for your specific niche Modify scoring criteria and thresholds Customize email branding and design Add/remove form fields Integrate with your CRM Segment responses by score (different CTAs for different grades) 📈 Expected Results Lead Quality**: High (users actively seeking optimization) Conversion Rate**: 15-30% of audit recipients book calls Time Saved**: 2-3 hours of manual analysis per audit Perceived Value**: $2,000+ professional audit 🔧 Setup Difficulty Intermediate - Requires basic n8n knowledge and API key setup Setup Steps Import workflow to n8n Add Anthropic API credentials Configure Gmail/SMTP credentials Customize form and email branding Test with sample landing page Deploy form on your website 📚 Included Documentation Comprehensive sticky notes explaining each component Setup instructions and prerequisites Customization guide Monetization strategy breakdown Performance optimization tips Enhancement ideas for v2 🌟 Use Cases Lead Magnet: Embed form on your website to capture qualified leads Sales Tool: Use during discovery calls to demonstrate value Content Marketing: Offer audits in LinkedIn posts, email campaigns Partner Program: Provide white-labeled audits to partners Upsell Sequence: Follow up with paid optimization services ⚡ Why This Workflow? Unlike simple template-based audits, this workflow uses real AI intelligence to provide nuanced, contextual insights. Each analysis is unique and considers: Industry-specific best practices Stated business goals Current pain points Competitive landscape User's experience level The result? Reports that feel hand-crafted by experts, not generic checklists. 🤝 Support 📖 Website: https://evervise.ai/ ✨ Support: mark.marin@evervise.com N8N Link 📊 Version History v1.0** - Initial release with 4-agent analysis pipeline Coming soon: Screenshot capture, competitor comparison, visual mockups Tags lead-generation marketing-automation ai-agents conversion-optimization landing-pages anthropic claude audit seo copywriting design-analysis lead-magnet Ready to turn landing page audits into a lead generation machine? Import this workflow and start attracting high-quality clients today.
by Lucas Peyrin
How it works This workflow automates your initial hiring pipeline by creating an AI-powered CV scanner. It collects job applications through a web form, uses AI to analyze the candidate's CV against your job description, and neatly organizes the results in a Google Sheet. Here’s the step-by-step process: The Application Form:** A Form Trigger provides a public web form for candidates to submit their name, email, and CV (as a PDF). Initial Logging:** As soon as an application is submitted, the candidate's name and email are added to a Google Sheet. This ensures every applicant is logged, even if a later step fails. CV Text Extraction:* The workflow uses *Mistral's OCR** model to accurately extract all the text from the uploaded CV PDF. AI Analysis:* The extracted text is sent to *Google Gemini**. A detailed prompt instructs the AI to act as a hiring assistant, scoring the CV against the specific requirements of your job role and providing a detailed explanation for its score. Structured Output:** A JSON Output Parser ensures the AI's analysis is returned in a clean, structured format, making the data reliable. Final Record:** The AI-generated qualification score and explanation are added to the candidate's row in the Google Sheet, giving you a complete, analyzed list of applicants. Set up steps Setup time: ~15 minutes You'll need API keys for Mistral and Google AI, and to connect your Google account. Get Your Mistral API Key: Visit the Mistral Platform at console.mistral.ai/api-keys. Create and copy your API key. In the workflow, go to the Extract CV Text node, click the Credential dropdown, and select + Create New Credential. Paste your key into the API Key field and Save. Get Your Google AI API Key: Visit Google AI Studio at aistudio.google.com/app/apikey. Click "Create API key in new project" and copy the key. In the workflow, go to the Gemini 2.5 Flash Lite node, click the Credential dropdown, and select + Create New Credential. Paste your key into the API Key field and Save. Connect Your Google Account: Select the Create 'CVs' Spreadsheet node. Click the Credential dropdown and select + Create New Credential to connect your Google account. Repeat this for the Log Candidate Submission and Add CV Analysis nodes, selecting the credential you just created. Create Your Spreadsheet: Click the "play" icon on the Start Here node to run it. This will create a new Google Sheet in your Google Drive named "CVs" with the correct columns. Customize the Job Role: Go to the AI Qualification node. In the Text parameter, find the job_requirements section and replace the example job description with your own. Be as detailed as possible for the best results. Start Screening! Activate the workflow using the toggle at the top right. Go to the Application Form node and click the "Open Form URL" button. Fill out the form with a test application and upload a sample CV. Check your Google Sheet to see the AI's analysis appear within moments
by Khairul Muhtadin
Say goodbye to endless applications and hello to more time for perfecting your interview skills! The JOB Hunter Agent uses the power of Google Gemini and SerpAPI to find the perfect job match and generate a personalized cover letter. Result example: 💡 Why Use JOB Hunter Agent? Save Precious Time: Stop sifting through countless job boards; this agent does the heavy lifting, saving you hours every week. Land Your Dream Job Faster: Get laser-focused job matches and a custom-crafted cover letter that speaks directly to the hiring manager, increasing your chances of getting noticed. Never Miss an Opportunity: Your personal AI assistant works 24/7, ensuring you're always on top of the latest openings, even while you sleep! Stand Out from the Crowd: A perfectly tailored cover letter generated on the fly gives you an edge over generic applications, making you look like a superstar. ⚡ Perfect For Job Seekers: Anyone actively looking for a new role who wants to streamline their application process. Busy Professionals: Those with limited time who need an efficient way to find relevant opportunities. Career Changers: Individuals exploring new industries who need a helping hand in crafting compelling applications. 🔧 How It Works ⏱ Trigger: You submit your CV and job preferences through a simple n8n form. 📎 Process: Your CV is extracted from the PDF, and your preferences (location, job type, salary, email) are neatly organized. 🤖 Smart Logic: The "Job Hunter Agent" uses Google Gemini and SerpAPI to find the single best job match for you and then drafts a bespoke cover letter based on your CV and the job description. It's like magic, but with more code! 💌 Output: A beautifully formatted HTML email containing your profile summary, the best job match, your personalized cover letter, and handy application tips is sent straight to your inbox. 🗂 Storage: All the initial data from your form submission is processed and used to craft your perfect job application package. 🔐 Quick Setup Import JSON file to your n8n instances Add credentials: Google Gemini (Gemini 2.5 Pro model) and SerpAPI Customize: Adjust the system prompt in the "Job Hunter Agent" to fine-tune the cover letter tone or length, update the email footer, and expand job filters. Update: Ensure your Gmail OAuth2 credentials are valid for sending emails. Test: Run the workflow with your own CV and preferences to see the magic happen! 🧩 You'll Need Active n8n instances Google Gemini API key (for Gemini 2.5 Pro) SerpAPI account for Google Jobs search results A Gmail account for sending personalized job match emails 🛠️ Level Up Ideas Integrate with LinkedIn, Jobstreet, or Indeed APIs for a wider range of job sources. Allow the agent to find multiple job matches and present them as a curated list. Add an option to attach a parsed CV summary as a PDF to the email for quick reference. Made by: khaisa Studio Tags: AI, Gemini, Google Jobs, Job Search, Automation, Cover Letter Category: job hunter Need custom work? Contact me
by Derek Cheung
How it works: This project teaches you to create a personal AI assistant named Jackie that operates through Telegram. Jackie can summarize unread emails, check calendar events, manage Google Tasks, and handle both voice and text interactions. The assistant provides a comprehensive digital life management solution accessible via Telegram messaging. Key Features: Supports hands-free voice interaction Maintains conversation memory Integrates with major Google services Provides personalized assistance for email management, scheduling, and task organization Step-by-step: Telegram Trigger: The workflow starts with a Telegram trigger that listens for incoming message events. The system determines if the incoming message is voice or text input. Voice Processing: If a voice message is received, the workflow retrieves the voice file from Telegram and uses OpenAI's transcription API to convert speech to text. AI Assistant: The processed text (whether original text or transcribed voice) is passed to Jackie, the AI assistant powered by OpenRouter's language model. Tools Integration: Jackie is equipped with several productivity tools: Get Email: Uses Gmail API to fetch unread emails from the inbox with sender, date, subject, and summary information Google Calendar: Retrieves calendar events for specified dates, filtering out irrelevant future events Google Tasks: Both creates new tasks and retrieves existing tasks from Google Tasks lists API Keys Required: Telegram Bot API: Create a bot via @BotFather on Telegram to get your bot token OpenAI API: Required for voice-to-text transcription OpenRouter API: Powers the AI language model responses Google OAuth2: Needed for Gmail, Google Calendar, and Google Tasks integration Response Generation: The AI formulates intelligent responses based on the gathered information, current date context, and conversation history, then sends the response back to the user via Telegram in Markdown format.
by MilanWR
Telegram N8N workflow (de)activator What does it do? This workflow helps you to quickly activate or deactivate a workflow through Telegram. Sometimes we are not able to access a PC to resolve an issue if something goes wrong with a workflow. If you, like me, use Telegram to send yourself error reports, you can quickly react in case of urgency. Just by sending '/stop' combined with the name you use for a workflow, you can deactivate a workflow, or reactivate it with '/start'. For example '/stop marketing'. Walkthrough: https://watch.screencastify.com/v/uWQ88gZKj57WTGOOqSW2 (6min) Instructions Create a Telegram API key through botfather (https://t.me/botfather). Add it to the telegram credentials. For the N8N nodes, go to settings in your n8n instance. Then 'n8n API' and 'create an API key'. To ensure that only we can send commands to the bot, we need the chat ID of our DM with our newly created bot. Open the the Telegram trigger and click on 'listen to events'. Go to Telegram and send a direct message to the bot, this will trigger the Telegram node. Go to the filter node and fill in the chat id you want to filter for with the data you got from the test event in the Telegram node. In the first Switch node you can find the commands, in this case it is '/start' and '/stop'. When you send a message to your bot starting with either of those, it will go to the next switch nodes. Next it will check what other word it contains. As an example I have used the words 'marketing' and 'sales', both corresponding to a marketing and sales workflow. The last nodes will either activate or deactivate a workflow.
by n8n Team
This workflow imports multiple CSV files and appends or updates them to a Google Sheets document. Here's a step-by-step breakdown: When clicked "Execute Workflow", the process starts. The "Read Binary Files" node reads all the '.csv' files from the specified directory. The files are then split into batches (one file in a batch) by the "Split In Batches" node. For each file, the "Read CSV" node reads the data from the CSV file. The "Assign source file name" node assigns the source file name to the data. The data is then processed by the "Remove duplicates" node. This removes any duplicate entries based on the 'user_name' field. The "Keep only subscribers" node filters the data to keep only those entries where the 'subscribed' field is set to 'TRUE'. The data is then sorted by the 'date_subscribed' field using the "Sort by date" node. Finally, the processed data is appended or updated to a specified Google Sheets document using the "Upload to spreadsheet" node. It checks for the 'user_name' field, if the data corresponding to that 'user_name' already exists, it updates the data, otherwise appends the new data.