by Oneclick AI Squad
This automated n8n workflow scrapes job listings from Upwork using Apify, processes and cleans the data, and generates daily email reports with job summaries. The system uses Google Sheets for data storage and keyword management, providing a comprehensive solution for tracking relevant job opportunities and market trends. What is Apify? Apify is a web scraping and automation platform that provides reliable APIs for extracting data from websites like Upwork. It handles the complexities of web scraping including rate limiting, proxy management, and data extraction while maintaining compliance with website terms of service. Good to Know Apify API calls may incur costs based on usage; check Apify pricing for details Google Sheets access must be properly authorized to avoid data sync issues The workflow includes data cleaning and deduplication to ensure high-quality results Email reports provide structured summaries for easy review and decision-making Keyword management through Google Sheets allows for flexible job targeting How It Works The workflow is organized into three main phases: Phase 1: Job Scraping & Initial Processing This phase handles the core data collection and initial storage: Trigger Manual Run - Manually starts the workflow for on-demand job scraping Fetch Keywords from Google Sheet - Reads the list of job-related keywords from the All Keywords sheet Loop Through Keywords - Iterates over each keyword to trigger Apify scraping Trigger Apify Scraper - Sends HTTP request to start Apify actor for job scraping Wait for Apify Completion - Waits for the Apify actor to finish execution Delay Before Dataset Read - Waits a few seconds to ensure dataset is ready for processing Fetch Scraped Job Dataset - Fetches the latest dataset from Apify Process Raw Job Data - Filters jobs posted in the last 24 hours and formats the data Save Jobs to Daily Sheet - Appends new job data to the daily Google Sheet Update Keyword Job Count - Updates job count in the All Keywords summary sheet Phase 2: Data Cleaning & Deduplication This phase ensures data quality and removes duplicates: Load Today's Daily Jobs - Loads all jobs added in today's sheet for processing Remove Duplicates by Title/Desc - Removes duplicates based on title and description matching Save Clean Job Data - Saves the cleaned, unique entries back to the sheet Clear Old Daily Sheet Data - Deletes old or duplicate entries from the sheet Reload Clean Job Data - Loads clean data again after deletion for final processing Phase 3: Daily Summary & Email Report This phase generates summaries and delivers the final report: Generate Keyword Summary Stats - Counts job totals per keyword for analysis Update Summary Sheet - Updates the summary sheet with keyword statistics Fetch Final Summary Data - Reads the summary sheet for reporting purposes Build Email Body - Formats email with statistics and sheet link Send Daily Report Email - Sends the structured daily summary email to recipients Data Sources The workflow utilizes Google Sheets for data management: AI Keywords Sheet - Contains keyword management data with columns: Keyword (text) - Job search terms Job Count (number) - Number of jobs found for each keyword Status (text) - Active/Inactive status Last Updated (timestamp) - When keyword was last processed Daily Jobs Sheet - Contains scraped job data with columns: Job Title (text) - Title of the job posting Description (text) - Job description content Budget (text) - Job budget or hourly rate Client Rating (number) - Client's rating on Upwork Posted Date (timestamp) - When job was posted Job URL (text) - Direct link to the job posting Keyword (text) - Which keyword found this job Scraped At (timestamp) - When data was collected Summary Sheet - Contains daily statistics with columns: Date (date) - Report date Total Jobs (number) - Total jobs found Keywords Processed (number) - Number of keywords searched Top Keyword (text) - Most productive keyword Average Budget (currency) - Average job budget Report Generated (timestamp) - When summary was created How to Use Import the workflow into n8n Configure Apify API credentials and Google Sheets API access Set up email credentials for daily report delivery Create three Google Sheets with the specified column structures Add relevant job keywords to the AI Keywords sheet Test with sample keywords and adjust as needed Requirements Apify API credentials and actor access Google Sheets API access Email service credentials (Gmail, SMTP, etc.) Upwork job search keywords for targeting Customizing This Workflow Modify the Process Raw Job Data node to filter jobs by additional criteria like budget range, client rating, or job type. Adjust the email report format to include more detailed statistics or add visual aids, such as charts. Customize the data cleaning logic to better handle duplicate detection based on your specific requirements, or add additional data sources beyond Upwork for comprehensive job market analysis.
by Jonathan
This workflow uses a Hubspot Trigger to check for new contacts. It then validates the contacts' email using OneSimpleAPI. If there are any a message will be sent to Slack. To configure this workflow you will need to set the credentials for the Hubspot, OneSimpleAPI and Slack Nodes. You will also need to select the Slack channel to use for sending the message.
by Sirhexalot
This n8n workflow enables you to export data from Zammad, including Users, Roles, Groups, and Organizations, into individual Excel files. It simplifies data handling and reporting by creating structured outputs for further processing or sharing. Features Export Users with associated details such as email, firstname, lastname, role_ids, and group_ids. Export Roles and Organizations with their respective identifiers and names. Convert all data into separate Excel files for easy access and use. Usage Import this workflow into your n8n instance. Configure the required Zammad API credentials (zammad_base_url and zammad_api_key) in the Basic Variables node. Run the workflow to generate Excel files containing Zammad data. Issues and Suggestions If you encounter any issues or have suggestions for improvement, please report them on the GitHub repository. We appreciate your feedback to help enhance this workflow!
by Harshil Agrawal
This workflow allows you to send position updates of the ISS every minute to a table in Google BigQuery. Cron node: The Cron node will trigger the workflow every minute. HTTP Request node: This node will make a GET request to the API https://api.wheretheiss.at/v1/satellites/25544/positions to fetch the position of the ISS. This information gets passed on to the next node in the workflow. Set node: We will use the Set node to ensure that only the data that we set in this node gets passed on to the next nodes in the workflow. Google BigQuery: This node will send the data from the previous node to the position table in Google BigQuery. If you have created a table with a different name, use that table instead.
by Didac Fernandez
🤖 Autonomous Email Assistant - AI-Powered Inbox Management > Transform Your Email Workflow with Intelligent Automation This advanced n8n workflow creates a fully autonomous email assistant that processes incoming emails through AI-powered classification, generates contextually-aware responses in your personal brand voice, and automatically organizes your inbox. Perfect for: Professionals managing high email volumes who want to maintain response quality while saving hours each week. 🎯 What This Workflow Does The Autonomous Email Assistant monitors your Outlook inbox and intelligently processes every incoming email through a sophisticated multi-stage pipeline: 🏷️ Smart Classification - Automatically categorizes emails into 7 distinct types (Commercial/Spam, Internal, Meeting, Newsletter, Notifications, Urgent, Other) ✍️ AI Response Generation - Creates draft responses tailored to the email type, maintaining your unique communication style 📅 Meeting Automation - Checks your calendar availability and handles meeting requests automatically ⚡ Priority Handling - Sends Slack notifications for urgent emails requiring immediate attention 📂 Inbox Organization - Files processed emails into categorized folders with AI tagging 📊 Comprehensive Logging - Records all processed emails and responses in Excel for audit trails ✨ Key Features 🔍 Dual Classification System Primary LLM classifier for fast categorization Secondary text classifier for validation 7 predefined categories with smart routing logic 🎨 Brand Voice Integration Maintains consistent communication style across all responses Customizable writing patterns and key phrases Professional tone with configurable formality levels 📆 Intelligent Meeting Handler Calendar integration with availability checking Automatic event creation for confirmed meetings Suggests alternative times when unavailable Maintains 15-minute buffers between meetings Respects working hours (8:30 AM - 5:00 PM) 👤 Human-in-the-Loop for Critical Emails Slack notifications for urgent messages Approval workflow with feedback incorporation Draft responses for review before sending 📥 Complete Inbox Management Auto-marking as read AI category tagging for tracking Organized folder archiving by email type Excel logging for analytics and compliance 🛠️ Workflow Requirements 🔐 Required Credentials Microsoft Outlook OAuth2** - Email access, calendar permissions Microsoft Excel 365** - For logging workbook OpenRouter API** - GPT-5-mini model recommended Slack OAuth2** - Optional, for urgent notifications 💻 Technical Stack | Component | Technology | |-----------|-----------| | AI Model | OpenRouter GPT-5-mini | | Email Provider | Microsoft Outlook | | Data Storage | Microsoft Excel 365 | | Notifications | Slack | | Polling Interval | Every minute (configurable) | ⚙️ How It Works Stage 1️⃣: Email Ingestion Microsoft Outlook Trigger monitors inbox → Information Extractor pulls sender details Stage 2️⃣: Classification Dual AI classifiers determine email category → Routes to appropriate handler Stage 3️⃣: Response Generation General emails** → emailReplier Meeting requests** → AI Agent with calendar tools Urgent emails** → urgentReplier + Slack notification Others** → Context-aware handler Stage 4️⃣: Brand Voice Application All responses pass through brand voice nodes for style consistency Stage 5️⃣: Organization ✅ Mark as read 🏷️ Apply AI category tag 📁 Archive to appropriate folder 📝 Log to Excel 🎛️ Customization Options 📋 Adjust Classification Categories Modify the Virtual Postman categories to match your specific needs. Add industry-specific classifications or merge existing ones. ✏️ Personalize Brand Voice The embedded brand voice prompts can be completely customized: Update key phrases and sign-offs Adjust sentence length preferences Modify formality and tone Add company-specific terminology ⚙️ Configure Response Behaviors Change meeting scheduling preferences Update working hours Modify urgent email criteria Adjust buffer times between meetings 🔔 Notification Preferences Switch Slack to email notifications Add multiple notification channels Customize urgency thresholds 💼 Use Cases | Role | Benefits | |------|----------| | 🎯 Busy Executives | Handle routine correspondence while maintaining personal touch | | 🎧 Customer Support | First-line response generation with consistent brand voice | | 💰 Sales Teams | Automated meeting scheduling and follow-up management | | 📊 Project Managers | Internal communication routing and priority handling | | 💡 Consultants | Client communication management across multiple projects | 🚀 Setup Guide Import Workflow - Import the JSON into your n8n instance Configure Credentials - Add all four required OAuth2 connections Create Excel Workbook - Set up "Email Automator" workbook with specified columns Create Outlook Folders - Add the 7 category folders to your Outlook Customize Brand Voice - Update the brand voice prompts with your writing style Test Classification - Send test emails to verify category routing Activate Workflow - Enable the workflow to start processing ⚠️ Important Notes ⚡ All urgent emails require human approval before sending 📝 Most responses are saved as drafts for review 📊 Comprehensive Excel logging enables quality assurance 🏷️ AI tagging allows easy identification of automated processing 📅 Calendar integration respects existing commitments 🔒 Data Privacy & Security This workflow processes emails locally within your n8n instance. Email content is sent to OpenRouter for AI processing. Review OpenRouter's data policies and ensure compliance with your organization's data handling requirements. 📜 Version History v1.0 - Initial Release 7-category classification system Brand voice integration Meeting automation Excel logging Slack notifications 💬 Support & Community For questions, customization help, or to share improvements, visit the n8n community forum. This workflow is designed to be highly customizable - adapt it to your specific needs! Created by: Didac Fernandez Girona | AutoSolutions.ai - AI Consulting Services Tags: email automation AI assistant outlook calendar management brand voice inbox organization meeting scheduler
by Yaron Been
This workflow automatically monitors competitor social media engagement on LinkedIn to track their content performance and posting strategies. It saves you time by eliminating the need to manually check competitor social media accounts and provides detailed analytics on their engagement metrics. Overview This workflow automatically scrapes LinkedIn company profiles to extract the latest 5 posts and analyzes their engagement metrics including likes, comments, and content performance. It uses Bright Data to access LinkedIn without being blocked and AI to intelligently parse post data, calculating average engagement rates and storing detailed post information. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping LinkedIn company profiles without being blocked OpenAI**: AI agent for intelligent post data extraction and analysis Google Sheets**: For storing engagement metrics and detailed post information How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your competitor tracking spreadsheets Customize: Enter target LinkedIn company URLs and adjust engagement tracking parameters Use Cases Social Media Marketing**: Analyze competitor content strategies and engagement patterns Competitive Intelligence**: Track competitor posting frequency and content performance Content Strategy**: Identify high-performing content types and messaging approaches Brand Monitoring**: Monitor competitor social media presence and audience engagement Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #socialmedia #competitoranalysis #linkedin #brightdata #webscraping #socialmonitoring #engagementtracking #n8nworkflow #workflow #nocode #socialautomation #competitormonitoring #contentanalysis #socialmediamonitoring #linkedinanalytics #engagementmetrics #competitorresearch #socialintelligence #contentperformance #socialmediaanalytics #brandmonitoring #competitortracking #socialmediastrategy #contentmarketing #socialmediadata #engagementanalysis #competitiveanalysis #linkedinscraping
by Sascha
Having a seamless flow of customer data between your online store and your marketing platform is essential. By keeping your systems synchronized, you can ensure that your marketing campaigns are accurately targeted and effective. The integration between Shopify, a leading e-commerce platform, and Mautic, an open-source marketing automation system, is not available out-of-the-box. However, with a n8n workflow you can bridge this gap with. This template will help you: enhance accuracy in marketing lists by ensuring that subscription changes in Shopify are instantly updated in Mautic. improve compliance with data protection laws by respecting users' subscription preferences across platforms achieve integration without the need for additional plugins or software, minimizing complexity and potential points of failure. This template will demonstrate the follwing concepts in n8n: working with Shopify in n8n control flow with the IF node use Webhooks validate Webhooks with the Crypto node use the GraphQL node to call the Shopify Admin API The template consists of two parts: Sync Email Subscriptions from Shopify to Mautic Sync Email Subscriptions from Mautic to Shopify How to get started? Create a custom app in Shopify get the credentials needed to connect n8n to Shopify This is needed for the Shopify Trigger Create Shopify Acces Token API credentials n n8n for the Shopify trigger node Create Header Auth credentials: Use X-Shopify-Access-Token as the name and the Acces-Token from the Shopify App you created as the value. The Header Auth is neccessary for the GraphQL nodes. Enable the Mautic API under Configuration/API Settings, After the settings are saved you will have an additional entry in your settings menu to create API credentials for n8n Create Mautic credentials in n8n Please make sure to read the notes in the template. For a detailed explanation please check the corresponding video: https://youtu.be/x63rrh_yJzI
by Paul-François GORIAUX
This workflow acts as your personal AI-powered analyst for Meta Ads. It's pretty straightforward: First, it grabs a list of Facebook Ad Library URLs you want to check out from a Google Sheet. Then, it automatically scrapes the active ads from those pages. Here's the cool part: it sends each ad's image and text to Google Gemini, which analyzes it like an expert marketer would. Finally, Gemini's full analysis—we're talking strengths, weaknesses, actionable suggestions, and a performance score—gets dropped neatly into another Google Sheet for you. Set up steps You should be ready to roll in about 5 minutes. There are no complex configurations, you just need to: Connect your accounts: The workflow has placeholders waiting for your credentials for Google (for Sheets and the Gemini API) and ScrapingFlash. Link your Google Sheets: Just point the first Google Sheets node to the sheet with your URLs, and tell the last node where you want to save the results. All the nitty-gritty details and expressions are explained in the sticky notes inside the workflow itself!
by Priya Jain
This workflow provides an OAuth 2.0 auth token refresh process for better control. Developers can utilize it as an alternative to n8n's built-in OAuth flow to achieve improved control and visibility. In this template, I've used Pipedrive API, but users can apply it with any app that requires the authorization_code for token access. This resolves the issue of manually refreshing the OAuth 2.0 token when it expires, or when n8n's native OAuth stops working. What you need to replicate this Your database with a pre-existing table for storing authentication tokens and associated information. I'm using Supabase in this example, but you can also employ a self-hosted MySQL. Here's a quick video on setting up the Supabase table. Create a client app for your chosen application that you want to access via the API. After duplicating the template: a. Add credentials to your database and connect the DB nodes in all 3 workflows. Enable/Publish the first workflow, "1. Generate and Save Pipedrive tokens to Database." Open your client app and follow the Pipedrive instructions to authenticate. Click on Install and test. This will save your initial refresh token and access token to the database. Please watch the YouTube video for a detailed demonstration of the workflow: How it operates Workflow 1. Create a workflow to capture the authorization_code, generate the access_token, and refresh the token, and then save the token to the database. Workflow 2. Develop your primary workflow to fetch or post data to/from your application. Observe the logic to include an if condition when an error occurs with an invalid token. This triggers the third workflow to refresh the token. Workflow 3. This workflow will handle the token refresh. Remember to send the unique ID to the webhook to fetch the necessary tokens from your table. Detailed demonstration of the workflow: https://youtu.be/6nXi_yverss
by Sk developer
🚀 LinkedIn Video to MP4 Automation with Google Drive & Sheets | RapidAPI Integration This n8n workflow automatically converts LinkedIn video URLs into downloadable MP4 files using the LinkedIn Video Downloader API, uploads them to Google Drive with public access, and logs both the original URL and Google Drive link into Google Sheets. It leverages the LinkedIn Video Downloader service for fast and secure video extraction. 📝 Node Explanations (Single-Line) 1️⃣ On form submission → Captures LinkedIn video URL from the user via a web form. 2️⃣ HTTP Request → Calls LinkedIn Video Downloader to fetch downloadable MP4 links. 3️⃣ If → Checks for API errors and routes workflow accordingly. 4️⃣ Download mp4 → Downloads the MP4 video file from the API response URL. 5️⃣ Upload To Google Drive → Uploads the downloaded MP4 file to Google Drive. 6️⃣ Google Drive Set Permission → Makes the uploaded file publicly accessible. 7️⃣ Google Sheets → Logs successful conversions with LinkedIn URL and sharable Drive link. 8️⃣ Wait → Delays execution before logging failed attempts. 9️⃣ Google Sheets Append Row → Logs failed video downloads with N/A Drive link. 📄 Google Sheets Columns URL** → Original LinkedIn video URL entered in the form. Drive_URL** → Publicly sharable Google Drive link to the converted MP4 file. (For failed downloads) → Drive_URL will display N/A. 💡 Use Case Automate LinkedIn video downloading and sharing using LinkedIn Video Downloader for social media managers, marketers, and content creators without manual file handling. ✅ Benefits Time-saving* (auto-download & upload), *Centralized tracking* in Sheets, *Easy sharing* via Drive links, and *Error logging* for failed downloads—all powered by *RapidAPI LinkedIn Video Downloader**.
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically tracks key sales pipeline metrics—new leads, deal stages, win rates—and sends actionable insights to your team. Eliminate manual CRM exports and stay on top of revenue health. Overview The automation queries your CRM API (HubSpot, Salesforce, or Pipedrive) on a schedule, pulls pipeline data, and feeds it into OpenAI for anomaly detection (e.g., stalled deals). Summaries and alerts appear in Slack, while daily snapshots are archived in Google Sheets for trend analysis. Tools Used n8n** – Pipeline orchestration CRM API** – Connects to your chosen CRM OpenAI** – Detects anomalies and highlights risks Slack** – Notifies reps and managers in real time Google Sheets** – Stores historical pipeline data How to Install Import the Workflow into n8n. Connect Your CRM: Provide API credentials in the HTTP Request node. Set Up OpenAI: Add your API key. Authorize Slack & Google Sheets. Customize Thresholds: Adjust what constitutes a stalled deal or low conversion. Use Cases Sales Management**: Monitor pipeline health without dashboards. Revenue Operations**: Detect bottlenecks early. Forecasting**: Use historical snapshots to improve predictions. Rep Coaching**: Alert reps when deals stagnate. Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #salespipeline #crm #openai #slackalerts #n8nworkflow #nocode #revenueops
by Nabin Bhandari
This template uses VAPI and Cal.com to book appointments through a voice conversation. It detects whether the user wants to check availability or book an appointment, then responds naturally with real-time scheduling options. Who is this for? This workflow is perfect for: Voice assistant developers AI receptionists and smart concierge tools Service providers (salons, clinics, coaches) needing hands-free scheduling Anyone building voice-based customer experiences What does it do? This workflow turns a natural voice conversation into a working appointment system. It starts with a Webhook connected to your VAPI voice agent. The Set node extracts user intent (like “check availability” or “book now”). A Switch node branches logic based on the intent. If the user wants to check availability, the workflow fetches available times from Cal.com. If the user wants to book, it creates a new event using Cal.com's API. The final result is sent back to VAPI as a conversational voice response. How to use it Import this workflow into your n8n instance. Set up a Webhook node and connect it to your VAPI voice agent. Add your Cal.com API token as a credential (use HTTP Header Auth). Deploy and test using VAPI’s simulator or real phone input. (Optional) Customize the OpenAI prompt if you're using it to process or moderate inputs. Requirements A working VAPI agent A Cal.com account with API access n8n (cloud or self-hosted) An understanding of how to configure webhook and API credentials in n8n Customization Ideas Swap out Cal.com with another booking API (like Calendly) Add a Google Sheets or Supabase node to log appointments Use OpenAI to summarize or sanitize voice inputs before proceeding Build multi-turn conversations in VAPI for more complex bookings