by System Admin
Script to delete traces of selenium in the browser
by Paul Kobelke
Remove Duplicates & Update Google Sheets How it Works This workflow helps you keep your Google Sheets clean and up-to-date by automatically removing duplicate entries and re-uploading the cleaned data back to your sheet. It’s especially useful for large lead lists, email databases, or any dataset where duplicate rows can cause confusion and inefficiency. The flow: Trigger the workflow manually. Fetch all rows from a specific Google Sheet. Identify and remove duplicate rows based on the profileUrl field. Convert the cleaned dataset into a file. Update the original Google Sheet with the new, deduplicated data. Setup Steps Connect your Google Sheets and Google Drive credentials in n8n. Update the workflow with your desired spreadsheet and sheet ID. Run the workflow by clicking “Execute Workflow” whenever you want to clean up your data. The process only takes a few seconds and ensures your sheet stays organized without any manual effort. Use Cases CRM lead management (avoiding duplicate prospects). Contact lists with scraped or imported data. Marketing databases with overlapping submissions.
by Robert Breen
This workflow checks a Google Sheet for new tasks (marked Added = No) and automatically creates them in a Monday.com board. Once added, the workflow updates the sheet to mark the task as Added = Yes. ⚙️ Setup Instructions 1️⃣ Prepare Your Google Sheet Copy this template to your own Google Drive: Google Sheet Template First row should contain column names Add your data in rows 2–100. Make sure each new task row starts with Added = No. Connect Google Sheets in n8n Go to n8n → Credentials → New → Google Sheets (OAuth2) Log in with your Google account and grant access. In the workflow, select your Spreadsheet ID and Worksheet Name. Optional: You can connect Airtable, Notion, or your database instead of Google Sheets. 2️⃣ Connect Monday.com Node In Monday.com → go to your Admin → API Copy your Personal API Token Docs: Generate Monday API Token In n8n → Credentials → New → Monday.com API Paste your token and save. Open the Create Monday Task node → choose your credential → select your Board ID and Group ID. 📬 Contact Need help customizing this (e.g., mapping more fields, syncing statuses, or updating timelines)? 📧 robert@ynteractive.com 🔗 Robert Breen 🌐 ynteractive.com
by Anurag
Description This workflow automates the download of new or updated files from a Google Drive folder, processing only files changed since the last run using a timestamp control file. How It Works Triggered on a schedule. Checks for a n8n_last_run.txt file in your Google Drive to read when the workflow last ran. If missing, defaults to processing changes in the last 24 hours. Searches for new or modified files in your specified folder. Downloads new/changed files. Replaces the timestamp file with the current time for future runs. Setup Steps Set up your Google Drive credentials in n8n. Find the Folder ID of the Google Drive folder you wish to monitor. Edit all Google Drive nodes: Select your credentials Paste the Folder ID Adjust the schedule trigger if needed. Activate the workflow. Features No duplicate file processing (idempotent) Handles missing timestamp files Clear logical sticky notes in the editor Modular, extendable design Prerequisites Google Drive API credentials connected to n8n Target Google Drive folder accessible by the credentials
by Milan Vasarhelyi - SmoothWork
What this workflow does This workflow automates backend setup tasks for real estate client portals. When a new property transaction is added to your Google Sheets database with a buyer email but no document folder assigned, the workflow automatically creates a dedicated Google Drive folder, updates the spreadsheet with the folder URL, and adds an initial task prompting the client to upload documents. This automation eliminates manual folder creation and task assignment, ensuring every new transaction has its documentation infrastructure ready from day one. Your clients can access their dedicated folder directly from the portal, keeping all property-related documents organized and accessible in one place. Key benefits Eliminate manual setup**: No more creating folders and tasks individually for each transaction Consistent client experience**: Every buyer gets the same professional onboarding process Organized documentation**: Each transaction has its own Google Drive folder automatically shared with the client Time savings**: Focus on closing deals instead of administrative setup Setup requirements Important: You must make a copy of the reference Google Sheets spreadsheet to your own Google account before using this workflow. Your spreadsheet needs at minimum two tabs: Transactions tab**: Columns for ID, Buyer Email, Documents URL, Property Address, and Status Tasks tab**: Columns for Transaction ID, Task Name, Task Description, and Status Configuration steps Authenticate your Google Sheets and Google Drive accounts in n8n Update the Google Sheets trigger node to point to your copied spreadsheet Set the parent folder ID in the "Create Client Documents Folder" node (where transaction folders should be created) Customize the initial task name and description in the "Add Initial Upload Task" node Verify all sheet names match your spreadsheet tabs The workflow triggers every minute checking for new transactions that meet the criteria (has buyer email, missing documents URL).
by Jannik Lehmann
GitLab Wrapped Generator ✨ Automatically generate your personalized GitLab Wrapped, a stunning year-in-review of your contributions, activity, and stats. Powered by gitlab-wrapped by @michaelangelorivera. 🚀 How it works Forks the gitlab-wrapped project (or finds your existing fork) Configures CI/CD environment variables Triggers the GitLab pipeline Monitors until completion (polls every 2 minutes) 🎉 Your wrapped will be available at: https://YOUR-USERNAME.gitlab.io/gitlab-wrapped --- ⚙️ Setup Create a GitLab PAT with these scopes: api read_repository write_repository Fill out the form: Your GitLab username Your PAT token GitLab instance URL (defaults to gitlab.com) Year (defaults to 2025) Submit & relax! ☕ The workflow handles everything automatically. --- 💡 Works with GitLab.com and self-hosted instances 📅 Generate wrapped reports for any past year
by Alysson Neves
Canvas: Send students their pending assignments How it works Trigger the workflow and set the Canvas base URL and target course name. Fetch all instructor courses and locate the course ID that matches the name. Retrieve enrolled students and their unsubmitted submissions for the course, handling paginated results. Merge student records with submission data, convert due dates to local time, and build a per-student summary. Send a Canvas conversation to each student with a personalized list of pending assignments and links. Setup [ ] Connect Canvas API credentials (Bearer and header auth used by the workflow). [ ] Enter your Canvas base URL (e.g. https://your_educational_institution.instructure.com). [ ] Set the exact course name to check for pending work. [ ] Confirm the teacher account can view students and send conversations. [ ] Run the workflow manually to verify output and delivery. [ ] Edit the message subject or body template if you need different wording.
by 福壽一貴
Who Is This For Marketing teams, social media managers, and brand strategists who want to understand competitor visual strategies across multiple platforms. Perfect for agencies managing multiple client accounts or brands looking to benchmark their visual content. What This Template Does This workflow automates competitive visual intelligence gathering across Instagram and TikTok using AI-powered image analysis: Collects recent posts from your account and up to 3 competitors via a simple form interface Routes content to appropriate Apify scrapers based on selected platforms Filters and processes image content from scraped posts Analyzes each image using GPT-4o Vision to extract color palettes, composition styles, mood/emotion, and text design elements Generates a comprehensive competitive analysis report with actionable recommendations Logs all analysis results to Google Sheets for historical tracking Requirements Apify account** with API access (for Instagram and TikTok scraping) OpenAI API key** with GPT-4o access Google Sheets** connected for logging results How to Set Up Configure Apify credentials: Connect your Apify account in the credential settings Add OpenAI API key: Enter your API key in the "Workflow Configuration" node (replace YOUR_OPENAI_API_KEY) Set up Google Sheets: Create a new spreadsheet with columns: timestamp, own_account, competitors, platforms, posts_analyzed, summary Update the "Log Results to Google Sheets" node with your document ID Activate the workflow and access the form via the provided webhook URL How to Customize Adjust analysis depth**: Modify the postsCount variable to analyze more or fewer posts per account Customize AI prompts**: Edit the prompt in "Analyze Images with GPT-4o Vision" to extract different visual attributes Add more platforms**: Extend the Platform Router to include additional social networks Change report format**: Modify the prompt in "Generate Competitive Analysis Report" for different output structures
by vinci-king-01
Content Gap Analyzer with AI-Powered Competitor Intelligence Overview This comprehensive workflow automatically analyzes competitor content strategies and identifies content gaps in your market. Using advanced AI-powered scraping and analysis, it provides actionable insights for content planning, SEO optimization, and competitive advantage. Key Features 🔍 AI-Powered Content Analysis Scrapes and analyzes competitor websites using ScrapeGraphAI Extracts comprehensive content metadata (titles, keywords, engagement metrics) Identifies trending topics and content formats Analyzes your existing content library for comparison 📊 Advanced Gap Identification Identifies topic gaps where competitors are active but you're not Discovers keyword opportunities with low competition Analyzes content format gaps (videos, guides, case studies) Calculates opportunity scores based on engagement and competition 🎯 SEO Strategy Mapping Maps primary, secondary, and long-tail keywords for each opportunity Analyzes search intent (informational, commercial, transactional) Identifies keyword clusters for pillar content strategies Provides SEO difficulty assessments 📅 Strategic Content Planning Generates detailed content plans with specifications Creates 6-month editorial calendars with production timelines Provides resource planning and workload analysis Includes success metrics and performance tracking 🤝 Team Collaboration Exports complete editorial calendar to Google Sheets Enables real-time team collaboration and progress tracking Includes writer assignments and milestone management Provides performance analytics and ROI tracking Workflow Steps Weekly Content Analysis Trigger - Automated weekly execution Competitor Content Scraping - AI-powered analysis of multiple competitors Your Content Library Analysis - Comprehensive audit of existing content Data Processing & Merging - Normalizes and combines competitor data Advanced Gap Identification - Identifies opportunities using scoring algorithms SEO Keyword Mapping - Strategic keyword planning and clustering Content Planning & Roadmap - Detailed content specifications and timelines Editorial Calendar Generation - Production schedules and team assignments Google Sheets Integration - Team collaboration and tracking platform Benefits Competitive Intelligence**: Stay ahead of competitor content strategies Data-Driven Decisions**: Make content decisions based on real market data SEO Optimization**: Target high-opportunity keywords with low competition Resource Efficiency**: Optimize content production based on opportunity scores Team Productivity**: Streamlined editorial calendar and workflow management Performance Tracking**: Monitor content success and ROI Use Cases Content Marketing Teams**: Strategic content planning and competitive analysis SEO Specialists**: Keyword research and content gap identification Digital Marketing Agencies**: Client content strategy development E-commerce Businesses**: Product content and educational material planning B2B Companies**: Thought leadership and industry content strategies Technical Requirements ScrapeGraphAI Integration**: For competitor content analysis Google Sheets API**: For editorial calendar storage Weekly Automation**: Scheduled execution for continuous monitoring Data Processing**: Advanced algorithms for opportunity scoring This workflow transforms competitive intelligence into actionable content strategies, helping you identify and capitalize on content opportunities that your competitors are missing.
by System Admin
Tagged with: , , , ,
by vinci-king-01
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works This workflow automatically analyzes property maintenance costs by scraping contractor websites and provides comprehensive budget planning and recommendations. Key Steps Scheduled Trigger - Runs weekly to update maintenance cost data from multiple sources. Multi-Source Scraping - Uses ScrapeGraphAI to extract service data from plumbing, electrical, and HVAC contractor websites. Cost Analysis - JavaScript nodes process and categorize services by price level and urgency. Service Comparison - Compares providers within each category to find best-rated and most cost-effective options. Budget Planning - Creates annual budget with quarterly breakdown and service scheduling recommendations. Property Manager Alerts - Formats comprehensive reports with budget summaries and actionable recommendations. Set up steps Setup time: 10-15 minutes Configure ScrapeGraphAI credentials - Add your ScrapeGraphAI API key for web scraping. Customize contractor websites - Update the URLs in the scraping nodes to target specific local contractor directories. Adjust schedule frequency - Modify the trigger timing based on how often you want cost updates. Review budget parameters - Customize the budget planning logic in the JavaScript nodes if needed. Test the workflow - Run manually first to ensure all scraping and analysis nodes work correctly. Technologies Used ScrapeGraphAI** - For extracting structured data from contractor websites JavaScript Code Nodes** - For data processing, cost analysis, and budget planning Schedule Trigger** - For automated weekly execution JSON Data Processing** - For structured data handling and analysis
by isa024787bel
No description available