by Hossein Karami
Whoโs it for Teams that track absences in Everhour and want a shared Google Calendar view for quick planning. Ideal for managers, HR/OPS, and teammates who need instant visibility into approved time off. What it does Pulls approved time-off from Everhour on a schedule Creates/updates all-day events per day of absence in Google Calendar Removes stale events if a request changes or is canceled Uses a stable external key (everhour:ASSIGNMENT_ID:YYYY-MM-DD) to avoid duplicates How it works A Schedule Trigger runs periodically โ the workflow fetches Everhour assignments, filters approved time-off, expands multi-day requests into single-day items, then searches by external key to either create or update events. Separate cleanup steps list calendar events and delete any that are no longer present in Everhour. How to set up In n8n, create an HTTP Header Auth credential for Everhour with header `X-Api-Key: {YOUR_EVERHOUR_API_KEY} `. Add a Google Calendar OAuth credential. Open the Config node and set your calendarId (e.g., team@group.calendar.google.com). Enable the workflow and choose your schedule. Requirements Everhour account with API access Google Calendar (workspace or personal) n8n Cloud or self-hosted How to customize the workflow Adjust the schedule (hourly/daily). Filter by user or time-off type. Tweak the event title/description templates. Point to multiple calendars (duplicate the create/update branch per calendar).
by David Olusola
๐ Send Daily Motivational Quote to Slack This workflow automatically posts an inspiring motivational quote to your Slack channel every morning at 8 AM. It uses the free ZenQuotes.io API (no API key required) to fetch quotes and delivers them to your team in Slack. โ๏ธ How It Works Trigger at 8 AM A Cron node runs daily at 8 AM EST (America/New_York timezone by default). Fetch a Random Quote The workflow sends an HTTP Request to ZenQuotes.io API to retrieve a motivational quote. Format the Message A Code node structures the quote into a Slack-friendly message, adding styling, emojis, and the authorโs name. Post to Slack Finally, the Slack node sends the motivational message to your chosen Slack channel (default: #general). ๐ ๏ธ Setup Steps 1. Connect Slack App Go to api.slack.com โ Create a new app. Add OAuth scopes: chat:write channels:read Install the app to your Slack workspace. Copy credentials into n8n. 2. Configure Slack Channel Default is #general. Update the Slack node if you want to post to another channel. 3. Adjust Timezone (Optional) Workflow is set to America/New_York timezone. Change under workflow โ settings โ timezone if needed. โ Example Slack Output ๐ Daily Motivation ๐ "Success is not final, failure is not fatal: it is the courage to continue that counts." โ Winston Churchill โก Once enabled, your team will receive a motivational quote in Slack every morning at 8 AM โ simple, automatic, and uplifting!
by Nitin Garg
How it works Form Trigger accepts a question and optional settings (folder ID, search depth) Cookie Validation checks if Skool session is still active BuildId Extraction dynamically extracts Skool's build ID from homepage Keyword Extraction uses Claude Haiku to extract 1-2 search keywords Multi-Page Search fetches 1-10 pages of Skool search results Post Aggregation collects all posts with content and comments AI Analysis uses Claude Sonnet to analyze posts and answer your question Google Doc Report creates a detailed research document in your Drive HTML Response returns a beautiful summary page Key Features Auto BuildId Detection - No manual updates when Skool changes Cookie Expiration Handling - Clear error messages when session expires Configurable Search Depth - Search 1-10 pages (default: 5) Token Protection - Limits content to control API costs Dual AI Models - Haiku for keywords (cheap), Sonnet for analysis (powerful) Set up steps Time required: 10-15 minutes Get your Skool session cookie from browser DevTools Get an Anthropic API key from console.anthropic.com Set up Google Docs OAuth2 credential in n8n Create a Google Drive folder for research docs Update the Config node with your values: COOKIE - Your Skool session cookie ANTHROPIC_API_KEY - Your Claude API key DEFAULT_FOLDER_ID - Your Google Drive folder ID COMMUNITY - Your Skool community slug Who is this for? Members of Skool communities searching past discussions Community managers researching common questions Anyone building knowledge bases from Skool content Estimated costs Per search:** $0.02-0.10 (Claude Haiku + Sonnet) Skool cookies expire every 7-14 days (requires refresh) ๐ท๏ธ Suggested Tags skool, community, search, ai, claude, anthropic, google-docs, research, knowledge-base, form
by ObisDev
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. LinkedIn Automation Outreach Workflow Documentation Inline Notes for Each Node 1. On form submission Trigger Node - Manual Start ๐ Note: "Manual trigger to start the LinkedIn scraping and outreach process. This node initiates the workflow when you want to begin lead processing." 2. Scrape profiles from a linkedin search HTTP Request/Browserflow Node ๐ Note: "Scrapes LinkedIn profiles based on search criteria (e.g., automation specialists in Lagos). Returns JSON array with profile data including names, URLs, taglines, locations, and summaries. Uses scrapeProfilesFromSearch.linkedinSearch() function." 3. Split Out1 Split Out Node ๐ Note: "Converts the JSON array of profiles into individual items for processing. Each profile becomes a separate execution path. Field to split: 'data'. This enables personalized message generation for each contact." 4. Limit Limit Node ๐ Note: "Controls batch size for processing (currently set to 3 items). Prevents overwhelming the AI agent and helps with rate limiting. Adjust max items based on your subscription limits and testing needs." 5. AI Agent LangChain AI Agent Node ๐ Note: "Generates personalized LinkedIn and email outreach messages using profile data. Uses Groq Chat Model (llama3-8b-8192) for cost-effective text generation. Input: Individual profile data. Output: Structured JSON with personalized messages. System prompt focuses on networking approach rather than sales." 6. Code1 JavaScript Code Node ๐ Note: "Processes AI-generated messages and formats data for LinkedIn automation. Extracts connection message, profile URL, and adds automation parameters. Includes error handling for malformed AI responses and random delay generation. Prepares data structure compatible with Browserflow LinkedIn automation." 7. Send a linkedin message1 Browserflow/HTTP Node ๐ Note: "Automates LinkedIn connection requests with personalized messages. Uses formatted data from Code1 node including target URL and message content. Includes built-in delays and retry logic to avoid LinkedIn rate limiting. โ ๏ธ Currently shows error - check Browserflow configuration and credentials." Workflow Architecture Overview Flow Type: Sequential Processing with Batch Control Purpose: Automated LinkedIn networking outreach for automation professionals Target Audience: Lagos-based automation specialists and similar professionals Detailed Workflow Description ๐ฏ LinkedIn Automation Outreach Workflow for Networking This sophisticated n8n workflow automates the entire process of discovering, analyzing, and reaching out to potential networking contacts in the automation industry. Designed specifically for automation agency owners and professionals looking to build meaningful connections within their local tech community. ๐ Workflow Process: Stage 1: Data Collection The workflow begins with a manual trigger that initiates a comprehensive LinkedIn profile scraping operation. Using advanced web scraping techniques, it searches for automation specialists, particularly focusing on the Lagos tech ecosystem. The scraping function targets professionals with expertise in tools like n8n, Make.com, AI automation, and workflow optimization. Stage 2: Data Processing & Segmentation Once the profile data is collected, the Split Out node transforms the bulk JSON response into individual processing items. This crucial step enables personalized treatment of each contact. The Limit node provides batch control, allowing you to test with smaller groups (currently 3 profiles) before scaling to larger datasets. Stage 3: AI-Powered Personalization The AI Agent represents the workflow's intelligence core, utilizing Groq's LLaMA model for cost-effective, high-quality text generation. Each profile receives a customized analysis that identifies: Specific technical expertise and tools Geographic and industry connections Potential collaboration opportunities Shared professional interests The AI generates both LinkedIn connection messages and email alternatives, ensuring multiple touchpoint options. Messages focus on genuine networking value rather than sales pitches, emphasizing knowledge sharing, collaboration opportunities, and community building. Stage 4: Message Optimization & Formatting The JavaScript Code node serves as the workflow's data orchestrator, transforming AI-generated content into automation-ready formats. It handles: Response validation and error recovery LinkedIn-specific message formatting Automation parameter injection (delays, retry logic) Fallback email preparation Metadata tracking for campaign analysis Stage 5: Automated Outreach Execution The final Browserflow integration automates the actual LinkedIn connection process. It navigates to each target profile, sends personalized connection requests, and implements intelligent delays to maintain LinkedIn compliance. Built-in error handling ensures workflow resilience even when individual requests fail. ๐๏ธ Key Features: Intelligent Batch Processing**: Controlled processing prevents rate limiting Dual-Channel Approach**: LinkedIn + email backup ensures message delivery Geographic Targeting**: Lagos-focused networking for local community building AI-Driven Personalization**: Each message uniquely crafted based on profile analysis Error Resilience**: Comprehensive error handling maintains workflow stability Compliance-First Design**: Built-in delays and limits respect platform policies ๐ฏ Use Cases: Building local automation professional networks Identifying potential collaboration partners Market research on automation service providers Community building for tech meetups and events Knowledge sharing network development โก Technical Specifications: Model**: Groq LLaMA3-8B for cost-effective AI generation Processing Capacity**: 3-item batches (scalable) Message Types**: LinkedIn connections + email alternatives Automation Platform**: Browserflow for LinkedIn interaction Error Handling**: Multi-layer validation and recovery Personalization Depth**: 3-5 specific talking points per contact This workflow represents a sophisticated approach to professional networking automation, balancing efficiency with authentic relationship building. It's particularly valuable for automation professionals who understand the importance of genuine connections over mass outreach tactics.
by Harshil Agrawal
This workflow allows you to add, commit, and push changes to a git repository. Git node: This node will add the README.md file to the staging area. If you want to add a different file, enter the path of that file instead. Git1 node: This node will commit all the changes that were added to the staging area by the previous node. Git2 node: This node will return the commit logs of your repository. Git3 node: This node will push the changes to a cloud repository.
by Harshil Agrawal
This workflow allows you to create, update, and get an item from Webflow. Webflow node: This node will create a new collection of the type Team Members in Webflow. If you want to create a collection with a different type, use that type instead. Webflow1 node: This node will update the item that we created using the previous node. Webflow2 node: This node will retrieve the information of the object that we created earlier.
by Yang
Who is this for? This workflow is perfect for content strategists, SEO specialists, marketing agencies, and virtual assistants who need to quickly audit and collect blog content from client websites into a structured Google Sheet without doing manual crawling and copy-pasting. What problem is this workflow solving? Manually visiting a website, finding blog posts, and copying content into a spreadsheet is time-consuming and prone to errors. This workflow automates the process: it crawls a website, filters only blog-related pages, scrapes the article content, and stores everything neatly in Google Sheets for easy analysis and content strategy planning. What this workflow does The workflow starts when a client submits their website URL through a form. A Google Sheet is automatically created and headers are added for organizing the audit. Dumpling AI then crawls the website to discover all available pages, while the automation filters out only blog-related URLs. Each blog page is scraped for content, and the structured results (URL, crawled page, and website content) are appended row by row into the Google Sheet. Nodes Overview Form Trigger โ Form Submission (Client URL) Captures the clientโs website URL to start the workflow. Google Sheets โ Create Blog Audit Sheet Creates a new Google Sheet with a title based on the submitted URL. Set โ Set Sheet Headers Defines the headers: Url, Crawled_pages, website_content. Code โ Format Header Row Formats the headers properly before sending them to the sheet. HTTP Request โ Insert Headers into Sheet Updates the Google Sheet with the prepared header row. HTTP Request โ Dumpling AI: Crawl Website Crawls the submitted URL to discover internal pages. Code โ Extract Blog URLs Filters the crawl results and keeps only URLs that match common blog patterns (e.g., /blog/, /articles/, /posts/). HTTP Request โ Dumpling AI: Scrape Blog Pages Scrapes the text content from each filtered blog page. Set โ Prepare Row Data Maps the URL, blog page link, and scraped content into structured fields. Google Sheets โ Save Blog Data to Google Sheets Appends the structured data into the audit sheet row by row. ๐ Notes Set up Dumpling AI and generate your API key from: Dumpling AI Google Sheets must be connected with write permissions enabled. You can change the crawl depth or limit (currently set to 10 pages) in the Dumpling AI: Crawl Website node. The Extract Blog URLs node uses regex patterns to detect blog content. You can customize these patterns to match your websiteโs URL structure.
by Harshil Agrawal
This workflow allows you to create, update, and retrieve a record from FileMaker. FileMaker node: This node will create a new record in FileMaker. FileMaker1 node: This node will add a new field to the record that we created in the previous node. FileMaker2 node: This node will get the information about the record that we created earlier.
by n8n Team
This workflow sends a file to a Notion database of your choosing when a new file is created in a specific Google Drive folder. Prerequisites Notion account and Notion credentials. Google account and Google credentials. Google Drive folder to monitor for new files. How it works When a Google Drive file is created in the folder you specified, the workflow sends the file to the Notion database you created. The workflow uses the On file upload node to trigger the workflow when a new file is created in the folder. The Create database page node creates a new page in the Notion database you created. Setup Create a Notion database called "My Google Drive Files" with the following columns: Filename Google Drive File Share the database to n8n. In the n8n workflow, click on the Create database page node and select the database you created in step 1. In Google Drive, create a folder and navigate to it. Copy the URL of the Google Drive folder you are currently in. In the n8n workflow, add the folder URL to On file upload node.
by Eumentis
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. What It Does This workflow automatically discovers recently seed-funded startups by monitoring RSS feeds for funding announcements. It uses Bright Data to scrape full article content, then extracts structured company information using OpenAI (GPT). The data is exported to an Excel sheet on OneDrive, providing sales teams with a real-time list of qualified leads without any manual effort. How It Works Trigger & Article Discovery: Monitors curated RSS feeds for articles mentioning seed funding and triggers the workflow on new article detection. Content Scraping & Preparation: Scrapes full article content and converts it into clean markdown format for AI processing. Data Extraction with AI: Uses OpenAI to extract structured details like company name, website, LinkedIn profile, founders, and funding amount. Structured Data Output & Storage: Appends extracted data to an Excel sheet on OneDrive via Microsoft Graph API. Prerequisites RSS Feed URL**: A valid RSS feed source that provides seed funding articles for startups. Bright Data Credentials**: Active Bright Data account with access credentials (API token ) to enable article scraping. OpenAI API Key**: An OpenAI account with an API key and access to GPT-4.1-MINI models for data extraction. Microsoft OAuth2 API Credentials**: OAuth2 credentials (Client ID, Secret, Tenant ID) with access scopes to use Microsoft Graph API for Excel integration. Excel Sheet in SharePoint**: A pre-created Excel file hosted on OneDrive or SharePoint with the following column headers: createdAt, companyName, companyWebsite, companyLinkedIn, fundingAmount, founderName, founderLinkedIn, articleLink Excel File & Sheet Identifiers**: The Drive ID, File ID and Sheet ID of your Excel sheet stored on OneDrive or SharePoint, required by the Microsoft Graph API for appending rows using the HTTP node in n8n. Need help with the setup? Feel free to contact us How to Set It Up Follow these steps to configure and run the workflow: Import the Workflow Copy the provided n8n workflow template. In your n8n instance, go to Editor UI > paste this workflow. Configure the RSS Feed Node Open the RSS trigger node. Replace the default URL with your RSS feed URL. Ensure the polling interval matches your desired frequency (e.g., every 15 minutes or 1 hour). Set Up Bright Data Node Add your Bright Data credentials. Follow the documentation to complete the setup. Configure OpenAI Integration Add your OpenAI API key as a credential in n8n. Ensure the model is set to gpt-4.1-MINI. Follow the documentation to complete the setup. Configure Excel File Integration Open the HTTP node responsible for sending data to the Excel sheet via Microsoft Graph API. Replace the placeholder values in the API endpoint URL with your actual File ID and Sheet ID from the Excel file stored on OneDrive or SharePoint. https://graph.microsoft.com/v1.0/drives/{{drive-id}}/items/{{file-id}}/workbook/tables/{ {{ sheet-id }} }/rows This URL is used to append data to the specified Excel sheet range. Next, set up Microsoft OAuth2 credentials in n8n: Go to n8n > Credentials > Microsoft OAuth2 API. Provide the required values: Client ID Client Secret Tenant ID Scope Follow the documentation to complete the setup. Once the credential is saved, connect it to the HTTP node making the Graph API call. Activate the Workflow Set the workflow status to Active in n8n so it runs automatically when a new article appears in the RSS feed. Need Help? Contact us for support and custom workflow development.
by EmailListVerify
How to scrape emails from websites This workflow will : Try to find emails by scraping the website via http request If no result is found, it will use EmailListVerify email finder API to guess an email address Scraping email via http request is a cost-effective way to find email addresses, so it can save you a few bucks to use it before calling any email finder API. Who is for This workflow will help you transform a list of websites into a list of leads with email addresses. This is a handy workflow for any lead generation specialist. Pay attention that this workflow will usually return only generic emails like "contact@". Those generic emails are useful when you target small businesses; the owner usually monitors those emails. However, I don't advise this workflow to target enterprise customers. Requirements In order to use this workflow, you will need: To copy this Google sheet template Get an API key for EmailListVerify You then need to edit the setup of the 3 stages highlighted with a yellow sticky note, and you will be good to go.
by Muhammad Sajid
TruePeopleSearch Scraper for Skip Tracers Enrich any list of people with verified contact info using this workflow. This n8n automation scrapes TruePeopleSearch using Zyte's extraction API to safely bypass bot protection and extract detailed profiles. Itโs built for data brokers, skip tracers, and real estate professionals who need clean contact data (phone, email, address) from names alone โ even when the main profile is empty. If the original profile lacks a phone number, the workflow intelligently scrapes one of their listed relatives instead โ giving you the best possible chance of finding a valid number. What this workflow does Pulls lead data (first name, last name, and custom search URL) from a Google Sheet Sends the TruePeopleSearch search URL to Zyteโs Scraping API to retrieve search results HTML Parses the first matching profile link from the results (by full name > first name > last name) Visits that profile page and extracts: Full Name Age / Date of Birth Primary Phone Number Other Phone Numbers Email Addresses Current Address If no phone numbers are found: Detects a relative's profile link Scrapes the relativeโs profile for fallback contact data Writes all scraped information (or empty fields) back into the same row in Google Sheets Youโll need n8n (self-hosted or cloud)** To run and automate the workflow Zyte Scraping API** A Zyte account + API key to access their /extract endpoint (Use HTTP Basic Auth in the HTTP Request node) Google Sheets integration** Your own lead sheet with headers like: row_number (used to write back to the correct row) First Name Last Name SearchURL (Search by Address) Basic JavaScript familiarity (optional)** To tweak the HTML parsing logic for profile structure changes Example Google Sheet Use this Google Sheet as a template for your inputs and outputs: ๐ TruePeopleSearch Lead Template (Google Sheet) Disclaimer TruePeopleSearch may change its structure or block heavy scraping โ always test at small scale first This workflow is built to simulate human behavior via Zyteโs smart rendering โ scraping is still subject to site limitations Use ethically and within your local data usage laws Categories Data Enrichment ยท Scraping Automation ยท Lead Generation ยท Skip Tracing Feel free to drop me an email if you need help with building a custom scraping automation for your business at sajid@marketingbyprof.com