by Harshil Agrawal
This workflow allows you to store the output of a phantom in Airtable. This workflow uses the LinkedIn Profile Scraper phantom. Configure and launch this phantom from your Phantombuster account before executing the workflow. The workflow uses the following node: Phantombuster node: The Phantombuster node gets the output of the LinkedIn Profile Scraper phantom that ran earlier. You can select a different phantom from the Agent dropdown list, but make sure to configure the workflow accordingly. Set node: Using the Set node we are setting the data for the workflow. The data that we set in this node will be used by the next nodes in the workflow. Based on your use-case, you can modify the node. Airtable node: The Airtable node allows us to append the data in an Airtable. Based on your use-case you can replace this node with any other node. Instead of storing the data in Airtable, you can store the data in a database or Google Sheet, or send it as an email using the Send Email node, Gmail node, or Microsoft Outlook node.
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically gathers and analyzes feature requests from multiple sources including support tickets, user forums, and feedback platforms to help prioritize product development. It saves you time by eliminating the need to manually monitor various channels and provides intelligent feature request analysis. Overview This workflow automatically scrapes support systems, user forums, social media, and feedback platforms to collect feature requests from customers. It uses Bright Data to access various platforms without being blocked and AI to intelligently categorize, prioritize, and analyze feature requests based on frequency and user impact. Tools Used n8n**: The automation platform that orchestrates the workflow Bright Data**: For scraping support platforms and user forums without being blocked OpenAI**: AI agent for intelligent feature request categorization and analysis Google Sheets**: For storing feature requests and generating prioritization reports How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your feature request tracking spreadsheet Customize: Define feedback sources and feature request identification parameters Use Cases Product Management**: Prioritize roadmap items based on customer demand Development Teams**: Understand which features users need most Customer Success**: Track and respond to feature requests proactively Strategy Teams**: Make data-driven decisions about product direction Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #featurerequests #productmanagement #brightdata #webscraping #productdevelopment #n8nworkflow #workflow #nocode #roadmapping #customervoice #productinsights #featureanalysis #productfeedback #userresearch #productdata #featuretracking #productplanning #customerneeds #featurediscovery #productprioritization #featurebacklog #uservoice #productintelligence #developmentplanning #featuremonitoring #productdecisions #feedbackgathering #productautomation
by Davide
How it Works This workflow is designed to automate the process of handling incoming emails, summarizing their content, generating appropriate responses, and obtaining approval before sending replies. Below are the key operational steps: Email Reception and Summarization: The workflow starts with an Email Trigger (IMAP) node that listens for new emails in a specified inbox. Once an email is received, its HTML content is processed by a Markdown node to convert it into plain text if necessary, followed by an Email Summarization Chain node which uses AI to create a concise summary of the email's content using prompts tailored for this purpose. Response Generation and Approval: A Write email node generates a professional response based on the summarized content, utilizing predefined templates and guidelines such as keeping responses under 100 words and ensuring they're formatted correctly in HTML. Before sending out any automated replies, the system sends these drafts via Gmail for human review and approval through a Gmail node configured with double-approval settings. If approved (Approve?), the finalized email is sent back to the original sender using the Send Email node; otherwise, it loops back for further edits or manual intervention. Set Up Steps To replicate this workflow within your own n8n environment, follow these essential configuration steps: Configuration: Begin by setting up an n8n instance either locally or via cloud services offered directly from their official site. Import the provided JSON configuration file into your workspace, making sure all required credentials like IMAP, SMTP, OpenAI API keys, etc., are properly set up under Credentials section since multiple nodes rely heavily on external integrations for functionalities like reading emails, generating summaries, crafting replies, and managing approvals. Customization: Adjust parameters according to specific business needs, including but not limited to adjusting the conditions used during conditional checks performed by nodes like Approve?. Modify the template messages given to AI models so they align closely with organizational tone & style preferences while maintaining professionalism expected in business communications. Ensure correct mappings between fields when appending data to external systems like Google Sheets or similar platforms where records might need tracking post-interaction completion.
by Oneclick AI Squad
This automated n8n workflow performs weekly forecasting of restaurant sales and raw material requirements using historical data from Google Sheets and AI predictions powered by Google Gemini. The forecast is then emailed to stakeholders for efficient planning and waste reduction. What is Google Gemini AI? Google Gemini is an advanced AI model that analyzes historical sales data, seasonal patterns, and market trends to generate accurate forecasts for restaurant sales and inventory requirements, helping optimize purchasing decisions and reduce waste. Good to Know Google Gemini AI forecasting accuracy improves over time with more historical data Weekly forecasting provides better strategic planning compared to daily predictions Google Sheets access must be properly authorized to avoid data sync issues Email notifications ensure timely review of weekly forecasts by stakeholders The system analyzes trends and predicts upcoming needs for efficient planning and waste reduction How It Works Trigger Weekly Forecast - Automatically starts the workflow every week at a scheduled time Load Historical Sales Data - Pulls weekly sales and material usage data from Google Sheets Format Input for AI Agent - Transforms raw data into a structured format suitable for the AI Agent Generate Forecast with AI - Uses Gemini AI to analyze trends and predict upcoming needs Interpret AI Forecast Output - Parses the AI's response into readable, usable JSON format Log Forecast to Google Sheets - Stores the new forecast data back into a Google Sheet Email Forecast Summary - Sends a summary of the forecast via Gmail for stakeholder review Data Sources The workflow utilizes Google Sheets as the primary data source: Historical Sales Data Sheet - Contains weekly sales and inventory data with columns: Week/Date (date) Menu Item (text) Sales Quantity (number) Revenue (currency) Raw Material Used (number) Inventory Level (number) Category (text) Forecast Output Sheet - Contains AI-generated predictions with columns: Forecast Week (date) Menu Item (text) Predicted Sales (number) Recommended Inventory (number) Material Requirements (number) Confidence Level (percentage) Notes (text) How to Use Import the workflow into n8n Configure Google Sheets API access and authorize the application Set up Gmail credentials for forecast report delivery Create the required Google Sheets with the specified column structures Configure Google Gemini AI API credentials Test with sample historical sales data to verify predictions and email delivery Adjust forecasting parameters based on your restaurant's specific needs Monitor and refine the system based on actual vs. predicted results Requirements Google Sheets API access Gmail API credentials Google Gemini AI API credentials Historical sales and inventory data for initial training Customizing This Workflow Modify the Generate Forecast with AI node to focus on specific menu categories, seasonal adjustments, or local market conditions. Adjust the email summary format to match your restaurant's reporting preferences and add additional data sources like supplier information, weather data, or special events calendar for more accurate predictions.
by Jonathan
This workflow uses a Hubspot Trigger to check for new contacts. It then validates the contacts' email using OneSimpleAPI. If there are any a message will be sent to Slack. To configure this workflow you will need to set the credentials for the Hubspot, OneSimpleAPI and Slack Nodes. You will also need to select the Slack channel to use for sending the message.
by Lorena
This workflow is triggered when a meeting is scheduled via Calendly. Then, an activity is automatically created in Pipedrive and 15 minutes after the end of the meeting, a message is sent to the interviewer in Slack, reminding them to write down their notes and insights from the meeting.
by Obsidi8n
How it Works This n8n template makes it possible to send emails directly from your Obsidian notes. It leverages the power of the Obsidian Post Webhook plugin, allowing seamless integration between your notes and the email workflow. What it does: Receives note content and metadata from Obsidian via a Webhook. Parses YAML frontmatter to define email recipients, subject, and more. Automatically processes attachments, encoding them into an email-friendly format. Sends emails via Gmail and confirms the status back to Obsidian. Includes a testing feature to verify everything works before going live. Set-up Steps Webhook Configuration: Set your n8n POST Webhook URL in the Obsidian Obsidian Post Webhook plugin settings. Email Integration: Submit the Gmail credentials in n8n email nodes. Test the Workflow: Run a test from Obsidian to ensure the template functions correctly. Activate and Enjoy: Start sending customized emails with attachments from your notes in no time!
by Yaron Been
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically monitors local event platforms (Eventbrite, Meetup, Facebook Events) and aggregates upcoming events that match your criteria. Never miss a networking or sponsorship opportunity again. Overview A scheduled trigger scrapes multiple event sites via Bright Data, filtering by location, date range, and keywords. OpenAI classifies each event (conference, meetup, workshop) and extracts key details such as venue, organizers, and ticket price. Updates are posted to Slack and archived in Airtable for quick lookup. Tools Used n8n** – Core automation engine Bright Data** – Reliable multi-site scraping OpenAI** – NLP-based event categorization Slack** – Delivers daily event digests Airtable** – Stores enriched event records How to Install Import the Workflow: Add the .json file to n8n. Configure Bright Data: Provide your account credentials. Set Up OpenAI: Insert your API key. Connect Slack & Airtable: Authorize both services. Customize Filters: Edit the initial Set node to adjust city, radius, and keywords. Use Cases Community Managers**: Curate a calendar of relevant events. Sales Teams**: Identify trade shows and meetups for prospecting. Event Planners**: Track competing events when choosing dates. Marketers**: Spot speaking or sponsorship opportunities. Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Bright Data**: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) #n8n #automation #eventmonitoring #brightdata #openscraping #openai #slackalerts #n8nworkflow #nocode #meetup #eventbrite
by Oneclick AI Squad
This automated n8n workflow scrapes job listings from Upwork using Apify, processes and cleans the data, and generates daily email reports with job summaries. The system uses Google Sheets for data storage and keyword management, providing a comprehensive solution for tracking relevant job opportunities and market trends. What is Apify? Apify is a web scraping and automation platform that provides reliable APIs for extracting data from websites like Upwork. It handles the complexities of web scraping including rate limiting, proxy management, and data extraction while maintaining compliance with website terms of service. Good to Know Apify API calls may incur costs based on usage; check Apify pricing for details Google Sheets access must be properly authorized to avoid data sync issues The workflow includes data cleaning and deduplication to ensure high-quality results Email reports provide structured summaries for easy review and decision-making Keyword management through Google Sheets allows for flexible job targeting How It Works The workflow is organized into three main phases: Phase 1: Job Scraping & Initial Processing This phase handles the core data collection and initial storage: Trigger Manual Run - Manually starts the workflow for on-demand job scraping Fetch Keywords from Google Sheet - Reads the list of job-related keywords from the All Keywords sheet Loop Through Keywords - Iterates over each keyword to trigger Apify scraping Trigger Apify Scraper - Sends HTTP request to start Apify actor for job scraping Wait for Apify Completion - Waits for the Apify actor to finish execution Delay Before Dataset Read - Waits a few seconds to ensure dataset is ready for processing Fetch Scraped Job Dataset - Fetches the latest dataset from Apify Process Raw Job Data - Filters jobs posted in the last 24 hours and formats the data Save Jobs to Daily Sheet - Appends new job data to the daily Google Sheet Update Keyword Job Count - Updates job count in the All Keywords summary sheet Phase 2: Data Cleaning & Deduplication This phase ensures data quality and removes duplicates: Load Today's Daily Jobs - Loads all jobs added in today's sheet for processing Remove Duplicates by Title/Desc - Removes duplicates based on title and description matching Save Clean Job Data - Saves the cleaned, unique entries back to the sheet Clear Old Daily Sheet Data - Deletes old or duplicate entries from the sheet Reload Clean Job Data - Loads clean data again after deletion for final processing Phase 3: Daily Summary & Email Report This phase generates summaries and delivers the final report: Generate Keyword Summary Stats - Counts job totals per keyword for analysis Update Summary Sheet - Updates the summary sheet with keyword statistics Fetch Final Summary Data - Reads the summary sheet for reporting purposes Build Email Body - Formats email with statistics and sheet link Send Daily Report Email - Sends the structured daily summary email to recipients Data Sources The workflow utilizes Google Sheets for data management: AI Keywords Sheet - Contains keyword management data with columns: Keyword (text) - Job search terms Job Count (number) - Number of jobs found for each keyword Status (text) - Active/Inactive status Last Updated (timestamp) - When keyword was last processed Daily Jobs Sheet - Contains scraped job data with columns: Job Title (text) - Title of the job posting Description (text) - Job description content Budget (text) - Job budget or hourly rate Client Rating (number) - Client's rating on Upwork Posted Date (timestamp) - When job was posted Job URL (text) - Direct link to the job posting Keyword (text) - Which keyword found this job Scraped At (timestamp) - When data was collected Summary Sheet - Contains daily statistics with columns: Date (date) - Report date Total Jobs (number) - Total jobs found Keywords Processed (number) - Number of keywords searched Top Keyword (text) - Most productive keyword Average Budget (currency) - Average job budget Report Generated (timestamp) - When summary was created How to Use Import the workflow into n8n Configure Apify API credentials and Google Sheets API access Set up email credentials for daily report delivery Create three Google Sheets with the specified column structures Add relevant job keywords to the AI Keywords sheet Test with sample keywords and adjust as needed Requirements Apify API credentials and actor access Google Sheets API access Email service credentials (Gmail, SMTP, etc.) Upwork job search keywords for targeting Customizing This Workflow Modify the Process Raw Job Data node to filter jobs by additional criteria like budget range, client rating, or job type. Adjust the email report format to include more detailed statistics or add visual aids, such as charts. Customize the data cleaning logic to better handle duplicate detection based on your specific requirements, or add additional data sources beyond Upwork for comprehensive job market analysis.
by Jonathan
This workflow uses a Hubspot Trigger to check for new companies. It then checks the companies website exists using the HTTP node. If it doesn't, a message is sent to Slack. To configure this workflow you will need to set the credentials for the Hubspot and Slack Nodes. You will also need to select the Slack channel to use for sending the message.
by Don Jayamaha Jr
A next-generation AI-powered DeFi health monitor that tracks wallet positions across Aave V3 using GPT-4o and LangChain. It delivers human-readable reports via Telegram and Gmail, triggered on schedule or manually. Built for professionals monitoring multiple DeFi wallets. 🧩 System Components | Component | Role | | --------------------------------- | ------------------------------------------------------------- | | ✅ Scheduler | Triggers the workflow periodically | | ✅ Google Sheets Wallet Loader | Loads all monitored wallet addresses | | ✅ Set Variables | Injects dynamic wallet + date | | ✅ AAVE Portfolio AI Agent | GPT-4o + LangChain AI that generates human-readable summaries | | ✅ Moralis API Nodes (3) | Collect Aave V3 supply/borrow/collateral data | | ✅ OpenAI Chat Model (gpt-4o-mini) | Interprets on-chain data and explains it | | ✅ Telegram Delivery | Sends summary to Telegram chat | | ✅ Gmail Email Sender | Sends full HTML report to email | | ✅ HTML Formatter | Beautifies AI output into email structure | ⚙️ How It Works Scheduled or manual trigger Pulls wallet addresses from Google Sheets For each wallet: Pulls Aave data from Moralis GPT-4o AI generates report Sends summary to Telegram Sends full HTML report via Gmail 🛠 Installation Steps 1. Import the Workflow Upload AAVE_Portfolio_Professional_AI_Agent.json to your n8n instance. 2. Connect These Credentials | Service | Required Credential Type | | -------- | ---------------------------- | | Moralis | HTTP Header Auth (X-API-Key) | | OpenAI | GPT-4o via OpenAI API Key | | Telegram | Telegram Bot API Token | | Gmail | Gmail OAuth2 Credential | 3. Create Google Sheet Column name must be: wallet_address Add wallet addresses you want monitored 📬 Output Format Telegram Message Example 📊 Aave DeFi Health Report Wallet: 0xABC...123 Date: 2025-05-21 ▪️ Pool: Aave Ethereum USDC • Supply: $10,040 • Borrowed: $5,500 • Health Factor: 3.43 • Liquidation Risk: No Email Report Full HTML + plain text versions Auto-generated date + styled per wallet Includes notes and threshold warnings 🧠 Smart Features GPT-4o generates clear human summaries Monitors multiple wallets in one run Flags liquidation risk dynamically Logs daily performance snapshots 💡 Customization Ideas Add Telegram slash command /aave <wallet> Expand to monitor Compound, Lido, or Uniswap Export to Notion, Slack, or Data Studio 🧾 Licensing & Attribution © 2025 Treasurium Capital Limited Company Architecture, prompts, and report formatting are intellectual property protected. No unauthorized rebranding, redistribution, or resale permitted. 🔗 For support or licensing inquiries: LinkedIn – Don Jayamaha 🚀 Track all your Aave DeFi positions using AI—delivered via Telegram + Gmail. Perfect for funds, traders, and DeFi power users. 🎥 Watch the Live Demo:
by Abdullah
What it does Automatically respond to Google Form entries submitted via Google Sheets. This workflow notifies your Slack team, sends a personalized Gmail response to the user, and adds the user to Google Contacts — all triggered instantly upon new row addition in your connected Sheet. Who's it for Perfect for lead capture forms, client inquiries, or feedback submissions. Trigger: When a new row is added to a connected Google Sheet (usually linked to a Google Form). Slack Notification: Sends a Slack message to your selected channel with the form data. Gmail Message: Sends an automatic email reply to the submitter (using their email from the form). Add Google Contact: Automatically creates a new contact in Google Contacts using the form data. This setup is ideal for automating client communication and internal team alerts without manual input.