by Yaron Been
Automated pipeline to collect and analyze investor data from Crunchbase, tracking investment patterns, funding history, and portfolio companies for market analysis and lead generation. 🚀 What It Does Investor Profiling**: Collects comprehensive data on investors and VC firms Investment Pattern Analysis**: Tracks funding history and investment preferences Portfolio Monitoring**: Keeps tabs on investor portfolios and new investments Data Enrichment**: Enhances raw data with additional context and metrics 🎯 Perfect For Startup founders seeking investors Market research analysts Investment professionals Business development teams Competitive intelligence ⚙️ Key Benefits ✅ Comprehensive investor profiles ✅ Real-time investment tracking ✅ Market trend analysis ✅ Data-driven investment decisions ✅ Time-saving automation 🔧 What You Need Crunchbase API access n8n instance Storage solution (database or spreadsheet) 📊 Data Points Collected Investor/Firm details Investment history Portfolio companies Funding rounds participated in Investment focus areas Contact information (when available) 🛠️ Setup & Support Quick Setup Deploy in 30 minutes with our step-by-step configuration guide 📺 Watch Tutorial 💼 Get Expert Support 📧 Direct Help Transform your investor research with automated data collection and analysis. Spend less time gathering data and more time making strategic decisions.
by Batu Öztürk
🚀 Transform LinkedIn Post Reactions into Content Ideas with Airtable 📝 Description This workflow helps you to turn your LinkedIn activity into a powerful content ideation engine. It captures your most recent post reactions on LinkedIn automatically, filters them based on recency, and structures the content into Airtable—ready for brainstorming, inspiration, or publication planning. ⚙️ What It Does Fetches* the latest liked posts from LinkedIn via a public API (rapidapi.com/Real-Time Linkedin Scraper*). Filters** posts to include only those marked as your decided reaction and posted in the last 7 days. Extracts** the post text, author, links and more. Formats** the data into a database-friendly structure. Saves** the output in Airtable for easy tracking, tagging, or team collaboration. 💡 Use Cases Build a content idea vault from posts you admire. Capture inspiration from thought leaders. Identify trends based on what you find insightful. Supercharge your personal brand or newsletter by turning likes into learning. 🛠 Prerequisites Before using this template, make sure you have: ✅ A RapidAPI account and access to the linkedin-api8 endpoint. ✅ Your RapidAPI key and the target LinkedIn username. ✅ An Airtable account with a base/table set up. 🧰 Setup Instructions Clone this template into your n8n instance. Open the Fetch LinkedIn Likes node and enter: Your LinkedIn username. Your RapidAPI key in the headers. Open the Save to Airtable node and: Connect your Airtable account. Link the correct base (Content Hub) and table (Ideas). Set your desired schedule in the Trigger node. Activate the workflow and you're done! 📋 Airtable Setup Create a base called Content Hub and a table named Ideas with the following columns: | Column Name | Type | Required | Notes | |-------------|------------|----------|----------------------------| | Title | Single line text | ✅ | Generated from author info | | Description | Long text | ✅ | Contains post content | | Source | URL | ✅ | Link to the original post | | Type | Single select | ✅ | Value: Linkedin
by Yulia
This n8n workflow demonstrates how to create an agent using LangChain and SQLite. The agent can understand natural language queries and interact with a SQLite database to provide accurate answers. 💪 🚀 Setup Run the top part of the workflow once. It downloads the example SQLite database, extracts from a ZIP file and saves locally (chinook.db). 🗣️ Chatting with Your Data Send a message in a chat window. Locally saved SQLite database loads automatically. User's chat input is combined with the binary data. The LangChain Agend node gets both data and begins to work. The AI Agent will process the user's message, perform necessary SQL queries, and generate a response based on the database information. 🗄️ 🌟 Example Queries Try these sample queries to see the AI Agent in action: "Please describe the database" - Get a high-level overview of the database structure, only one or two queries are needed. "What are the revenues by genre?" - Retrieve revenue information grouped by genre, LangChain agent iterates several time before producing the answer. The AI Agent will store the final answer in its memory, allowing for context-aware conversations. 💬 Read the full article: 👉 https://blog.n8n.io/ai-agents/
by Naveen Choudhary
Description This workflow automates the process of scraping Google Events data using SerpApi and organizing it in Google Sheets for analysis and tracking. Who's it for Event organizers** who need to monitor competitor events in their area Marketing teams** tracking local events for partnership opportunities Researchers** collecting event data for analysis Business owners** monitoring industry events and conferences How it works The workflow searches Google Events using SerpApi's Google Events engine, processes the returned data, and saves it to a Google Sheets spreadsheet. It handles pagination automatically to collect multiple events and flattens the nested API response into a structured format. What it does Configures search parameters - Sets the search query, total events to fetch, and pagination settings Fetches events via SerpApi - Makes paginated requests to Google Events API with proper rate limiting Processes and flattens data - Transforms nested event data into a flat structure with all relevant fields Saves to Google Sheets - Appends the processed events to a Google Sheets document for easy analysis Requirements SerpApi account** with API key (Get one here) Google Sheets API access** (OAuth2 credentials) Google Sheets document** - Make a copy of this template sheet How to set up Configure SerpApi credentials in the HTTP Request node Set up Google Sheets OAuth2 authentication Update the Google Sheets document ID in the final node to point to your copy Modify search parameters in the "Set Search Parameters" node: Change query to your desired search terms Adjust total_events (10 events per page) Set start position for pagination Run the workflow using the manual trigger How to customize the workflow Search terms**: Modify the query in the Set node (e.g., "conferences in New York", "music events Los Angeles") Event count**: Adjust total_events to fetch more or fewer events Output format**: Modify the Google Sheets column mapping to include/exclude specific fields Rate limiting**: Adjust the requestInterval in the HTTP Request node if needed Scheduling**: Replace the Manual Trigger with a Schedule Trigger for automated runs Output data includes Event title, description, and direct link Start date and timing information Venue and address details Ticket information and pricing Event location map links Event images Original search query for tracking Note: This workflow respects SerpApi rate limits with built-in delays between requests and processes up to 10 events per API call efficiently.
by Jimleuk
This n8n workflow demonstrates how we can use Multimodal LLMs to parse and extract from PDF documents in n8n. In this particular scenario, we're passing a candidate's CV/resume to an AI which filters out unqualified applications. However, this sneaky candidate has added in hidden prompt to bypass our bot! Whatever will we do? No fret, using AI Vision is one approach to solve this problem... read on! How it works Our candidate's CV/Resume is a PDF downloaded via Google Drive for this demonstration. The PDF is then converted into an image PNG using a tool called Stirling PDF. Since the hidden prompt has a white font color, it is is invisible in the converted image. The image is then forwarded to a Basic LLM node to process using our multimodal model - in this example, we'll use Google's Gemini 1.5 Pro. In the Basic LLM node, we'll need to set a User Message with the type of Binary. This allows us to directly send the image file in our request. The LLM is now immune to the hidden prompt and its response is has expected. The example CV/Resume with hidden prompt can be found here: https://drive.google.com/file/d/1MORAdeev6cMcTJBV2EYALAwll8gCDRav/view?usp=sharing Requirements Google Gemini API Key. Alternatively, GPT4 will also work for this use-case. Stirling PDF or another service which can convert PDFs into images. Note for data privacy, this example uses a public API and it is recommended that you self-host and use a private instance of Stirling PDF instead. Customising the workflow Swap out the manual trigger for another trigger such as a webhook to integrate into your existing services. This example demonstrates a validation use-case ie. "does the candidate look qualified?". You can try additionally extracting data points instead such as years of experiences, previous companies etc.
by Pavel Duchovny
Who is this for? This workflow is designed for: Database administrators and developers working with MongoDB Content managers handling movie databases Organizations looking to implement AI-powered search and recommendation systems Developers interested in combining LangChain, OpenAI, and MongoDB capabilities What problem does this workflow solve? Traditional database queries can be complex and require specific MongoDB syntax knowledge. This workflow addresses: The complexity of writing MongoDB aggregation pipelines The need for natural language interaction with movie databases The challenge of maintaining user preferences and favorites The gap between AI language models and database operations What this workflow does This workflow creates an intelligent agent that: Accepts natural language queries about movies Translates user requests into MongoDB aggregation pipelines Queries a movie database containing detailed information including: Plot summaries Genre classifications Cast and director information Runtime and release dates Ratings and awards Provides contextual responses using OpenAI's language model Allows users to save favorite movies to the database Maintains conversation context using a window buffer memory Setup Required Credentials: OpenAI API credentials MongoDB connection details Node Configuration: Configure the MongoDB connection in the MongoDBAggregate node Set up the OpenAI Chat Model with your API key Ensure the webhook trigger is properly configured for receiving chat messages Database Requirements: A MongoDB collection named "movies" with the specified document structure Proper indexes for efficient querying Appropriate user permissions for read/write operations How to customize this workflow Modify the Document Structure: Update the tool description in the MongoDBAggregate node to match your collection schema Adjust the aggregation pipeline templates for your specific use case Enhance the AI Agent: Customize the prompt in the "AI Agent - Movie Recommendation" node Modify the window buffer memory size based on your context needs Add additional tools for more functionality Extend Functionality: Add more MongoDB operations beyond aggregation Implement additional workflows for different types of queries Create custom error handling and validation Add user authentication and rate limiting Integration Options: Connect to external APIs for additional movie data Add webhook endpoints for different platforms Implement caching mechanisms for frequent queries Add data transformation nodes for specific output formats This workflow serves as a foundation that can be adapted to various use cases beyond movie recommendations, such as e-commerce product search, content management systems, or any scenario requiring intelligent database interaction.
by Lucas Peyrin
How it works This workflow is a hands-on tutorial for the Code node in n8n, covering both basic and advanced concepts through a simple data processing task. Provides Sample Data: The workflow begins with a sample list of users. Processes Each Item (Run Once for Each Item): The first Code node iterates through each user to calculate their fullName and age. This demonstrates basic item-by-item data manipulation using $input.item.json. Fetches External Data (Advanced): The second Code node showcases a more advanced feature. For each user, it uses the built-in this.helpers.httpRequest function to call an external API (genderize.io) to enrich the data with a predicted gender. Processes All Items at Once (Run Once for All Items): The third Code node receives the fully enriched list of users and runs only once. It uses $items() to access the entire list and calculate the averageAge, returning a single summary item. Create a Binary File: The final Code node gets the fully enriched list of users once again and creates a binary CSV file to show how to use binary data Buffer in JavaScript. Set up steps Setup time: < 1 minute This workflow is a self-contained tutorial and requires no setup. Explore the Nodes: Click on each of the Code nodes to read the code and the comments explaining each step, from basic to advanced. Run the Workflow: Click "Execute Workflow" to see it in action. Check the Output: Click on each node after the execution to see how the data is transformed at each stage. Notice how the data is progressively enriched. Experiment! Try changing the data in the 1. Sample Data node, or modify the code in the Code nodes to see what happens.
by Yaron Been
Automated pipeline that exports technology stack data from BuiltWith to Google Sheets for analysis, reporting, and team collaboration. 🚀 What It Does Extracts technology stack data Organizes data in Google Sheets Updates automatically on schedule Supports multiple company tracking Enables easy data sharing 🎯 Perfect For Sales teams Market researchers Business analysts Competitive intelligence Technology consultants ⚙️ Key Benefits ✅ Centralized technology database ✅ Easy data analysis ✅ Team collaboration ✅ Historical tracking ✅ Custom reporting 🔧 What You Need BuiltWith API access Google account n8n instance Google Sheets setup 📊 Data Exported Company information Web technologies Hosting details Analytics tools Marketing technologies Contact information 🛠️ Setup & Support Quick Setup Start exporting in 15 minutes with our step-by-step guide 📺 Watch Tutorial 💼 Get Expert Support 📧 Direct Help Transform raw technology data into actionable business intelligence with automated exports.
by Marketing Canopy
Automate Sports Betting Data with TheOddsAPI This workflow enables you to create and update a table using TheOddsAPI for sports betting data. It automatically pulls upcoming Ice Hockey games at the start of the day and updates the table with results at the end of the day. You can modify it to retrieve odds and game data for any sport. This setup is particularly useful for sports betting applications, such as tracking the results of a predictive model. It leverages scheduled triggers to activate HTTP requests, which then create or update fields in Airtable by matching on the game ID. Prerequisites Before implementing this workflow, ensure you have the following: TheOddsAPI Account & API Key Sign up at TheOddsAPI and obtain an API key. Ensure you have the correct API permissions to access sports odds and results. Airtable Account & API Key Create an account at Airtable and set up a database. Obtain an API key from the Account Settings page. API Access & Rate Limits Review TheOddsAPI’s rate limits and ensure your account tier allows for scheduled API calls. Confirm that Airtable API limits align with your expected data retrieval frequency. Step-by-Step Guide to Integrating TheOddsAPI 1. Schedule API Requests Set up a trigger to automatically pull upcoming Ice Hockey games at the start of each day. 2. Fetch Data from TheOddsAPI Retrieve the latest sports betting data, including game details and odds, using TheOddsAPI. 3. Store Data in Airtable Insert or update records in Airtable by matching game IDs, ensuring data accuracy. Sample Airtable Template Column Setup for Ice Hockey (Table can adjust depending on sport and data needs. Reference TheOddsAPI for more documentation.) Game ID** Sport** League** Game Date (UTC)** Home Team** Away Team** Completed** (Boolean: TRUE/FALSE for game completion status) Scores** (JSON or String for final scores) Last Update** (Timestamp of the latest update) 4. Schedule an End-of-Day Update Configure another trigger to fetch final game results at the end of the day. 5. Update Records in Airtable Modify existing Airtable records with final scores and game outcomes for complete tracking. 6. Customize for Other Sports Adjust API parameters to retrieve data for different sports and betting odds, making the system flexible for multiple use cases. This structured workflow automates sports betting data collection and updates, ensuring accurate and real-time tracking of odds and game results. By integrating TheOddsAPI with Airtable, you can build scalable applications for predictive sports analytics and betting insights.
by Yaron Been
🔥 AI Lead Scoring Agent: Smart Contact Form Triager Automatically score every contact form lead as Hot/Warm/Cold and alert your sales team instantly. This intelligent workflow captures contact form submissions, uses GPT-4 to analyze message content and score lead quality, then sends formatted alerts to Slack - ensuring your sales team always focuses on the hottest prospects first. 🚀 What It Does Instant Lead Capture: Automatically receives contact form submissions via webhook endpoint AI-Powered Scoring: GPT-4 analyzes message content and classifies leads as Hot 🔥, Warm 🌤, or Cold ❄️ Smart Data Extraction: Cleanly extracts name, email, and message from form submissions Real-Time Slack Alerts: Sends formatted notifications to your sales team with lead details and AI scoring 🎯 Key Benefits ✅ Never Miss Hot Prospects: AI identifies urgent leads automatically ✅ Save Sales Time: Focus effort on highest-probability leads first ✅ Instant Team Alerts: Real-time notifications in Slack channels ✅ Smart Prioritization: AI scoring eliminates guesswork in lead quality ✅ Zero Manual Work: Complete automation from form to sales alert ✅ Universal Integration: Works with any contact form or landing page 🏢 Perfect For Sales & Marketing Teams SaaS companies managing inbound leads Service businesses qualifying prospects E-commerce stores identifying serious buyers Agencies prioritizing client inquiries Business Applications Lead Qualification**: Identify purchase-ready prospects instantly Sales Efficiency**: Focus team effort on highest-value opportunities Response Prioritization**: Handle urgent inquiries first Team Coordination**: Keep entire sales team informed of new leads ⚙️ What's Included Complete Workflow: Ready-to-deploy lead scoring automation Webhook Endpoint: Receives submissions from any contact form AI Classification: GPT-4 powered lead interest analysis Slack Integration: Professional team notifications with emojis and formatting Data Processing: Clean extraction and formatting of lead information 🔧 Quick Setup Requirements n8n Platform**: Cloud or self-hosted instance OpenAI API**: GPT-4 access for lead scoring Slack Workspace**: Team channel for lead notifications Contact Form**: Any form that can POST to webhook endpoint 📱 Sample Slack Alert 🔥 New Lead: Sarah Johnson (sarah@techstartup.com) Message: "We're looking for a project management solution for our 50-person team. Need to implement ASAP as we're scaling fast. Can we schedule a demo this week?" Triage: 🔥 Hot ❄️ New Lead: John Smith (john@email.com) Message: "Just browsing your website. Might be interested in learning more someday." Triage: ❄️ Cold 🎨 Customization Options Scoring Criteria: Adjust AI prompts for industry-specific lead qualification Team Channels: Route different lead types to specific Slack channels Additional Fields: Capture company size, budget, timeline data CRM Integration: Connect to Salesforce, HubSpot, or Pipedrive Follow-up Automation: Trigger email sequences based on lead temperature Analytics Tracking: Monitor lead quality trends and conversion rates 🏷️ Tags & Categories #lead-scoring #sales-automation #contact-form-processing #ai-qualification #slack-integration #prospect-management #inbound-marketing #sales-productivity #lead-generation #openai-integration #webhook-automation #crm-automation #sales-alerts #lead-triage #ai-agent 💡 Use Case Examples SaaS Company: Score demo requests based on company size and urgency mentions Consulting Firm: Identify clients ready to start projects vs those still researching E-commerce Store: Spot bulk buyers and wholesale inquiries vs casual browsers Marketing Agency: Prioritize clients with specific budgets and timelines mentioned 📈 Expected Results 70% faster** lead response times through smart prioritization 3x higher** conversion rates focusing on Hot leads first 50% time savings** on manual lead qualification 100% lead coverage** - never miss or ignore a prospect again 🛠️ Setup & Support 5-Minute Setup: Simple webhook configuration with any contact form Universal Integration: Works with WordPress, Webflow, custom forms, landing pages Team Training: Clear Slack notification format anyone can understand Scalable: Handles unlimited form submissions automatically 📞 Get Help & Resources YouTube: https://www.youtube.com/@YaronBeen/videos 💼 Sales Automation Support LinkedIn: https://www.linkedin.com/in/yaronbeen/ 📧 Direct Help Email: Yaron@nofluff.online - Response within 24 hours Ready to never miss another hot lead? Get this AI Lead Scoring Agent and transform your contact forms into intelligent lead qualification systems. Your sales team will always know which prospects to call first, and you'll never waste time on cold leads again. Stop treating all leads equally. Start prioritizing the ones ready to buy.
by Anthony
What this workflow does Linkedin tracks which Chrome extensions are installed in your browser. This workflow uses a huge raw JSON of chrome extension ids, extracted from Linkedin pages, and builds a pretty Google Sheet with the list of these extensions. This workflow web scrapes Google to search for chrome extension id - and extracts the first search result. Setup Clone this Google Sheet template: https://docs.google.com/spreadsheets/d/1nVtoqx-wxRl6ckP9rBHSL3xiCURZ8pbyywvEor0VwOY/edit?gid=0#gid=0 Get API key for Google SERP API access here: https://rapidapi.com/restyler/api/serp-api1 Create n8n header auth for Google SERP API Some context and discussion https://www.linkedin.com/feed/update/urn:li:activity:7245006911807393792/ Follow the author and get the final Google Sheet with 1300+ Chrome extensions: https://www.linkedin.com/in/anthony-sidashin/
by Jaruphat J.
Who is this for? This workflow is ideal for businesses, accountants, and finance teams who receive bank slip images via LINE and want to automate the extraction of transaction details. It eliminates manual data entry and speeds up financial tracking. What problem does this workflow solve? Many businesses receive bank transfer slips via LINE from customers, but manually recording transaction details into spreadsheets is time-consuming and error-prone. This workflow automates the entire process, extracting structured data from the bank slips and storing it in Google Sheets for seamless record-keeping. What this workflow does: Receives bank slip images from LINE BOT Extracts transaction details (sender, receiver, amount, transaction ID) using SpaceOCR Automatically logs extracted data into Google Sheets Works with Standard Bank Slips & PromptPay transactions Eliminates manual data entry and reduces errors Setup Instructions: 1. Prerequisites A LINE BOT with Messaging API enabled A SpaceOCR API Key (Get from https://spaceocr.com/) A Google Sheets account to store extracted data An n8n instance running (Cloud or Self-hosted) 2. Setup Google Sheets Create a Google Sheet with the following structure: A (Date) B (Time) C (Sender) D (Receiver) E (Bank Name) F (Amount) G (Transaction ID) Ensure your Google Sheets API is enabled and connected to n8n. For an example of the required format, check this Google Sheets template: Google Sheets Template 3. Configure n8n Workflow 1. Webhook Node (Receives bank slip from LINE BOT) Set method:* Set Path:* 2. HTTP Request (Download Image from LINE Message) Retrieves image URL from the LINE message payload 3. SpaceOCR Node (Extract Text from Bank Slip) Input:* API Key:* #### 4. Google Sheets Node (Save Transaction Data) Select your Google Sheet Map extracted data (sender, receiver, amount, etc.) to the respective columns 4. Deploy & Test Activate the workflow in n8n Set Webhook URL in LINE Developer Console Send a test bank slip image to the LINE BOT Check Google Sheets for extracted transaction data