by explorium
Google Sheets Company Enrichment with Explorium MCP Template Download the following json file and import it to a new n8n workflow: google\_sheets\_enrichment.json Overview This n8n workflow template enables automatic enrichment of company information in your Google Sheets. When you add a new company or update existing company details (name or website), the workflow automatically fetches additional business intelligence data using Explorium MCP and updates your sheet with: Business ID NAICS industry code Number of employees (range) Annual revenue (range) Key Features Automatic Triggering**: Monitors your Google Sheet for new rows or updates to company name/website fields Smart Processing**: Only processes new or modified rows, not the entire sheet Data Validation**: Ensures both company name and website are present before processing Error Handling**: Processes each row individually to prevent one failure from affecting others Powered by AI**: Uses Claude Sonnet 4 with Explorium MCP for intelligent data enrichment Prerequisites Before setting up this workflow, ensure you have: n8n instance (self-hosted or cloud) Google account with access to Google Sheets Anthropic API key for Claude Explorium MCP API key Installation & Setup Step 1: Import the Workflow Create a new workflow. Download the workflow JSON from above. In your n8n instance, go to Workflows โ Add Workflow โ Import from File Select the JSON file and click Import Step 2: Create Google Sheet Create a new google sheet (or make a copy of this template) Your Google Sheet must have the following columns (exact names): name - Company name website - Company website URL business_id - Will be populated by the workflow naics - Will be populated by the workflow number_of_employees_range - Will be populated by the workflow yearly_revenue_range - Will be populated by the workflow Step 3: Configure Google Sheets Credentials You'll need to set up two Google credentials: Google Sheets Trigger Credentials: Click on the Google Sheets Trigger node Under Credentials, click Create New If working on n8n Cloud, Click the 'Sign in with Google' button Grant permissions to read and monitor your Google Sheets If working on n8n Instance, Follow the OAuth2 authentication process here Fill the Client ID and Client Secret fields Google Sheets Update Credentials: Click on the Update Company Row node Under Credentials, select the same credentials or create new ones (The same you did above) Ensure permissions include write access to your sheets Step 4: Configure Anthropic Credentials Click on the Anthropic Chat Model node Under Credentials, click Create New Enter your Anthropic API key Save the credentials Step 5: Configure Explorium MCP Credentials Click on the MCP Client node Under Credentials, click Create New (Header Auth) Fill the Name field with api_key Fill the Value field with your Explorium API Key Save the credentials Step 6: Link Your Google Sheet In the Google Sheets Trigger node: Select your Google Sheet from the dropdown Select the worksheet (usually "Sheet1") In the Update Company Row node: Select the same Google Sheet and worksheet Ensure the matching column is set to row_number Step 7: Activate the Workflow Click the Active toggle in the top right to activate the workflow The workflow will now monitor your sheet every minute for changes How It Works Workflow Process Flow Google Sheets Trigger: Polls your sheet every minute for new rows or changes to name/website fields Filter Valid Rows: Validates that both company name and website are present Loop Over Items: Processes each company individually AI Agent: Uses Explorium MCP to: Find the company's business ID Retrieve firmographic data (revenue, employees, NAICS code) Format Output: Structures the data for Google Sheets Update Company Row: Writes the enriched data back to the original row Trigger Behavior First Activation**: May process all existing rows to establish a baseline Ongoing Operation**: Only processes new rows or rows where name/website fields change Polling Frequency**: Checks for changes every minute Usage Adding New Companies Add a new row to your Google Sheet Fill in the name and website columns Within 1 minute, the workflow will automatically: Detect the new row Enrich the company data Update the remaining columns Updating Existing Companies Modify the name or website field of an existing row The workflow will re-process that row with the updated information All enrichment data will be refreshed Monitoring Executions In n8n, go to Executions to see workflow runs Each execution shows: Which rows were processed Success/failure status Detailed logs for troubleshooting Troubleshooting Common Issues All rows are processed instead of just new/updated ones Ensure the workflow is activated, not just run manually Manual test runs will process all rows First activation may process all rows once No data is returned for a company Verify the company name and website are correct Check if the company exists in Explorium's database Some smaller or newer companies may not have data available Workflow isn't triggering Confirm the workflow is activated (Active toggle is ON) Check that changes are made to the name or website columns Verify Google Sheets credentials have proper permissions Authentication errors Re-authenticate Google Sheets credentials Verify Anthropic API key is valid and has credits Check Explorium Bearer token is correct and active Error Handling The workflow processes each row individually, so if one company fails to enrich: Other rows will still be processed The failed row will retain its original data Check the execution logs for specific error details Best Practices Data Quality: Ensure company names and websites are accurate for best results Website Format: Include full URLs (https://example.com) rather than just domain names Batch Processing: The workflow handles multiple updates efficiently, so you can add several companies at once Regular Monitoring: Periodically check execution logs to ensure smooth operation API Limits & Considerations Google Sheets API**: Subject to Google's API quotas Anthropic API**: Each enrichment uses Claude Sonnet 4 tokens Explorium MCP**: Rate limits may apply based on your subscription Support For issues specific to: n8n platform**: Consult n8n documentation or community Google Sheets integration**: Check n8n's Google Sheets node documentation Explorium MCP**: Contact Explorium support for API-related issues Anthropic/Claude**: Refer to Anthropic's documentation for API issues Example Use Cases Sales Prospecting: Automatically enrich lead lists with company size and revenue data Market Research: Build comprehensive databases of companies in specific industries Competitive Analysis: Track and monitor competitor information Investment Research: Gather firmographic data for potential investment targets
by Alex Kim
Automate Video Creation with Luma AI Dream Machine and Airtable (Part 2) Description This is the second part of the Luma AI Dream Machine automation. It captures the webhook response from Luma AI after video generation is complete, processes the data, and automatically updates Airtable with the video and thumbnail URLs. This completes the end-to-end automation for video creation and tracking. ๐ Airtable Base Template ๐ Tutorial Video Setup 1. Luma AI Setup Ensure youโve created an account with Luma AI and generated an API key. Confirm that the API key has permission to manage video requests. 2. Airtable Setup Make sure your Airtable base includes the following fields (set up in Part 1): Use the Airtable Base Template linked above to simplify setup. Generation ID** โ To match incoming webhook data. Status** โ Workflow status (e.g., "Done"). Video URL** โ Stores the generated video URL. Thumbnail URL** โ Stores the thumbnail URL. 3. n8n Setup Ensure that the n8n workflow from Part 1 is set up and configured. Import this workflow and connect it to the webhook callback from Luma AI. How It Works 1. Webhook Trigger The Webhook node listens for a POST response from Luma AI once video generation is finished. The response includes: Video URL โ Direct link to the video. Thumbnail URL โ Link to the video thumbnail. Generation ID โ Used to match the record in Airtable. 2. Process Webhook Data The Set node extracts the video data from the webhook response. The If node checks if the video URL is valid before proceeding. 3. Store in Airtable The Airtable node updates the record with: Video URL โ Direct link to the video. Thumbnail URL โ Link to the video thumbnail. Status โ Marked as "Done." Uses the Generation ID to match and update the correct record. Why This Workflow is Useful โ Automates the completion step for video creation โ Ensures accurate record-keeping by matching generation IDs โ Simplifies the process of managing and organizing video content โ Reduces manual effort by automating the update process Next Steps Future Enhancements** โ Adding more complex post-processing, video trimming, and multi-platform publishing.
by Open Paws
Whoโs it for ๐ฏ This workflow is ideal for outreach specialists, fundraisers, campaigners, and professionals who want to build authentic connections by researching prospects deeply and strategically. It helps users understand prospectsโ backgrounds, interests, and mutual connections to craft effective outreach. How it works / What it does โ๏ธ Using the Multi-tool Research Agent subworkflow, it analyzes both the prospectorโs and prospectโs profiles, social media, and online presence. The workflow verifies identities, uncovers key connection points, and generates a comprehensive HTML report with actionable insights, conversation starters, and suggested engagement tactics. How to set up ๐ ๏ธ Import this workflow and the Multi-tool Research Agent subworkflow. Configure required API credentials. Provide inputs: prospector and prospect names, social media URLs, and outreach goal. Test the workflow to ensure accurate research and report generation. Requirements ๐ n8n instance with internet access Valid API keys Multi-tool Research Agent subworkflow installed and linked Optional email node for sending reports directly How to customize ๐ง Update input parameters to suit your outreach use case. Modify research prompts in the subworkflow for tone or focus. Customize the HTML report design for branding or format preferences. Attach an email node to send reports automatically or route output as needed. Use this workflow to power personalized, strategic outreach with data-driven insights.
by Yaron Been
Automated system for monitoring and analyzing competitor activities, funding rounds, and market movements using CrunchBase data. ๐ What It Does Tracks competitor funding rounds Monitors leadership changes Analyzes investment patterns Identifies new market entries Tracks product launches ๐ฏ Perfect For Startup founders Business strategists Market analysts Investment professionals Corporate development โ๏ธ Key Benefits โ Competitive intelligence โ Early warning system โ Market trend analysis โ Strategic insights โ Time-saving automation ๐ง What You Need CrunchBase API access n8n instance Google Sheets (for data storage) Notification preferences ๐ Tracking Metrics Funding amounts and rounds Investor networks Hiring trends Market expansion Product updates ๐ ๏ธ Setup & Support Quick Setup Start tracking in 20 minutes with our step-by-step guide ๐บ Watch Tutorial ๐ผ Get Expert Support ๐ง Direct Help Gain a competitive edge with automated tracking and analysis of your competitors' activities and strategies.
by Mirajul Mohin
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Automatically transform your video uploads into AI-powered summaries with key topic extraction and instant team notifications. What this workflow does Monitors Google Drive for new video uploads Downloads and processes videos using VLM Run AI Generates intelligent summaries with key topics extracted Posts results to Slack for immediate team access Setup Prerequisites: Google Drive account, VLM Run API credentials, Slack workspace, self-hosted n8n. You need to install VLM Run community node Quick Setup: Configure Google Drive OAuth2 and create video upload folder Add VLM Run API credentials Set up Slack integration for notifications Update folder/channel IDs in workflow nodes Test and activate Perfect for Meeting recordings and training videos Webinar summaries and educational content Content analysis and team collaboration Any video content requiring quick insights Key Benefits Asynchronous processing** handles large files without timeouts Multi-format support** for MP4, AVI, MOV, WebM, MKV Instant team updates** via Slack notifications Saves hours** of manual video review time How to customize Extend by adding: Video categorization and tagging Integration with project management tools Email notifications alongside Slack Searchable video databases with summaries This workflow transforms lengthy videos into actionable insights, making your content instantly accessible and shareable with your team.
by Mohan Gopal
Overview This release introduces a Voice-Enabled Tour Recommendation System that leverages n8n, ElevenLabs Voice Agent, OpenAI GPT-4o, and Pinecone Vector DB to deliver personalized travel itineraries based on spoken input. Users speak their preferences to the ElevenLabs voice agent, which then triggers an n8n workflow that returns a tailored tour plan. Features Voice interaction with AI-powered travel agent via ElevenLabs Uses ChatGPT-4o for contextual understanding and generation Dynamic query handling with vector-based search using Pinecone Fast response generation using n8n webhook Modular agent memory and role design for scalable enhancement Pre-requisites n8n account with workflow creation access ElevenLabs account with agent and webhook setup OpenAI API key (GPT-4o access) Pinecone account for vector database A list of vectorized tour packages using this n8n embedder (https://creators.n8n.io/workflows/5085) Setup Instructions Step 1: Configure the Voice Agent Webhook in ElevenLabs Use POST method Webhook URL: https://... Breakdown voice input into: Destination Type of tour Number of days Number of passengers Step 2: Set Up the AI Agent Prompt in ElevenLabs Use a conversational style with summaries, clarifying questions, and affirmations. Example Prompt: โYou use a natural speech style and periodically summarize... Your goal is to help callers create a personalized tour plan.โ Step 3: Select LLM LLM: GPT-4o Mini Memory window: Up to 5 contexts Step 4: Integrate Tools Use Custom Tool: n8n ID: tool_xxxxxx Tool Description: โGenerates travel plan once the details are collectedโ Step 5: Build n8n Workflow Trigger: Webhook (POST) Process user input: Tour Recommendation AI Agent Use OpenAI Chat Model (GPT-4o) for reasoning Query Pinecone Vector Store using Tour Builder Q&A node Respond with structured Itinerary Plan via webhook response How to use: Execute the n8n workflow (the webhook waits for the voice trigger from elevenlabs) Start the Elevenlabs Voice Agent Request for a tour plan to any destination giving the details of your tour preferences. Wait for the Voice Agent to respond back with tour package suggestions after fetching the tour details from the n8n workflow. Close the conversation. | Area | Improvement | | ------------------ | ----------------------------------------------------- | | ๐ Voice UX | Natural-sounding travel agent using ElevenLabs | | ๐ก Personalization | ChatGPT-4o adapts based on travel style & preferences | | ๐ Knowledge Base | Pinecone-powered vector retrieval of real tour data | | ๐ Reusability | Modular workflow with reusable embedding tools | | โ๏ธ System Design | Separation of memory, logic, and data layers | Who is this for? Travel Agencies & DMCs Offer ultra-personalized packages based on customer queries. Let AI do the matching. Tour Package Aggregators Auto-curate and send matching packages from your catalog โ no manual searching needed. Content & Marketing Teams Craft customized tour recommendations for email campaigns and newsletters. Tech-enabled Travel Startups Embed this intelligence in your workflows, CRMs, or chatbots to delight customers.
by RedOne
This workflow is designed for e-commerce store owners, operations managers, and developers who use Shopify as their e-commerce platform and want an automated way to track and analyze their order data. It is particularly useful for businesses that: Need a centralized view of all Shopify orders Want to analyze order trends without logging into Shopify Need to share order data with team members who don't have Shopify access Want to build custom reports based on order information What Problem Is This Workflow Solving? While Shopify provides excellent order management within its platform, many businesses need their order data available in other systems for various purposes: Data accessibility**: Not everyone in your organization may have access to Shopify's admin interface Custom reporting**: Google Sheets allows for flexible analysis and report creation Data integration**: Having orders in Google Sheets makes it easier to combine with other business data Backup**: Creates an additional backup of your critical order information What This Workflow Does This n8n workflow creates an automated bridge between your Shopify store and Google Sheets: Listens for new order notifications from your Shopify store via webhooks Processes the incoming order data and transforms it into a structured format Stores each new order in a dedicated Google Sheets spreadsheet Sends real-time notifications to Telegram when new orders are received or errors occur Setup Create a Google Sheet Create a new Google Sheet to store your orders Add a sheet named "orders" with the following columns: orderId orderNumber created_at processed processed_at json customer shippingAddress lineItems totalPrice currency Set Up Telegram Bot Create a Telegram bot using BotFather (send /newbot to @BotFather) Save your bot token for use in n8n credentials Start a chat with your bot and get your chat ID (you can use @userinfobot) Configure the Workflow Set your Google Sheet ID in the "Edit Variables" node Enter your Telegram chat ID in the "Edit Variables" node Set up your Telegram API credentials in n8n Configure Shopify Webhook In your Shopify admin, go to: Settings > Notifications > Webhooks Create a new webhook for "Order creation" Set the URL to your n8n webhook URL (from the "Receive New Shopify Order" node) Set the format to JSON How to Customize This Workflow to Your Needs Additional data**: Modify the "Transform Order Data to Standard Format" function to extract more Shopify data Multiple sheets**: Duplicate the Google Sheets node to store different aspects of orders in separate sheets Telegram messages**: Customize the text in Telegram nodes to include more details or rich formatting Data processing**: Add nodes to perform calculations or transformations on order data Additional notifications**: Add more channels like Slack, Discord, or SMS Integrations**: Extend the workflow to send order data to other systems like CRMs, ERPs, or accounting software Final Notes This workflow serves as a foundation that you can build upon to create a comprehensive order management system tailored to your specific business needs.
by Tony Duffy
. Read and store IOT sensor data with the MQTT Trigger and InfluxDB tonyduffy@protonmail.com This workflow is for users wanting a practical example of how to obtain data from remote IOT systems using the MQTT protocol in an n8n environment. The template provides typical n8n node implementation and configuration settings necessary to read and store IOT data. The workflow reads the temperature and humidity data from a remote IOT system in this case a DHT22 sensor connected to a ESP32 micro controller. The data is parsed into the correct JSON format and then ingested in an InfluxDB data bucket. From there the stored temperature and humidity values can be displayed in real time. The workflow can be easily modified to read any MQTT driven device data. Remote IOT Sensor Setup The ESP32 controller with the DHT22 sensor are running on a Wokwi simulator. The simulator uses micro python to publish a MQTT "wokwi-weather" topic with the temperature and humidity payloads to an online Mosquitto MQTT broker. The n8n MQTT trigger node subscribes to the topic on the broker and reads the payload values when any changes are published. The code node then prepares the payload for JSON format. The HTTP request node ingests the data in a InfluxDB bucket How to customise this workflow to your needs Wokwi IOT ESP32 simulator You will need to setup a free account at Wokwi.com Once created search for a project "Micro-Python MQTT Weather Logger (ESP32)" Then when the MQTT weather logger project is open change lines 28 and 29 to the following 28 MQTT_CLIENT_ID = "" 29 MQTT_BROKER = "test.mosquitto.org" You then can start the simulation by clicking on the green arrow and it will connect the mosquitto broker and the "wokwi-weather" topic will be published. By clicking on the DHT22 sensor the temperature and humidity bar will appear and you can change the values to send updated payload values to the broker. InfluxDB You will require access to functioning InfluxDB database to utilise this workflow Note : You will have to provide the following for the HTTP request node to connect to InfluxDB. The URL and port of the desired InfluxDB (In this case the InfluxDB is running locally on port 8086 ie. http://localhost:8086.) InfluxDB bucket for the data. ( In this case the created bucket name is "wokwi-data") The Organization ID of the InfluxDB. This can be obtained for the InfluxDB admin page A generated API token to read and write to the InfluxDB bucket. Created from the InfluxDB admin n8n workflow. The MQTT trigger node is configured to subscribe to the "wokwi-weather" topic on the test Mosquitto MQTT broker. It reads the temperature and humidity data sent by ESP32. The code node uses Javascript to move the temperature and humidity payloads to JSON format. This is flexible and can easily modified. The HTTP request node posts the JSON payloads to the InfluxDB bucket. When the above is configured the workflow should function correctly. Thanks to the many who have downloaded this template. Let me know on what you would like to build. Contact me at tonyduffy@protonmail.com
by Guillaume Duvernay
Unlock a new level of sophistication for your AI agents with this template. While the native n8n Think Tool is great for giving an agent an internal monologue, it's limited to one instance. This workflow provides a clever solution using a sub-workflow to create multiple, custom thinking tools, each with its own specific purpose. This template provides the foundation for building agents that can plan, act, and then reflect on their actions before proceeding. Instead of just reacting, your agent can now follow a structured, multi-step reasoning process that you design, leading to more reliable and powerful automations. Who is this for? AI and automation developers:** Anyone looking to build complex, multi-tool agents that require robust logic and planning capabilities. LangChain enthusiasts:** Users familiar with advanced agent concepts like ReAct (Reason-Act) will find this a practical way to implement similar frameworks in n8n. Problem solvers:** If your current agent struggles with complex tasks, giving it distinct steps for planning and reflection can dramatically improve its performance. What problem does this solve? Bypasses the single "Think Tool" limit:** The core of this template is a technique that allows you to add as many distinct thinking steps to your agent as you need. Enables complex reasoning:** You can design a structured thought process for your agent, such as "Plan the entire process," "Execute Step 1," and "Reflect on the result," making it behave more intelligently. Improves agent reliability and debugging:** By forcing the agent to write down its thoughts at different stages, you can easily see its line of reasoning, making it less prone to errors and much easier to debug when things go wrong. Provides a blueprint for sophisticated AI:** This is not just a simple tool; it's a foundational framework for building state-of-the-art AI agents that can handle more nuanced and multi-step tasks. How it works The re-usable "Thinking Space": The magic of this template is a simple sub-workflow that does nothing but receive text. This workflow acts as a reusable "scratchpad." Creating custom thinking tools: In the main workflow, we use the Tool (Workflow) node to call this "scratchpad" sub-workflow multiple times. We give each of these tools a unique name (e.g., Initial thoughts, Additional thoughts). The power of descriptions: The key is the description you give each of these tool nodes. This description tells the agent when and how it should use that specific thinking step. For example, the Initial thoughts tool is described as the place to create a plan at the start of a task. Orchestration via system prompt: The main AI Agent's system prompt acts as the conductor, instructing the agent on the overall process and telling it about its new thinking abilities (e.g., "Always start by using the Initial thoughts tool to make a plan..."). A practical example: This template includes two thinking tools to demonstrate a "Plan and Reflect" cycle, but you can add many more to fit your needs. Setup Add your own "action" tools: This template provides the thinking framework. To make it useful, you need to give the agent something to do. Add your own tools to the AI Agent, such as a web search tool, a database lookup, or an API call. Customize the thinking tools: Edit the description of the existing Initial thoughts and Additional thoughts tools. Make them relevant to the new action tools you've added. For example, "Plan which of the web search or database tools to use." Update the agent's brain: Modify the system prompt in the main AI Agent node. Tell it about the new action tools you've added and how it should use your customized thinking tools to complete its tasks. Connect your AI model: Select the OpenAI Chat Model node and add your credentials. Taking it further Create more granular thinking steps:** Add more thinking tools for different stages of a process, like a "Hypothesize a solution" tool, a "Verify assumptions" tool, or a "Final answer check" tool. Customize the thought process:* You can change *how the agent thinks by editing the prompt inside the fromAI('Thoughts', ...) field within each tool. You could ask for thoughts in a specific format, like bullet points or a JSON object. Change the workflow trigger:** Switch the chat trigger for a Telegram trigger, email, Slack, whatever you need for your use case! Integrate with memory:** For even more power, combine this framework with a long-term memory solution, allowing the agent to reflect on its thoughts from past conversations.
by Matheus Pedrosa
Who is this template for? This template is ideal for n8n instance administrators, developers, and DevOps teams who need a proactive and organized way to monitor the health of their automations. If you want to be notified about failures as soon as they happen, without having to manually check execution logs, this workflow is for you. What does this template do? This workflow automates error monitoring on your n8n instance. Every hour, it performs the following steps: Queries the n8n API to fetch all executions that have failed in the last hour. Groups the errors by workflow to consolidate the information. Builds a rich message for each failed workflow, including the error count. Sends an alert to a Slack channel with a button to open the workflow directly, allowing for immediate investigation. Requirements Before you start, you will need to have the following configured in your n8n instance: n8n API Credentials:** You need to generate an API key in your n8n instance settings so the workflow can query execution data. Slack Credentials:* A configured *Slack (OAuth2 API)** credential to allow n8n to send messages to your workspace. How to set it up Setup is simple and only takes a few minutes: Config Node: In the node named "Config", you must set the value for the baseUrl to your n8n instance's URL (e.g., https://n8n.yourdomain.com). This is crucial for generating the correct workflow links in the Slack message. Schedule Trigger: The workflow is pre-configured to run every hour. You can adjust the frequency in this node to fit your needs. "Get Failed Executions" Node (HTTP Request): Under Authentication, select 'Header Auth'. In the Credentials field, select your n8n API credential. "Post to Slack" Node (Slack): Select your Slack credential. In the Channel field, enter the name of the channel where error notifications should be sent (e.g., #n8n-alerts). Activate the Workflow! After these steps, just activate the workflow to start the automatic error monitoring. How to customize the workflow You can easily customize this template: Change the Schedule:** Modify the Schedule Trigger node to run at different intervals (every 15 minutes, once a day, etc.). Change the Notification Channel:** Instead of Slack, you can replace the last node to send notifications to Discord, Microsoft Teams, Telegram, or even by email. Add More Information:** You can modify the MakeMessage node that generates the message to include more details about the errors, such as the error message or the exact time of failure.
by Artur
Overview This automated workflow fetches Upwork job postings using Apify, removes duplicate job listings via Airtable, and sends new job opportunities to Slack. Key Features: Automated job retrieval** from Upwork via Apify API Duplicate filtering** using Airtable to store only unique jobs Slack notifications** for new job postings Runs every 30 minutes** during working hours (9 AM - 5 PM) This workflow requires an active Apify subscription to function, as it uses the Apify Upwork API to fetch job listings. Who is This For? This workflow is ideal for: Freelancers looking to track Upwork jobs in real time Recruiters automating job collection for analytics Developers who want to integrate Upwork job data into their applications What Problem Does This Solve? Manually checking Upwork for jobs is time-consuming and inefficient. This workflow: Automates job discovery based on your keywords Filters out duplicate listings, ensuring only new jobs are stored Notifies you on Slack when new jobs appear How the Workflow Works 1. Schedule Trigger (Every 20 Minutes) Triggers the workflow at 20-minute intervals Ensures job searches are only executed during working hours (9 AM - 5 PM) 2. Query Upwork for Jobs Uses Apify API to scrape Upwork job posts for specific keywords (e.g., "n8n", "Python") 3. Find Existing Jobs in Airtable Searches Airtable to check if a job (based on title and link) already exists 4. Filter Out Duplicate Jobs The Merge Node compares Upwork jobs with Airtable data The IF Node filters out jobs that are already stored in the database 5. Save Only New Jobs in Airtable The Insert Node adds only new job listings to the Airtable collection 6. Send a Slack Notification If a new job is found, a Slack message is sent with job details Setup Guide Required API Keys Upwork Scraper (Apify Token) โ Get your token from Apify Airtable Credentials Slack API Token โ Connect Slack to n8n and set the channel ID (default: #general) Configuration Steps Modify search keywords in the 'Assign Parameters' node (startUrls) Adjust the Working Hours in the 'If Working Hours' node Set your Slack channel in the Slack node Ensure Airtable is connected properly - you'll need to create a table with 'title' and 'link' columns. Adjust the 'If Working Hours' node to match your timezone and hours, or remove it altogether to receive notifications and updates constantly. How to Customize the Workflow Change keywords: update the startUrls in the 'Assign Parameters' node to track different job categories Change 'If Working Hours': Modify conditions in the IF Node to filter times based on your needs Modify Slack Notifications: Adjust the Slack message format to include additional job details Why Use This Workflow? Automated job tracking without manual searches Prevents duplicate entries in Airtable Instant Slack notifications for new job opportunities Customizable โ adapt the workflow to different job categories Next Steps Run the workflow and test with a small set of keywords Expand job categories for better coverage Enhance notifications by integrating Telegram, Email, or a dashboard This workflow ensures real-time job tracking, prevents duplicates, and keeps you updated effortlessly.
by Anurag
Description This workflow automates the extraction of structured data from invoices or similar documents using Docsumo's API. Users can upload a PDF via an n8n form trigger, which is then sent to Docsumo for processing and structured parsing. The workflow fetches key document metadata and all line items, reconstructs each invoice row with combined header and item details, and finally exports all results as an Excel file. Ideal for automating invoice data entry, reporting, or integrating with accounting systems. How It Works A user uploads a PDF document using the integrated n8n form trigger. The workflow securely sends the document to Docsumo via REST API. After uploading, it checks and retrieves the parsed document results. Header information and table line items are extracted and mapped into structured records. The complete result is exported as an Excel (.xls) file. Setup Steps Docsumo Account: Register and obtain your API key from Docsumo. n8n Credentials Manager: Add your Docsumo API key as an HTTP header credential (never hardcode the key in the workflow). Workflow Configuration: In the HTTP Request nodes, set the authentication to your saved Docsumo credentials. Update the file type or document type in the request (e.g., "type": "invoice") as needed for your use case. Testing: Enable the workflow and use the built-in form to upload a sample invoice for extraction. Features Supports PDF uploads via n8nโs built-in form or via API/webhook extension. Sends files directly to Docsumo for document data extraction using secure credentials. Extracts invoice-level metadata (number, date, vendor, totals) and full line item tables. Consolidates all data in easy-to-use Excel format for download or integration. Modular node structure, easily extensible for further automation. Prerequisites Docsumo account with API access enabled. n8n instance with form, HTTP Request, Code, and Excel/Convert to File nodes. Working Docsumo API Key stored securely in n8nโs credential manager. Example Use Cases | Scenario | Benefit | |---------------------|-----------------------------------------| | Invoice Automation | Extract line items and metadata rapidly | | Receipts Processing | Parse and digitize business receipts | | Bulk Bill Imports | Batch process bills for analytics | Notes Credentials Security:** Do not store your API key directly in HTTP Request nodes; always use n8n credentials manager. Sticky Notes:** The workflow includes sticky notes for setup, input, API call, extraction, and output steps to assist template users. Custom Columns:** You can customize header or line item extraction by editing the Code node as needed.