by Nicolas
What is it This workflow aims to build a simple bot that will send a message to a telegram channel every time there is a new saved item to the Reader. This workflow can be easily modify to support other way of sending the notification, thanks to existing n8n nodes. Warning: This is only for folks who already have access to the Reader, it won't work if you don't Also, this workflow use a file to store the last update time in order to not sync everything everytime. Setup The config node : It contains the telegram channel id It also contains the file used as storage To get the header auth, you have to : Go to the reader Open the devtools, Option + ⌘ + J (on macOS), or Shift + CTRL + J (on Windows/Linux) Go to network and find a profile_details/ request, click on it Go to Request Headers Copy the value for "Cookie" In n8n, set the name of the Header auth account to Cookie and the value with the one you copied before
by Jimleuk
This n8n template imports an XLSX containing terms dates for a university, extracts the relevant events using AI and converts the events to an ICS file which can be imported into iCal, Google Calendar or Outlook. Manually adding important term dates to your calendar by hand? Stop! Automate it with this simple AI/LLM-powered document understanding and extraction template. This cool use-case can be applied to many scenarios where Excel files are predominantly used. How it works The term dates excel file (xlsx) are imported into the workflow from the university's website using the http request node. To parse the excel file, we use an external service - Cloudflare's Markdown Conversion Service. This converts the excel's sheets into markdown tables which our LLM can read. To extract the events and their dates from the markdown, we can use the Information Extractor node for structured output. LLMs are great for this use-case because they can understand the layout; one row may have many data points. With our data, there are endless possibilities to use it! But for this demonstration, we'll generate an ICS file so that we can import the extracted events into our calendar. We use the Python code node to combine the events into the ICS spec and the "Convert to File" node to create the ICS binary. Finally, let's distribute the ICS file by email to other students or instructors who may also find this incredibly helpful for the upcoming semester! How to use Ensure you're downloading the correct excel file and amend the URL parameter of the "Get Term Dates Excel" as necessary. Update the gmail node with your email or other emails as required. Alternatively, send the ICS file to Google Drive or a student portal. Requirements Cloudflare Account is required to use the Markdown Conversion Service. Gemini for LLM document understanding and extraction. Gmail for email sending. Customising the workflow This template should work for other Excel files which - for a university - there are many. Some will be more complicated than others so experiment with different parsers and extraction tools and strategies.
by Agentick AI
This n8n template demonstrates how to automate invoice data extraction from PDF attachments received via Gmail. Using LlamaParse and Gemini LLM, this workflow parses structured fields like PO numbers, line items, tax amounts, and totals — and stores them neatly into a Google Sheet. Perfect for use cases such as: 💼 Finance teams managing vendor invoices 📊 Bookkeeping workflows 🔄 Automating monthly reconciliation Good to Know At the time of writing, LlamaParse and Gemini may involve API usage costs depending on your subscription tier. Check LlamaIndex Pricing and Gemini Pricing for updated info. LlamaParse provides Markdown-formatted parsed output which is then passed to an LLM for structured field extraction. Gemini models may be geo-restricted. If you encounter "model not found" errors, your region might not be supported. How it Works Trigger: Watches your Gmail for new emails with PDF attachments. Email Filter: Ensures we only parse fresh emails not already labeled as "invoice synced". LlamaParse Upload: Uploads the PDF to LlamaParse’s parsing endpoint. Status Polling: Periodically checks whether the parsing is complete. Download Markdown: Once ready, it fetches the parsed invoice in Markdown format. AI Parsing with Gemini: Sends the Markdown to Gemini LLM to extract structured JSON (like PO number, line items, taxes, etc.) using a predefined schema. Google Sheets Upload: Stores extracted data into a predefined spreadsheet. Labeling: Marks the email as “invoice synced” to avoid reprocessing. How to Use The trigger is based on Gmail, but you can replace this with a webhook or manual trigger for testing. Setup Instructions Gmail API Enable Gmail API in Google Cloud Console. Connect your Gmail account in n8n credentials. Allow read + modify access. Google Sheets Create a new Google Sheet with the following headers (row 1): Date | Vendor Name | Invoice Number | PO Number | Line Items | Subtotal | Tax | Total Amount Connect Google Sheets in n8n and paste the Sheet ID in the node. You can customise the google sheet basis your requirement. LlamaParse Get a LlamaIndex API Key from LlamaIndex. Use the LlamaParse upload and polling nodes to process your PDFs. Gemini (via Vertex AI) Set up Gemini access in GCP. Use the Gemini 2.5 Model. Construct a structured prompt to extract required fields. Labeling Create a Gmail label named "Invoice Synced" for tracking processed emails. Requirements Gmail account with API access LlamaParse (LlamaIndex) account with API Key Google Sheets API credentials Access to Gemini 2.5 model via Google Vertex AI Customising This Workflow This template is just the beginning. You can expand it to: Auto-generate invoices back to vendors Run duplicate checks before inserting into Sheets Integrate with accounting tools like Zoho, QuickBooks, or Tally Trigger Slack/Email notifications on specific vendors or high invoice amounts
by James Francis
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Overview When applying for freelance jobs on Upwork, minutes matter. The first quality application is more often than not the one that's ultimately selected. Subscribers to Upwork's Freelancer Plus receive email job alerts, but filters are very limited. As a result, it takes a lot of time to manually go through each email and determine if each job fits your criteria. This workflow scans your Gmail every few minutes, finds all Upwork job alerts, scores them based on your profile/preferences, and sends a Slack channel message for jobs that are strong potential matches. How it works Scans Gmail for Upwork job alerts every few minutes Extracts all available job data from each email Scores the job based on profile information and criteria you provide Sends a Slack notification for all jobs that meet a given score threshold Disclaimers This workflow polls Gmail for new messages every 10 minutes. A workflow execution will be used each time, regardless of whether the Gmail scan finds anything. You may want to adjust this frequency based on the amount of workflow executions you want to use. The AI matching process is based only on the information included in the email body (job title, description snippet and metadata). It is against Upwork's Terms of Service to scrape a full job posting. Despite this, the quality of the results in our testing is high for most use cases. Required Setup Subscribe to Upwork's Freelancer Plus plan to enable job alerts ($19.99/mo at the time of this posting) Create Gmail and Open Router (or an LLM provider of your choice) credentials and select them in the Gmail / LLM Model nodes Create a Slack app that has at least the chat:write.public and channels:read scopes, install it into your workspace, and use your apps OAuth Token to create a Slack API credential in n8n IMPORTANT: In the "Opportuntity Scorer" node, replace the text in between the <my_profile> tags with your freelancer bio. For best results, include as much detail as possible about your skillset, experience, tool familiarity, and job preferences. Update the filter with your notification threshold preference(s) and update the Slack channel to send notifications to in the last Slack node If you have any questions or feedback about this workflow, or would like me to build custom workflows for your business, email me at n8n@paperjam.agency.
by sateshcharan
Who is this template for? This workflow template is designed for DevOps, Engineering, and Managed Service Provider professionals seeking alerts on various channels, with each channel being logically chosen based on the severity of the event. How it works Each time a new event occurs, the workflow runs (powered by TwentyCRM's native Webhooks feature). After filtering for the required data from the webhook, the filtered data is logged using Google Sheets. Based on the eventType from the webhook, we conditionally select a predefined messaging channel and send updates or alerts through it. Set up instructions Complete the Set up credentials step when you first open the workflow. You'll need a Google-OAuth2.0 with Gmail API & Google Sheets Scope, Slack with OAuth2.0 - chat:write scopes. Set up the Webhook in TwentyCRM, linking the On new TwentyCRM event Trigger with your TwentyCRM App. Set the correct channel to send to in the Post message in channel step. After testing your workflow, swap the Test URL to Production URL in TwentyCRM and activate your workflow. Template was created in n8n v1.63.4.
by ibrhdotme
This is a simple workflow that grabs HackerNews front-page headlines from today's date across every year since 2007 and uses a little AI magic (Google Gemini) to sort 'em into themes, sends a neat Markdown summary on Telegram. How it works Runs daily, grabs Hacker News front page for this day across every year since 2007. Pulls headlines & dates. Uses Google Gemini to sort headlines into topics & spot trends. Sends a Markdown summary to Telegram. Set up steps Clone the workflow. Add your Google Gemini API key. Add your Telegram bot token and chat ID. **Built on Day-01 as part of the #100DaysOfAgenticAi Fork it, tweak it, have fun!**
by Marth
How it works This workflow runs on a daily schedule. It starts by scraping real estate-related queries from Google using Apify. The organic search results are parsed and summarized into a single text block. That text is then sent to an AI model (GPT-4o) which extracts the top 3 pain points faced by real estate agents based on current online sentiment. The workflow compares today's insights with yesterday's data stored in Airtable to detect recurring or new pain points. Finally, it sends a summary notification via Telegram and stores the current day's insights into Airtable for trend tracking. How to set up Clone or import the workflow into your n8n instance. Get an Apify API token and insert it into the HTTP Request node. Create an Airtable base with a table containing two fields: "Date" (text) and "Summary" (long text). Copy the Base ID and Table ID into the Airtable nodes. Connect your Telegram bot and replace the chat ID in the Telegram node. Set up OpenAI credentials with GPT-4o or GPT-4o-mini for the LLM node. Run once manually to test, then activate the schedule trigger to run daily. (Optional) Extend the flow to generate cold outreach emails based on pain points, or sync to Notion/CRM.
by Matt Chong
Who is this for? This workflow is ideal for: For freelancers, business owners, and finance teams who receive receipts via Gmail Automatically logs expenses for tax, bookkeeping and year-end audits What problem is this workflow solving? When tax season hits, missing receipts create panic. This workflow keeps everything in one place. It uses AI to extract details from Gmail attachments, logs them in a Google Sheet, and stores the PDFs in Google Drive. No digging. No copying. Just everything where it should be. How it works? Apply the label receipt to any incoming Gmail email. Do not mark it as read. On a schedule (e.g. daily at 8:00 AM), the workflow triggers. It searches for unread emails with the label receipt. For each matching email, it downloads the attached receipt file. It extracts text content from the receipt file. It uploads the original receipt file to a specified folder in Google Drive. It merges the extracted text with email metadata. It sends this combined data to OpenAI. OpenAI extracts structured fields: date merchant category description subtotal tax total The extracted data is appended as a new row in the specific Google Sheet. Finally, the email is marked as read to prevent it from being processed again. How to set up? Connect these services in your n8n credentials: Gmail (OAuth2) Google Drive Google Sheets OpenAI Configure the Google Drive upload: In the “Upload File” node, select the target folder where you want receipt PDFs stored. Set your execution schedule: Open the “Schedule Trigger” node and choose when it should run (default is once daily at 8:00 AM). Choose your Google Sheet and tab: In the “Append to Google Sheet” node, select your document and tab Ensure the sheet contains these columns: Date, Merchant, Category, Description, Subtotal, Tax, Total. How to customize this workflow to your needs? Change the Gmail label or search filter** to match your needs. Modify the OpenAI schema** to extract additional fields like currency, project, or notes.
by Cyril Nicko Gaspar
Amazon Price Monitoring Workflow This workflow enables you to monitor the prices of Amazon product listings directly from a Google Sheet, using data provided by Bright Data’s Amazon Scraper API. It automates the retrieval of price data for specified products and is ideal for market research, competitor analysis, or personal price tracking. ✅ Requirements Before using this template, ensure you have the following: A Bright Data account and access to the Amazon Scraper API. An active API key from Bright Data. A Google Sheet set up with the required columns. N8N account (self-host or cloud version) ⸻ ⚙️ Setup 1. Create a Google Sheet with the following columns: Product URL ZIP Code (used for regional price variations) ASIN (Amazon Standard Identification Number) 2. Extract ASIN Automatically using the following formula in the ASIN column: =REGEXEXTRACT(A2, "/(?:dp|gp/product|product)/([A-Z0-9]{10})") Replace A2 with the appropriate cell reference 3. Obtain an API Key: Sign in to your Bright Data account. Go to the API section to generate an API key. Create a Bearer Authentication Credential using this key in your automation tool. 4. Configure the Workflow: Use a node (e.g., “Google Sheets”) to read data from your sheet. Use an HTTP Request node to send a query to Bright Data’s Amazon API with the ASIN and ZIP code. Parse the returned JSON response to extract product price and other relevant data. Optionally write the output (e.g., current price, timestamp) back into the sheet or another data store. ⸻ Workflow Functionality The workflow is triggered periodically (or manually) and reads product details from your Google Sheet. For each row, it extracts the Product URL and ZIP code and sends a request to the Bright Data API. The API returns product price information, which is then logged or updated back into the sheet using ASIN. You can also map the product URL to the product URL, but ensure that the URL has no parameters. If the URL has appended parameters, refer to the input field from the Bright Data snapshot result. ⸻ 💡 Use Cases E-commerce sellers monitoring competitors’ prices. Consumers tracking price drops on wishlist items. Market researchers collecting pricing data across ZIP codes. Affiliate marketers ensuring accurate product pricing on their platforms. ⸻ 🛠️ Customization Add columns for additional product data such as rating, seller, or stock availability. Schedule the workflow to run hourly, daily, or weekly depending on your needs. Implement email or Slack alerts for significant price changes. Filter by product category or brand to narrow your tracking focus.
by Sira Ekabut
This workflow automates AI-based image generation using the Fal.ai Flux API. Define custom prompts, image parameters, and effortlessly generate, monitor, and save the output directly to Google Drive. Streamline your creative automation with ease and precision. Who is this for? This template is for content creators, developers, automation experts, and creative professionals looking to integrate AI-based image generation into their workflows. It’s ideal for generating custom visuals with the Fal.ai Flux API and automating storage in Google Drive. What problem is this workflow solving? Manually generating AI-based images, checking their status, and saving results can be tedious. This workflow automates the entire process — from requesting image generation, monitoring its progress, downloading the result, and saving it directly to a Google Drive folder. What this workflow does Sets Custom Image Parameters: Allows you to define the prompt, resolution, guidance scale, and steps for AI image generation. Sends a Request to Fal.ai: Initiates the image generation process using the Fal.ai Flux API. Monitors Image Status: Checks for completion and waits if needed. Downloads the Generated Image: Fetches the completed image once ready. Saves to Google Drive: Automatically uploads the generated image to a specified Google Drive folder. Setup 1. Prerequisites: • Fal.ai API Key: Obtain it from the Fal.ai platform and set it as the Authorization header in HTTP Header Auth credentials. • Google Drive OAuth Credentials: Connect your Google Drive account in n8n. 2. Configuration: • Update the “Edit Fields” node with your desired image parameters: • Prompt: Describe the image (e.g., “Thai young woman net idol 25 yrs old, walking on the street”). • Width/Height: Define image resolution (default: 1024x768). • Steps: Number of inference steps (e.g., 30). • Guidance Scale: Controls image adherence to the prompt (e.g., 3.5). • Set your Google Drive folder ID in the “Google Drive” node to save the image. 3. Run the Workflow: • Trigger the workflow manually to generate the image. • The workflow waits, checks status, and saves the final output seamlessly. Customization • Modify Image Parameters: Adjust the prompt, resolution, steps, and guidance scale in the “Edit Fields” node. • Change Storage Location: Update the Google Drive node with a different folder ID. • Add Notifications: Integrate an email or messaging node to alert you when the image is ready. • Additional Outputs: Expand the workflow to send the generated image to Slack, Dropbox, or other platforms. This workflow streamlines AI-based image generation and storage, offering flexibility and customization for creative automation.
by Adam Crafts
🎥 n8n Workflow: Generate AI Videos with HeyGen 🚀 Overview This automation connects directly to HeyGen's powerful AI video generation platform. It allows you to programmatically create videos with digital avatars and voiceovers, perfect for scaling your content creation for social media, marketing campaigns, or personalized messages without ever opening a video editor. 😩 The Problem Creating video content is incredibly time-consuming and expensive. You have to write scripts, record audio, find actors or create complex animations, piece everything together in an editor, and then wait for it to render. Every minor change or personalization requires repeating the entire frustrating process. This manual work is a major bottleneck, making it nearly impossible to produce large volumes of high-quality video content quickly and affordably. ✨ The Solution This workflow acts as your personal, automated video production assistant! When you provide a script, the automation instantly sends instructions to HeyGen to begin creating your video. It tells the AI which avatar and voice to use and starts the generation process. Then, it cleverly waits and periodically checks the status until your new video is finished and ready. It’s a completely hands-off process that transforms simple text into professional AI videos on demand. 🔧 What It Does Send a request to HeyGen's API to generate a video with: Custom avatar Scripted voice-over Background color and dimension Wait 30 seconds Check video status Loop until video is completed, failed, or still processing ⚙️ Simple Setup This workflow is a pre-built blueprint, designed to be up and running in minutes! Upload:** Simply upload the provided JSON file into your n8n instance. Connect:** Connect your app credentials (e.g., your HeyGen account). The workflow will show you exactly where. Activate:** Turn the workflow on, and it's ready to go! Let your new automated employee get to work. This free n8n workflow allows you to generate AI videos using HeyGen via their API. 🌐 Explore more workflows ❤️ Buy more workflows at: adamcrafts 🦾 Custom workflows at: adamcrafts@cloudysoftwares.com adamaicrafts@gmail.com > Build once, customize endlessly, and scale your video content like never before. 🚀
by ist00dent
This n8n workflow provides a simple yet powerful utility to convert Unix timestamps (seconds since epoch) into the universally recognized ISO 8601 date and time format. This is crucial for harmonizing date data across different systems, databases, and applications. 🔧 How it works Receive Timestamp Webhook: This node acts as the entry point, listening for incoming POST requests. It expects a JSON body containing a single property: timestamp, which should be a Unix timestamp in seconds (e.g., 1678886400). Convert to ISO 8601: This node takes the timestamp received from the webhook. Since JavaScript's Date object typically uses milliseconds, it multiplies the Unix timestamp by 1000. It then uses new Date(...).toISOString() to convert this into an ISO 8601 formatted string (e.g., 2023-03-15T00:00:00.000Z) and assigns it to a new property called convertedTime. Respond with Converted Time: This node sends the convertedTime property back as the response to the original webhook caller. 👤 Who is it for? This workflow is extremely useful for: Developers & Integrators: When working with APIs or databases that return dates as Unix timestamps, and you need to display them in a human-readable or standardized format in your applications or dashboards. Data Analysts & Scientists: For cleaning and transforming raw timestamp data from logs, event streams, or legacy systems into a consistent format for analysis. System Administrators: For debugging logs where timestamps are often in Unix format. Anyone Managing Data Imports/Exports: Ensuring date compatibility when moving data between different platforms. Automators: As a building block in larger workflows where incoming data has Unix timestamps that need to be normalized before further processing (e.g., adding to a spreadsheet, sending in an email, or performing date calculations). 📑 Data Structure When you trigger the webhook, send a POST request with a JSON body structured as follows: { "timestamp": 1678886400 } The workflow will return a JSON response similar to this: { "convertedTime": "2023-03-15T00:00:00.000Z" } ⚙️ Setup Instructions Import Workflow: In your n8n editor, click "Import from JSON" and paste the provided workflow JSON. Configure Webhook Path: Double-click the Receive Timestamp Webhook node. In the 'Path' field, set a unique and descriptive path (e.g., /convert-timestamp or /unix-to-iso). Activate Workflow: Save and activate the workflow. 📝 Tips This simple conversion workflow can be drastically enhanced and leveraged in many ways: Dynamic Output Formats: Upgrade: Modify the Convert to ISO 8601 node (or add a Function node after it) to accept an optional format parameter in the webhook. Leverage: Allow users to request formats like MM/DD/YYYY HH:mm:ss, YYYY-MM-DD, DD-MM-YYYY, or just the time, making the output directly usable in various contexts without further processing. Example using a Function node: const date = new Date($json.timestamp * 1000); const format = $json.format || 'iso'; // Default to ISO let output; switch (format.toLowerCase()) { case 'iso': output = date.toISOString(); break; case 'locale': // e.g., "3/15/2023, 12:00:00 AM UTC" output = date.toLocaleString('en-US', { timeZone: 'UTC' }); break; case 'dateonly': // e.g., "2023-03-15" output = date.toISOString().split('T')[0]; break; case 'timeonly': // e.g., "00:00:00 UTC" output = date.toLocaleTimeString('en-US', { timeZone: 'UTC', hour12: false }); break; default: output = date.toISOString(); // Fallback } return [{ json: { convertedTime: output } }]; Timezone Conversion: Upgrade: Combine this with the Time Zone Converter workflow (or integrate moment-timezone.js if using a Code node and have a self-hosted instance). Accept an optional targetTimeZone parameter in the webhook. Leverage: Convert the Unix timestamp directly into a human-readable date and time in a specific target timezone, which is incredibly valuable for global scheduling or reporting. Error Handling and Input Validation: Upgrade: Add an IF node after the Receive Timestamp Webhook. Check if isNaN($json.body.timestamp) or if typeof $json.body.timestamp !== 'number'. Leverage: If the input timestamp is invalid, branch to a Respond to Webhook node that returns a clear error message (e.g., "Invalid timestamp provided. Please provide a numeric Unix timestamp in seconds."). This makes your API more robust. Reverse Conversion (ISO to Unix): Upgrade: Create a separate workflow, or add another branch to this one, to convert an ISO 8601 string back to a Unix timestamp. This provides a complete conversion utility. Example Set node value: ={{ new Date($json.body.isoString).getTime() / 1000 }} Integration with Data Pipelines: Upgrade: Use this workflow as a microservice in larger ETL (Extract, Transform, Load) pipelines. Leverage: If you're pulling data from a source that provides Unix timestamps (e.g., a logging system, IoT device, certain databases), send that data through this workflow to normalize the dates before loading them into your analytics database, CRM, or data warehouse. Automated Reporting: Upgrade: If you have a system that generates reports with Unix timestamps, trigger this webhook for each timestamp. Leverage: Produce reports with human-readable dates for better readability and decision-making for non-technical stakeholders. This workflow is a cornerstone for any automation involving diverse date and time data. By implementing the suggested upgrades, you can transform it from a basic converter into a highly flexible and reliable date-time processing hub.