by PDF Vector
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Description: Unified Academic Search Across Major Research Databases This powerful workflow enables researchers to search multiple academic databases simultaneously, automatically deduplicate results, and export formatted bibliographies. By leveraging PDF Vector's multi-database search capabilities, researchers can save hours of manual searching and ensure comprehensive literature coverage across PubMed, ArXiv, Google Scholar, Semantic Scholar, and ERIC databases. Target Audience & Problem Solved This template is designed for: Graduate students** conducting systematic literature reviews Researchers** ensuring comprehensive coverage of their field Librarians** helping patrons with complex searches Academic teams** building shared bibliographies It solves the critical problem of fragmented academic search by providing a single interface to query all major databases, eliminating duplicate results, and standardizing output formats. Prerequisites n8n instance with PDF Vector node installed PDF Vector API credentials with search permissions Basic understanding of academic search syntax Optional: PostgreSQL for search history logging Minimum 50 API credits for comprehensive searches Step-by-Step Setup Instructions Configure PDF Vector Credentials Go to n8n Credentials section Create new PDF Vector credentials Enter your API key from pdfvector.io Test the connection to verify setup Import the Workflow Template Copy the template JSON code In n8n, click "Import Workflow" Paste the JSON and save Review all nodes for any configuration needs Customize Search Parameters Open the "Set Search Parameters" node Modify the default search query for your field Adjust the year range (default: 2020-present) Set results per source limit (default: 25) Configure Export Options Choose your preferred export formats (BibTeX, CSV, JSON) Set the output directory for files Configure file naming conventions Enable/disable specific export types Test Your Configuration Run the workflow with a sample query Check that all databases return results Verify deduplication is working correctly Confirm export files are created properly Implementation Details The workflow implements a sophisticated search pipeline: Parallel Database Queries: Searches all configured databases simultaneously for efficiency Smart Deduplication: Uses DOI matching and fuzzy title comparison to remove duplicates Relevance Scoring: Combines citation count, title relevance, and recency for ranking Format Generation: Creates properly formatted citations in multiple styles Batch Processing: Handles large result sets without memory issues Customization Guide Adding Custom Databases: // In the PDF Vector search node, add to providers array: "providers": ["pubmed", "semantic_scholar", "arxiv", "google_scholar", "eric", "your_custom_db"] Modifying Relevance Algorithm: Edit the "Rank by Relevance" node to adjust scoring weights: // Adjust these weights for your needs: const titleWeight = 10; // Title match importance const citationWeight = 5; // Citation count importance const recencyWeight = 10; // Recent publication bonus const fulltextWeight = 15; // Full-text availability bonus Custom Export Formats: Add new format generators in the workflow: // Example: Add APA format export const apaFormat = papers.map(p => { const authors = p.authors.slice(0, 3).join(', '); return ${authors} (${p.year}). ${p.title}. ${p.journal || 'Preprint'}.; }); Advanced Filtering: Implement additional filters: Journal impact factor thresholds Open access only options Language restrictions Methodology filters for systematic reviews Search Features: Query multiple databases in parallel Advanced filtering and deduplication Citation format export (BibTeX, RIS, etc.) Relevance ranking across sources Full-text availability checking Workflow Process: Input: Search query and parameters Parallel Search: Query all databases Merge & Deduplicate: Combine results Rank: Sort by relevance/citations Enrich: Add full-text links Export: Multiple format options
by Samuel Heredia
This n8n workflow securely processes contact form submissions by validating user input, formatting the data, and storing it in a MongoDB database. The flow ensures data consistency, prevents unsafe entries, and provides a confirmation response back to the user. Workflow 1. Form Submission Node Purpose: Serves as the workflow’s entry point. Functionality: Captures user input from the contact form, which typically includes: name last name email phone number 2. Code Node (Validation Layer) Purpose: Ensures that collected data is valid and secure. Validations performed: Removes suspicious characters to mitigate risks like SQL injection or script injection. Validates the phone_number field format (numeric, correct length, etc.). If any field fails validation, the entry is marked as “is_not_valid” to block it from database insertion. 3. Edit Fields Node (Data Formatting) Purpose: Normalizes data before database insertion. Transformations applied: Converts field names to snake_case (first_name, last_name, phone_number). Standardizes field naming convention for consistency in MongoDB storage. 4. MongoDB Node (Insert Documents) Purpose: Persists validated data in MongoDB Atlas. Process: Inserts documents into the target collection with the cleaned and formatted fields. Connection is established securely using a MongoDB Atlas connection string (URI). 🔧 How to Set Up MongoDB Atlas Connection URL a. Create a Cluster b. Log in to MongoDB Atlas and create a new cluster. c. Configure Database Access: Add a database user with a secure username and password, Assign appropriate roles (e.g., Atlas Admin for full access or Read/Write for limited). d. Obtain Connection String (URI) From Atlas, go to Clusters → Connect → Drivers. Copy the provided connection string, which looks like: mongodb+srv://<username>:<password>@cluster0.abcd123.mongodb.net/myDatabase?retryWrites=true&w=majority Configure in n8n In the MongoDB node, paste the URI. Replace <username>, <password>, and myDatabase with your actual credentials and database name. Test the connection to ensure it is successful. 5. Form Ending Node Purpose: Provides closure to the workflow. Functionality: Sends a confirmation response back to the user, indicating that their contact details were successfully submitted and stored. ✅ Result: With this workflow, all contact form submissions are safely validated, normalized, and stored in MongoDB Atlas, ensuring both data integrity and security basic.
by Davide
This workflow automates the process of monitoring Amazon product prices and sending alerts when a product’s price drops below a defined threshold. It integrates ScrapeGraphAI, Google Sheets, and Telegram to provide a complete end-to-end price tracking system. Key Advantages 💡 Intelligent Scraping Uses ScrapeGraphAI to extract structured data (product prices) from complex Amazon pages — even those with dynamic JavaScript rendering. 📊 Centralized Tracking All products and price history are managed in a Google Sheet, making it easy to review and update data. ⚡ Real-Time Alerts Sends instant Telegram notifications when a product’s price drops below its previous minimum — helping users take quick advantage of deals. 🔁 Fully Automated Once set up, it runs on a schedule with no manual input required, automatically updating and alerting users. 🧩 Modular & Extensible Built entirely with n8n nodes, making it easy to customize — for example, adding new alert channels (email, Slack) or additional data checks. 🕒 Time-Efficient Eliminates the need for manual price checking, saving significant time for users monitoring multiple products. How it Works This automated workflow tracks Amazon product prices and sends an alert via Telegram when a product hits a new lowest price. Here's the process: Trigger & Data Fetch: The workflow is initiated either on a scheduled basis (every 10 minutes) or manually. It first connects to a designated Google Sheet, which acts as a database, to fetch a list of products to monitor. Each product's details (Name, URL, and current "MIN PRICE") are read. Price Scraping & Comparison: The workflow loops through each product from the sheet. For each product, it uses ScrapeGraphAI to navigate to the Amazon product page, render JavaScript-heavy content, and extract the current price. This newly scraped price is then compared to the "MIN PRICE" value stored in the Google Sheet for that product. Conditional Alert & Update: If the new price is lower, two actions are triggered: a. Sheet Update: The Google Sheet is updated with the new, lower "MIN PRICE" and the current date. b. Telegram Notification: A message is sent to a specified Telegram chat, announcing that the product has hit a new lowest price, including the product name and a link. If the price is not lower, no action is taken for that product, and the workflow moves on to the next one in the loop. Set up Steps To implement this workflow yourself, follow these steps: Prepare the Google Sheet: Create a copy of the provided template spreadsheet. In the sheet, fill in the columns for PRODUCT (name), URL (the full Amazon product link), and MIN PRICE. When adding a new product, set the MIN PRICE to a very high value (e.g., 9999) to ensure the first real price triggers an alert. Configure n8n Credentials: Google Sheets: Set up a "Google Sheets account" credential in n8n using OAuth2 to grant the workflow access to your copied spreadsheet. ScrapeGraphAI: Configure the "ScrapegraphAI account" credential with your API key from the ScrapeGraphAI service. Telegram: Set up a "Telegram account" credential with a Bot Token obtained from the BotFather in Telegram. You will also need your specific chatId for the node. Need help customizing? Contact me for consulting and support or add me on Linkedin.
by Jitesh Dugar
Validated RSVP Confirmation with Automated Badge Generation Overview: This comprehensive workflow automates the entire event RSVP process from form submission to attendee confirmation, including real-time email validation and personalized digital badge generation. ✨ KEY FEATURES: • Real-time Email Validation - Verify attendee emails using VerifiEmail API to prevent fake registrations • Automated Badge Generation - Create beautiful, personalized event badges with attendee details • Smart Email Routing - Send confirmation emails with badges for valid emails, rejection notices for invalid ones • Comprehensive Logging - Track all RSVPs (both valid and invalid) in Google Sheets for analytics • Dual Path Logic - Handle valid and invalid submissions differently with conditional branching • Anti-Fraud Protection - Detect disposable emails and invalid domains automatically 🔧 WORKFLOW COMPONENTS: Webhook Trigger - Receives RSVP submissions Email Validation - Verifies email authenticity using VerifiEmail API Conditional Logic - Separates valid from invalid submissions Badge Creator - Generates HTML-based personalized event badges Image Converter - Converts HTML badges to shareable PNG images using HTMLCssToImage Email Sender - Delivers confirmation with badge or rejection notice via Gmail Data Logger - Records all attempts in Google Sheets for tracking and analytics 🎯 PERFECT FOR: • Conference organizers managing hundreds of RSVPs • Corporate event planners requiring verified attendee lists • Webinar hosts preventing fake registrations • Workshop coordinators issuing digital badges • Community event managers tracking attendance 💡 BENEFITS: • Reduces manual verification time by 95% • Eliminates fake email registrations • Creates professional branded badges automatically • Provides real-time RSVP tracking and analytics • Improves attendee experience with instant confirmations • Maintains clean, verified contact lists 🛠️ REQUIRED SERVICES: • n8n (cloud or self-hosted) • VerifiEmail API (https://verifi.email) • HTMLCssToImage API (https://htmlcsstoimg.com) • Gmail account (OAuth2) • Google Sheets 📈 USE CASE SCENARIO: When someone submits your event RSVP form, this workflow instantly validates their email, generates a personalized badge with their details, and emails them a confirmation—all within seconds. Invalid emails receive a helpful rejection notice, and every submission is logged for your records. No manual work required! 🎨 BADGE CUSTOMIZATION: The workflow includes a fully customizable HTML badge template featuring: • Gradient background with modern design • Attendee name, designation, and organization • Event name and date • Email address and validation timestamp • Google Fonts (Poppins) for professional typography 📊 ANALYTICS INCLUDED: Track metrics like: • Total RSVPs received • Valid vs invalid email ratio • Event-wise registration breakdown • Temporal patterns • Organization/company distribution ⚡ PERFORMANCE: • Processing time: ~3-5 seconds per RSVP • Scales to handle 100+ concurrent submissions • Email delivery within 10 seconds • Real-time Google Sheets updates 🔄 EASY SETUP: Import the workflow JSON Configure your credentials (detailed instructions included) Create your form with required fields (name, email, event, designation, organization) Connect the webhook Activate and start receiving validated RSVPs! 🎓 LEARNING VALUE: This workflow demonstrates: • Webhook integration patterns • API authentication methods • Conditional workflow branching • HTML-to-image conversion • Email automation best practices • Data logging strategies • Error handling techniques
by Mohammed Abid
Shopify Order Data to Airtable This n8n template demonstrates how to capture incoming Shopify order webhooks, transform the data into a structured format, and insert each product line item as a separate record in an Airtable sheet. It provides both high-level order information and detailed product-level metrics, making it ideal for analytics, reporting, inventory management, and customer insights. Good to Know Airtable API Rate Limits: By default, Airtable allows 5 requests per second per base. Consider batching or adding delays if you process high volumes of orders. Shopify Webhook Configuration: Ensure you have configured the orders/create webhook in your Shopify Admin to point to the n8n webhook node. Field Mapping: The template maps standard Shopify fields; if your store uses custom order or line item properties, update the Function nodes accordingly. How It Works Webhook Trigger: A Shopify orders/create webhook fires when a new order is placed. Normalize Order Data: The Function node extracts core order, customer, shipping, and billing details and computes financial totals (subtotal, tax, shipping, discounts). Line Item Breakdown: A second Function node builds an array of objects—one per line item—calculating per-item totals, tax/shipping allocation, and product attributes (color, size, material). Check Customer Record: Optionally check against an Airtable "Customers" sheet to flag new vs existing customers. Auto-Increment Record ID: A Function node generates a running serial number for each Airtable record. Insert Records: The Airtable node writes each line item object into the target base and table, creating rich records with both order-level and product-level details. How to Use Clone the Template: Click "Use Template" in your n8n instance to import this workflow. Configure Credentials: Shopify Trigger: Add your Shopify store domain and webhook secret. Airtable Node: Set up your Airtable API key and select the base and table. Review Field Names: Match the field names in the Function nodes to the columns in your Airtable table. Activate Workflow: Turn on the workflow and place a test order in your Shopify store. Verify Records: Check your Airtable sheet to see the new order and its line items. Requirements n8n@latest Shopify Store with orders/create webhook configured Airtable Account with a base and table ready to receive records Customizing This Workflow Add Custom Fields: Extend the Functions to include additional Shopify metafields, discounts, or customer tags. Alternative Destinations: Replace the Airtable node with Google Sheets, Supabase, or another database by swapping in the corresponding node. Error Handling: Insert If/Wait nodes to retry on API failures or send notifications on errors. Multi-Currency Support: Adapt the currency logic to convert totals based on dynamic exchange rates.
by Oneclick AI Squad
This n8n workflow automates email blasts with follow-ups and response tracking by reading contact data from a Google Sheet daily, looping through contacts to send personalized emails based on follow-up stages via Gmail, updating the sheet with status changes, and monitoring replies for logging. Why Use It This workflow streamlines email marketing campaigns by automating personalized email distribution, managing follow-up sequences, and tracking responses without manual intervention, saving time, improving engagement, and providing actionable insights into contact interactions. How to Import It Download the Workflow JSON: Obtain the workflow file from the n8n template or create it based on this document. Import into n8n: In your n8n instance, go to "Workflows," click the three dots, select "Import from File," and upload the JSON. Configure Credentials: Set up Gmail and Google Sheets credentials in n8n. Run the Workflow: Activate the scheduled trigger and test with a sample Google Sheet. System Architecture Email Blast Pipeline**: Daily Trigger - 9 AM: Initiates the workflow daily at 9 AM via Cron. Read Contact Data from Google Sheet: Fetches contact details from the sheet. Loop Through Contacts: Processes each contact individually. Determine Follow-Up Stage: Identifies the current stage for each contact. Send Main/Follow-Up Email: Delivers the appropriate email via Gmail. Update Sheet Status: Updates the Google Sheet with the latest status. Response Tracking Flow**: Check Gmail for Replies: Monitors Gmail for email responses. Log Responses: Records responses in the Google Sheet. Google Sheet File Structure Sheet Name**: EmailCampaign Range**: A1:F10 (or adjust based on needs) | A | B | C | D | E | F | |------------|------------|---------------|---------------|---------------|---------------| | name | email | stage | last_email_date | status | response | | John Doe | john@example.com | Initial | 2025-08-07 | Pending | | | Jane Smith | jane@example.com | Follow-Up 1 | 2025-08-06 | Sent | "Interested" | | Bob Jones | bob@example.com | Follow-Up 2 | 2025-08-05 | Replied | "Follow up later" | Columns**: name: Contact’s full name. email: Contact’s email address for sending emails. stage: Current follow-up stage (e.g., Initial, Follow-Up 1, Follow-Up 2). last_email_date: Date of the last email sent. status: Current status (e.g., Pending, Sent, Replied). response: Logged response from the contact (updated after reply detection). Customization Ideas Adjust Schedule**: Change the Cron trigger to hourly or weekly based on campaign needs. Add Email Templates**: Customize email content for different stages or audiences. Incorporate SMS**: Add WhatsApp or SMS follow-ups using additional nodes. Enhance Tracking**: Integrate a dashboard (e.g., Google Data Studio) for real-time campaign analytics. Automate Segmentation**: Add logic to segment contacts by industry or interest for targeted emails. Requirements to Run This Workflow Google Sheets Account**: For storing and managing contact data and responses. Gmail Account**: For sending emails and checking replies (requires IMAP enabled). n8n Instance**: With Google Sheets and Gmail connectors configured. Cron Service**: For scheduling the daily trigger. Internet Connection**: To access Google Sheets and Gmail APIs. API Credentials**: Gmail OAuth2 and Google Sheets API credentials set up in n8n. Notes Ensure the Google Sheet is shared with the n8n service account or has appropriate permissions. Test the workflow with a small contact list to verify email delivery and response logging. Adjust the stage logic in the "Determine Follow-Up Stage" node to match your campaign structure.
by WeblineIndia
Automate Video Upload → Auto-Thumbnail → Google Drive This workflow accepts a video via HTTP upload, verifies it’s a valid video, extracts a thumbnail frame at the 5-second mark using FFmpeg (auto-installs a static build if missing), uploads the image to a specified Google Drive folder and returns a structured JSON response containing the new file’s details. Who’s it for Marketing / Social teams** who need ready-to-publish thumbnails from raw uploads. Developers** who want an API-first thumbnail microservice without standing up extra infrastructure. Agencies / Creators** standardizing assets in a shared Drive. How it works Accept Video Upload (Webhook) Receives multipart/form-data with file in field media at /mediaUpload. Response is deferred until the final node. Validate Upload is Video (IF) Checks {{$binary.media.mimeType}} contains video/. Non-video payloads can be rejected with HTTP 400. Persist Upload to /tmp (Write Binary File) Writes the uploaded file to /tmp/<originalFilename or input.mp4> for stable processing. Extract Thumbnail with FFmpeg (Execute Command) Uses system ffmpeg if available; otherwise downloads a static binary to /tmp/ffmpeg. Runs: ffmpeg -y -ss 5 -i -frames:v 1 -q:v 2 /tmp/thumbnail.jpg Load Thumbnail from Disk (Read Binary File) Reads /tmp/thumbnail.jpg into the item’s binary as thumbnail. Upload Thumbnail to Drive (Google Drive) Uploads to your target folder. File name is <original>-thumb.jpg. Return API Response (Respond to Webhook) Sends JSON to the client including Drive file id, name, links, size, and checksums (if available). How to set up Import the workflow JSON into n8n. Google Drive Create (or choose) a destination folder; copy its Folder ID. Add Google Drive OAuth2 credentials in n8n and select them in the Drive node. Set the folder in the Drive “Upload” node. Webhook The endpoint is POST /webhook/mediaUpload. Test: curl -X POST https://YOUR-N8N-URL/webhook/mediaUpload \ -F "media=@/path/to/video.mp4" FFmpeg Nothing to install manually: the Execute Command node auto-installs a static ffmpeg if it’s not present. (Optional) If running n8n in Docker and you want permanence, use an image that includes ffmpeg. Response body The Respond node returns JSON with file metadata. You can customize the fields as needed. (Optional) Non-video branch On the IF node’s false output, add a Respond node with HTTP 400 and a helpful message. Requirements n8n instance with Execute Command node enabled (self-hosted/container/VM). Outbound network** access (to fetch static FFmpeg if not installed). Google Drive OAuth2** credential with permission to the destination folder. Adequate temp space in /tmp for the uploaded video and generated thumbnail. How to customize Timestamp**: change -ss 5 to another second, or parameterize it via query/body (e.g., timestamp=15). Multiple thumbnails**: duplicate the FFmpeg + Read steps with -ss 5, -ss 15, -ss 30, suffix names -thumb-5.jpg, etc. File naming**: include the upload time or Drive file ID: {{ base + '-' + $now + '-thumb.jpg' }}. Public sharing: add a **Drive → Permission: Create node (Role: reader, Type: anyone) and return webViewLink. Output target: replace the Drive node with **S3 Upload or Zoho WorkDrive (HTTP Request) if needed. Validation**: enforce max file size/MIME whitelist in a small Function node before writing to disk. Logging**: append a row to Google Sheets/Notion with sourceFile, thumbId, size, duration, status. Add-ons Slack / Teams notification** with the uploaded thumbnail link. Image optimization** (e.g., convert to WebP or resize variants). Retry & alerts** via error trigger workflow. Audit log** to a database (e.g., Postgres) for observability. Use Case Examples CMS ingestion**: Editors upload videos; workflow returns a thumbnail URL to store alongside the article. Social scheduling**: Upload longform to generate a quick hero image for a post. Client portals**: Clients drop raw footage; you keep thumbnails uniform in one Drive folder. Common troubleshooting | Issue | Possible Cause | Solution | | ----------------------------------------------------- | ------------------------------------------------------ | ---------------------------------------------------------------------------------------------------------------- | | ffmpeg: not found | System lacks ffmpeg and static build couldn’t download | Ensure outbound HTTPS allowed; keep the auto-installer lines intact; or use a Docker image that includes ffmpeg. | | Webhook returns 400 “not a video” | Wrong field name or non-video MIME | Send file in media field; ensure it’s video/*. | | Drive upload fails (403 / insufficient permissions) | OAuth scope or account lacks folder access | Reconnect Drive credential; verify the destination Folder ID and sharing/ownership. | | Response missing webViewLink / webContentLink | Drive node not returning link fields | Enable link fields in the Drive node or build URLs using the returned id. | | 413 Payload Too Large at reverse proxy | Proxy limits on upload size | Increase body size limits in your proxy (e.g., Nginx client_max_body_size). | | Disk full / ENOSPC | Large uploads filling /tmp | Increase temp storage; keep Cleanup step; consider size caps and early rejection. | | Corrupt thumbnail or black frame | Timestamp lands on a black frame | Change -ss or use -ss before -i vs. after; try different seconds (e.g., 1–3s). | | Slow extraction | Large or remote files; cold FFmpeg download | Warm the container; host near upload source; keep static ffmpeg cached in image. | | Duplicate outputs | Repeat requests with same video/name | Add a de-dup check (query Drive for existing <base>-thumb.jpg before upload). | Need Help? Want this wired to S3 or Zoho WorkDrive or to generate multiple timestamps and public links out of the box? We're happy to help.
by AI/ML API | D1m7asis
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 🌅 Daily AI Inspiration (n8n + AI/ML API + Telegram) This n8n workflow sends a short, original AI‑generated quote and a matching cinematic image to your Telegram chat every morning. It auto‑captures your chat ID from the first message you send to the bot, then runs on a daily schedule. 🚀 Key Features Zero‑Friction Delivery — Just send any message once; the chat ID is saved for daily drops. AI Quote Writer — GPT‑4o crafts concise, uplifting quotes (no author, no quotes). Cinematic Visuals — flux-pro turns each quote into a mood‑rich illustration. Hands‑Off Scheduling — Runs at a set time every day via Schedule Trigger. Telegram Ready — Sends the image + caption directly to your chat. 🛠 Setup Guide Create AI/ML API Credentials Get your API key in AI/ML API. In n8n → Credentials, add AI/ML API (Bearer token, Base URL https://api.aimlapi.com/v1). Create Telegram Credentials In Telegram, open @BotFather → /newbot → save the bot token. In n8n → Credentials → Telegram API, paste the token. Capture Chat ID Start the workflow, message your bot once (Telegram Trigger will store the chat ID automatically). Schedule & Test Set your preferred time in Schedule Trigger (e.g., 07:30). Execute once to confirm delivery, then enable the workflow. 💡 How It Works Trigger — Runs daily via Schedule Trigger (or manually after first chat message to capture chat ID). Quote Generation — AI/ML API (GPT‑4o) produces a short, original, uplifting line. Image Creation — AI/ML API (flux-pro) renders a cinematic image inspired by the quote. Telegram Delivery — Sends the image to your chat with the quote as the caption (🌅 prefix). Optional: tweak image size (1024×1024 by default), add logging (Google Sheets), or extend with moderation, model switching, or multi‑chat routing.
by CryptooChai
This workflow fetches the latest trending cryptocurrency searches from CoinGecko and automatically sends them to your Telegram group/channel. ✅ No account or API key required ✅ Uses CoinGecko’s free public API ✅ Sends formatted daily updates to Telegram ✅ Easy to customize schedule with a Cron node Perfect for community managers, traders, or anyone who wants to keep their Telegram group updated with the hottest crypto trends. How it works Manual Trigger (for testing) & Schedule Trigger (runs automatically at set times) Uses CoinGecko API https://api.coingecko.com/api/v3/search/trending Extracts coin name, symbol, price (if available), and market cap rank Formats into a readable Telegram message & sends the message to your configured group/channel How to use Import the workflow: Download the JSON file and import it into your n8n instance. Connect your Telegram account: Add Telegram credentials in n8n using your Bot Token. Replace the placeholder Telegram Chat ID (chatId) with your group/channel ID. Adjust schedule (optional): By default, the workflow runs at 8:30 AM and 8:30 PM IST. You can change this in the Schedule Trigger node. Activate the workflow: Once configured, activate it, and your Telegram group will start receiving daily trending coin updates. Requirements An n8n instance (self-hosted or cloud) A Telegram Bot Token (create via BotFather) Telegram Group or Channel ID where messages should be sent
by Paul Kobelke
Remove Duplicates & Update Google Sheets How it Works This workflow helps you keep your Google Sheets clean and up-to-date by automatically removing duplicate entries and re-uploading the cleaned data back to your sheet. It’s especially useful for large lead lists, email databases, or any dataset where duplicate rows can cause confusion and inefficiency. The flow: Trigger the workflow manually. Fetch all rows from a specific Google Sheet. Identify and remove duplicate rows based on the profileUrl field. Convert the cleaned dataset into a file. Update the original Google Sheet with the new, deduplicated data. Setup Steps Connect your Google Sheets and Google Drive credentials in n8n. Update the workflow with your desired spreadsheet and sheet ID. Run the workflow by clicking “Execute Workflow” whenever you want to clean up your data. The process only takes a few seconds and ensures your sheet stays organized without any manual effort. Use Cases CRM lead management (avoiding duplicate prospects). Contact lists with scraped or imported data. Marketing databases with overlapping submissions.
by M Sayed
Telegram Restaurant Bot Workflow This workflow creates a Telegram bot that fetches the top 5 rated restaurants for any specified area in Egypt using SerpAPI's Google Maps search. It's designed to provide quick, detailed, and richly formatted information directly in your chat. Key Features Simple Command Trigger**: Activate the search with a straightforward command (e.g., التحرير) Real-Time Restaurant Data**: Utilizes SerpAPI to pull live data, ratings, and details from Google Maps Top 5 Ranking**: Automatically sorts restaurants by their rating in descending order and presents the top five Richly Formatted Replies**: Generates a clean, user-friendly Markdown message with essential details: Rating ⭐ Phone Number ☎️ Website 🌐 Service Options (Dine-in ✅ | Takeaway ❌) A direct link to the location on a map 📍 Arabic Language Focused**: The workflow is initially configured to process requests and format replies in Arabic How It Works A user sends a place name like التحرير to the Telegram bot The Parse Area node extracts the location name from the message text The Geocode (Nominatim) node finds the geographic coordinates for the area (this can be adapted for more precise location searches) The Find Restaurants (SerpAPI) node uses this area to perform a Google Maps search The Format Reply node processes the search results. It sorts them by rating, takes the top 5, and builds a detailed Markdown-formatted string The Send to Telegram node delivers the final formatted message back to the user who made the request Setup Telegram**: Configure your credentials in the Telegram Trigger and Send to Telegram nodes SerpAPI**: Add your free or paid API key in the Find Restaurants (SerpAPI) HTTP Request node
by Robert Breen
This workflow checks a Google Sheet for new tasks (marked Added = No) and automatically creates them in a Monday.com board. Once added, the workflow updates the sheet to mark the task as Added = Yes. ⚙️ Setup Instructions 1️⃣ Prepare Your Google Sheet Copy this template to your own Google Drive: Google Sheet Template First row should contain column names Add your data in rows 2–100. Make sure each new task row starts with Added = No. Connect Google Sheets in n8n Go to n8n → Credentials → New → Google Sheets (OAuth2) Log in with your Google account and grant access. In the workflow, select your Spreadsheet ID and Worksheet Name. Optional: You can connect Airtable, Notion, or your database instead of Google Sheets. 2️⃣ Connect Monday.com Node In Monday.com → go to your Admin → API Copy your Personal API Token Docs: Generate Monday API Token In n8n → Credentials → New → Monday.com API Paste your token and save. Open the Create Monday Task node → choose your credential → select your Board ID and Group ID. 📬 Contact Need help customizing this (e.g., mapping more fields, syncing statuses, or updating timelines)? 📧 robert@ynteractive.com 🔗 Robert Breen 🌐 ynteractive.com