by 1Shot API
Launch ERC-20 Tokens with Telegram Bot and 1Shot API Custodial Wallets Find the full walk-through tutorial video for this workflow on YouTube. This Telegram bot template demonstrates some useful patterns for creating crypto-powered multi-user bot applications like: How to create custodial wallets tied to specific telegram users. How submit onchain transactions from a specific user's wallet when they interact with your bot. Storing transaction logs in n8n data tables Settings up inline keyboards for use triggering specific onchain actions. Starting with this workflow, you can extend it for many different crypto/onchain applications as well as introduce fees to generate revenue in return for performing onchain actions for your users. NOTE: This is somewhat advanced setup as it requires the use of datatables, you will need to self-host this workflow. Setup You'll need to perform the following steps to get this Telegram bot workflow up and running: Create a free 1Shot API account & generate an API Key & Secret. Use the API credentials to authenticate the 1Shot API nodes. Create a new Telegram bot using the @BotFather bot to get an api key. Use the bot API key to authenticate the Telegram nodes. Create a two Data Tables in your n8n account: User Table & Token Table. The User Table should have chatid, wallet and walletid as columns. The Token Table should have wallet, token, name and chatid as columns. These tables are used to the mappings between your bot users and their wallets and tokens. Import the 1Shot API Token Factory contract into your 1Shot API account. This smart contract is needed to deploy ERC-20 tokens on behalf of your bot users. You can also watch the associated YouTube tutorial for a step-by-step walkthrough of these points.
by Sheikh Masem Mandal
🚀 Cybersecurity News Automation Workflow This n8n automation workflow fetches daily cybersecurity news, cleans it, summarizes with AI, and posts it automatically to a Telegram channel. 🔎 Workflow Steps 1. Triggering the Workflow 9 AM - Schedule Trigger: Starts the workflow every day at 9 AM. 2. Fetching Cybersecurity News Bleeping Computer Security Bulletin: Pulls the latest news from the RSS feed. 3. Processing Articles HTTP Request → Filter Body → Extract Images: Retrieves the full article, cleans the HTML, and pulls image links. AI Agent (OpenRouter Grok): Summarizes the article snippet into 2 short sentences. Memory Buffer: Maintains short-term context across articles. 4. Merging Data Merge Node: Combines images, cleaned text, and AI-generated summaries. 5. Filtering Sponsored Content Sponsored Removal: Excludes articles with “Sponsored” in the creator field. 6. Posting to Telegram Loop Over Items + Send Photo Message: Publishes sponsor-free, summarized articles to @DailySecurityNewss telegram Channel. Each post includes: title, author, date, AI summary, categories, image (if available), and the “Read more” link. Wait 1 min: Adds a short delay to avoid spamming Telegram. 🎯 Outcome ✅ Daily feed of cybersecurity news ✅ Clean, AI-simplified summaries ✅ Images & links preserved ✅ Sponsored posts filtered ✅ Auto-posted to Telegram at 9 AM
by DuyTran
Overview This n8n workflow automatically fetches the latest news articles from multiple RSS sources, filters for the last 24 hours, summarizes them into a concise ~400-word digest in Vietnamese, and then delivers the result directly to Zalo and Telegram. It’s designed for professionals who need a quick, AI-curated overview of daily news without manually browsing multiple websites. 🧩 Key Features ⏰ Triggers Schedule Trigger: Run at specific times (e.g., morning briefing). Zalo & Telegram Triggers: Start workflow when requested via chat. 🌐 News Collection Fetches news from 4 RSS sources (RSS.app, Google News, etc.). Extracts standardized fields (title, pubDate, content). 🔍 Filtering & Processing Keeps only news published in the last 24h. Limits to 20 most recent items. Aggregates multiple feeds into one dataset. 🧠 AI Summarization Uses OpenAI Assistant to generate 15–19 highlights (~400 words). Translates into Vietnamese, removes special symbols. Optionally calls Perplexity AI to refine content into a financial–economic–political style summary. Maintains short-term context with Memory Buffer for improved output. 📲 Delivery Channels Sends digest directly to Zalo (personal & group chat). Sends digest to Telegram bot. ⚙️ Workflow Steps Trigger – schedule or chat command (Zalo/Telegram). RSS Fetchers (4 feeds) – collect news. Edit Fields – normalize title, date, and content. Merge & Filter – unify feeds, keep only last 24h. Limit & Aggregate – select top 20 articles. AI Summarization – generate digest via OpenAI + Perplexity. Delivery – send results to Zalo & Telegram. 🔐 Requirements ✅ RSS source URLs (already set in workflow). 🔑 OpenAI API key. 🔑 Perplexity API key. 📲 Zalo User API + Telegram Bot API credentials. 📥 Example Use Case A financial analyst or business leader wants a daily briefing in Vietnamese. At 7 AM, they automatically receive a curated 400-word digest via Telegram and Zalo. Can also trigger the report on-demand from chat. 🛠 Customization Options Add/remove RSS sources. Adjust summary length (short/medium/long). Output to other channels (Email, Slack, Notion). Change language from Vietnamese → English. ⚠️ Limitations RSS feeds must be valid & accessible. Summaries depend on AI quality and may vary slightly. Perplexity API requires active subscription. 📌 SEO Tags n8n workflow, rss news summarizer, daily news digest, telegram news bot, zalo ai assistant, openai news summary, perplexity ai, financial political economic news
by Dart
Automatically generate a meeting summary from your meetings through Fireflies.ai, save it to a Dart document, and create a review task with the meeting link attached. What it does This workflow activates when a Fireflies.ai meeting is processed (via a webhook). It retrieves the meeting transcript via the FirefliesAI transcript node and uses an AI model to generate a structured summary. Who’s it for Teams or individuals needing automatic meeting notes. Project managers tracking reviews and actions from meetings. Users of Fireflies.ai and Dart who want to streamline their documentation and follow-up process. How to set up Import the workflow into n8n. Connect your Dart account (it will need workspace and folder access). Add your PROD webhook link from the webhook node to your Fireflies.ai API settings. Add your Fireflies.ai API key on the Fireflies Transcript node. Replace the dummy Folder ID and Dartboard ID with your actual target IDs. Choose your preferred AI model for generating the summaries. Requirements n8n account Connected Dart account Connected Fireflies.ai account (with access to API key and webhooks) How to customize the workflow Edit the AI prompt to adjust the tone, style, or format of the meeting summaries. Add, remove, or change the summary sections to match your needs (e.g., Key takeaways, Action Items, Summary).
by Amir Tadrisi
Direct Booking Site Generator Workflow This workflow instantly transforms any Airbnb listing into a polished, mobile-ready direct booking site hosted on the Netlify platform. Requirements 1. Install the Airbnb Scraper Node This workflow depends on the community package n8n-nodes-airbnb-scraper. Install it on your n8n instance (Settings → Community Nodes) before importing the workflow. 2. Generate the Required API Tokens | Credential | Purpose | Where to create it | | --- | --- | --- | | Airbnb Scraper API Token | Authenticates the Airbnb Scraper node so it can fetch listing data. | Sign up at scraper.shortrentals.ai and copy your API token from the dashboard. | Netlify Personal Access Token | Allows the workflow to create sites and deploy ZIP assets through the Netlify API. | Go to Netlify User Settings → Applications → Personal access tokens and generate a token with Deploy sites permissions. Store both tokens as credentials in n8n (Airbnb Scraper API and Netlify API Token) before executing the workflow. How the Workflow Works Manual Trigger & Listing Input – Provide any Airbnb listingId and run the workflow. Data Collection – n8n-nodes-airbnb-scraper pulls rich listing data (photos, amenities, host details, pricing, reviews, etc.). Static Site Generation – The Generate HTML Site node transforms that data into a premium, mobile-responsive landing page with sticky booking card, amenities grid, gallery, and shortrentals.ai credit. ZIP Packaging – Prepare Binary and Create ZIP convert the HTML into a Netlify-ready archive (rooted index.html). Netlify Deploy – Create Netlify Site spins up a new site (unique subdomain per run). Deploy ZIP uploads the packaged site via Netlify’s deploy API. Output – The final node returns the public URL, admin dashboard link, site ID, and deploy metadata so you can verify or reuse the site later. Need More Functionality? If you require conversion-ready sites with payments, Calendar Sync sync, or Booking engine, head to sitebuilder.shortrentals.ai to explore the full product suite. Questions or Custom Builds? Visit shortrentals.ai for product info and tutorials. Reach our team anytime at hello@shortrentals.ai. Happy hosting! 🚀
by rana tamure
Overview This n8n workflow, named "Keyword Search for Blogs," automates the process of gathering and organizing keyword research data for SEO purposes. It integrates with Google Sheets and Google Drive to manage input and output data, and leverages the DataForSEO API to fetch comprehensive keyword-related information, including related keywords, keyword suggestions, keyword ideas, autocomplete suggestions, subtopics, and SERP (Search Engine Results Page) analysis. The workflow is designed to streamline SEO research by collecting, processing, and storing data in an organized manner for blog content creation. Workflow Functionality The workflow performs the following key functions: Trigger: Initiated manually via the "When clicking ‘Test workflow’" node, allowing users to start the process on-demand. Input Data Retrieval: Reads primary keywords, location, and language data from a specified Google Sheet ("SEO PRO"). Spreadsheet Creation: Creates a new Google Sheet with a dynamic title based on the current date (e.g., "YYYY-MM-DD-seo pro") and predefined sheet names for organizing different types of keyword data (e.g., keyword, SERP, Content, related keyword, keyword ideas, suggested keyword, subtopics, autocomplete). Google Drive Integration: Moves the newly created spreadsheet to a designated folder ("seo pro") in Google Drive for organized storage. API Data Collection: Related Keywords: Fetches related keywords using the DataForSEO API (/v3/dataforseo_labs/google/related_keywords/live), including SERP information and keyword metrics like search volume, CPC, and competition. Keyword Suggestions: Retrieves keyword suggestions via the DataForSEO API (/v3/dataforseo_labs/google/keyword_suggestions/live). Keyword Ideas: Collects keyword ideas using the DataForSEO API (/v3/dataforseo_labs/google/keyword_ideas/live). Autocomplete Suggestions: Gathers Google autocomplete suggestions through the DataForSEO API (/v3/serp/google/autocomplete/live/advanced). Subtopics: Generates subtopics for the primary keyword using the DataForSEO API (/v3/content_generation/generate_sub_topics/live). People Also Ask & Organic Results: Pulls "People Also Ask" questions and organic SERP results via the DataForSEO API (/v3/serp/google/organic/live/advanced). Data Processing: Uses Split Out nodes to break down API responses into individual items for processing. Employs Edit Fields nodes to map and format data, extracting relevant fields like keyword, search intent, search volume, CPC, competition, keyword difficulty, and SERP item types. Filters SERP results to separate "People Also Ask" and organic results for targeted processing. Data Storage: Appends processed data to multiple sheets in the destination Google Sheet ("2025-06-08-seo pro") across different tabs: Master Sheet: Stores comprehensive data including keywords, search intent, related keywords, SERP analysis, and more. Related Keywords: Stores related keyword data with metrics. Suggested Keywords: Stores suggested keyword data. Keyword Ideas: Stores keyword ideas with relevant metrics. Autocomplete: Stores autocomplete suggestions. Subtopics: Stores generated subtopics. Organic Results: Stores organic SERP data with details like domain, URL, title, and description. Key Features Automation: Eliminates manual keyword research by automating data collection and organization. Scalability: Processes multiple keywords and their related data in a single workflow run, with a limit of 100 related items per API call. Dynamic Organization: Creates and organizes data in a new Google Sheet with a timestamped title, ensuring easy tracking of research over time. Comprehensive SEO Insights: Collects diverse SEO metrics (e.g., keyword difficulty, search intent, SERP item types) to inform content strategy. Error Handling: Uses filters to ensure only relevant data (e.g., "people_also_ask" or "organic" results) is processed and stored. Use Case This workflow is ideal for SEO professionals, content creators, and digital marketers who need to perform in-depth keyword research for blog content. It provides a structured dataset that can be used to identify high-potential keywords, understand search intent, analyze SERP competition, and generate content ideas, all of which are critical for optimizing blog posts to rank higher on search engines. Inputs Google Sheet ("SEO PRO"): Contains primary keywords, location names, and language names. Google Drive Folder: Destination folder ("seo pro") for storing the output spreadsheet. DataForSEO API Credentials: Requires HTTP Basic Authentication credentials for accessing DataForSEO API endpoints. Outputs A new Google Sheet titled with the current date (e.g., "2025-06-08-seo pro") containing multiple tabs: Master Sheet: Aggregated data for all keyword types. Related Keywords: Detailed metrics for related keywords. Suggested Keywords: Suggested keywords with metrics. Keyword Ideas: Keyword ideas with metrics. Autocomplete: Google autocomplete suggestions. Subtopics: Generated subtopics for content planning. Organic Results: Organic SERP data including domains, URLs, titles, and descriptions. Benefits Time-Saving: Automates repetitive tasks, reducing manual effort in keyword research. Organized Data: Stores all data in a structured Google Sheet for easy access and analysis. Actionable Insights: Provides detailed SEO metrics to guide content creation and optimization strategies. Scalable and Reusable: Can be reused for different keywords by updating the input Google Sheet. Technical Details Nodes: Utilizes n8n nodes including manualTrigger, googleSheets, googleDrive, httpRequest, splitOut, set, and filter. API Integration: Leverages DataForSEO API for real-time keyword and SERP data. Credentials: Requires Google Sheets OAuth2 and Google Drive OAuth2 credentials, along with DataForSEO HTTP Basic Authentication. Data Mapping: Uses set nodes to map API response fields to desired output formats, ensuring compatibility with Google Sheets. Potential Enhancements Add error handling for API failures or invalid inputs. Include additional DataForSEO API endpoints for more granular data (e.g., competitor analysis). Implement deduplication logic to avoid redundant keyword entries. Add a scheduling node to run the workflow automatically at regular intervals. This workflow is a powerful tool for SEO-driven content planning, providing a robust foundation for creating optimized blog content.
by The O Suite
The Bug Bounty Target Recon n8n workflow is a powerful automation tool for security professionals and ethical hackers. It efficiently automates the time-consuming process of external attack surface mapping. By taking a domain, the workflow performs DNS Lookups to identify all associated IP addresses, and then utilizes the Shodan API to query: Detailed service banners Open ports Technologies Known vulnerabilities This system delivers crucial, organized OSINT data, saving the user hours of manual scripting and reconnaissance, and providing a clear, actionable map of a target's exposed infrastructure.
by Sk developer
Threads Video Downloader & Google Drive Logger Automate downloading Threads videos from URLs, upload them to Google Drive, and log results in Google Sheets using n8n. API Source: Threads Downloader on RapidAPI Workflow Explanation | Node | Explanation | | ---------------------------------------- | -------------------------------------------------------------------------------------------------------------------------------- | | On form submission | Trigger workflow when a user submits a Threads URL via a form. | | Fetch Threads Video Data | Sends the submitted URL to Threads Downloader API to get video info. | | Check If Video Exists | Checks if the API returned a valid downloadable video URL. | | Download Threads Video File | Downloads the video from the API-provided URL. | | Upload Video to Google Drive | Uploads the downloaded video to a designated Google Drive folder. | | Set Google Drive Sharing Permissions | Sets sharing permissions so the uploaded video is accessible via a link. | | Log Success to Google Sheets | Records the original URL and Google Drive link in Google Sheets for successful downloads. | | Wait Before Logging Failure | Adds a pause before logging failed downloads to avoid timing issues. | | Log Failed Download to Google Sheets | Logs URLs with “N/A” for videos that failed to download. | How to Obtain a RapidAPI Key Go to Threads Downloader API on RapidAPI. Sign up or log in to RapidAPI. Subscribe to the API (free or paid plan). Copy the X-RapidAPI-Key from your dashboard and paste it into the n8n HTTP Request node. ✅ Note: Keep your API key private. How to Configure Google Drive & Google Sheets Google Drive Go to Google Drive and create a folder for videos. In n8n, create Google Drive OAuth2 credentials and connect your account. Configure the Upload Video node to target your folder. Google Sheets Create a spreadsheet with columns: URL | Drive_URL. Create Google API credentials in n8n (service account or OAuth2). Map the nodes to log successful or failed downloads. Google Sheet Column Table Example | URL | Drive_URL | | -------------------------------------------------------------------- | ------------------------------------------------------------------------------------ | | https://www.threads.net/p/abc123 | https://drive.google.com/file/d/xyz/view | | https://www.threads.net/p/def456 | N/A | Use Case & Benefits Use Case:** Automate downloading Threads videos for marketing, content archiving, or research. Benefits:** Saves time with automated downloads. Centralized storage in Google Drive. Keeps a clear log in Google Sheets. Works with multiple Threads URLs without manual effort.
by Anchor
Find Company linkedin Urls directly in Google sheets This n8n template shows how to populate a Google Spreadsheet with LinkedIn company URLs automatically using the Apify LinkedIn Company URL Finder actor from Anchor. It will create a new sheet with the matched LinkedIn URLs. You can use it to speed up lead research, keep CRM records consistent, or prep outreach lists — all directly inside n8n. Who is this for Sales Teams: Map accounts to their official LinkedIn pages fast. Recruiters: Locate company pages before sourcing. Growth Marketers: Clean and enrich account lists at scale. Researchers: Track competitors and market segments. CRM Builders: Normalize company records with an authoritative URL. Lead-Gen Agencies: Deliver verified company URLs at volume. How it works Write a list of company names in Google Sheets (one per row) The Apify node resolves each name to its LinkedIn company page The results are then stored in a new Google Sheet How to use In Google Sheets: Create a Google Sheet, rename the sheet companies, and add all the company names you want to resolve (one per row) In this Workflow: Open “Set google sheet URL & original sheet name” and replace the example Google Sheet URL, and the name of the sheet where your company names are. In the n8n credentials: Connect your Google Sheets account with read and write privileges. Connect your Apify account. In Apify: Sign up for this Apify Actor Requirements Apify account with access to LinkedIn Company URL Finder. A list of company names to process. Need Help? Open an issue directly on Apify! Avg answer in less than 24h Happy URL Finding!
by PDF Vector
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. High-Volume PDF to Markdown Conversion Convert multiple PDF documents to clean, structured Markdown format in bulk. Perfect for documentation teams, content managers, and anyone needing to process large volumes of PDFs. Key Features: Process PDFs from multiple sources (URLs, Google Drive, Dropbox) Intelligent LLM-based parsing for complex layouts Preserve formatting, tables, and structure Export to various destinations Workflow Components: Input Sources: Multiple file sources supported Batch Processing: Handle hundreds of PDFs efficiently Smart Parsing: Auto-detect when LLM parsing is needed Quality Check: Validate conversion results Export Options: Save to cloud storage or database Ideal For: Converting technical documentation Migrating legacy PDF content Building searchable knowledge bases
by Anton Bezman
Telegram RSS Autoposter This workflow monitors multiple RSS feeds, detects new posts, rewrites them using AI, and sends the formatted output to your Telegram account. It helps you keep your Telegram feed active with clear, polished, and ready-to-publish content — all fully automated. How It Works A scheduled trigger runs every 30 minutes and loads all RSS feed URLs from an n8n Data Table. Each feed is fetched, and only the most recent item is selected. The workflow checks whether this post has already been processed. If it’s new, the link is stored in a second Data Table used for deduplication. A Groq LLM rewrites the article text so it stays within Telegram’s limits and has clean, readable formatting. The final message is sent to your Telegram account. Requirements Groq API key Telegram bot token and your Telegram user ID Two n8n Data Tables: One for RSS feed URLs One for processed post links How to Set Up Add your RSS URLs to the Data Table. Insert your Groq API key into the Groq node. Add your Telegram bot token and user ID to the Telegram node. Adjust the schedule trigger interval if needed. Customization Ideas Replace the schedule trigger with a webhook for manual on-demand checks. Change the AI prompt for a different tone or formatting style. Send messages to a Telegram channel instead of a private chat. Add keyword filters to send only specific articles.
by SerpApi
Sync Google Maps Reviews to Google Sheets for Any Google Maps Query How it works This workflow accepts any query you might run on actual Google Maps to search for places. The search happens through SerpApi's Google Maps API. Once the workflow receives place results from Google Maps, it loops through each place fetching reviews using SerpApi's Google Maps Reviews API. By default, the workflow will be limited to fetch up to 50 reviews per place. This can be customized in the 'Set Review Limit' node}. The first page of reviews for a place will only return 8 reviews. All subsequent pages will return up to 20 reviews. The fetched reviews are sent to a connected Google Sheet. How to use Create a free SerpApi account here: https://serpapi.com/ Add SerpApi credentials to n8n. Your SerpApi API key is here: https://serpapi.com/manage-api-key Connect your Google Sheets accounts to n8n. Help available here: https://n8n.io/integrations/google-sheets/ Create a Google Sheet with these column headers: name, iso_date, rating, snippet Connect your Google Sheet in the 'Append Reviews' Google Sheet node Update the Search Query in the 'Search Google Maps' node to set your own query (Optional) Update the review limit from the default 50 in the 'Set Review Limit' node. Set it to a very high number (e.g. 50000) to get all possible reviews. Hit 'Test Workflow' to manually trigger the workflow. Limitations Can only retrieve the top 20 results from Google Maps. It won't paginate to get more results. The workflow could be extended to support Google Maps Pagination. Warning Each request to SerpApi consumes 1 search credit. Be mindful of how many search credits your account has before requesting more reviews than your account supports. As an example, if a Google Maps query returns 20 results and you fetch the default limit of 50 reviews per page, this will use up to 61 SerpApi search credits. Documentation Google Maps API Google Maps Reviews API SerpApi n8n Node Intro Guide