by System Admin
Tagged with: , , , ,
by System Admin
Tagged with: , , , ,
by System Admin
Tagged with: , , , ,
by System Admin
No description available
by System Admin
Tagged with: , , , ,
by System Admin
No description available
by System Admin
Tagged with: , , , ,
by Christian Moises
Since the Get Many Subreddit node often blocks requests because Reddit requires proper authentication headers, this workflow provides a reliable alternative. It uses the Reddit OAuth2 API through the HTTP Request node, processes the results, and outputs cleaned subreddit data. If you are using Get Many subreddit node and you are getting this error: *n8n You've been blocked by network security.To continue, log in to your Reddit account or use your developer token* Usecase: This is especially useful if you want to search multiple subreddits programmatically and apply filtering for members, descriptions, and categories. How It Works Trigger Input The workflow is designed to be called by another workflow using the Execute Workflow Trigger node. Input is passed in JSON format with parameters: { "Query": "RealEstateTechnology", "min_members": 0, "max_members": 20000, "limit": 50 } Fetch Subreddits The HTTP Request (Reddit OAuth2) node queries the Reddit API (/subreddits/search) with the given keyword and limit. Because it uses OAuth2 credentials, the request is properly authenticated and accepted by Reddit. Process Results Split Out: Iterates over each subreddit entry (data.children). Edit Fields: Extracts the following fields for clarity: Subreddit URL Description 18+ flag Member count Aggregate: Recombines the processed data into a structured output array. Output Returns a cleaned dataset with only the relevant subreddit details.(Saves token if Attached to an AI Agent) How to Use Import this workflow into n8n. In your main workflow, replace the Get Many Subreddit node with an Execute Workflow node and select this workflow. Pass in the required query parameters (Query, min_members, max_members, limit). Run your main workflow — results will now come through authenticated API requests without being blocked. Requirements Reddit OAuth2 API Credentials* (must be set up in n8n under *Credentials). Basic understanding of JSON parameters in n8n. An existing workflow that calls this one using Execute Workflow. Customizing This Workflow You can adapt this workflow to your specific needs by: Filtering by member range:** Add logic to exclude subreddits outside min_members or max_members. Expanding extracted fields:** Include additional subreddit properties such as created_utc, lang, or active_user_count. Changing authentication:** Switch to different Reddit OAuth2 credentials if managing multiple Reddit accounts. Integrating downstream apps:** Send the processed subreddit list to Google Sheets, Airtable, or a database for storage.
by Đỗ Thành Nguyên
Get Long-Lived Facebook Page Access Token with Data Table > Set up n8n self-hosted via Tino.vn VPS — use code VPSN8N for up to 39% off (affiliate link). Good to Know This workflow automatically solves the common issue of Facebook Page Access Tokens expiring. It proactively renews your Page Tokens and stores them in an n8n Data Table. It runs every two months, ensuring your Page Access Tokens remain valid. This guarantees seamless and uninterrupted automation for all your Facebook API integrations. How It Works The workflow performs the following steps to keep your tokens up to date: Schedule Trigger: The workflow runs on a set schedule — every two months by default. Set Parameters: It initializes the required credentials: client_id, client_secret, a short-lived user_access_token, and the app_scoped_user_id (all obtained from Facebook Developer Tools). Get Long-Lived User Token: It exchanges the short-lived User Access Token for a long-lived one. Get Page Tokens: Using the long-lived User Token, it fetches all pages you manage and their corresponding Page Access Tokens. Update Data Table: For each page, it extracts the access_token, name, and id, then performs an Upsert operation to update or insert rows in your n8n Data Table, ensuring the stored tokens are always current. How to Use Import: Import this JSON file into your n8n instance. Configure Credentials: Open the Set Parameters node and replace the placeholder values for client_id, client_secret, user_access_token, and app_scoped_user_id with your actual credentials from Facebook. Configure Data Table: Open the Upsert row(s) node. Select or create an n8n Data Table to store your tokens. Make sure the column mapping (token, name_page, id_page) matches your table schema. Activate: Save and activate the workflow. It will now run automatically based on your configured schedule. Requirements n8n instance :** > Set up n8n self-hosted via Tino.vn VPS — use code VPSN8N for up to 39% off (affiliate link). Facebook App:** A Facebook Developer App to generate the following credentials: client_id and client_secret A short-lived user_access_token app_scoped_user_id Data Table:** An n8n Data Table configured with columns to store token information (e.g., token, name_page, id_page). Customizing This Workflow Change Schedule:* To modify how often tokens are renewed, edit the *Schedule Trigger* node. You can change the interval from *2 months* to *1 month**, or schedule it for a specific day. Filter Pages:* If you only want to store tokens for specific pages, insert a *Filter* node right after *Split Out. Use the page name or ID to filter before sending data to **Upsert row(s). Alternative Storage:* Instead of an n8n Data Table, you can replace the *Upsert row(s)* node with another option (e.g., Google Sheets, a database, or a *Set** node) to store tokens elsewhere.
by Robert Breen
This n8n workflow pulls campaign data from Google Sheets and creates two pivot tables automatically each time it runs. ✅ Step 1: Connect Google Sheets In n8n, go to Credentials → click New Credential Select Google Sheets OAuth2 API Log in with your Google account and authorize access Use this sheet: 📄 Campaign Data Sheet Make sure the sheet includes: A Data tab (row 1 = headers, rows 2+ = campaign data) A tab for each pivot view (e.g. by Channel, by Campaign) 📬 Need Help? Feel free to reach out: 📧 robert@ynteractive.com 🔗 LinkedIn
by automedia
Automated Blog Monitoring System with RSS Feeds and Time-Based Filtering Overview This workflow provides a powerful yet simple foundation for monitoring blogs using RSS feeds. It automatically fetches articles from a list of your favorite blogs and filters them based on their publication date, separating new content from old. It is the perfect starting point for anyone looking to build a custom content aggregation or notification system without needing any API keys. This template is designed for developers, hobbyists, and marketers who want a reliable way to track new blog posts and then decide what to do with them. Instead of including a specific final step, this workflow intentionally ends with a filter, giving you complete freedom to add your own integrations. Use Cases Why would you need to monitor and filter blog posts? Build a Custom News Feed: Send new articles that match your interests directly to a Discord channel, Slack, or Telegram chat. Power a Newsletter: Automatically collect links and summaries from industry blogs to curate your weekly newsletter content. Create a Social Media Queue: Add new, relevant blog posts to a content calendar or social media scheduling tool like Buffer or Hootsuite. Archive Content: Save new articles to a personal database like Airtable, Notion, or Google Sheets to build a searchable knowledge base. How It Works Manual Trigger: The workflow starts when you click "Execute Workflow". You can easily swap this for a Schedule Trigger to run it automatically. Fetch RSS Feeds: It reads a list of RSS feed URLs that you provide in the "blogs to track" node. Process Each Feed: The workflow loops through each RSS feed individually. Filter by Date: It checks the publication date of every article and compares it to a timeframe you set (default is 60 days). Split New from Old: New articles are sent down the true path of the "Filter Out Old Blogs" node. Old articles are sent down the false path. This workflow leaves the true path empty so you can add your desired next steps. Setup and Customization This workflow requires minimal setup and is designed for easy customization. Add Your Blog Feeds: Find the "blogs to track" node. In the source_identifier field, replace the example URLs with the RSS feeds you want to monitor. // Add your target RSS feed URLs in this array ['https://blog.n8n.io/rss', 'https://zapier.com/blog/feeds/latest/'] Set the Time Filter: Go to the "max\_content\_age\_days" node. Change the value from the default 60 to your desired number of days. For example, use 7 to only get articles published in the last week. Customize Your Output (Required Next Step): This is the most important part\! Drag a new node and connect it to the true output of the "Filter Out Old Blogs" node. Example Idea: To save new articles to a Google Sheet, add a Split In Batches node followed by a Google Sheets node to append each new article as a new row.
by InfraNodus
Basic AI Chatbot that Retrieves Answers From Knowledge Base Using GraphRAG. Easiest setup, without vector database, external knowledge base, or OpenAI API keys. All you need is an InfraNodus graph with your knowledge. In this workflow, user sends a request to the InfraNodus GraphRAG system that will extract a reasoning ontology from a graph that you create (or that you can copy from our repository of public graphs) and generate a response directly to the user. How it works Receives a request from a user (via n8n or a publicly available URL chat bot if you replace the Chat Trigger with a webhook connected to the embeddable n8n Chat Widget that you can expose via a URL or add to any website. Sends the request to the knowledge graph in your InfraNodus account that contains a reasoning ontology represented as a knowledge graph. You can also use a standard graph — InfraNodus will use its underlying GraphRAG technology to generate the most relevant response. Sends the answer back to the user via chat or webhook (which is then delivered back via n8n chat widget Note: This is a simple example that will work well for occasionally providing responses to users. For a more advanced setup, you might want to build a more sophisticated workflow with AI agent node that would orchestrate among different InfraNodus expert graphs and chat memory, so the context of the conversation can be maintained. See our other workflows for examples. How to use • Just get an InfraNodus API key and add API authentication to your InfraNodus GraphRAG node. • In the same InfraNodus GraphRAG Nnode, provide the name of the graph you want to u. Note, these can be two different graphs ife for retrieval. Support If you wan to create your own reasoning ontology graphs, please, refer to this article on generating your own knowledge graph ontologies. You may also be interested to watch this video that explains the logic of this approach in detail: Help article on this specific workflow: Building expert ontology for InfraNodus GraphRAG n8n expert node.