by Fan Luo
Daily Company News Bot This n8n template demonstrates how to use Free FinnHub API to retrieve the company news from a list stock tickers and post messages in Slack channel with a pre-scheduled time. How it works We firstly define the list of stock tickers you are interested Loop over items to call FinnHub API to get the latest company news for the ticker Then we format the company news as a markdown text content which could be sent to Slack Post a new message in Slack channel Wait for 5 seconds, then move to the next ticker How to use Simply setup a scheduler trigger to automatically trigger the workflow Requirements FinnHub API Key Slack channel webhook Need Help? Contact me via My Blog or ask in the Forum! Happy Hacking!
by Airtop
Automating LinkedIn Competitive Monitoring Use Case Automatically track and summarize LinkedIn posts from key executives at competitor companies. This agent provides structured insights into hiring trends, product announcements, strategic shifts, and thought leadership, helping teams stay informed and responsive without manual monitoring. What This Automation Does This automation monitors and summarizes LinkedIn posts from competitor profiles and shares the results on Slack. It uses the following input parameters: Airtop Profile**: A browser profile authenticated to LinkedIn. Create one Google Sheet**: A document listing LinkedIn profile URLs of competitors, copy this one. Slack Channel**: The destination for sharing summarized post insights. How It Works Trigger: The workflow is scheduled to run weekly at a specific time. Data Collection: Retrieves the list of competitor LinkedIn URLs from a Google Sheet. Browser Automation: Uses Airtop to navigate to each LinkedIn profile and analyze up to 5 recent posts. Summarization: Summarizes number of recent posts, main topics, and engagement levels using Airtopβs AI. Slack Notification: Posts a formatted summary to a predefined Slack channel. Setup Requirements Airtop API Key β free to generate. An Airtop Profile authenticated to LinkedIn. Google Sheet with competitor post URLs, copy this one. Slack Bot credentials with access to the target channel. Next Steps Expand Coverage**: Add more competitor profiles to the Google Sheet to scale monitoring. Integrate with CRM**: Feed summarized insights into your CRM for competitor tracking. Enhance Analysis**: Include post-level engagement metrics over time for trend analysis. Read more about competitve analysis using Linkedin
by Rodrigue Gbadou
What this workflow does This n8n workflow collects client feedback through a form (Tally, Typeform, or Google Forms) and uses AI to analyze it. It automatically generates a summary of the positive points, highlights areas for improvement, and drafts a short social media post based on the feedback. Ideal for: Freelancers Customer support teams Online service providers Coaches and educators Setup steps Connect your form tool to the Webhook node (POST method) and make sure it sends a feedback field. Add your DeepSeek (or other GPT-compatible) API key to the AI request node. Configure the email node with your SMTP credentials and desired recipient address. Replace the Telegram node with Slack, Buffer, or another integration if you prefer. (Optional) Customize the prompt in the function node for different tone/language. π Estimated setup time: ~15 minutes π¬ Sticky notes are included and clearly positioned to guide you. Technologies used n8n Webhook node n8n Function node DeepSeek Chat or compatible AI API Email node (SMTP) Telegram node (or other integration) Sticky Notes for setup guidance Use cases Analyze feedback from onboarding or satisfaction surveys Create ready-to-publish social media content from real customer praise Help support or marketing teams act on feedback immediately
by Emad
This workflow automatically sends you a list of your daily meetings every morning via a Telegram bot. Use Cases: This workflow is useful for anyone who wants to be automatically informed of their daily meetings, especially for busy professionals, students, and anyone with a hectic schedule. Setup: Google Calendar connected to n8n A Telegram bot created and connected to n8n Your Telegram user ID specified Notes: You need to replace the placeholder in the Telegram node with your actual Telegram user ID. You can customize the formatting of the Telegram message in the JavaScript Code node.
by Alex Kim
Automate Video Creation with Luma AI Dream Machine and Airtable (Part 2) Description This is the second part of the Luma AI Dream Machine automation. It captures the webhook response from Luma AI after video generation is complete, processes the data, and automatically updates Airtable with the video and thumbnail URLs. This completes the end-to-end automation for video creation and tracking. π Airtable Base Template π Tutorial Video Setup 1. Luma AI Setup Ensure youβve created an account with Luma AI and generated an API key. Confirm that the API key has permission to manage video requests. 2. Airtable Setup Make sure your Airtable base includes the following fields (set up in Part 1): Use the Airtable Base Template linked above to simplify setup. Generation ID** β To match incoming webhook data. Status** β Workflow status (e.g., "Done"). Video URL** β Stores the generated video URL. Thumbnail URL** β Stores the thumbnail URL. 3. n8n Setup Ensure that the n8n workflow from Part 1 is set up and configured. Import this workflow and connect it to the webhook callback from Luma AI. How It Works 1. Webhook Trigger The Webhook node listens for a POST response from Luma AI once video generation is finished. The response includes: Video URL β Direct link to the video. Thumbnail URL β Link to the video thumbnail. Generation ID β Used to match the record in Airtable. 2. Process Webhook Data The Set node extracts the video data from the webhook response. The If node checks if the video URL is valid before proceeding. 3. Store in Airtable The Airtable node updates the record with: Video URL β Direct link to the video. Thumbnail URL β Link to the video thumbnail. Status β Marked as "Done." Uses the Generation ID to match and update the correct record. Why This Workflow is Useful β Automates the completion step for video creation β Ensures accurate record-keeping by matching generation IDs β Simplifies the process of managing and organizing video content β Reduces manual effort by automating the update process Next Steps Future Enhancements** β Adding more complex post-processing, video trimming, and multi-platform publishing.
by Shahrear
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Transform your expense tracking with automated AI receipt processing that extracts data and organizes it instantly. What this workflow does Monitors Google Drive for new receipt uploads (images/PDFs) Downloads and processes files automatically Extracts key data using VLM Run community node (merchant, amount, currency, date) Saves structured data to Google Sheets for easy tracking Setup Prerequisites: Google Drive/Sheets accounts, VLM Run API credentials, n8n instance. You need to install VLM Run community node. To install Community nodes you need to follow steps, Settings -> Community Nodes -> Install -> Search with name @vlm-run/n8n-nodes-vlmrun Quick Setup: Configure Google Drive OAuth2 and create receipt upload folder Add VLM Run API credentials Create Google Sheets with columns: Customer, Merchant, Amount, Currency, Date Update folder/sheet IDs in workflow nodes Test and activate How to customize this workflow to your needs Extend functionality by: Adding expense categories and approval workflows Connecting to accounting software (QuickBooks, Xero) Including Slack notifications for processed receipts Adding data validation and duplicate detection This workflow transforms manual receipt processing into an automated system that saves hours while improving accuracy.
by Lucas Peyrin
How it works This workflow is a robust and forgiving JSON parser designed to handle malformed or "dirty" JSON strings often returned by AI models or scraped from web pages. It takes a text string as input and attempts to extract and parse a valid JSON object from it. Cleans Input: It starts by trimming whitespace and removing common Markdown code fences (like ` Applies Multiple Fixes: It systematically attempts to correct common JSON errors in a specific order: Escapes unescaped control characters (like newlines) within strings. Fixes invalid backslash escape sequences. Removes trailing commas. Intelligently attempts to fix unescaped double quotes inside string values. Parses Strategically: If a direct parse fails, it tries to extract a potential JSON object from the text (e.g., finding a {...} block inside a larger sentence) and then re-applies the cleaning logic to that extracted portion. Outputs Clean Data: If successful, it outputs the parsed JSON fields. By default, it removes the detailed parsing_status object, but you can deactivate the final "Set" node to keep it for debugging. Set up steps Setup time: ~1 minute This workflow is designed to be used as a sub-workflow and requires no internal setup. In your main workflow, add an Execute Sub-Workflow node where you need to parse a messy JSON string. In the Workflow parameter, select this "Robust JSON Parser" workflow. Ensure the data you send to the node is a JSON object containing a text field, where the value of text is the string you want to parse. For example: { "text": "{\\\"key\\\": \\\"some broken json...\\\"}" }. The workflow will return the successfully parsed data. To see a detailed log of the cleaning process, simply deactivate the final Remove parsing_status node inside this workflow.
by Kunsh
A streamlined AI-powered tool that extracts actionable technical insights from HackerOne security reports for advanced bug bounty hunters. How It Works Send any HackerOne report URL (e.g., https://hackerone.com/reports/123456) to the chat interface. The AI agent will: Fetch the report JSON automatically Analyze for unique techniques, payloads, and root causes Extract reusable insights in a structured format Summarize with practical pentesting value Setup Requirements Google Gemini API credentials configured Chat interface deployed and accessible HackerOne report URLs Output Format Summary: One-liner impact statement Techniques: Payloads, code snippets, exploitation steps Pro Tips: Reusable insights for future hunts Perfect for rapid triage and building your personal exploit knowledge base.
by Rodrigue Gbadou
What this workflow does This n8n workflow connects to Google Search Console to fetch SEO performance data (clicks, impressions, CTR, and average position) for the last 7 days. It formats the results into a clean weekly summary and automatically sends it to your email inbox every Monday morning. Ideal for: Website owners Bloggers SEO consultants who want to track site performance over time without manual reporting. Setup steps Replace YOUR_SITE_URL in the HTTP Request node with your verified domain from Google Search Console. Connect your Google OAuth2 credentials to the HTTP Request node. Set up your SMTP credentials in the "Send Email" node. Adjust the recipient email address and subject line if necessary. (Optional) Customize the Function node to include more queries or format the report as a PDF. Estimated setup time: ~10 minutes Sticky notes are included in the workflow canvas to guide you step-by-step. Technologies used Google Search Console API SMTP Email Node n8n Function Node n8n HTTP Request Node n8n Sticky Notes
by SamirLiu
π Overview This workflow leverages Google Gemini 2.0 Flash multimodal AI to automatically generate detailed descriptions of video content from any public URL. It streamlines video understanding, making it ideal for content cataloging, accessibility, and content moderation. π‘ Use Cases βΏ Accessibility: Automatically generate detailed video descriptions for visually impaired users. π‘οΈ Content Moderation: Detect inappropriate or off-brand material without manual watching. ποΈ Media Cataloging: Enrich your media library with automatically extracted metadata. π Marketing & Branding: Gain fast insights into key elements, tone, and branding in video content. βοΈ Setup Instructions π Get a Gemini API Key Register at ai.google.dev and create an API key. Before running the workflow, set your Gemini API key as an environment variable named GeminiKey for secure access within the workflow. In the Set Input node, reference this environment variable instead of hardcoding the key. π Configure Video URL Replace the sample URL in the Set Input node with your desired public video URL. Ensure the video is directly accessible (no login or special permissions required). π Optional: Customize the Analysis Edit the prompt in the Analyze video Gemini node to focus on the most relevant video details for your use case (e.g., branding, key actions, visual elements). π Security Tip Use n8n's credentials manager or environment variables (like GeminiKey) to store your API key securely. Avoid hardcoding API keys directly in workflow nodes, especially in production environments. π How It Works π₯ Download the video from the provided URL. βοΈ Upload the video to Geminiβs server for processing. β³ Wait for Gemini to complete processing. π€ Analyze the video with Gemini AI using your customized prompt. π Output a comprehensive description of the video as videoDescription. β‘ Technical Details Uses HTTP Request nodes to interact with Gemini API endpoints. Handles file download, upload, status checking, and result retrieval. Customizable Gemini AI parameters for fine-tuned response. Main output: videoDescription (detailed text describing video content). π Quickstart Set your Gemini API key as the GeminiKey environment variable and configure your video URL in the workflow. Execute the workflow. Retrieve your rich, AI-generated video description for downstream use such as automation, tagging, or reporting.
by Jimleuk
This n8n demonstrates how to build a simple PostgreSQL MCP server to manage your PostgreSQL database such as HR, Payroll, Sale, Inventory and More! This MCP example is based off an official MCP reference implementation which can be found here -https://github.com/modelcontextprotocol/servers/tree/main/src/postgres How it works A MCP server trigger is used and connected to 5 tools: 2 postgreSQL and 3 custom workflow. The 2 postgreSQL tools are simple read-only queries and as such, the postgreSQL tool can be simply used. The 3 custom workflow tools are used for select, insert and update queries as these are operations which require a bit more discretion. Whilst it may be easier to allow the agent to use raw SQL queries, we may find it a little safer to just allow for the parameters instead. The custom workflow tool allows us to define this restricted schema for tool input which we'll use to construct the SQL statement ourselves. All 3 custom workflow tools trigger the same "Execute workflow" trigger in this very template which has a switch to route the operation to the correct handler. Finally, we use our standard PostgreSQL node to handle select, insert and update operations. The responses are then sent back to the the MCP client. How to use This PostgreSQL MCP server allows any compatible MCP client to manage a PostgreSQL database by supporting select, create and update operations. You will need to have a database available before you can use this server. Connect your MCP client by following the n8n guidelines here - https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-langchain.mcptrigger/#integrating-with-claude-desktop Try the following queries in your MCP client: "Please help me check if Alex has an entry in the users table. If not, please help me create a record for her." "What was the top selling product in the last week?" "How many high priority support tickets are still open this morning?" Requirements PostgreSQL for database. This can be an external database such as Supabase or one you can host internally. MCP Client or Agent for usage such as Claude Desktop - https://claude.ai/download Customising this workflow If the scope of schemas or tables is too open, try restrict it so the MCP serves a specific purpose for business operations. eg. Confine the querying and editing to HR only tables before providing access to people in that department. Remember to set the MCP server to require credentials before going to production and sharing this MCP server with others!
by Martech Mafia
Problem Monitoring SEO performance from Google Search Console (GSC) manually is repetitive and prone to human error. For marketers or analysts managing multiple domains, checking reports manually and copying data into spreadsheets or databases is time-consuming. There is a strong need for an automated solution that collects, stores, and updates SEO metrics regularly for easier analysis and dashboarding. Solution This workflow automatically pulls performance metrics from Google Search Console β including queries, pages, CTR, impressions, positions, and devices β and stores them in a structured format inside a NocoDB table. Itβs ideal for SEO specialists, marketing teams, or data analysts who need to automate SEO reporting and centralize data for analytics or dashboards (like Superset or Metabase). Setup Instructions Authorize your Google Search Console account Connect via OAuth2 (requires GSC API access). Create a NocoDB table Define fields to match GSC response: query (text) page (URL) device (text) clicks (number) impressions (number) ctr (percentage) position (number) Add credentials in n8n Use credential nodes for both: Google OAuth2 NocoDB API Token Customize schedule trigger Set the frequency (e.g., weekly) and adjust the domain/date range as needed. Generalize domains Replace specific domains like martechmafia.net with your-domain.com before submission. NocoDB Table Structure The NocoDB table must match the fields coming from GSC's Search Analytics API. Here's a sample schema: { "query": "string", "page": "string", "device": "string", "clicks": "number", "impressions": "number", "ctr": "number", "position": "number" }