by Manuel
Effortlessly optimize your workflow by automatically save all files you are receiving on Telegram to a Google Drive Folder. How it works Retrieve a message sent to your Telegram Bot containing a file Upload the file to your Google Drive Folder Set up Steps Create a Telegram Account and a Telegram Bot and connect your Telegram Bot to n8n by following the official n8n instructions Create a Google Drive Folder Connect your Google Drive with n8n following the official n8n instructions Set the right folder in the Google Drive node Use case examples Backup and Recovery Cross-Platform Access File Organization and Management File Collaboration and Sharing Storage Space Management
by Karol
How it works This workflow automates publishing content from any RSS feed directly to Facebook and Instagram. It reads new RSS entries, extracts the article content, generates a short social-media-friendly summary using an AI model, and then creates an AI-generated image based on the topic. The post is uploaded to Facebook and Instagram (via Graph API) and logged in Google Sheets for reference. Finally, a Telegram bot sends you a notification with links to the published posts. Set up steps Insert your RSS feed URL in the RSS Feed Trigger node. Configure Google Sheets credentials and replace the example sheet with your own. In Supabase Config, insert your Supabase URL and bucket name. In Facebook/Instagram nodes, replace [INSERT_YOUR_SITE_ID] with your own page or account ID. Connect your Facebook Graph API credentials (remove hardcoded tokens). Connect your OpenAI / Anthropic / Gemini credentials for text and image generation. Set up your Telegram Bot credentials if you want to receive notifications. Notes • Sticky notes inside the workflow explain each section (RSS trigger, filtering, content generation, posting, logging, notifications). • No credentials are saved in the template – you must connect your own before running. • All generated content (text + images) is fully automated but can be customized (e.g. change AI prompts for your preferred style).
by Trung Tran
🎧 IT Voice Support Automation Bot – Telegram Voice Message to JIRA ticket with OpenAI Whisper > Automatically process IT support requests submitted via Telegram voice messages by transcribing, extracting structured data, creating a JIRA ticket, and notifying relevant parties. 🧑💼 Who’s it for Internal teams that handle IT support but want to streamline voice-based requests. Employees who prefer using mobile/voice to report incidents or ask for support. Organizations aiming to integrate conversational AI into existing support workflows. ⚙️ How it works / What it does A user sends a voice message to a Telegram bot. The system checks whether it’s an audio message. If valid, the audio is: Downloaded Transcribed via OpenAI Whisper Backed up to Google Drive The transcription and file metadata are merged. The merged content is processed through an AI Agent (GPT) to extract structured request info. A JIRA ticket is created using the extracted data. The IT team is notified via Slack (or other channels). The requester receives a Telegram confirmation message with the JIRA ticket link. If the input is not audio, a polite rejection message is sent. 📌 Key Features Supports voice-based ticket creation Accurate transcription using Whisper Context-aware request parsing using GPT-4.1 mini Fully automated ticket creation in JIRA Notifies both IT and the original requester Cloud backup of original voice messages (Google Drive) 🛠️ Setup Instructions Prerequisites | Component | Required | |----------|----------| | Telegram Bot & API Key | ✅ | | OpenAI Whisper / Transcription Model | ✅ | | Google Drive Credentials (OAuth2) | ✅ | | Google Sheets or other storage (optional) | ⬜ | | JIRA Cloud API Access | ✅ | | Slack Bot or Webhook | ✅ | Workflow Steps Telegram Voice Message Trigger: Starts the flow when a user sends a voice message. Is Audio Message?: If false → reply "only voice is supported" Download Audio: Download .oga file from Telegram. Transcribe Audio: Use OpenAI Whisper to get text transcript. Backup to Google Drive: Upload original voice file with metadata. Merge Results: Combine transcript and metadata. Pre-process Output: Clean formatting before AI extraction. Transcript Processing Agent: GPT-based agent extracts: Requester name, department Request title & description Priority & request type Submit JIRA Request Ticket: Create ticket from AI-extracted data. Setup Slack / Email / Manual Steps: Optional internal routing or approvals. Inform Reporter via Telegram: Sends confirmation message with JIRA ticket link. 🔧 How to Customize Replace JIRA with Zendesk, GitHub Issues, or other ticketing tools. Change Slack to Microsoft Teams or Email. Add Notion/Airtable logging. Enhance agent to extract department from user ID or metadata. 📦 Requirements | Integration | Notes | |-------------|-------| | Telegram Bot | Used for input/output | | Google Drive | Audio backup | | OpenAI GPT + Whisper | Transcript & Extraction | | JIRA | Ticketing platform | | Slack | Team notification | Built with ❤️ using n8n
by Aditya Gaur
Who is this template for? This template is designed for teams who need to automate data retrieval from SharePoint lists using n8n. It is ideal for users who want to authenticate via OAuth and then use the token to access SharePoint API endpoints, pulling in list data directly into n8n. How it works The template first generates an OAuth token using the Microsoft OAuth API. This token is then used to authenticate requests to the SharePoint List API, allowing the workflow to fetch data from a specified SharePoint list. By following the n8n workflow, the user can configure the necessary credentials and endpoints to automate SharePoint data access securely. Setup steps Step 1: Replace {tenant_id}, {client_id}, and {client_secret} with your Azure AD details for OAuth authentication. Step 2: Specify the SharePoint list API endpoint in the template (under "SharePoint List Fetch" node). Step 3: Configure the SharePoint list URL and make adjustments for specific data fields if necessary.
by Keith Rumjahn
Who's this for? Anyone who wants to improve the SEO of their website Umami users who want insights on how to improve their site SEO managers who need to generate reports weekly Case study Watch youtube tutorial here Get my SEO A.I. agent system here You can read more about how this works here. How it works This workflow calls the Umami API to get data Then it sends the data to A.I. for analysis It saves the data and analysis to Baserow How to use this Input your Umami credentials Input your website property ID Input your Openrouter.ai credentials Input your baserow credentials You will need to create a baserow database with columns: Date, Summary, Top Pages, Blog (name of your blog). Future development Use this as a template. There's alot more Umami stats you can pull from the API. Change the A.I. prompt to give even more detailed analysis. Created by Rumjahn
by Airtop
Monitoring Job Changes on LinkedIn Use Case This automation tracks job changes among your LinkedIn connections and extracts relevant details. It's ideal for triggering timely outreach, updating CRM records, or feeding lead scoring workflows based on new roles. What This Automation Does It scrapes your LinkedIn "Job Changes" feed and returns: Name of the person Their new position LinkedIn profile URL Functional category (e.g., marketing, sales, HR, executive) Each run processes 5 job changes at a time. How It Works Manual Trigger: Starts the workflow when the user clicks "Test workflow." Airtop Enrichment: Navigates to the LinkedIn job changes page and extracts: name new_position linkedin_profile_url position_function (classification such as marketing, sales, HR, etc.) Formatting: Output is structured into clean JSON for use in further workflows. Setup Requirements Airtop Profile connected to LinkedIn Airtop API key configured in n8n A LinkedIn account with a populated “Job Changes” feed Next Steps Automate Alerts**: Add Slack, email, or CRM integrations to notify your team. Enrich and Score Leads**: Chain this with your ICP scoring workflow to evaluate new roles. Customize Scope**: Expand extraction to more than 5 job changes or add filters based on job titles or functions. Read more about Monitoring Job Changes on Linkedin.
by Rodrigue Gbadou
How it works Continuous monitoring**: Real-time surveillance of supplier performance, financial health, and operational status Risk scoring**: AI-powered assessment of supplier risks across multiple dimensions (financial, operational, geopolitical) Automated alerts**: Instant notifications when supplier risk levels exceed predefined thresholds Contingency activation**: Automatic triggering of backup suppliers and alternative sourcing plans Set up steps Supplier database**: Connect your ERP/procurement system with complete supplier information Financial data sources**: Integrate with credit monitoring services (Dun & Bradstreet, Experian) News monitoring**: Configure news APIs for real-time supplier-related news tracking Performance metrics**: Set up KPIs tracking (delivery times, quality scores, compliance) Alert systems**: Configure Slack, Teams, or email notifications for risk alerts Backup protocols**: Define alternative supplier activation procedures Key Features 🔍 360° supplier visibility**: Complete view of supplier ecosystem health and performance ⚡ Real-time risk detection**: Immediate identification of potential supply chain disruptions 📊 Predictive analytics**: Forecasting potential supplier issues before they impact operations 🚨 Automated escalation**: Risk-based alert system with appropriate stakeholder notifications 📈 Performance benchmarking**: Continuous comparison against industry standards and peers 🔄 Contingency management**: Automated backup supplier activation and procurement rerouting 🌍 Geopolitical monitoring**: Tracking of regulatory changes and political risks by region 💰 Cost impact analysis**: Financial impact assessment of supplier disruptions Risk categories monitored Financial stability**: Credit scores, payment delays, bankruptcy indicators Operational performance**: Delivery reliability, quality metrics, capacity utilization Compliance status**: Regulatory adherence, certifications, audit results Geopolitical risks**: Political instability, trade restrictions, regulatory changes Environmental factors**: Natural disasters, climate risks, sustainability metrics Cyber security**: Security breaches, data protection compliance Automated responses Low risk (0-30)**: Routine monitoring and performance tracking Medium risk (31-60)**: Enhanced monitoring with supplier engagement High risk (61-80)**: Immediate supplier contact and mitigation planning Critical risk (81-100)**: Emergency protocols and backup supplier activation Integration capabilities ERP systems**: SAP, Oracle, Microsoft Dynamics for procurement data Risk platforms**: Resilinc, Riskmethods, Prewave for specialized risk intelligence Financial services**: Credit monitoring and financial health assessment News APIs**: Real-time news monitoring and sentiment analysis Communication tools**: Slack, Teams, email for stakeholder notifications This workflow provides comprehensive supply chain visibility and proactive risk management, enabling companies to maintain operational continuity while minimizing disruption costs.
by Ricardo Espinozaas
Use Case Whenever someone shows interest in your offerings by subscribing to a list in ConvertKit it could be a potential new customer. Typically you need to gather more detailed information about them (data enrichment) and finally update their profile in your CRM system to better manage and nurture your relationship with them. This workflow does this all for you! What this workflow does The workflow runs every time a user is subscribed to a ConvertKit list. It then filters out personal emails, before enriching the email. If the email is attached to a company it enriches the company and upserts it in your Hubspot CRM. Setup Add Clearbit, Hubspot, and ConvertKit credentials. Click on Test workflow. Subscribe to a list on ConvertKit to trigger the workflow. Be aware that you can adapt this workflow to work with your enrichment tool, CRM, and email automation tool of choice.
by David Ashby
🛠️ Demio Tool MCP Server Complete MCP server exposing all Demio Tool operations to AI agents. Zero configuration needed - all 4 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Demio Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Demio Tool tool with full error handling 📋 Available Operations (4 total) Every possible Demio Tool operation is included: 📅 Event (3 operations) • Get an event • Get many events • Register an event 🔧 Report (1 operations) • Get a report 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Demio Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Demio Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Mikal Hayden-Gates
Overview Automates your complete social media content pipeline: sources articles from Wallabag RSS, generates platform-specific posts with AI, creates contextual images, and publishes via GetLate API. Built with 63 nodes across two workflows to handle LinkedIn, Instagram, and Bluesky—with easy expansion to more platforms. Ideal for: Content marketers, solo creators, agencies, and community managers maintaining a consistent multi-platform presence with minimal manual effort. How It Works Two-Workflow Architecture: Content Aggregation Workflow Monitors Wallabag RSS feeds for tagged articles (#to-share-linkedin, #to-share-instagram, etc.) Extracts and converts content from HTML to Markdown Stores structured data in Airtable with platform assignment AI Generation & Publishing Workflow Scheduled trigger queries Airtable for unpublished content Routes to platform-specific sub-workflows (LinkedIn, Instagram, Bluesky) LLM generates optimized post text and image prompts based on custom brand parameters Optionally generates AI images and hosts them on Imgbb CDN Publishes via GetLate API (immediate or draft mode) Updates Airtable with publication status and metadata Key Features: Tag-based content routing using Wallabag's native system Swappable AI providers (Groq, OpenAI, Anthropic) Platform-specific optimization (tone, length, hashtags, CTAs) Modular design—duplicate sub-workflows to add new platforms in \~30 minutes Centralized Airtable tracking with 17 data points per post Set Up Steps Setup time: \~45-60 minutes for initial configuration Create accounts and get API keys (\~15 min) Wallabag (with RSS feeds enabled) GetLate (social media publishing) Airtable (create base with provided schema—see sticky notes) LLM provider (Groq, OpenAI, or Anthropic) Image service (Hugging Face, Fal.ai, or Stability AI) Imgbb (image hosting) Configure n8n credentials (\~10 min) Add all API keys in n8n's credential manager Detailed credential setup instructions in workflow sticky notes Set up Airtable database (\~10 min) Create "RSS Feed - Content Store" base Add 19 required fields (schema provided in workflow sticky notes) Get Airtable base ID and API key Customize brand prompts (\~15 min) Edit "Set Custom SMCG Prompt" node for each platform Define brand voice, tone, goals, audience, and image preferences Platform-specific examples provided in sticky notes Configure platform settings (\~10 min) Set GetLate account IDs for each platform Enable/disable image generation per platform Choose immediate publish vs. draft mode Adjust schedule trigger frequency Test and deploy Tag test articles in Wallabag Monitor the first few executions in draft mode Activate workflows when satisfied with the output Important: This is a proof-of-concept template. Test thoroughly with draft mode before production use. Detailed setup instructions, troubleshooting tips, and customization guidance are in the workflow's sticky notes. Technical Details 63 nodes**: 9 Airtable operations, 8 HTTP requests, 7 code nodes, 3 LangChain LLM chains, 3 RSS triggers, 3 GetLate publishers Supports**: Multiple LLM providers, multiple image generation services, unlimited platforms via modular architecture Tracking**: 17 metadata fields per post, including publish status, applied parameters, character counts, hashtags, image URLs Prerequisites n8n instance (self-hosted or cloud) Accounts: Wallabag, GetLate, Airtable, LLM provider, image generation service, Imgbb Basic understanding of n8n workflows and credential configuration Time to customize prompts for your brand voice Detailed documentation, Airtable schema, prompt examples, and troubleshooting guides are in the workflow's sticky notes. Category Tags #social-media-automation, #ai-content-generation, #rss-to-social, #multi-platform-posting, #getlate-api, #airtable-database, #langchain, #workflow-automation, #content-marketing
by Jimleuk
This n8n template combines an AI agent with n8n's multi-page forms to create a novel interaction which allows automated question-and-answer sessions. One of the more obvious use-cases of this interaction is what I'm calling the AI interviewer. You can read the full post here: https://community.n8n.io/t/build-your-own-ai-interview-agents-with-n8n-forms/62312 Live demo here: https://jimleuk.app.n8n.cloud/form/driving-lessons-survey How it works A form trigger is used to start the interview and a new session is created in redis to capture the transcript. An AI agent is then tasked to ask questions to the user regarding the topic of the interview. This is setup as a loop so the questions never stop unless the user wishes to end the interview. Each answer is recorded in our session set up earlier between questions. When the user requests to end the interview we break the loop and show the interview completion screen. Finally, the session is then saved in a Google Sheet which can then be shared with team members and for the purpose of data analysis. How to use You'll need to be on a n8n instance that is accessible to your target audience. Not technical enough to setup your own server? Try out n8n cloud and instantly deploy template! Remember to activate the workflow so the form trigger is published and available for users to use. Requirements Groq LLM for AI agent. Feel free to swap this out for any other LLM. Redis(-compatible) storage for capturing sessions Customising this workflow The next step would be adding tools! AI interviews with knowledge retrieval could definitely open up other possibilities. Eg. An onboarding wizard generating questions by pulling facts from internal knowledgebase.
by Nukeador
Who is this for? BlueSky users who are looking to send a "welcome message" to their new followers as a private message. What this workflow does This worflow will check for new followers on BlueSky every 60 minutes and send a private message to the new ones. Setup You need to create a BlueSky app password with private messages access. Fill your credentials and the message text on the corresponding nodes (see sticky notes). Manually run once the `Save followers to file` node to generate your initial followers list. Enable the workflow How to customize this workflow to your needs You can adjust the check frecuency, but be careful to avoid hitting the 100 createSession per day rate limit Feedback or comments You can leave comments, feedback or improvements about this workflow on the n8n forums