by Alfred Nutile
How it works This workflow provides a streamlined process for uploading files to Digital Ocean Spaces, making them publicly accessible. The process happens in three main steps: User submits the form with file, in this case I needed it to upload images I use in my seo tags. File is automatically uploaded to Digital Ocean Spaces using S3-compatible storage Form completion confirmation is provided Setup steps Initial setup typically takes 5-10 minutes Configure your Digital Ocean Spaces credentials and bucket settings Test the upload functionality with a small sample file Verify public access permissions are working as expected Important notes Credentials are tricky check the screenshot above for how I set the url, bucket etc. I am just using the S3 Node Set the ACL as seen below Troubleshooting Bucket name might be incorrect Region Wrong Check Space permissions if uploads fail Verify API credentials are correctly configured You can see a video here. (live in 24 hours) https://youtu.be/pYOpy3Ntt1o
by Romain Jouhannet
This workflow imports Productboard data into Snowflake, automating data extraction, mapping, and updates for features, companies, and notes. It supports scheduled weekly updates, data cleansing, and Slack notifications summarizing the latest insights. Features Fetches data from Productboard (features, companies, notes). Maps and processes data for Snowflake tables. Automates table creation, truncation, and updates. Summarizes new and unprocessed notes. Sends weekly Slack notifications with key insights. Setup Configure Productboard and Snowflake credentials in n8n. Update Snowflake table schemas to match your setup. Replace Slack channel ID and dashboard URL in the notification node. Activate the workflow and set the desired schedule.
by Ludwig
How it works: This workflow automates tagging for WordPress posts using AI: Fetch blog post content and metadata. Generate contextually relevant tags using AI. Verify existing tags in WordPress and create new ones if necessary. Automatically update posts with accurate and optimized tags. Set up steps: Estimated time: ~15 minutes. Configure the workflow with your WordPress API credentials. Connect your content source (e.g., RSS feed or manual input). Adjust tag formatting preferences in the workflow settings. Run the workflow to ensure proper tag creation and assignment. This workflow is perfect for marketers and content managers looking to streamline their content categorization and improve SEO efficiency.
by Francis Njenga
Workflow Documentation: HR Job Posting and Evaluation with AI Detailed Description The HR Job Posting and Evaluation with AI workflow is designed to streamline and enhance recruitment for technical roles, such as Automation Specialists. By automating key stages in the hiring process, this workflow ensures a seamless experience for both candidates and HR teams. From collecting applications to evaluating candidates using AI and scheduling interviews, this workflow provides an end-to-end solution for recruitment challenges. Who is this for? This workflow is ideal for: HR Professionals**: Managing multiple job postings and candidates efficiently. Recruitment Teams**: Handling large volumes of applications for technical positions. Hiring Managers**: Ensuring structured and objective candidate evaluations. What problem does this workflow solve? Time-Consuming Processes**: Automates repetitive tasks like data entry, CV management, and scheduling. Fair Candidate Evaluation**: Leverages AI to provide objective insights based on resumes and job descriptions. Streamlined Communication**: Ensures timely and personalized candidate interactions, improving their experience. What this workflow does This workflow automates the following steps: Form Submission: Collects candidate information via a structured application form. Data Storage: Stores applicant details in Airtable for centralized tracking. CV Management: Automatically uploads resumes to Google Drive for easy access and organization. AI-Powered Candidate Evaluation: Scores candidates based on their resumes and job descriptions using OpenAI, providing actionable insights. Interview Scheduling: Automates scheduling based on candidate and interviewer availability. Communication: Sends customized emails to candidates for interview invitations and feedback. Setup Prerequisites To use this workflow, you’ll need: n8n Account**: To create and run the workflow. Airtable Account**: For managing applicant data. Google Drive Account**: For storing candidate CVs. OpenAI API Key**: For AI-powered candidate scoring. SMTP Email Account**: For sending candidate communications. Setup Process Airtable Configuration: Create a base in Airtable with tables for Applicants and Job Positions. Google Drive Setup: Create a folder for CV storage and ensure you have write permissions. Integrate Airtable in n8n: Use the Airtable API key to connect Airtable to n8n. Integrate Google Drive in n8n: Authorize Google Drive to enable CV storage automation. OpenAI Integration: Add your OpenAI API key to n8n for candidate scoring. Email Configuration: Set up your SMTP email account in n8n for sending notifications and invitations. How to customize this workflow Tailor the workflow to fit your unique recruitment needs: Edit Job Descriptions: Adjust the form parameters to match the specific role and qualifications. Refine AI Evaluation Criteria: Modify OpenAI prompts to reflect the skills and competencies for the desired position. Personalize Email Templates: Update email content to match your organization’s tone and branding. Add New Features: Incorporate additional steps like feedback collection or integration with other HR tools. Conclusion The HR Job Posting and Evaluation with AI workflow simplifies and automates the recruitment process, enabling HR teams to focus on engaging with candidates rather than handling administrative tasks. With its powerful integrations and customization options, this workflow helps organizations hire efficiently while improving the candidate experience.
by Joseph LePage
Multi-AI Agent Chatbot for Postgres/Supabase Databases and QuickChart Generation Who is this for? This workflow is ideal for data analysts, developers, and business intelligence teams who need an AI-powered chatbot to query Postgres/Supabase databases and generate dynamic charts for data visualization. What problem does this solve? It simplifies data exploration by combining conversational AI with database querying and chart generation. Users can interact with their database using natural language, retrieve insights, and visualize data without manual SQL queries or chart configuration. What this workflow does AI-Powered Chat Interface: Accepts natural language prompts to query databases or generate charts. Routes user requests through a tool agent system to determine the appropriate action (query or chart). Database Querying: Executes SQL queries on Postgres/Supabase databases based on user input. Retrieves schema information, table definitions, and specific data records. Dynamic Chart Generation: Uses QuickChart to create bar charts, line charts, or other visualizations from database records. Outputs a shareable chart URL or JSON configuration for further customization. Memory Integration: Maintains chat history using Postgres memory nodes, enabling context-aware interactions. Workflow diagram showcasing AI agents, database querying, and chart generation paths. Setup Prerequisites: A Postgres-compatible database (e.g., Supabase). API credentials for OpenAI. Configuration Steps: Add your database connection credentials in the Postgres nodes. Set up OpenAI credentials for GPT-4o-mini in the language model nodes. Adjust the QuickChart schema in the "QuickChart Object Schema" node to fit your use case. Testing: Trigger the chat workflow via the "When chat message received" node. Test with prompts like "Generate a bar chart of sales data" or "Show me all users in the database." How to customize this workflow Modify AI Prompts** Add Chart Types** Integrate Other Tools**
by Geekaz / Kazen
Who is this for? This template is designed for social media managers, content creators, data analysts, and anyone who wants to automatically save and analyze their Meta Threads posts in Notion. It’s particularly useful for: Building a personal archive of your Threads content. Training AI models using your social media data. Tracking your online presence and engagement. What this workflow does This workflow uses the Meta Threads API to automatically retrieve your posts and import them into a Notion database. It retrieves the post content, date, and time, and stores them in designated properties within your Notion database. Setup Get Threads Access Token and ID: Obtain a long-lived access token and your Threads ID from the Meta Threads developer platform. This token auto-refreshes, ensuring uninterrupted workflow operation. Configure Credentials and Date Range: In the “Set Credentials” node (using edit fields), enter your token and ID. Set the since and until parameters in the “Set Date Range” node to specify the post import period. Connect to Notion and Create a Database: Connect to your Notion workspace and create a database with these properties (customize with the “Create Properties” node): a. Title: Threads post URL (Notion entry title). b. Threads ID: Unique post ID (prevents duplicate imports). c. Username: Post author (for future multi-account/source management). d. Post Date: Original post date. e. Source (Multi-Select): “Threads” tag (for future multi-platform import and filtering). f. Created: Import date and time. g. Import Check (Optional): For use with a separate post-categorization workflow.
by Zacharia Kimotho
This workflow automates sentiment analysis of Reddit posts related to Apple's WWDC25 event. It extracts data, categorizes posts, analyzes sentiment of comments, and updates a Google Sheet with the results. Preliquisites Bright Data Account: You need a Bright Data account to scrape Reddit data. Ensure you have the correct permissions to use their API. https://brightdata.com/ Google Sheets API Credentials: Enable the Google Sheets API in your Google Cloud project and create credentials (OAuth 2.0 Client IDs). Google Gemini API Credentials: You need a Gemini API key to run the sentiment analysis. Ensure you have the correct permissions to use their API. https://ai.google.dev/". You can use any other models of choice Setup Import the Workflow: Import the provided JSON workflow into your n8n instance.", Configure Bright Data Credentials:, In the 'scrap reddit' and the 'get status' nodes, in Header Parameters find the Authorization field, replace Bearer 1234 with your Bright Data API key. Apply this to every node that utilizes your Bright Data API Key., Set up the Google Sheets API credentials, In the 'Append Sentiments' node, set up the Google Sheets API by connecting your Google Sheets account through oAuth 2 credentials. ", Configure the Google Gemini Credential ID, In the ' Sentiment Analysis per comment' node, set up the Google Gemini API by connecting your Google AI account through the API credentials. , Configure Additional Parameters:, In the 'scrap reddit' node, modify the JSON body to adjust the search term, date, or sort method., In the 'Wait' node, alter the 'Amount' to adjust the polling interval for scraping status, it is set to 15 seconds by default., In the 'Text Classifier' node, customize the categories and descriptions to suit the sentiment analysis needs. Review categories such as 'WWDC events' to ensure relevancy., In the 'Sentiment Analysis per comment' node, modify the system prompt template to improve context. customization_options Bright Data API parameters to adjust scraping behavior. Wait node duration to optimize polling. Text Classifier categories and descriptions. Sentiment Analysis system prompt. Use Case Examples Brand Monitoring:** Track public sentiment towards Apple during and after the WWDC25 event. Product Feedback Analysis:** Gather insights into user reactions to new product announcements. Competitive Analysis:** Compare sentiment towards Apple's announcements versus competitors. Event Impact Assessment:** Measure the overall impact of the WWDC25 event on various aspects of Apple's business. Target_audiences: Marketing professionals in the tech industry, Brand managers, Product managers, Market research analysts, Social media managers Troubleshooting: Workflow fails to start. Check that all necessary credentials (Bright Data and Google Sheets API) are correctly configured and that the Bright Data API key is valid. Data scraping fails. Verify the Bright Data API key, ensure the dataset ID is correct, and inspect the Bright Data dashboard for any issues with the scraping job. Sentiment analysis is inaccurate. Refine the categories and descriptions in the 'Text Classifier' node. Check that you have the correct Google Gemini API key, as the original is a placeholder. Google Sheets are not updating. Ensure the Google Sheets API credentials have the necessary permissions to write to the specified spreadsheet and sheet. Check API usage limits. Workflow does not produce the correct output. Check the data connections, by clicking the connections, and looking at which data is being produced. Check all formulas for errors. Happy productivity!
by David Ashby
🛠️ Pushover Tool MCP Server Complete MCP server exposing all Pushover Tool operations to AI agents. Zero configuration needed - 1 operation pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every Pushover Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n Pushover Tool tool with full error handling 📋 Available Operations (1 total) Every possible Pushover Tool operation is included: 💬 Message (1 operations) • Push a message 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native Pushover Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every Pushover Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
by Jimleuk
This n8n template lets you summarize team member activity on Slack for the past week and generates a report. For remote teams, chat is a crucial communication tool to ensure work gets done but with so many conversations happening at once and in multiple threads, ideas, information and decisions usually live in the moment and get lost just as quickly - and all together forgotten by the weekend! Using this template, this doesn't have to be the case. Have AI crawl through last week's activity, summarize all threads and generate a casual and snappy report to bring the team back into focus for the current week. A project manager's dream! How it works A scheduled trigger is set to run every Monday at 6am to gather all team channel messages within the last week. Each message thread are grouped by user and data mined for replies. Combined, an AI analyses the raw messages to pull out interesting observations and highlights. The summarized threads of the user are then combined together and passed to another AI agent to generate a higher level overview of their week. These are referred to as the individual reports. Next, all individual reports are summarized together into a team weekly report. This allows understanding of group and similar activities. Finally, the team weekly report is posted back to the channel. The timing is important as it should be the first message of the week and ready for the team to glance over coffee. How to use Ideally works best per project and where most of the comms happens on a single channel. Avoid combining channels and instead duplicate this workflow for more channels. You may need to filter for specific team members if you want specific team updates. Customise the report to suit your organisation, team or the channel. You may prefer to be more formal if clients or external stakeholders are also present. Requirements Slack for chat platform Gemini for LLM (or switch for other models) Customising this workflow If the slack channel is busy enough already, consider posting the final report to email. Pull in project metrics to include in your report. As extra context, it may be interesting to tie the messages to production performance. Use an AI Agent to query for knowledgebase or tickets relevant to the messages. This may be useful for attaching links or references to add context. Channel not so busy or way too busy for 1 week? Play with the scheduled trigger and set an interval which works for your team.
by Matthieu
LinkedIn Profile Tracker Automation Who is this for? This template is ideal for sales teams, recruiters, business development professionals, and relationship managers who need to monitor changes in their network's LinkedIn profiles. Perfect for agencies tracking client personnel changes, HR teams monitoring talent movements, sales professionals staying updated on prospect job changes, and content teams tracking influencer activity. What problem does this workflow solve? Manually checking LinkedIn profiles for updates like job changes, status modifications, profile edits, or latest posts is extremely time-consuming and easy to miss. This automation eliminates the need for constant manual monitoring while ensuring you never miss important changes that could signal new business opportunities, relationship updates, or content engagement opportunities. What this workflow does This workflow automatically monitors a list of LinkedIn profiles on a weekly schedule, detects any changes in: Personal information** (name, headline, summary) Job status** (hiring/open to work flags) Latest work experience** (new positions, company changes) Recent posts** (latest content activity) When changes are detected, it immediately sends Slack notifications with before/after comparisons and updates your tracking database to maintain historical records of all profile evolution. Setup Create a Ghost Genius API account and get your API key for LinkedIn profile scraping Configure HTTP Request nodes with Header Auth credentials using your Ghost Genius API key Set up your Google Sheets database with columns: Firstname, Lastname, LinkedIn URL, ID Tagline, Summary, Latest experience Open to work?, Hiring?, Latest post Configure Slack webhook integration for real-time notifications Set up credentials for Google Sheets and Slack following n8n documentation Add LinkedIn profile URLs to your Google Sheet to start monitoring How to customize this workflow Modify the schedule trigger** to check profiles daily, bi-weekly, or monthly based on your monitoring needs Customize Slack notification messages** to include additional context, mentions, or custom formatting Add email notifications** alongside Slack alerts for critical changes like job transitions Set up filtered notifications** to only alert on specific types of changes (e.g., job changes only, posts from key influencers) Add post content analysis** to detect mentions of your company or competitors Integrate with CRM systems** to automatically update lead records when profile changes occur
by Angel Menendez
Submission Overview for Voiceflow Demo Workflow View the YouTube video for this workflow here. Who is this for? This workflow is ideal for businesses and developers using Voiceflow to power AI voice chatbots. It benefits teams that want to enhance chatbot functionality through integrations with platforms like Zendesk, Google Calendar, and Airtable. What problem is this workflow solving? The workflow addresses the need for seamless integration of chatbot interactions with backend systems. It automates customer service tasks such as ticket creation, meeting scheduling, and data reporting, reducing manual effort and enhancing efficiency. What does this workflow do? Customer Lookup:** Checks the database for existing customers and returns relevant details or a "NOT_FOUND" status. Zendesk Ticket Creation:** Automates the creation of support tickets for customer issues. Meeting Scheduling:** Integrates with Google Calendar to provide availability and schedule meetings. Transcript Reporting:** Aggregates interaction data and sends it to Airtable for analysis by the product team. Setup Configure your Voiceflow chatbot to connect to this workflow via a webhook. Set up the required integrations: Zendesk API: For ticket creation. Google Calendar API: For scheduling. Airtable API: For storing transcripts. Customize the workflow's nodes to match your use case, such as database fields or API endpoints. Deploy the workflow on your n8n instance and test the integrations. How to customize this workflow to your needs Adjust database queries to match your customer data schema. Modify the Zendesk ticket payload to include additional fields or custom formats. Update Google Calendar configurations for different scheduling requirements. Add or remove Airtable fields based on the product team's analysis needs. This template adheres to n8n’s submission guidelines, ensuring clarity, relevance, and broad applicability for users in customer service, product development, and automation.
by Todsaporn Sangboon
📈 How it works This n8n workflow allows you to interact with Binance Spot Trading API directly to: Place Limit Buy and Limit Sell orders Place Market Buy and Market Sell orders Query account info* and *open orders** Cancel all open orders** for a specific symbol All requests are signed using Binance's HMAC SHA256 signature method for secure trading. ⚙️ Setup Steps Create Binance API Credentials in n8n: Go to Credentials > New Choose Binance API Add api_key and api_secret Save as Binance API Import this workflow into your n8n instance. Update default values: In Set Parameter nodes like LimitBuy Parameter, change: symbol (e.g. BTCUSDT) quantity, price as needed Run the workflow manually via the Execute workflow trigger. ✅ Notes Credential node is marked with instructions. HMAC signatures are automatically calculated before making each request. HTTP nodes are preconfigured for Binance API v3. 🔒 No API key or secret is included.