by Jah coozi
AI Social Media Content Generator & Scheduler Transform your social media strategy with AI-powered content generation that creates platform-specific posts in seconds! 🚀 What It Does This workflow uses AI to generate optimized content for multiple social media platforms from a single topic input. Perfect for marketers, content creators, and businesses looking to maintain consistent social media presence. ✨ Key Features Multi-Platform Support**: LinkedIn, Twitter/X, Instagram, Facebook, TikTok AI-Powered Generation**: Uses GPT-4 for creative, engaging content Platform Optimization**: Respects character limits and best practices Hashtag Generation**: Platform-specific hashtag strategies Posting Time Suggestions**: Optimal times for each platform Tone Customization**: Professional, casual, friendly, or custom Multi-Language Support**: Generate content in any language Engagement Predictions**: Estimate reach and engagement Daily Automation**: Schedule automatic content generation Bulk Processing**: Generate content for multiple topics at once 📊 Use Cases Marketing Teams: Streamline content creation across channels Small Businesses: Maintain consistent social presence Content Agencies: Scale content production efficiently Personal Brands: Build thought leadership E-commerce: Product launches and promotions 🛠️ Setup Instructions Add OpenAI Credentials Get API key from OpenAI Add to n8n credentials Configure Webhook (Optional) Set custom path if needed Enable for external integrations Customize Settings Adjust tone and style Set platform preferences Configure posting schedule Test Generation Use example prompts Verify output quality 💡 Example Inputs "New product launch - eco-friendly water bottle" "Company milestone - 10 years in business" "Industry insights - Future of AI in healthcare" "Team spotlight - Meet our new developer" "Seasonal campaign - Summer sale 50% off" 📈 Benefits 10x Faster**: Create content in seconds vs hours Consistency**: Maintain brand voice across platforms Engagement**: Platform-optimized for maximum reach Scalability**: Generate unlimited content Cost-Effective**: Reduce content creation costs by 80% 🔧 Customization Options Custom brand voice training Industry-specific content rules Competitor analysis integration A/B testing capabilities Analytics webhook integration Auto-posting to platforms Image generation add-on Translation services 🎯 Pro Tips Train the AI with your best-performing posts Use platform analytics to refine strategies Test different tones for audience engagement Schedule content during peak hours Monitor and iterate based on performance Start creating engaging social media content today! Categories: Marketing & Growth Content Creation Social Media AI & Automation Productivity Difficulty: Beginner Required Services: OpenAI API (or compatible LLM) n8n instance Optional: Social media APIs for auto-posting
by Miquel Colomer
This n8n workflow template checks for new major releases (tagged with .0) of the n8n project using its official GitHub releases feed. It runs multiple times a day and sends notifications via email and Telegram if a new release is found. > ⚠️ Note: You must *activate the workflow* to start receiving release notifications. 🚀 What It Does Monitors the n8n GitHub releases feed Detects major versions (e.g., 1.0.0, 2.0.0) Sends alert messages via Telegram and email (SES) when a release is published ⏰ Scheduling Details The Cron node checks for new releases three times per day: 10:00, 14:00, and 18:00 server time. 🛠️ Step-by-Step Setup Configure Telegram Bot Connect your Telegram bot and specify the chat ID where you want to receive notifications. Set up AWS SES Credentials Use a verified sender email and set up AWS SES credentials in your n8n instance. Activate the Workflow Enable the workflow in your instance to start receiving notifications. Customize Notification Messages (Optional) You can modify the email subject, Telegram format, or filter logic. 🧠 How It Works: Workflow Overview Cron Trigger Runs the workflow at 10:00, 14:00, and 18:00 daily. Read RSS Feed Pulls data from https://github.com/n8n-io/n8n/releases.atom. Filter by Current Day Filters the feed to match: Releases published in the last 4 hours Titles starting with n8n@ and ending with .0 Condition Check Uses a regex to check if the filter result contains any release data. Notifications If a new major release is found, sends: Telegram message to a specified chat Email via AWS SES with release info 📨 Final Output You'll receive a Telegram message and email when a new major n8n version is released. 🔐 Credentials Used Telegram API** – For sending chat notifications AWS SES** – To send email alerts ✨ Customization Tips Change Notification Channels**: Add Slack, Discord, or other preferred channels. Adjust Cron Schedule**: Modify the Cron node to fit your check frequency. Modify Filters**: Detect patch or beta versions by changing the .0 condition. Send Release Notes**: Extend the feed parsing to include release content. ❓Questions? Template created by Miquel Colomer and n8nhackers.com. Need help customizing or deploying? Contact us for consulting and support.
by n8n Team
This workflow automatically adds a note of the PR from GitHub to the Pipedrive contact if their GitHub email matches a Person in Pipedrive. Prerequisites Pipedrive account and Pipedrive credentials GitHub account and GitHub credentials How it works GitHub Trigger node activates the workflow when a GitHub user adds a PR. HTTP Request node gets the user's data and sends it further. Pipedrive node searches the same email that GitHub user has in Pipedrive. IF node checks whether a person with the same email exists in Pipedrive. In case there's such a person in Pipedrive, the Pipedrive node creates a note within the person's profile.
by Yang
Who is this for? This workflow is built for newsletter writers, marketers, content creators, or anyone who curates and summarizes web articles. It’s especially helpful for virtual assistants and founders who need to quickly turn web content into digestible, branded newsletters using AI. What problem is this workflow solving? Manually reading, summarizing, and formatting multiple articles into a newsletter takes time and focus. This workflow automates the process using Dumpling AI for crawling, GPT-4o for summarization, and Gmail for delivery—so you can go from raw URLs to a polished email in minutes. What this workflow does Starts manually (can also be scheduled) Reads a list of article URLs from Google Sheets Sends URLs to Dumpling AI to crawl and extract content Splits each article into a single item for processing Uses a Code node to clean and structure article data Uses an Edit Fields node to merge articles into one JSON block GPT-4o summarizes and generates HTML content for the newsletter Sends the formatted newsletter via Gmail Setup Google Sheets Create a sheet with a column (A) for article URLs Update the Read URLs from Google Sheet node to use your Sheet ID and tab name Connect your Google account in the credentials Dumpling AI Sign up at https://app.dumplingai.com Create an agent for web crawling under /crawl Add your Dumpling API key in the HTTP headers of the Crawl Content with Dumpling AI node Split Node Breaks apart the array of articles from Dumpling AI so each article is processed individually Code Node Structures each article as JSON with title, url, and cleaned text content Edit Fields Node Gathers all structured articles back into a single JSON array to prepare for AI summarization OpenAI (GPT-4o) Processes the article list and returns a formatted subject line and HTML newsletter content Gmail Connect your Gmail account to send the AI-generated newsletter to your inbox or team Update the recipient field in the Send HTML Email via Gmail node How to customize this workflow to your needs Replace the manual trigger with a Schedule node to send newsletters weekly Modify the GPT-4o prompt to change tone (e.g., more professional, funny, casual) Add filtering logic to skip low-value articles Connect Slack, Airtable, or Notion for internal team usage Change Gmail to SendGrid or Outlook if preferred Final Notes This workflow uses: Dumpling AI** /crawl endpoint to extract article content Split, **Code, and Edit Fields nodes to format multi-article input GPT-4o** for summarization and HTML formatting Gmail** for delivery This setup eliminates manual steps and delivers fast, consistent newsletters powered by AI.
by Artur
Streamline your accounting by automatically creating QuickBooks Online customers and sales receipts whenever a successful Stripe payment is processed. Ideal for businesses looking to reduce manual data entry and improve accounting efficiency. How it works Trigger: The workflow is triggered when a new successful payment intent event is received from Stripe. Retrieve Customer Data: Fetches customer details from Stripe associated with the payment. Check QuickBooks Customer: Searches QuickBooks Online to see if the customer already exists using their email address. Create or Use Existing Customer: If the customer doesn't exist in QuickBooks, they are created; otherwise, the existing customer is used. Generate Sales Receipt: A sales receipt is created in QuickBooks Online with payment details, including item descriptions, amounts, and currency. Set up steps Connect Accounts: Authenticate both your QuickBooks Online and Stripe accounts in n8n. Webhook Setup: Configure the Stripe webhook to send payment_intent.succeeded events to this workflow. Test the Workflow: Trigger a test payment in Stripe to validate the integration. Customize Details: Adjust item descriptions or other fields in the QuickBooks sales receipt JSON body as needed. This workflow requires basic familiarity with n8n, but setup can be completed in under 15 minutes for most users.
by Corentin Ribeyre
This template can be used to verify email addresses with Icypeas. Be sure to have an active account to use this template. How it works This workflow can be divided into four steps : The workflow initiates with a manual trigger (On clicking ‘execute’). It reads your Google Sheet file. It connects to your Icypeas account. It performs an HTTP request to scan the domains/companies. Set up steps You will need a formated Google sheet file with company/domain names. You will need a working icypeas account to run the workflow and get your API Key, API Secret and User ID. You will need domain/companies names to scan them.
by Rudi Afandi
Description This n8n workflow enables users to send an image to a Telegram bot and receive the extracted text using Tesseract OCR (via the n8n-nodes-tesseractjs Community Node). It's a quick and straightforward way to convert images into readable text directly through chat. How it Works The workflow listens for new image messages coming in via the Telegram bot. Once an image is received, it downloads the image file from Telegram (which initially arrives as application/octet-stream). The image data, now properly identified, is then sent to the Tesseract OCR node to extract the text. Finally, the recognized text is sent back as a reply to the Telegram user. Setup Steps Install Community Node: Ensure you have installed n8n-nodes-tesseractjs in your n8n instance. Connect Telegram Bot: Configure the Telegram Trigger node with your Telegram bot. Bot Token: Add your Telegram bot token to the Send Message node to send replies. Deploy & Test: Activate (deploy) the workflow and send an image to your Telegram bot to test.
by Emad
This workflow automatically sends you a list of your daily meetings every morning via a Telegram bot. Use Cases: This workflow is useful for anyone who wants to be automatically informed of their daily meetings, especially for busy professionals, students, and anyone with a hectic schedule. Setup: Google Calendar connected to n8n A Telegram bot created and connected to n8n Your Telegram user ID specified Notes: You need to replace the placeholder in the Telegram node with your actual Telegram user ID. You can customize the formatting of the Telegram message in the JavaScript Code node.
by Sarfaraz Muhammad Sajib
What this workflow does This workflow helps HR teams screen CVs with AI, store compatibility ratings in Google Sheets, and send email notifications to candidates and HR. It simplifies the recruitment process. CV Submission Form: Candidates submit their details and CV (PDF) through a web form, triggering the workflow in n8n. PDF Extraction & AI Rating: The submitted CV is processed to extract text, and AI analyzes it to generate a compatibility rating. Results Storage & Notifications: Ratings are stored in a Google Sheet for easy access and organization. Confirmation emails are automatically sent to both HR and the candidate. Setup Use the provided template to configure your form and connect it to n8n. Ensure your Google Sheets and email service integrations are active. Customization Instructions: Modify the email template to match your organization’s branding. Adjust the AI compatibility rating thresholds based on your requirements. Ensure you have updated the prompt for cv screening.
by Manu
How it works Weekly triggered Fetches all previous executions of a given workflow Filter for failures and aggregate them into a single report Sends them to a given Telegram chat. Set up steps Create a new N8N api token in the settings panel. Add new N8N credentials in the credentials panel. Add new Telegram credentials in the credentials panel. Select N8N credentials and select the workflow ID in the "Get all previous executions" node. Select Telegram credentials and enter the chat-id in the "Telegram" node.
by AlQaisi
Template Information Who is this template for? This template is for users looking to retrieve email information from LinkedIn profiles and update Google Sheets with the collected data. 🎥 quick set up video How it works** The template utilizes a series of nodes to fetch email information from LinkedIn profiles. It starts with a Schedule Trigger node that sets the interval for the workflow. The Conditional Check node verifies if certain fields like Name, Gender, Job Title, Summary, and LinkedIn URL are not empty. The HTTP Request node sends a POST request to the specified URL with API key and profile information. The Data Merge node merges the data collected. The Field Editing node modifies the fields as needed. Finally, the Google Sheets Update node updates the Google Sheets with the gathered information. Set Up Instructions Make sure to have the necessary credentials and permissions for accessing LinkedIn and Google Sheets. Set up the API key required for the HTTP Request node. Configure the Google Sheets Update node with the appropriate document ID and sheet name. Check and adjust field mappings in the Field Editing node according to your needs. Run the workflow and monitor the updates in your Google Sheets document. Overview: The workflow is designed to find contact information for LinkedIn profile URLs stored in a Google Sheet. It involves various nodes for different operations such as making HTTP requests, scheduling triggers, reading from and updating Google Sheets, field editing, data merging, and conditional checks. A video demonstrating the workflow process can be accessed here. Copy this template to get started : Google Sheets Using Prospeo.io LinkedIn Email Finder API with cURL To use the API endpoint "https://api.prospeo.io/linkedin-email-finder" with cURL, follow these steps: Use the cURL command with the following parameters: curl -X POST \ -H "Content-Type: application/json" \ -H "X-KEY: your_api_key" \ -d '{ "url": "https://www.linkedin.com/in/john-doe/" }' \ "https://api.prospeo.io/linkedin-email-finder" Replace "your_api_key" with your actual API key. Update the "url" field in the JSON data with the LinkedIn profile URL for which you want to find the email address. To get access to this API and obtain your API key, you need to sign up on the Prospeo platform and subscribe to their LinkedIn email finder service. Once you have subscribed, you will receive an API key that you can use to authenticate your requests to the API endpoint. Description: Schedule Trigger:** Triggers the workflow based on a defined schedule interval, in this case, based on minutes. Schedule Trigger Node Documentation Google Sheets Read:** Reads data from a Google Sheets document and sheet based on the provided document ID and sheet name. Google Sheets Node Documentation Conditional Check:** Checks multiple conditions based on the input data and performs actions accordingly. Conditional Node Documentation HTTP Request:** Sends an HTTP POST request to a specified URL with headers and body parameters. HTTP Request Node Documentation No Operation, do nothing:** Placeholder node that does not perform any operation. Data Merge:** Merges data based on specified mode and combination settings. Merge Node Documentation Field Editing:** Edits fields by setting specific values for each field based on input data. Set Node Documentation Google Sheets Update:** Updates data in a Google Sheets document and sheet based on specified columns and values. Google Sheets Node Documentation
by Artur
Overview This automated workflow fetches Upwork job postings using Apify, removes duplicate job listings via MongoDB, and sends new job opportunities to Slack. Key Features: Automated job retrieval** from Upwork via Apify API Duplicate filtering** using MongoDB to store only unique jobs Slack notifications** for new job postings Runs every 20 minutes** during working hours (9 AM - 5 PM) This workflow requires an active Apify subscription to function, as it uses the Apify Upwork API to fetch job listings. Who is This For? This workflow is ideal for: Freelancers looking to track Upwork jobs in real time Recruiters automating job collection for analytics Developers who want to integrate Upwork job data into their applications What Problem Does This Solve? Manually checking Upwork for jobs is time-consuming and inefficient. This workflow: Automates job discovery based on your keywords Filters out duplicate listings, ensuring only new jobs are stored Notifies you on Slack when new jobs appear How the Workflow Works 1. Schedule Trigger (Every 20 Minutes) Triggers the workflow at 20-minute intervals Ensures job searches are only executed during working hours (9 AM - 5 PM) 2. Query Upwork for Jobs Uses Apify API to scrape Upwork job posts for specific keywords (e.g., "n8n", "Python") 3. Find Existing Jobs in MongoDB Searches MongoDB to check if a job (based on title and budget) already exists 4. Filter Out Duplicate Jobs The Merge Node compares Upwork jobs with MongoDB data The IF Node filters out jobs that are already stored in the database 5. Save Only New Jobs in MongoDB The Insert Node adds only new job listings to the MongoDB collection 6. Send a Slack Notification If a new job is found, a Slack message is sent with job details Setup Guide Required API Keys Upwork Scraper (Apify Token) – Get your token from Apify MongoDB Credentials – Set up MongoDB in n8n using your connection string Slack API Token – Connect Slack to n8n and set the channel ID (default: #general) Configuration Steps Modify search keywords in the 'Assign Parameters' node (startUrls) Adjust the Working Hours in the 'If Working Hours' node Set your Slack channel in the Slack node Ensure MongoDB is connected properly Adjust the 'If Working Hours' node to match your timezone and hours, or remove it altogether to receive notifications and updates constantly. How to Customize the Workflow Change keywords: update the startUrls in the 'Assign Parameters' node to track different job categories Change 'If Working Hours': Modify conditions in the IF Node to filter times based on your needs Modify Slack Notifications: Adjust the Slack message format to include additional job details Why Use This Workflow? Automated job tracking without manual searches Prevents duplicate entries in MongoDB Instant Slack notifications for new job opportunities Customizable – adapt the workflow to different job categories Next Steps Run the workflow and test with a small set of keywords Expand job categories for better coverage Enhance notifications by integrating Telegram, Email, or a dashboard This workflow ensures real-time job tracking, prevents duplicates, and keeps you updated effortlessly.