by Yang
Who is this for? This workflow is built for newsletter writers, marketers, content creators, or anyone who curates and summarizes web articles. It’s especially helpful for virtual assistants and founders who need to quickly turn web content into digestible, branded newsletters using AI. What problem is this workflow solving? Manually reading, summarizing, and formatting multiple articles into a newsletter takes time and focus. This workflow automates the process using Dumpling AI for crawling, GPT-4o for summarization, and Gmail for delivery—so you can go from raw URLs to a polished email in minutes. What this workflow does Starts manually (can also be scheduled) Reads a list of article URLs from Google Sheets Sends URLs to Dumpling AI to crawl and extract content Splits each article into a single item for processing Uses a Code node to clean and structure article data Uses an Edit Fields node to merge articles into one JSON block GPT-4o summarizes and generates HTML content for the newsletter Sends the formatted newsletter via Gmail Setup Google Sheets Create a sheet with a column (A) for article URLs Update the Read URLs from Google Sheet node to use your Sheet ID and tab name Connect your Google account in the credentials Dumpling AI Sign up at https://app.dumplingai.com Create an agent for web crawling under /crawl Add your Dumpling API key in the HTTP headers of the Crawl Content with Dumpling AI node Split Node Breaks apart the array of articles from Dumpling AI so each article is processed individually Code Node Structures each article as JSON with title, url, and cleaned text content Edit Fields Node Gathers all structured articles back into a single JSON array to prepare for AI summarization OpenAI (GPT-4o) Processes the article list and returns a formatted subject line and HTML newsletter content Gmail Connect your Gmail account to send the AI-generated newsletter to your inbox or team Update the recipient field in the Send HTML Email via Gmail node How to customize this workflow to your needs Replace the manual trigger with a Schedule node to send newsletters weekly Modify the GPT-4o prompt to change tone (e.g., more professional, funny, casual) Add filtering logic to skip low-value articles Connect Slack, Airtable, or Notion for internal team usage Change Gmail to SendGrid or Outlook if preferred Final Notes This workflow uses: Dumpling AI** /crawl endpoint to extract article content Split, **Code, and Edit Fields nodes to format multi-article input GPT-4o** for summarization and HTML formatting Gmail** for delivery This setup eliminates manual steps and delivers fast, consistent newsletters powered by AI.
by Yaron Been
🎤 Audio-to-Insights: Auto Meeting Summarizer Transform your meeting recordings into actionable insights automatically. This powerful n8n workflow monitors your Google Drive for new audio files, transcribes them using OpenAI's Whisper, generates intelligent summaries with ChatGPT, and logs everything in Google Sheets - all without lifting a finger. 🔄 How It Works This workflow operates as a seamless 6-step automation pipeline: Step 1: Smart Detection The workflow continuously monitors a designated Google Drive folder (polls every minute) for newly uploaded audio files. Step 2: Secure Download When a new audio file is detected, the system automatically downloads it from Google Drive for processing. Step 3: AI Transcription OpenAI's Whisper technology converts your audio recording into accurate text transcription, supporting multiple audio formats. Step 4: Intelligent Summarization ChatGPT processes the transcript using a specialized prompt that extracts: Key discussion points and decisions Action items with assigned persons and deadlines Priority levels and follow-up tasks Clean, professional formatting Step 5: Timestamp Generation The system automatically adds the current date and formats it consistently for tracking purposes. Step 6: Automated Logging The final summary is appended to your Google Sheets document with the date, creating a searchable archive of all meeting insights. ⚙️ Setup Steps Prerequisites Before setting up the workflow, ensure you have: Active Google Drive account OpenAI API key with credits Google Sheets access n8n instance (cloud or self-hosted) Configuration Steps 1. Credential Setup Google Drive OAuth2**: Required for folder monitoring and file downloads OpenAI API Key**: Needed for both transcription (Whisper) and summarization (ChatGPT) Google Sheets OAuth2**: Essential for writing summaries to your spreadsheet 2. Google Drive Configuration Create a dedicated folder in Google Drive for meeting recordings Copy the folder ID from the URL (the long string after /folders/) Update the folderToWatch parameter in the workflow 3. Google Sheets Preparation Create a new Google Sheet or use an existing one Ensure it has columns: Date and Meeting Summary Copy the spreadsheet ID from the URL Update the documentId parameter in the workflow 4. Audio Requirements Supported Formats**: MP3, WAV, M4A, MP4 Recommended Size**: Under 100MB for optimal processing Language**: Optimized for English (customizable for other languages) Quality**: Clear audio produces better transcriptions 5. Workflow Activation Import the workflow JSON into your n8n instance Configure all credential connections Test with a sample audio file Activate the workflow trigger 🚀 Use Cases Project Management Team Standup Summaries**: Convert daily standups into actionable task lists Sprint Retrospectives**: Extract improvement points and action items Stakeholder Updates**: Generate concise reports for leadership Sales & Customer Success Discovery Call Notes**: Capture prospect pain points and requirements Demo Follow-ups**: Track questions, objections, and next steps Customer Check-ins**: Monitor satisfaction and expansion opportunities Consulting & Professional Services Client Strategy Sessions**: Document recommendations and implementation plans Requirements Gathering**: Organize complex project specifications Progress Reviews**: Track deliverables and milestone achievements HR & Training Interview Debriefs**: Standardize candidate evaluation notes Training Sessions**: Create searchable knowledge bases Performance Reviews**: Document development plans and goals Research & Development Brainstorming Sessions**: Capture innovative ideas and concepts Technical Reviews**: Log decisions and architectural choices User Research**: Organize feedback and insights systematically 💡 Advanced Customization Options Enhanced Summarization Modify the ChatGPT prompt to focus on specific elements: Add speaker identification for multi-person meetings Include sentiment analysis for customer calls Generate department-specific summaries (technical, sales, legal) Extract financial figures and metrics automatically Integration Expansions Slack Integration**: Auto-post summaries to relevant channels Email Notifications**: Send summaries to meeting participants CRM Updates**: Push action items directly to Salesforce/HubSpot Calendar Integration**: Schedule follow-up meetings based on action items Quality Improvements Audio Preprocessing**: Add noise reduction before transcription Multi-language Support**: Configure for international teams Custom Templates**: Create industry-specific summary formats Approval Workflows**: Add human review before final storage 🛠️ Troubleshooting & Best Practices Common Issues Large File Processing**: Split recordings over 100MB into smaller segments Poor Audio Quality**: Use noise reduction tools before uploading API Rate Limits**: Implement delay nodes for high-volume usage Formatting Issues**: Adjust ChatGPT prompts for consistent output Optimization Tips Upload files in supported formats only Ensure stable internet connection for cloud processing Monitor OpenAI API usage and costs Regularly backup your Google Sheets data Test workflow changes with sample files first 📊 Expected Outputs Sample Summary Format: Meeting Summary - March 15, 2024 Key Discussion Points: Q1 budget review and allocation decisions New product launch timeline and milestones Team restructuring and role assignments Action Items: John: Finalize budget proposal by March 20th (High Priority) Sarah: Schedule product demo sessions for March 25th Team: Submit org chart feedback by March 18th Decisions Made: Approved additional marketing budget of $50K Delayed product launch to April 15th for quality assurance Promoted Lisa to Senior Developer role 📞 Questions & Support For any questions, customizations, or technical support regarding this workflow: 📧 Email Support Primary Contact**: Yaron@nofluff.online Response Time**: Within 24 hours on business days Best For**: Setup questions, customization requests, troubleshooting 🎥 Learning Resources YouTube Channel**: https://www.youtube.com/@YaronBeen/videos Step-by-step setup tutorials Advanced customization guides Workflow optimization tips 🔗 Professional Network LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Connect for ongoing support Share your workflow success stories Get updates on new automation ideas 💡 What to Include in Your Support Request Describe your specific use case Share any error messages or logs Mention your n8n version and setup type Include sample audio file characteristics (if relevant) Ready to transform your meeting chaos into organized insights? Download the workflow and start automating your meeting summaries today!
by please-open.it
Intro This workflow needs a user to authenticate by using an openid connect provider in order to call the webhook. If the user is not authenticated, it starts a login process by using an Authorization Code with PKCE https://datatracker.ietf.org/doc/html/rfc7636, a standard way to authenticate users with openid connect. Then, after the user logs in, the webhook is refreshed and gets the user's token from a cookie. With this token, all details about the user are requested through the userinfo endpoint on the identity provider. How to set up with Keycloak Keycloak Keycloak is an open source identity and access management solution. Feel free to get a demo realm at https://please-open.it or get your own Keycloak server up and running. After creating a realm, go to "Realm Settings" and click on "OpenID Endpoint Configuration" Retrieve authorization_endpoint, token_endpoint and userinfo_endpoint values. Set those variables in the "Set variables" node. In Keycloak, create a new client (name it as you want) Disable the client authentication, check only "standard flow" : At the third step, put the webhook url in "valid redirect URIs", fill "Web origins" with a "+". You're done, open the webhook and it asks you to authenticate. Usage User informations The userinfo node returns this structure about the user has logged in : [ { "sub":"73a6543f-f420-4fa6-9811-209e903c348b", "email_verified":true, "preferred_username": "mathieu.passenaud@please-open.it", "email": "mathieu.passenaud@please-open.it" } ] I can use those infos in my workflow for custom operations. APIs calls the "code" node returns me a cookie named "n8n-custom-auth" which is the access_token returned by the identity provider. This access_token can be used to call APIs connected to this identity provider (for example, we call userinfo API with this token). Example : asks a user to log in with his Google account then call an API (Gmail, drive...) with his own token. How it works We published a blog post about this flow, how it works and how you can use it : https://blog.please-open.it/n8n-openid-client/
by Aurélien P.
📈 Daily Crypto Market Summary Bot (Binance to Telegram) This workflow fetches 24h price change data from Binance for selected crypto pairs (BTC/USDC, ETH/USDC, SOL/USDC) every hour using a cron schedule. It performs in-depth analysis—including volatility, volume, bid-ask spread, momentum, and market comparison—then formats a detailed market summary. The final report is sent to a Telegram chat using HTML formatting, highlighting top gainers, losers, and key metrics in a clean, readable layout. 🔑 Key Features ⏱ Runs every hour (cron: 5 * * * *) 🔍 Filters and analyzes major coins: BTC, ETH, SOL 📊 Calculates market metrics: Volatility Bid-ask spread Momentum Estimated market cap Market average comparison 📈 Highlights gainers, losers, and top coins by volume ✂️ Splits messages to fit Telegram’s 4096 character limit 💬 Sends output in rich HTML format to a Telegram group or chat 🎯 Use Cases ✅ Crypto traders wanting hourly performance insights ✅ Telegram groups needing automated market updates ✅ Analysts monitoring key coin metrics in real-time ✅ Bot developers creating crypto dashboards or alerts 🛠 Technical Details Data Source:** Binance 24hr ticker API (/api/v3/ticker/24hr) Coins Monitored:** BTCUSDC, ETHUSDC, SOLUSDC (can be expanded) Metrics Calculated:** Price change percentage Volatility (high vs low price) Bid-ask spread % Momentum (vs weighted average) Estimated market cap Number of trades Market average movement Message Format:** HTML with emojis, bold styling, and section headings Auto-split messages when exceeding Telegram's 4096-char limit Error Handling:** Retry on HTTP failure (up to 5 times with 5s delay) Message length checked and split for Telegram compatibility ⚙️ Setup Requirements Telegram Bot Token — Create a bot via @BotFather on Telegram Chat ID — Use a personal ID or group chat ID (add the bot to the group) n8n Instance — Either cloud or self-hosted (Optional) Modify relevantSymbols in the Function node to track different coins 🧠 Notes This workflow is highly customizable—feel free to modify the analytics, tracked pairs, or formatting. Great base for alerting systems or crypto dashboards. 📷 Example Output (Telegram) 📊 Crypto Market Summary — 2025-04-20 14:05:05 UTC 🌐 Market Overview (BTC, ETH, SOL) Average Change: -1.54% 24h Volume: $850,358,765.46 Most Volatile: SOLUSDC (4.53%) Most Liquid: BTCUSDC (0.0000% spread) 💹 Top by Volume ETHUSDC: $403,860,356.75 | -1.640% SOLUSDC: $279,241,338.60 | -1.706% BTCUSDC: $167,257,070.12 | -1.261% 📉 Losers SOLUSDC 🔻 Change: -1.71% (24h) 💰 Current: $137.10 📊 Range: $135.82 - $141.97 📈 Volatility: 4.53% 🔄 Volume: 2.01M | $279,241,338.60 ⚖️ Bid-Ask Spread: 0.0073% ⬇️ vs Market Avg: -0.17% 🔽 Momentum: -1.42% 🔢 Trades: 366,119 ETHUSDC 🔻 Change: -1.64% (24h) 💰 Current: $1,577.42 📊 Range: $1,565.60 - $1,631.98 📈 Volatility: 4.24% 🔄 Volume: 252.11K | $403,860,356.75 ⚖️ Bid-Ask Spread: 0.0044% ⬇️ vs Market Avg: -0.10% 🔽 Momentum: -1.53% 🔢 Trades: 596,801 BTCUSDC 🔻 Change: -1.26% (24h) 💰 Current: $84,336.65 📊 Range: $83,963.35 - $85,634.50 📈 Volatility: 1.99% 🔄 Volume: 1.97K | $167,257,070.12 ⚖️ Bid-Ask Spread: 0.0000% ⭐ vs Market Avg: 0.27% 🔽 Momentum: -0.68% 🔢 Trades: 124,202
by Nskha
n8n Creators Template: Creator Profile Stats Updater This n8n workflow template is designed to automate the process of updating a creator's profile statistics, including total workflows, complex workflows, approved workflows, pending workflows, total nodes, and total views. It utilizes various nodes to fetch data, process it, and update a SVG file hosted on GitHub to reflect the latest stats. Workflow Overview Schedule Trigger: Triggers the workflow execution at specified intervals. Config: Sets up configuration details like creator username, colors for text, icons, border, and card. Get Workflows: Fetches workflows associated with the creator from the n8n API. Workflows Data: Processes the fetched data to calculate various statistics. Get User: Fetches user details from the n8n API. Download Image: Downloads the creator's profile image. Extract From File: Extracts binary data from the downloaded image file. SVG: Generates an SVG file with updated stats and visual representation. GitHub: Commits the updated SVG file to the specified GitHub repository. Final: Prepares the final data set for further processing or output. Sticky Note: Provides a visual note or reminder within the workflow editor. Embed & Live Preview Since it's a .SVG format you can host it anywhere. treat it like normal image so you can embed it with any site, forum, page that support posting images. here's example code for markdown: Here's the result Or served through CDN & Cache Setup Instructions GitHub Credentials: Ensure you have GitHub credentials set up in your n8n instance to allow the workflow to commit changes to your repository. Configure Trigger: Adjust the Schedule Trigger node to set the desired execution intervals for the workflow. Set Configuration: Customize the Config node with your GitHub username and preferred aesthetic options for the SVG. Deploy Workflow: Import the workflow into your n8n instance and deploy it. Customization Options Text and Icon Colors**: Customize the colors used in the SVG by modifying the respective fields in the Config node. Profile Image Size**: Adjust the image size in the Download Image node URL if needed. Commit Messages**: Modify the commit messages in the GitHub nodes to suit your version control conventions [I've used $now funaction to include current time in message which will gives allways a diffrent commit value]. Requirements n8n (Self-hosted or Cloud version compatible with 2024 releases and up) GitHub account and repository Basic understanding of n8n workflow configuration Support and Contributions For support, please refer to the n8n community forum or the official n8n documentation. Contributions to the template can be made you're allowed to reuse this workflow and reshare with edit (like new design/colors etc..) under your name.
by Yaron Been
Automated pipeline to collect and analyze investor data from Crunchbase, tracking investment patterns, funding history, and portfolio companies for market analysis and lead generation. 🚀 What It Does Investor Profiling**: Collects comprehensive data on investors and VC firms Investment Pattern Analysis**: Tracks funding history and investment preferences Portfolio Monitoring**: Keeps tabs on investor portfolios and new investments Data Enrichment**: Enhances raw data with additional context and metrics 🎯 Perfect For Startup founders seeking investors Market research analysts Investment professionals Business development teams Competitive intelligence ⚙️ Key Benefits ✅ Comprehensive investor profiles ✅ Real-time investment tracking ✅ Market trend analysis ✅ Data-driven investment decisions ✅ Time-saving automation 🔧 What You Need Crunchbase API access n8n instance Storage solution (database or spreadsheet) 📊 Data Points Collected Investor/Firm details Investment history Portfolio companies Funding rounds participated in Investment focus areas Contact information (when available) 🛠️ Setup & Support Quick Setup Deploy in 30 minutes with our step-by-step configuration guide 📺 Watch Tutorial 💼 Get Expert Support 📧 Direct Help Transform your investor research with automated data collection and analysis. Spend less time gathering data and more time making strategic decisions.
by Jihene
AI-Agent Code Review for GitHub Pull Requests Description: This n8n workflow automates the process of reviewing code changes in GitHub pull requests using an OpenAI-powered agent. It connects your GitHub repo, extracts modified files, analyzes diffs, and uses an AI agent to generate a code review based on your internal code best practices (fed from a Google Sheet). It ends by posting the review as a comment on the PR and tagging it with a visual label like ✅ Reviewed by AI. 🔧 What It Does Triggered on PR creation Extracts code diffs from the PR Formats and feeds them into an OpenAI prompt Enriches the prompt using a Google Sheet of Swift best practices Posts an AI-generated review as a comment on the PR Applies a PR label to visually mark reviewed PRs ✅ Prerequisites Before deploying this workflow, ensure you have the following: n8n Instance (Self-hosted or Cloud) GitHub Repository with PR activity OpenAI API Key** for GPT-4o, GPT-4-turbo, or GPT-3.5 GitHub OAuth App** (or PAT) connected to n8n to post comments and access PR diffs (Optional) Google Sheets API credentials if using the code best practices lookup node. ⚙️ Setup Instructions 1. Import the Workflow in n8n, click on Workflows → Import from file or JSON Paste or upload the JSON code of this template 2. Configure Triggers and Connections 🔁 GitHub Trigger Node**: PR Trigger Repository**: Select the GitHub repo(s) to monitor Events**: Set to pull_request Auth**: Use GitHub OAuth2 credentials 📥 HTTP Request Node: Get file's Diffs from PR No authentication needed; it uses dynamic path from trigger 🧠 OpenAI Model Node**: OpenAI Chat Model Model**: Select gpt-4o, gpt-4-turbo, or gpt-3.5-turbo Credential**: Provide your OpenAI API Key 🧑💻 Code Review Agent Node : Code Review Agent Connected to OpenAI and optionally to tools like Google Sheets 💬 GitHub Comment Poster Uses GitHub API to post review comments back on PR Node: GitHub Robot Credential: Use the agent Github account (OAuth or PAT) Repo : Pick your owen Github Repository 🏷️ PR Labeler (optional) Adds label ReviewedByAI after successful comment Node: Add Label to PR Label : you ca customize the label text of your owen tag. 📊 Google Sheet Best Practices config (optional) Connects to a Google Sheet for coding guideline lookups, we can replace Google sheet by another tool or data base First prepare your best practices list with the clear description and the code bad/good examples Add al the best practices in your Google Sheet Configure* the Code *Best Practices node** in the template : Credential : Use your Google Sheet account by OAuth2 URL : Add your Google Sheet document URL Sheet : Add the name of the best practices sheet
by Corentin Ribeyre
This template can be used to verify email addresses with Icypeas. Be sure to have an active account to use this template. How it works This workflow can be divided into four steps : The workflow initiates with a manual trigger (On clicking ‘execute’). It reads your Google Sheet file. It connects to your Icypeas account. It performs an HTTP request to scan the domains/companies. Set up steps You will need a formated Google sheet file with company/domain names. You will need a working icypeas account to run the workflow and get your API Key, API Secret and User ID. You will need domain/companies names to scan them.
by Kees Bosch - Browserflow
Auto find & invite LinkedIn Leads This n8n template automates LinkedIn lead generation by scraping profiles, filtering out existing connections, and sending connection requests — all in a controlled, looped workflow. Ideal for outreach campaigns, recruitment, or lead gen efforts. ⚠️ Disclaimer – Community Node Notice This template uses a verified community node available inside the n8n cloud environment. To use it, go to "Nodes" → search for: Browserflow for Linkedin …and click Install. It’s officially verified and accessible directly from n8n cloud. In case you wish to run this template locally, you need to go to the settings, click community nodes and search for n8n-nodes-browserflow. Then after installing you can start using the actions in this node. 🛠️ How to Use Trigger: Manual Start Initiates the workflow manually via the “Test workflow” button, giving you full control. Scrape LinkedIn Profiles Uses the Browserflow automation to extract profile links from a LinkedIn search or keyword query. Split Out Results Converts the list of profiles into individual items for single-profile processing. Loop Through Each Profile Ensures each LinkedIn profile is handled one at a time, avoiding simultaneous actions. Check Existing Connection Verifies if you’re already connected with the lead on LinkedIn. Conditional Logic ✅ Already Connected → Skip to next profile ❌ Not Connected → Continue to next step Send Connection Invite Sends a LinkedIn connection request, optionally with a personalized message. 📦 Requirements n8n (cloud or self-hosted) Installed community node: Browserflow for Linkedin LinkedIn account Valid Browserflow acount (you can set up a free 7-day trial at https://browserflow.io) ⚙️ Setup Instructions Install the Browserflow Community Node Search “Browserflow for Linkedin” > Install. Get your API key Get your API key at https://browserflow.io Setup your Browserflow account After registering, setup your Browserflow and connect with Linkedin using the wizard at https://browserflow.io Connect with Browserflow by making a credential Click on the Browserflow actions to setup a connection with Browserflow by adding your API key to a credential. 🧩 Customization Tips Targeting: Adjust the Browserflow actions to scrape specific roles, industries, or locations. Messaging: You can add a message to the connection invite but remind that LinkedIn limits the amount of messages that can be send each month. Use variables in the message for personalization (e.g., {firstName}). Trigger: Replace manual trigger with a cron node for scheduled outreach. Integration: Combine with CRM tools (e.g., HubSpot, Notion, Airtable) for syncing leads or integrate with AI Agents.
by Lucas Perret
This workflow enriches new accounts in Pipedrive using Datagma API by adding data about ICP (ideal customer profile). Instead of Pipedrive, you can use any other CRM. In this example, ideal buyers are heads of sales/business development. Prerequisites Pipedrive account and Pipedrive credentials How it works Pipedrive trigger node starts the workflow when a new company is created. HTTP Request node queries data from Datagma. Pipedrive node updates Pipedrive contact with new data from Datagma. The Item Lists node simplifies returned data from Datagma that contain lists (arrays), enabling you to easily modify the structure for further processing without the need to use Function nodes and write custom JavaScript. IF node identifies if the lead corresponds ICP. HTTP Request node searches for emails in Datagma. Set node prepares data for further merging. Merge node combines data from multiple streams. Pipedrive node adds a new person in Pipedrive.
by AlQaisi
Template Information Who is this template for? This template is for users looking to retrieve email information from LinkedIn profiles and update Google Sheets with the collected data. 🎥 quick set up video How it works** The template utilizes a series of nodes to fetch email information from LinkedIn profiles. It starts with a Schedule Trigger node that sets the interval for the workflow. The Conditional Check node verifies if certain fields like Name, Gender, Job Title, Summary, and LinkedIn URL are not empty. The HTTP Request node sends a POST request to the specified URL with API key and profile information. The Data Merge node merges the data collected. The Field Editing node modifies the fields as needed. Finally, the Google Sheets Update node updates the Google Sheets with the gathered information. Set Up Instructions Make sure to have the necessary credentials and permissions for accessing LinkedIn and Google Sheets. Set up the API key required for the HTTP Request node. Configure the Google Sheets Update node with the appropriate document ID and sheet name. Check and adjust field mappings in the Field Editing node according to your needs. Run the workflow and monitor the updates in your Google Sheets document. Overview: The workflow is designed to find contact information for LinkedIn profile URLs stored in a Google Sheet. It involves various nodes for different operations such as making HTTP requests, scheduling triggers, reading from and updating Google Sheets, field editing, data merging, and conditional checks. A video demonstrating the workflow process can be accessed here. Copy this template to get started : Google Sheets Using Prospeo.io LinkedIn Email Finder API with cURL To use the API endpoint "https://api.prospeo.io/linkedin-email-finder" with cURL, follow these steps: Use the cURL command with the following parameters: curl -X POST \ -H "Content-Type: application/json" \ -H "X-KEY: your_api_key" \ -d '{ "url": "https://www.linkedin.com/in/john-doe/" }' \ "https://api.prospeo.io/linkedin-email-finder" Replace "your_api_key" with your actual API key. Update the "url" field in the JSON data with the LinkedIn profile URL for which you want to find the email address. To get access to this API and obtain your API key, you need to sign up on the Prospeo platform and subscribe to their LinkedIn email finder service. Once you have subscribed, you will receive an API key that you can use to authenticate your requests to the API endpoint. Description: Schedule Trigger:** Triggers the workflow based on a defined schedule interval, in this case, based on minutes. Schedule Trigger Node Documentation Google Sheets Read:** Reads data from a Google Sheets document and sheet based on the provided document ID and sheet name. Google Sheets Node Documentation Conditional Check:** Checks multiple conditions based on the input data and performs actions accordingly. Conditional Node Documentation HTTP Request:** Sends an HTTP POST request to a specified URL with headers and body parameters. HTTP Request Node Documentation No Operation, do nothing:** Placeholder node that does not perform any operation. Data Merge:** Merges data based on specified mode and combination settings. Merge Node Documentation Field Editing:** Edits fields by setting specific values for each field based on input data. Set Node Documentation Google Sheets Update:** Updates data in a Google Sheets document and sheet based on specified columns and values. Google Sheets Node Documentation
by Martijn Smit
This workflow template helps Todoist users get a weekly overview of their completed tasks via email, making it easier to review their past week. Why use this workflow? Todoist doesn’t provide completed task reports or filters in its built-in reports or n8n app. This workflow solves that by using Todoist’s public API to fetch your completed tasks. How it works Runs every Friday afternoon (or manually). Uses the Todoist public API to retrieve completed tasks. Excludes specific projects you set (e.g., a grocery list). Sends an email summary, grouping tasks by the day they were completed. Set up steps Copy your Todoist API token (found here). Create a Todoist API credential in n8n. Create an SMTP credential in n8n. Alternatively, use a preferred email service like Brevo, Mailjet, etc. Import this workflow template. In the Get completed tasks via Todoist API step, select your Todoist API credential. In the Send Email step: Select your SMTP credential. Set the sender and recipient email addresses. Run the workflow manually and check your inbox! Ignoring specific projects If you do not want your grocery list, workouts, or other tasks from specific Todoist projects showing up in your weekly summary, modify the step called Optional: Ignore specific projects and change this line: const ignoredProjects = ['2335544024']; This should be an array with the id of each project you'd like to ignore. You can find a list of your projects (inc. their Ids) by visiting this link: https://api.todoist.com/rest/v2/projects