by James Francis
Overview Slack quietly released an update to their API that allows developers to build "AI Apps & Agents", which is a special classification of apps that have access to several special capabilities including: Multiple simultaneous chat threads with one user Loading "three dots" UI while your agent is thinking Option for users to pin your app to their top bar for quick chat access This workflow demonstrates how to build a Slack agent that takes advantage of all of these features. For a full video walkthrough of this workflow, watch this YouTube tutorial. Setup Instructions All of the below steps are required for this workflow to function properly unless otherwise noted. Create a Slack App Visit api.slack.com and click "Your Apps" Create a new app from scratch and follow the setup instructions In the Agents & AI Apps tab, enable the toggle and give your app a brief description In the OAuth & Permissions tab, enable the following bot token scopes: assistant:write chat:write channels:read im:history Install the app into your workspace and grant the requested permissions In your Slack workspace, right click your app's name in the sidebar, click "View app details", and make note of your apps Channel ID - you'll need this later. Copy your app's Bot User OAuth Token - you'll need that to create your n8n credentials In the Event Subscriptions tab, enable events and paste the workflows PRODUCTION webhook url (from this workflow's trigger node) into the input. In the same tab under "Susbcribe to bot events", select message.im Create a Postgres database In order to save the chat history and give your agent a working memory, you'll need your own Postgres database. You can use Supabase, Neon, or any other Postgres database provider. Once you've added your database's credentials to n8n, you can select those credentials in the Postgres Chat Memory node. This worklow saves all chat history in a table called chat_histories, but you name the table whatever you want. Create n8n Credentials You'll need to create the following credentials: Slack API. Use your Bot User OAuth Token referenced above. Bearer Auth. Use the same Bot User OAuth Token. Postgres. Use the connection string or config from your database provider. OpenRouter (or any other LLM model for the agent's model node) Wire Everything Up Now that you've created your Slack app, have your Postgres database, and have created credentials, follow these steps to wire up your workflow: In the "On Message Received" trigger, use your Slack API credential and enter your apps Channel ID in the "Channel To Watch" field. In the "Set Thinking Status" node, use your Bearer Auth credential. In the "Postgres Chat Memory" node, use your Postgres credential. In the "Send Reply" node, use your Slack API credential. Using the Chatbot Once you've completed the setup process and added in your credentials, you'll have a fully functional Slack chatbot complete with threads, loading UI, and the ability to pin your app to your workspace's top bar. Taking the Next Steps Now that this skeleton app is in place, it's up to you to add horsepower to the AI agent at the center of it all. Customize the prompts and add whatever tools you'd like. The sky is the limit! If you have any questions or feedback about this workflow, or would like me to build custom workflows for your business, email me at n8n@paperjam.agency.
by Oneclick AI Squad
This n8n template demonstrates how to create a comprehensive marketing automation and booking system that combines Excel-based lead management with voice-powered customer interactions. The system utilizes VAPI for voice communication and Excel/Google Sheets for data management, making it ideal for restaurants seeking to automate marketing campaigns and streamline booking processes through intelligent voice AI technology. Good to know Voice processing requires active VAPI subscription with per-minute billing Excel operations are handled in real-time with immediate data synchronization The system can handle multiple simultaneous voice calls and lead processing All customer data is stored securely in Excel with proper formatting and validation Marketing campaigns can be scheduled and automated based on lead data How it works Lead Management & Marketing Automation Workflow New Lead Trigger: Excel triggers capture new leads when customers are added to the lead management spreadsheet Lead Preparation: The system processes and formats lead data, extracting relevant details (name, phone, preferences, booking history) Campaign Loop: Automated loop processes through multiple leads for batch marketing campaigns Voice Marketing Call: VAPI initiates personalized voice calls to leads with tailored restaurant offers and booking invitations Response Tracking: All call results and lead responses are logged back to Excel for campaign analysis Booking & Order Processing Workflow Voice Response Capture: VAPI webhook triggers when customers respond to marketing calls or make direct booking requests Response Storage: Customer responses and booking preferences are immediately saved to Excel sheets Information Extraction: System processes natural language responses to extract booking details (party size, preferred times, special requests) Calendar Integration: Booking information is automatically scheduled in restaurant management systems Confirmation Loop: Automated follow-up voice messages confirm bookings and provide additional restaurant information Excel Sheet Structure Lead Management Sheet | Column | Description | |--------|-------------| | lead_id | Unique identifier for each lead | | customer_name | Customer's full name | | phone_number | Primary contact number | | email | Customer email address | | last_visit_date | Date of last restaurant visit | | preferred_cuisine | Customer's food preferences | | party_size_typical | Usual number of guests | | preferred_time_slot | Preferred dining times | | marketing_consent | Permission for marketing calls | | lead_source | How customer was acquired | | lead_status | Current status (new, contacted, converted, inactive) | | last_contact_date | Date of last marketing contact | | notes | Additional customer information | | created_at | Lead creation timestamp | Booking Responses Sheet | Column | Description | |--------|-------------| | response_id | Unique response identifier | | customer_name | Customer's name from call | | phone_number | Contact number used for call | | booking_requested | Whether customer wants to book | | party_size | Number of guests requested | | preferred_date | Requested booking date | | preferred_time | Requested time slot | | special_requests | Dietary restrictions or special occasions | | call_duration | Length of VAPI call | | call_outcome | Result of marketing call | | follow_up_needed | Whether additional contact is required | | booking_confirmed | Final booking confirmation status | | created_at | Response timestamp | Campaign Tracking Sheet | Column | Description | |--------|-------------| | campaign_id | Unique campaign identifier | | campaign_name | Descriptive campaign title | | target_audience | Lead segments targeted | | total_leads | Number of leads contacted | | successful_calls | Calls that connected | | bookings_generated | Number of bookings from campaign | | conversion_rate | Percentage of leads converted | | campaign_cost | Total VAPI usage cost | | roi | Return on investment | | start_date | Campaign launch date | | end_date | Campaign completion date | | status | Campaign status (active, completed, paused) | How to use Setup: Import the workflow into your n8n instance and configure VAPI credentials Excel Configuration: Set up Excel/Google Sheets with the required sheet structure provided above Lead Import: Populate the Lead Management sheet with customer data from various sources Campaign Setup: Configure marketing message templates in VAPI nodes to match your restaurant's branding Testing: Test voice commands such as "I'd like to book a table for tonight" or "What are your specials?" Automation: Enable triggers to automatically process new leads and schedule marketing campaigns Monitoring: Track campaign performance through the Campaign Tracking sheet and adjust strategies accordingly The system can handle multiple concurrent voice calls and scales with your restaurant's marketing needs. Requirements VAPI account** for voice processing and natural language understanding Excel/Google Sheets** for storing lead, booking, and campaign data n8n instance** with Excel/Sheets and VAPI integrations enabled Valid phone numbers** for lead contact and compliance with local calling regulations Customising this workflow Multi-location Support**: Adapt voice AI automation for restaurant chains with location-specific offers Seasonal Campaigns**: Try popular use-cases such as holiday promotions, special event marketing, or loyalty program outreach Integration Options**: The workflow can be extended to include CRM integration, SMS follow-ups, and social media campaign coordination Advanced Analytics**: Add nodes for detailed campaign performance analysis and customer segmentation
by Max aka Mosheh
How it works Trigger the workflow manually via the n8n UI. Define key parameters like the image prompt, number of images, size, quality, and model. Send a POST request to OpenAI’s image generation API using those inputs. Split the API response to handle multiple images. Convert the base64 image data into downloadable binary files. Set up steps Initial setup takes around 5–10 minutes. You’ll need an OpenAI API key, a configured HTTP Request node with credentials, and to customize the prompt/parameter fields in the “Set Variables” node. No advanced config or external services needed. Important Note You have to make sure to complete OpenAI's new verification requirements to use their new image API: https://help.openai.com/en/articles/10910291-api-organization-verification It only takes a few minutes and does not cost any money.
by Santhej Kallada
Who is this for? Marketers, lead generation agencies, freelancers, consultants, and sales teams who need to collect business leads from Google Maps. Small business owners looking to build targeted local business lists. Anyone interested in automating web scraping without coding skills. What problem is this workflow solving? Manually scraping business data from Google Maps is time-consuming and repetitive. This automation simplifies the process by: Collecting business details based on search terms and location. Filtering out irrelevant results. Delivering qualified leads directly to your inbox. What this workflow does This workflow automates Google Maps lead scraping using APIFY and sends the gathered leads via email. The steps include: Collecting user input through a simple form (business type, location, recipient email). Sending an HTTP request to APIFY to run a Google Maps scraper (actor). Filtering results to include only businesses with email addresses. Converting results to CSV format. Sending an automated email with the leads as a CSV attachment via Gmail. Setup Create an APIFY Account: Sign up at APIFY.COM (https://apify.com/). You get $5 in free credits (~1,000 leads). Get Your API Key: Copy your API key from APIFY Prepare n8n: Create a new workflow. Add an HTTP Request node to interact with APIFY. Configure authentication with your API key. Customize the Form: Build a simple form inside n8n to collect user inputs: Business Type, City, Country, Recipient Email. Filter Results: Use IF and Filter nodes to remove entries without email addresses. Convert to CSV: Use a "Spreadsheet File" node to generate a CSV from the filtered leads. Send Email: Use the Gmail node (or any email node) to send the CSV file to the provided recipient. How to customize this workflow to your needs Change search parameters to target different business niches or locations. Add filters to only include businesses with websites. Customize the email subject and body. Integrate with CRM or marketing platforms for direct lead injection. Expand filtering logic for more refined targeting. Notes This template uses APIFY (paid service after free credits). You will need an APIFY API key and a Gmail account (or SMTP credentials) to run this automation. For self-hosted n8n users: ensure you have internet access and proper credentials set up for external HTTP requests. Want A Video Tutorial on How To Setup This Automation : https://www.youtube.com/watch?v=Kz_Gfx7OH6o
by Bright Data
🔍 Glassdoor Job Finder: Bright Data Scraping + Keyword-Based Automation A comprehensive n8n automation that scrapes Glassdoor job listings using Bright Data's web scraping service based on user-defined keywords, location, and country parameters, then automatically stores the results in Google Sheets. 📋 Overview This workflow provides an automated job search solution that extracts job listings from Glassdoor using form-based inputs and stores organized results in Google Sheets. Perfect for recruiters, job seekers, market research, and competitive analysis. Workflow Description: Automates Glassdoor job searches using Bright Data's web scraping capabilities. Users submit keywords, location, and country via form trigger. The workflow scrapes job listings, extracts company details, ratings, and locations, then automatically stores organized results in Google Sheets for easy analysis and tracking. ✨ Key Features 🎯 Form-Based Input: Simple web form for job type, location, and country 🔍 Glassdoor Integration: Uses Bright Data's Glassdoor dataset for accurate job data 📊 Smart Data Processing: Automatically extracts key job information 📈 Google Sheets Storage: Organized data storage with automatic updates 🔄 Status Monitoring: Built-in progress tracking and retry logic ⚡ Fast & Reliable: Professional scraping with error handling 🎯 Keyword Flexibility: Search any job type with location filters 📝 Structured Output: Clean, organized job listing data 🎯 What This Workflow Does Input Job Keywords:** Job title or role (e.g., "Software Engineer", "Marketing Manager") Location:** City or region for job search Country:** Target country for job listings Processing Form Submission Data Scraping via Bright Data Status Monitoring Data Extraction Data Processing Sheet Update Output Data Points | Field | Description | Example | |-------|-------------|---------| | Job Title | Position title from listing | Senior Software Engineer | | Company Name | Employer name | Google Inc. | | Location | Job location | San Francisco, CA | | Rating | Company rating score | 4.5 | | Job Link | Direct URL to listing | https://glassdoor.com/job/... | 🚀 Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access Bright Data account with Glassdoor scraping dataset access 5–10 minutes for setup Step 1: Import the Workflow Copy the JSON workflow code from the provided file In n8n: Workflows → + Add workflow → Import from JSON Paste JSON and click Import Step 2: Configure Bright Data Set up Bright Data credentials in n8n Ensure access to dataset: gd_lpfbbndm1xnopbrcr0 Update API tokens in: "Scrape Job Data" node "Check Delivery Status of Snap ID" node "Getting Job Lists" node Step 3: Configure Google Sheets Integration Create a new Google Sheet (e.g., "Glassdoor Job Tracker") Set up Google Sheets OAuth2 credentials in n8n Prepare columns: Column A: Job Title Column B: Company Name Column C: Location Column D: Rating Column E: Job Link Step 4: Update Workflow Settings Update "Update Job List" node with your Sheet ID and credentials Test the form trigger and webhook URL Step 5: Test & Activate Submit test data (e.g., "Software Engineer" in "New York") Activate the workflow Verify Google Sheet updates and field extraction 📖 Usage Guide Submitting Job Searches Navigate to your workflow's webhook URL Fill in: Search Job Type Location Country Submit the form Reading the Results Real-time job listing data Company ratings and reviews Direct job posting links Location-specific results Processing timestamps 🔧 Customization Options More Data Points:** Add job descriptions, salary, company size, etc. Search Parameters:** Add filters for salary, experience, remote work Data Processing:** Add validation, deduplication, formatting 🚨 Troubleshooting Bright Data connection failed:** Check API credentials and dataset access No job data extracted:** Validate search terms and location format Google Sheets permission denied:** Re-authenticate and check sharing Form submission failed:** Check webhook URL and form config Workflow execution failed:** Check logs, add retry logic Advanced Troubleshooting Check execution logs in n8n Test individual nodes Verify data formats Monitor rate limits Add error handling 📊 Use Cases & Examples Recruitment Pipeline:** Track job postings, build talent database Market Research:** Analyze job trends, hiring patterns Career Development:** Monitor opportunities, salary trends Competitive Intelligence:** Track competitor hiring activity ⚙️ Advanced Configuration Batch Processing:** Accept multiple keywords, loop logic, delays Search History:** Track trends, compare results over time External Tools:** Integrate with CRM, Slack, databases, BI tools 📈 Performance & Limits Single search:** 2–5 minutes Data accuracy:** 95%+ Success rate:** 90%+ Concurrent searches:** 1–3 (depends on plan) Daily capacity:** 50–200 searches Memory:** ~50MB per execution API calls:** 3 Bright Data + 1 Google Sheets per search 🤝 Support & Community n8n Community Forum:** community.n8n.io Documentation:** docs.n8n.io Bright Data Support:** Via your dashboard GitHub Issues:** Report bugs and features Contributing: Share improvements, report issues, create variations, document best practices. Need Help? Check the full documentation or visit the n8n Community for support and workflow examples.
by Ahmed Alnaqa
Who is this template for? This workflow template is designed for content creators, researchers, educators, and professionals who need quick, accurate summaries of YouTube videos. It’s ideal for those looking to save time, extract key insights, or repurpose video content into concise formats for reports, studies, or social media. What does it do? The workflow automates the process of summarizing YouTube videos by extracting the transcript, analyzing the content, and generating a concise summary. It leverages AI tools to ensure accuracy and relevance, making it easier to digest lengthy videos in seconds. Why is it useful? This template saves hours of manual effort by automating video summarization, enabling users to focus on analyzing or sharing insights rather than watching entire videos. It’s particularly useful for staying updated with trends, conducting research, or creating content efficiently. How does it work? The workflow integrates with YouTube’s Transcript API powered by Apify Actor to fetch video transcripts, process the text using AI-powered summarization tools, and deliver a clear, concise summary. Setup Instructions You need an Apify account and an API key to connect with the Actor. Follow the steps below: Create a Free Account. Choose the appropriate Actor from the Apify search. Under the Integration tab, click on “Use API endpoints.” Select the API that best suits your needs.
by David Olusola
This plug-and-play n8n workflow automates medical record digitization using Mistral’s OCR API and stores clean, structured data in Google Sheets. Whether you run a clinic or healthtech product, this no-code solution simplifies data entry from scanned or uploaded medical documents. 📌 Works seamlessly on both self-hosted and cloud-based n8n environments. 👥 Who is this for? Hospitals and private clinics Healthtech platforms & startups Medical admin and document processing teams Clinical researchers and labs 😓 What problem does it solve? ❌ Manual entry from printed forms ❌ Unstructured, scattered records ❌ Errors in data transcription ❌ Inconsistent document storage ✅ This automation brings consistency, structure, and speed to the way you handle medical documents. ✅ What this workflow does Captures uploaded documents through a public form Uploads file to Mistral for OCR processing Extracts clean text from each page (PDF or image) Parses patient fields (Name, DOB, Diagnosis, Medications, etc.) Saves records into a structured Google Sheet 🛠️ Setup Instructions Step 1: Google Sheet Prep Create a Google Sheet with these columns (case-sensitive): Name, Date of Birth, Patient ID, Date of Visit, Referring Physician, Department, Symptoms, Blood Pressure, Heart Rate, Temperature, Lab Results, Diagnosis, Medications, Next Appointment, Notes Step 2: Mistral API Access Sign up at Mistral AI Get your API key Ensure your plan supports file upload & OCR endpoints Step 3: Google OAuth Credentials (Self-hosted or Cloud) Go to n8n → Settings → Credentials, and add: Google Sheets OAuth2 Scopes needed: https://www.googleapis.com/auth/spreadsheets Step 4: Import Workflow Go to Workflows > Import from File Upload your JSON file Replace: Google Sheet document ID in the "Google Sheets" node Your Mistral API key in HTTP Header Auth Step 5: (Optional) Make Form Public In Cloud-based n8n: You can expose the form as a public page Otherwise, connect it to your website form via webhook 🧩 Customization Tips Extract More Fields Update the "Data cleaning" node and extend the list of fields: const fields = ["Name", "Diagnosis", "Medications", "Symptoms", ...]; Add EHR or Database Integration After Google Sheets, chain your custom system: PostgreSQL Airtable Supabase MongoDB Change Output Format Want JSON or Markdown output for internal tools? Use the Set or Code node before the final output step. 🧪 Troubleshooting Issue Fix File upload fails Check Mistral API key and file type Google Sheets not updating Verify credentials and document ID No data parsed Check OCR quality; verify field labels in document Workflow not triggering Ensure webhook or form is configured correctly 🌐 Self-Hosted vs Cloud Comparison Feature Self-Hosted n8n Cloud Public Form Access Manual setup Built-in OAuth App Config Required Pre-configured Storage Limits Depends on server Included with plan Scalability Fully customizable Scales automatically 📣 Getting Support n8n Docs Mistral API Docs n8n Community Or reach out to: David Olusola (dimejicole21@gmail.com) 🌟 Like this template? Give it a star in the template library and help other no-code builders discover it. "Turn scanned documents into structured data with zero code."
by ist00dent
This n8n template enables you to instantly retrieve detailed geolocation information for any given IP address by simply sending a webhook request. Leverage the power of IP-API.com to gain insights into user locations, personalize experiences, or enhance security protocols within your automated workflows. 🔧 How it works Receive IP Webhook: This node acts as the entry point, listening for incoming POST requests. It expects a JSON body containing an ip property with the IP address you wish to look up. Get IP Geolocation: This node makes an HTTP GET request to the IP-API.com service, passing the IP address from your webhook. The API responds with a comprehensive JSON object detailing the IP's location (country, city, region), ISP, organization, and more. Respond with Geolocation Data: This node sends the full geolocation data received from IP-API.com back to the service that initiated the webhook. 👤 Who is it for? This workflow is ideal for: Marketing & Sales Teams: Personalize website content, offers, or ads based on a visitor's geographic location. Tailor email campaigns by region. Customer Support: Quickly identify a customer's location to provide more localized or relevant assistance. Security & Fraud Detection: Analyze incoming connection IPs to identify suspicious activity, block known malicious regions, or flag potential fraud. Analytics & Reporting: Augment your analytics data with geographical insights about your users or traffic. Developers & Integrators: Easily add IP lookup functionality to custom applications, internal tools, or monitoring systems. Content Delivery Networks (CDNs): Route users to the closest servers for faster content delivery (though advanced CDNs usually handle this automatically). 📑 Data Structure When you trigger the webhook, send a POST request with a JSON body structured as follows: { "ip": "8.8.8.8" // Replace with the IP address you want to look up } The workflow will return a JSON response similar to this (data will vary based on IP): { "status": "success", "country": "United States", "countryCode": "US", "region": "VA", "regionName": "Virginia", "city": "Ashburn", "zip": "20149", "lat": 39.0437, "lon": -77.4875, "timezone": "America/New_York", "isp": "Google LLC", "org": "Google Public DNS", "as": "AS15169 Google LLC", "query": "8.8.8.8" } ⚙️ Setup Instructions Import Workflow: In your n8n editor, click "Import from JSON" and paste the provided workflow JSON. Configure Webhook Path: Double-click the Receive IP Webhook node. In the 'Path' field, set a unique and descriptive path (e.g., /ip-lookup). Activate Workflow: Save and activate the workflow. 📝 Tips This workflow, while simple, is a powerful building block. Here's how you can make it even more useful: Conditional Logic: Add IF nodes after "Get IP Geolocation" to create conditional branches. For example: If countryCode is 'CN' or 'RU', send an alert to your security team. If city is 'New York', route the request to a specific sales representative. Data Enrichment: Integrate this workflow into larger automation. For instance: When a new sign-up occurs, pass their IP address to this workflow, then save the returned geolocation data (country, city, ISP) alongside their user profile in your CRM or database. For e-commerce, use the location data to pre-fill shipping fields or suggest local currency/language. Logging & Analytics: Store the lookup results in a spreadsheet (Google Sheets), database (PostgreSQL, Airtable), or logging service. This can help you track where your users are coming from over time. Rate Limiting: IP-API.com has rate limits for its free tier. If you anticipate high usage, consider adding a Delay node or implementing a caching mechanism with a Cache node to avoid hitting limits. For heavy use, you might need to upgrade to a paid plan. Dynamic Response: Instead of returning the full JSON, you could use a Function node to extract only specific pieces of information (e.g., just the country and city) and return a more concise response. Input Validation: For robust production use, add a Function node after the webhook to validate that the incoming ip value is indeed a valid IP address. If it's not, you can return an error message to the caller.
by Jah coozi
AI Social Media Content Generator & Scheduler Transform your social media strategy with AI-powered content generation that creates platform-specific posts in seconds! 🚀 What It Does This workflow uses AI to generate optimized content for multiple social media platforms from a single topic input. Perfect for marketers, content creators, and businesses looking to maintain consistent social media presence. ✨ Key Features Multi-Platform Support**: LinkedIn, Twitter/X, Instagram, Facebook, TikTok AI-Powered Generation**: Uses GPT-4 for creative, engaging content Platform Optimization**: Respects character limits and best practices Hashtag Generation**: Platform-specific hashtag strategies Posting Time Suggestions**: Optimal times for each platform Tone Customization**: Professional, casual, friendly, or custom Multi-Language Support**: Generate content in any language Engagement Predictions**: Estimate reach and engagement Daily Automation**: Schedule automatic content generation Bulk Processing**: Generate content for multiple topics at once 📊 Use Cases Marketing Teams: Streamline content creation across channels Small Businesses: Maintain consistent social presence Content Agencies: Scale content production efficiently Personal Brands: Build thought leadership E-commerce: Product launches and promotions 🛠️ Setup Instructions Add OpenAI Credentials Get API key from OpenAI Add to n8n credentials Configure Webhook (Optional) Set custom path if needed Enable for external integrations Customize Settings Adjust tone and style Set platform preferences Configure posting schedule Test Generation Use example prompts Verify output quality 💡 Example Inputs "New product launch - eco-friendly water bottle" "Company milestone - 10 years in business" "Industry insights - Future of AI in healthcare" "Team spotlight - Meet our new developer" "Seasonal campaign - Summer sale 50% off" 📈 Benefits 10x Faster**: Create content in seconds vs hours Consistency**: Maintain brand voice across platforms Engagement**: Platform-optimized for maximum reach Scalability**: Generate unlimited content Cost-Effective**: Reduce content creation costs by 80% 🔧 Customization Options Custom brand voice training Industry-specific content rules Competitor analysis integration A/B testing capabilities Analytics webhook integration Auto-posting to platforms Image generation add-on Translation services 🎯 Pro Tips Train the AI with your best-performing posts Use platform analytics to refine strategies Test different tones for audience engagement Schedule content during peak hours Monitor and iterate based on performance Start creating engaging social media content today! Categories: Marketing & Growth Content Creation Social Media AI & Automation Productivity Difficulty: Beginner Required Services: OpenAI API (or compatible LLM) n8n instance Optional: Social media APIs for auto-posting
by Teddy
Retrieve 20 Latest TechCrunch Articles Who is this for? This workflow is designed for developers, content creators, and data analysts who need to scrape recent articles from TechCrunch. It’s perfect for anyone looking to aggregate news articles or create custom feeds for analysis, reporting, or integration into other systems. What problem is this workflow solving? This workflow automates the process of scraping recent articles from TechCrunch. Manually collecting article data can be time-consuming and inefficient, but with this workflow, you can quickly gather up-to-date news articles with relevant metadata, saving time and effort. What this workflow does This workflow retrieves the latest 20 news articles from TechCrunch’s “Recent” page. It extracts the article URLs, metadata (such as titles and publication dates), and main content for each article, allowing you to access the information you need without any manual effort. Setup Clone or download the workflow template. Ensure you have a working n8n environment. Configure the HTTP Request nodes with your desired parameters to connect to the TechCrunch API. (Optional) Customize the workflow to target specific sections or topics of interest. Run the workflow to retrieve the latest 20 articles. How to customize this workflow to your needs Modify the HTTP request to pull articles from different pages or sections of TechCrunch. Adjust the number of articles to retrieve by changing the selection criteria. Add additional processing steps to further filter or analyze the article data. Workflow Steps Send an HTTP request to the TechCrunch "Recent" page. Parse a posts box that holds the list of articles. Parse all posts to extract all articles. spilt out posts for each article. Extract the URL and metadata from each article. Send an HTTP request for each article using its URL. Locate and parse the main content of each article. Note: Be sure to update the HTTP Request nodes with any necessary headers or authentication to work with TechCrunch’s website.
by Dvir Sharon
Goodreads Quote Extraction with Bright Data and Gemini This workflow demonstrates how to fetch data specifically from Goodreads web pages using Bright Data and then extract specific information (quotes) from that data using a Google Gemini AI model. How it works The workflow is triggered manually. It sends a request to a Bright Data collector to scrape data from a predefined list of Goodreads URLs. The collected text data from Goodreads is then passed to a Google Gemini AI node. The AI node processes the text and extracts quotes based on a specified JSON schema output format. Set up steps Setting up this workflow should take only a few minutes. You will need a Bright Data API key to configure the 'Header Auth' credential. You will need a Google Gemini API key to configure the 'Google Gemini(PaLM) Api account' credential. Ensure the correct Bright Data collector ID is set in the 'Perform Bright Data Web Request' node URL. Make sure the full list of target Goodreads URLs is correctly added to the 'Perform Bright Data Web Request' node's body. Link your created credentials to the respective nodes ('Perform Bright Data Web Request' and 'Quotes Extractor'). Keep detailed descriptions for specific node configurations in sticky notes inside your workflow canvas.
by Miquel Colomer
This n8n workflow template checks for new major releases (tagged with .0) of the n8n project using its official GitHub releases feed. It runs multiple times a day and sends notifications via email and Telegram if a new release is found. > ⚠️ Note: You must *activate the workflow* to start receiving release notifications. 🚀 What It Does Monitors the n8n GitHub releases feed Detects major versions (e.g., 1.0.0, 2.0.0) Sends alert messages via Telegram and email (SES) when a release is published ⏰ Scheduling Details The Cron node checks for new releases three times per day: 10:00, 14:00, and 18:00 server time. 🛠️ Step-by-Step Setup Configure Telegram Bot Connect your Telegram bot and specify the chat ID where you want to receive notifications. Set up AWS SES Credentials Use a verified sender email and set up AWS SES credentials in your n8n instance. Activate the Workflow Enable the workflow in your instance to start receiving notifications. Customize Notification Messages (Optional) You can modify the email subject, Telegram format, or filter logic. 🧠 How It Works: Workflow Overview Cron Trigger Runs the workflow at 10:00, 14:00, and 18:00 daily. Read RSS Feed Pulls data from https://github.com/n8n-io/n8n/releases.atom. Filter by Current Day Filters the feed to match: Releases published in the last 4 hours Titles starting with n8n@ and ending with .0 Condition Check Uses a regex to check if the filter result contains any release data. Notifications If a new major release is found, sends: Telegram message to a specified chat Email via AWS SES with release info 📨 Final Output You'll receive a Telegram message and email when a new major n8n version is released. 🔐 Credentials Used Telegram API** – For sending chat notifications AWS SES** – To send email alerts ✨ Customization Tips Change Notification Channels**: Add Slack, Discord, or other preferred channels. Adjust Cron Schedule**: Modify the Cron node to fit your check frequency. Modify Filters**: Detect patch or beta versions by changing the .0 condition. Send Release Notes**: Extend the feed parsing to include release content. ❓Questions? Template created by Miquel Colomer and n8nhackers.com. Need help customizing or deploying? Contact us for consulting and support.