by Lorena
This workflow imports data from Google Sheets into a MySQL database.
by David
This Workflow will extract the domain for the email address.
by Harshil Agrawal
This workflow allows you to create a screenshot of a website and send it to a telegram channel.
by rrezart
Workflow JSON TO CSVWorkflow JSON TO CSVWorkflow JSON TO CSVWorkflow JSON TO CSV
by Lorena
This workflow exports the contents of a JSON file to Google Sheets.
by Mauricio Perera
This n8n workflow converts invoices in PDF format into a structured, ready-to-use JSON, using AI and XML transformation β without writing any code. π How it works Upload form β The user uploads a PDF file. Text extraction β The PDF content is extracted as plain text. XML schema definition β A standard invoice structure is defined with fields such as: Invoice number Customer and issuer details Items with description, quantity, and price Totals and taxes Bank account details AI (Gemini) β The model rewrites the PDF text into a valid XML following the predefined schema. XML cleanup β Removes extra tags, line breaks, and unnecessary formatting. JSON conversion β The XML is transformed into a clean, structured JSON object, ready for integrations, APIs, or storage. β¨ Benefits Transforms unstructured PDFs into normalized JSON data. No coding required, only n8n nodes. Scalable to different invoice formats with minimal adjustments. Leverages AI to interpret complex textual content. π οΈ Use cases Automating invoice data capture. Integration with ERPs, CRMs, or databases. Generating financial reports from PDFs.
by Guilherme Campos
This n8n workflow turns Slack messages into actionable Notion todos β using nothing more than a simple emoji reaction. By reacting to any Slack message with the π emoji, the workflow will automatically capture the message, extract its content, and create a new Notion to_do item with a link back to the original message. A daily scheduled Slack message then reminds you of any tasks left unchecked in Notion. Perfect for async teams who live in Slack but organize work in Notion, this template helps you bridge the gap between communication and executionβwithout switching tools or relying on memory. Whoβs it for Teams that use Slack and Notion daily Product managers, leads, and async-first teams who want quick capture of action items Anyone tired of copy-pasting Slack messages into Notion manually How it works Emoji-triggered Notion capture The workflow listens for the reaction_added event from Slack. When a user reacts to a message with :memo:, it fetches the full message content and permalink. It creates a to_do block in Notion with the message and a direct link to the original Slack thread. Daily Slack reminder Every day at 08:00, the workflow scans all to_do blocks in a designated Notion page. It filters out those that are still unchecked. It aggregates the unchecked tasks and sends a single Slack message to the user with the list of open todos. How to set up Connect your Slack account and configure the trigger to watch for the :memo: reaction. Connect your Notion account and select a page where todos should be created. Customize the schedule time if needed (default is 08:00). (Optional) Set up the final Slack message node to send reminders to yourself or a specific team channel. Requirements A Slack app with permission to read messages and reactions A Notion integration with access to edit the target page A Notion page with blocks enabled (or create one manually) How to customize the workflow Change the emoji from :memo: to another (e.g. :fire: for urgent, :idea: for brainstorms) Add logic to assign Notion tasks to specific team members Use Slack threads, tags, or message metadata to auto-categorize tasks Modify the daily reminder to include due dates, priorities, or Slack buttons to check off tasks
by TUMUSIME David
Who's it for This template is perfect for sales teams, marketing professionals, recruiters, and business development specialists who need to efficiently collect targeted lead data from multiple sources. Whether you're building prospect lists, conducting market research, or expanding your contact database, this automated solution saves hours of manual data collection. How it works The workflow uses an n8n built-in form to collect search parameters, then automatically scrapes targeted leads from Google Maps and/or LinkedIn within Apify Actors based on your criteria. All collected data is cleaned, structured using Edit field nodes, and stored in your Supabase database for easy access and analysis. Key features: Interactive form for easy lead targeting Dual-source scraping (Google Maps + LinkedIn), you choose one or Both. Automatic data collection and cleaning, and structuring from Apify actors. Direct database storage in Supabase Flexible result limits and location targeting. How to set up Step 1: Set Up Apify Account Create account at apify.com Navigate to Settings > Integrations > API tokens Create new API token and copy it Ensure you have access to these actors: LinkedIn Profile Search Scraper (harvestapi/linkedin-profile-search) Google Maps Scraper (compass/crawler-google-places) Step 2: Configure Supabase Database Create new project at supabase.com Go to SQL Editor in your Supabase dashboard Run the SQL scripts from the Requirements section to create tables Navigate to Settings > API to copy: Project URL (starts with https://...) Service role key (secret key, not anon public) Step 3: Import and Configure Workflow Download the workflow JSON file In n8n Cloud, go to Workflows > Import from File Select the downloaded JSON file Configure credentials by clicking on each node that shows a warning: For Apify nodes (linkedin_dataset & googlemaps_dataset): Click "Create New Credential" Name: "Apify account" API Token: [Paste your Apify token from Step 1] Save credential For Supabase nodes (save_linkedin & save_googlemaps): Click "Create New Credential" Name: "Supabase account" Host: [Your Supabase project URL from Step 2] Service Role Secret: [Your service role key from Step 2] Save credential Step 4: Test the Workflow Click on the "On form submission" node Copy the Production URL (webhook link) Open this URL in a new browser tab Fill out the test form: Title/Industry: "restaurants" Location: "San Francisco, CA" Source: "Google Maps" Number of results: "3" Click Submit and wait for processing Check your Supabase tables to verify data was saved Step 5: Verify Setup Success In Supabase, go to Table Editor Check the googlemaps table for new entries Verify all fields are populated correctly Test with LinkedIn source to confirm both paths work The workflow is designed to be plug-and-play once credentials are configured. Simply share the form URL with your team or bookmark it for regular lead generation tasks. How to customize the workflow Search Parameters: Modify Apify actor configurations in the dataset nodes Adjust search query formats for better targeting Change result limits based on your needs Data Processing: Edit the Set nodes to extract additional fields Add data validation steps Implement duplicate detection logic Storage Options: Replace Supabase with other databases (Airtable, PostgreSQL) Add data export to CSV/Excel Implement real-time notifications Support: Check n8n community forum for help Review Apify actor documentation The workflow is designed to be plug-and-play once credentials are configured. Simply fill out the form with your target criteria and let the automation handle the rest. Requirements External Services: Apify account with credits (for web scraping) Supabase account (for data storage) Supabase Table Schemas: LinkedIn Table (LinkedIn): publicidentifier (text) linkedinurl (text) name (text) headline (text) about (text) premium (boolean) verified (boolean) openprofile (boolean) topskills (text) connectionscount (integer) followercount (integer) latest_experience (text) education (text) Google Maps Table (googlemaps): title (text) category_name (text) address (text) neighborhood (text) street (text) city (text) postal_code (text) state (text) country_code (text) website (text) phone (text) phone_unformatted (text) location (text) total_score (numeric) Node Requirements: Form Trigger (built-in), you may use others, or a webhook. Switch (built-in) Set (built-in) Supabase (built-in) Apify (community node), you'll have to install it first. How to customize the workflow Form Customisation: Modify form fields in the "On form submission" node to add additional search criteria if you want more. You may adjust dropdown options for different lead sources. Add validation rules for better data quality Search Parameters: Customize Apify actor configurations for different scraping behaviours. Modify the number of results per search. Add additional location targeting options Data Processing: Enhance the Set nodes to extract additional LinkedIn profile data Add data validation and cleaning steps Implement duplicate detection logic Storage Options: You may replace Supabase with other databases (Airtable, Google Sheets, etc.) Add data enrichment steps before storage Implement data export functionality Advanced Features: Add email notifications when scraping completes Implement error handling and retry logic Create data quality scoring mechanisms Let's connect: TUMUSIME David
by jason
Workflow to clean out unwanted email from Gmail. Further information here.
by amudhan
Companion workflow for Spreadsheet File node docs
by Jan Oberhauser
Read XLS from file Convert it to JSON Insert it in Postgres
by Jan Oberhauser
Read data from Postgres Converting it to XLS Save it to disk