by dev
Every 10 minutes look at your published news in your Tiny tiny RSS public feed and make a toot on your mastodon. You'll need: Your mastondon URL instance Your mastondon access token Your Tiny Tiny RSS public published feed URL
by Harshil Agrawal
This workflow demonstrates the use of static data in n8n. The workflow is built on the concept of polling. Cron node: The Cron node triggers the workflow every minute. You can configure the time based on your use-case. HTTP Request node: This node makes an HTTP Request to an API that returns the position of the ISS. Set node: In the Set node we set the information that we need in the workflow. Since we only need the timestamp, latitude, and longitude we set this in the node. If you need other information, you can set them in this node. Function node: The Function node, checks if the incoming data is similar to the data that was returned in the previous execution or not. If the data is different the Function node returns this new node, otherwise, it returns a message 'No New Items'. The data is also stored as static data with the workflow. Based on your use-case, you can build the workflow further. For example, you can use it send updates to Mattermost or Slack
by Jonathan
Task: Make sure that data is in the right format before injecting it into a database/spreadsheet/CRM/etc. Why: Spreadsheets and databases require the incoming data to have the same fields as the headers of the destination table. You can decide which fields you would like to send with the database and rename them by using the set node Main use cases: Change fields names to match a database or a spreadsheet table structure Keep only the fields that are needed at the destination table
by Eduard
This workflow demonstrates how CSV file can be automatically imported into existing MySQL database. Before running the workflow please make sure you have a file on the server: /home/node/.n8n/concerts-2023.csv And the content of the file is the following: Date,Band,ConcertName,Country,City,Location,LocationAddress, 2023-05-28,Ozzy Osbourne,No More Tours 2 - Special Guest: Judas Priest,Germany,Berlin,Mercedes-Benz Arena Berlin,"Mercedes-Platz 1, 10243 Berlin-Friedrichshain", 2023-05-08,Elton John,Farewell Yellow Brick Road Tour 2023,Germany,Berlin,Mercedes-Benz Arena Berlin,"Mercedes-Platz 1, 10243 Berlin-Friedrichshain", 2023-05-26,Hans Zimmer Live,Europe Tour 2023,Germany,Berlin,Mercedes-Benz Arena Berlin,"Mercedes-Platz 1, 10243 Berlin-Friedrichshain", 2023-07-07,Depeche Mode,Memento Mori World Tour 2023,Germany,Berlin,Olympiastadion Berlin,"Olympischer Platz 3, 14053 Berlin-Charlottenburg", The detailed process is explained in the tutorial https://blog.n8n.io/import-csv-into-mysql
by Eduard
This workflow demonstrates how easy it is to export SQL query to CSV automatically! Before running the workflow please make sure you have access to a local or remote MSSQL server with a sample AdventureWorks database. The detailed process is explained in the tutorial https://blog.n8n.io/sql-export-to-csv/
by Tom
This workflow is the opposite of this one. It transforms multiple different items with one binary object named data into a single item with multiple binary objects: This can be useful when creating a single .zip archive for example. It uses the updated Code node instead of the older Function node.
by Oleg Ivaniv
If you previously upgraded to n8n version 0.214.3, some of your workflows might have been accidentally rewired in the wrong way. This issue affected nodes with more than one output, such as If, Switch, and Compare Datasets. This workflow helps you identify potentially affected workflows and nodes that you should check. ❗️Please ensure that you run this workflow as the instance owner.❗️
by Yulia
This workflow demonstrates the conversion of a CSV file to Excel format. First, an example CSV file is downloaded via a direct link. The source file is taken from the European Open Data Portal: https://data.europa.eu/data/datasets/veranstaltungsplaetze-potsdam-potsdam?locale=en The binary data is then imported via the Spreadsheet File node and converted to Excel format. N.B. Note that as of version 1.23.0 n8n, the Spreadsheet File node has been redesigned and is now called Convert to File node. Learn more on the release notes page: https://docs.n8n.io/release-notes/#n8n1230
by Lucas Perret
Get recent funding rounds from Crunchbase in Google Sheets, along with 10+ data points (LinkedIn URL, monthly traffic, company size, etc.) You’ll be able to: Create a custom database Reach out to interesting leads at the right time Send custom alerts to your tools This workflow scrape recent funding rounds from Crunchbase, and add them in Google Sheets. It uses Piloterr API to get this data with ease. Full guide can be found here: https://lempire.notion.site/Get-recent-fundraising-in-Google-Sheets-dafbbda2635544b4925c4fb04abac8f5?pvs=74
by David Roberts
This workflow allows you to ask questions about data stored in a database using AI. To use it, you'll need an OpenAI API key (although you could also swap in a model from another service). Supported databases: Postgres MySQL SQLite The workflow uses n8n's embedded chat, but you could also modify it to work with a chat service such as Slack, MS Teams or WhatsApp. Note that to use this template, you need to be on n8n version 1.19.4 or later.
by Ranjan Dailata
1. Who this is for This workflow is specifically designed for Recruiters, HR analytics teams, and data-driven talent acquisition professionals seeking deeper insights from candidate resume. Valuable for HR tech developers, ATS/CRM engineers, and AI-driven recruitment platforms aiming to automate candidate research. Helps organizations build predictive hiring models and gain actionable talent intelligence. 2. What problem this workflow solves Recruiters often face information overload when analyzing candidate resume manually reviewing experiences, skills, and cultural fit is slow and inconsistent. Traditional scraping tools extract raw data but fail to produce actionable intelligence like career trajectory, skills alignment, and fit for a role. This workflow solves that by: Automating candidate resume data extraction through Decodo Structuring it into JSON Resume Schema Running deep AI-driven analytics using OpenAI GPT-4o-mini Delivering comprehensive candidate intelligence ready for ATS/CRM integration or HR dashboards 3. What this workflow does This n8n workflow combines Decodo’s web scraping with OpenAI GPT-4o-mini to produce advanced recruitment intelligence. Flow Breakdown: Manual Trigger – Start the workflow manually or schedule it in n8n. Set Input Fields – Define resume URL, location, and job description. Decodo Node – Scrapes the candidate’s profile (experience, skills, education, achievements, etc.). Structured Data Extractor (GPT-4o-mini) – Converts the scraped data into a structured JSON Resume Schema. Advanced Data Mining Engine (GPT-4o-mini) – Performs: Skills Analysis (strengths, gaps, transferable skills) Experience Intelligence (career trajectory, leadership, project complexity) Cultural Fit Insights (work style, communication style, agility indicators) Career Trajectory Forecasting (promotion trends, growth velocity) Competitive Advantage Analysis (market positioning, salary expectations) Summarizer Node – Produces an abstractive and comprehensive AI summary of the candidate profile. Google Sheets Node – Saves the structured insights automatically into your recruitment intelligence sheet. File Writer Node (Optional) – Writes the JSON report locally for offline storage or integration. The result is a data-enriched candidate intelligence report far beyond what traditional resume parsing provides. 4. Setup Prerequisites If you are new to Decode, please signup on this link visit.decodo.com n8n account with workflow editor access Decodo API credentials OpenAI API key Google Sheets account connected via OAuth2 Make sure to install the Decodo Community node. Setup Steps Import the workflow JSON into your n8n workspace. Set credentials for: Decodo Credentials account OpenAI API (GPT-4o-mini) Google Sheets OAuth2 In the “Set the Input Fields” node, update: url → Resume link geo → Candidate region or country jobDescription → Target job description for matching Ensure the Google Sheet ID and tab name are correct in the “Append or update row in sheet” node. Click Execute Workflow to start. 5. How to customize this workflow You can adapt this workflow for different recruitment or analytics scenarios: Add Sentiment Analysis Add another LLM node to perform sentiment analysis on candidate recommendations or feedback notes. Enrich with Job Board Data Use additional Decodo nodes or APIs (Indeed, Glassdoor, etc.) to compare candidate profiles to live job postings. Add Predictive Fit Scoring Insert a Function Node to compute a numerical "fit score" by comparing skill vectors and job requirements. Automate Candidate Reporting Connect to Gmail, Slack, or Notion nodes to automatically send summaries or reports to hiring managers. 6. Summary The Advanced Resume Intelligence & Data Mining via Decodo + OpenAI GPT-4o-mini workflow transforms traditional candidate sourcing into AI-driven intelligence gathering. It integrates: Decodo** → To perform webscraping of data GPT-4o-mini** → to interpret, analyze, and summarize with context Google Sheets** → to store structured results for real-time analysis With this system, recruiters and HR analysts can move from data collection to decision intelligence, unlocking faster and smarter talent insights.
by Yaron Been
This workflow provides automated access to the Paigedutcher2 Paige AI model through the Replicate API. It saves you time by eliminating the need to manually interact with AI models and provides a seamless integration for other generation tasks within your n8n automation workflows. Overview This workflow automatically handles the complete other generation process using the Paigedutcher2 Paige model. It manages API authentication, parameter configuration, request processing, and result retrieval with built-in error handling and retry logic for reliable automation. Model Description: "Custom AI model trained on Paige — bold, curvy, confident energy. Think Barbie meets boss. Great for glam, fantasy, seductive, and influencer-style prompts. Use trigger word CharacterPGE to activa... Key Capabilities Specialized AI model with unique capabilities** Advanced processing and generation features** Custom AI-powered automation tools** Artistic style control and customization** Tools Used n8n**: The automation platform that orchestrates the workflow Replicate API**: Access to the Paigedutcher2/paige AI model Paigedutcher2 Paige**: The core AI model for other generation Built-in Error Handling**: Automatic retry logic and comprehensive error management How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Replicate API: Add your Replicate API token to the 'Set API Token' node Customize Parameters: Adjust the model parameters in the 'Set Other Parameters' node Test the Workflow: Run the workflow with your desired inputs Integrate: Connect this workflow to your existing automation pipelines Use Cases Specialized Processing**: Handle specific AI tasks and workflows Custom Automation**: Implement unique business logic and processing Data Processing**: Transform and analyze various types of data AI Integration**: Add AI capabilities to existing systems and workflows Connect with Me Website**: https://www.nofluff.online YouTube**: https://www.youtube.com/@YaronBeen/videos LinkedIn**: https://www.linkedin.com/in/yaronbeen/ Get Replicate API**: https://replicate.com (Sign up to access powerful AI models) #n8n #automation #ai #replicate #aiautomation #workflow #nocode #aiprocessing #dataprocessing #machinelearning #artificialintelligence #aitools #automation #digitalart #contentcreation #productivity #innovation