by Mark Shcherbakov
Video Guide I prepared a detailed guide that showed the whole process of building a resume analyzer. Who is this for? This workflow is ideal for developers, data analysts, and business owners who want to enable conversational interactions with their database. It’s particularly useful for cases where users need to extract, analyze, or aggregate data without writing SQL queries manually. What problem does this workflow solve? Accessing and analyzing database data often requires SQL expertise or dedicated reports, which can be time-consuming. This workflow empowers users to interact with a database conversationally through an AI-powered agent. It dynamically generates SQL queries based on user requests, streamlining data retrieval and analysis. What this workflow does This workflow integrates OpenAI with a Supabase database, enabling users to interact with their data via an AI agent. The agent can: Retrieve records from the database. Extract and analyze JSON data stored in tables. Provide summaries, aggregations, or specific data points based on user queries. Dynamic SQL Querying: The agent uses user prompts to create and execute SQL queries on the database. Understand JSON Structure: The workflow identifies JSON schema from sample records, enabling the agent to parse and analyze JSON fields effectively. Database Schema Exploration: It provides the agent with tools to retrieve table structures, column details, and relationships for precise query generation. Setup Preparation Create Accounts: N8N: For workflow automation. Supabase: For database hosting and management. OpenAI: For building the conversational AI agent. Configure Database Connection: Set up a PostgreSQL database in Supabase. Use appropriate credentials (username, password, host, and database name) in your workflow. N8N Workflow AI agent with tools: Code Tool: Execute SQL queries based on user input. Database Schema Tool: Retrieve a list of all tables in the database. Use a predefined SQL query to fetch table definitions, including column names, types, and references. Table Definition: Retrieve a list of columns with types for one table.
by Jimleuk
This n8n template leverages n8n's multi-form feature to build a 2 part job application submission journey which aims to eliminate the need for applicants to re-enter data found on their CVs/Resumes. How it works The application submission process starts with an n8n form trigger to accept CV files in the form of PDFs. The PDF is validated using the text classifier node to determine if it is a valid CV else the applicant is asked to reupload. A basic LLM node is used to extract relevant information from the CV as data capture. A copy of the original job post is included to ensure relevancy. Applicant's data is then sent to an ATS for processing. For our demo, we used airtable because we could attach PDFs to rows. Finally, a second form trigger is used for the actual application form. However, it is prefilled to save the applicant's time and allow them to amend any of the generated application fields. How to use Ensure to change the redirect URL in the form ending node to use the host domain of your n8n instance. Requirements OpenAI for LLM Airtable to capture applicant data Customising the workflow Application form is pretty basic for this demonstration but could be extended to ask more in-depth questions. If it fits the job, why not ask applicants to upload portfolio works and have AI describe/caption them.
by Extruct AI
Who’s it for: Sales and business development professionals who want to monitor company news, hiring trends, and business signals for their leads. How it works / What it does: Add a company to the form, and the workflow will automatically search for the latest news, recent hires, company stage, and LinkedIn activity. The results are sent straight to your Google Sheet, helping you stay up to date with your leads and prospects. How to set up: Register for Extruct at www.extruct.ai/. Open the Extruct table template, copy the table ID from the browser’s address bar. Make a copy of the Google Sheets template to your Drive. Enter the table ID into the variables node in your n8n flow. Set up Bearer authentication in all HTTP Request nodes using your Extruct API token. In the Google Sheets node, paste your template link and connect your Google account. Run the flow once to load the mapping fields, then match each output to the correct column. Activate the flow and start adding companies through the form. Requirements: Extruct account and API token Extruct table template Google account with Google Sheets How to customize the workflow: To track more business development signals, add new columns in both the Extruct table and your Google Sheet, then map them in the Google Sheets node.
by Extruct AI
Who’s it for: Sales teams, marketers, and analysts who need to quickly access all the social media and public profile links for any company. How it works / What it does: When you enter a company into the form, this workflow automatically searches for and collects all available links to the company’s social media accounts, review sites, and public profiles from sources like Crunchbase and Zoominfo. All discovered URLs are added directly to your Google Sheet. How to set up: Create an Extruct account at www.extruct.ai/. Open the Extruct table template, find the table ID in your browser’s address bar, and copy it. Make a copy of the provided Google Sheets template to your own Google Drive. In n8n, paste the table ID into the variables node of your flow. Set up Bearer authentication in every HTTP Request node using your Extruct API token (found on the API page in Extruct). In the Google Sheets node, paste the link to your copied template and connect your Google account. Run the flow once to load the fields, then map the output fields to the correct columns in your sheet. Activate the flow and start adding companies via the form. Requirements: Extruct account and API token Extruct table template Google account with Google Sheets How to customize the workflow: You can add your own columns to the Extruct table and your Google Sheet. Just add the new column in both places and map it in the Google Sheets node in n8n.
by Bao Duy Nguyen
Who is this for? This template is ideal for developers, DevOps engineers, and automation managers who manage their n8n workflows using GitHub. It helps teams streamline their CI/CD automation by syncing changes from GitHub directly into n8n after a pull request (PR) is merged. What problem is this workflow solving? Manually restoring workflows after reviewing and merging code in GitHub can be tedious and error-prone. This workflow solves that by automating the restore process, ensuring that any new or updated workflow committed to your GitHub repo is automatically imported into your n8n environment. What this workflow does Triggers when a GitHub pull request is closed and merged. Fetches the details of the merge commit. Retrieves the list of added and modified workflow files. Downloads and decodes each workflow file. Creates or updates** the corresponding workflow in your n8n instance automatically. Setup Connect GitHub: Use the GitHub Trigger node and configure GitHub API credentials. Note: I'd recommended to use GitHub PAT (Personal Access Token) classic with repo and admin:repo_hook permission scopes enabled. Connect n8n API: Provide your n8n API credentials in the n8n nodes. - Check this doc Set repository variables: Update github_owner and repo_name in the Define Local Variables node. Enable webhook: Make sure your GitHub repository has a webhook for pull_request events pointing to this workflow. How to customize this workflow to your needs Modify filters to handle only certain branches or file paths. Add Slack or email notifications to confirm successful imports. Insert logging or version tagging for better traceability. Extend with conditional logic for workflow testing before applying changes. This automated flow provides a seamless CI/CD loop between GitHub and n8n, empowering teams to manage workflow versioning efficiently and securely.
by Joseph LePage
Transform your local N8N instance into a powerful chat interface using any local & private Ollama model, with zero cloud dependencies ☁️. This workflow creates a structured chat experience that processes messages locally through a language model chain and returns formatted responses 💬. How it works 🔄 💭 Chat messages trigger the workflow 🧠 Messages are processed through Llama 3.2 via Ollama (or any other Ollama compatible model) 📊 Responses are formatted as structured JSON ⚡ Error handling ensures robust operation Set up steps 🛠️ 📥 Install N8N and Ollama ⚙️ Download Ollama 3.2 model (or other model) 🔑 Configure Ollama API credentials ✨ Import and activate workflow This template provides a foundation for building AI-powered chat applications while maintaining full control over your data and infrastructure 🚀.
by Oliver Bardenheier
🛠️Setup Guide 'Get OVH Invoices to Google Sheets' Author: Oliver Bardenheier Who is this for? This Workflow is for all users who have services (Domains, BareMetal, VPS, Cloud, etc.) with Provider OVH.com (European API) It automatically retrieves invoice data, -files and puts the Data in a Google Spreadsheet for further processing. What problem is this workflow solving? / use case Currently the invoices from OVH do not come as an attachment via mail, it is just a link. So, the receiver has to be logged in to the ovh account to download the file. Even more effort if one is using 2FA. This workflow retrieves all information through the oauth2 token. What this workflow does This Workflow automatically retrieves invoice data, -files from Your OVH.com account and puts the Data in a Google Spreadsheet for further processing. It also saves the invoice PDF to a certain (yearly) folder in Your Google Drive. Setup Make a copy of this Google Sheet Template Set the timeframe for the query to Your likings in "Query Latest OVH Invoices" You could set an email trigger before and make the frame only one day. Log into Your OVH Account and get Your Credentials here Authentication using oAuth2 Authorization Code "Login with OVHcloud SSO" You need to Authorize OVHcloud API console If this worked fine You'll see a green text: "Access Token Received" Head over to the OVH API Console to get Your Token. Set Up Header Auth in the HTTP nodes: Authentication = Generic Credential Type Generic Auth Type = Header Auth Header Auth = Your OVH Header Credentials: -- a.) In every API Call in the console You'll find a curl example, just take the data from the line including: -H "authorization: Bearer eyJhxxxxxxxxxxxxxxxxxxxxxxxxxxxxx......" -- b.) Create a new Credential in n8n for the header auth. Put in the 'name' Field: authorization Copy Your Token including Bearer in the value field: 'Bearer eyJhxxxxxxxxxxxxxxxxxxxxxxxxxxxxx......' How to customize this workflow to your needs You can put in a mail trigger that activates on every incoming invoice mail from OVH. Adjusting the timeframe to get invoices from a certain time period, or remove the time variables completely to get ALL invoices.
by Mutasem
Use Case Track all Linear tickets in Google sheets. Useful if you want to do some custom analysis but don't want to pay for Linear's Plus features (Linear Insights) or that it does not cover. Setup Add Linear API header key Add Google sheets creds Update which teams to get tickets from in Graphql Nodes Update which Google Sheets page to write all the tickets to You only need to add one column, id, in the sheet. Google Sheets node in automatic mapping mode will handle adding the rest of the columns. Set any custom data on each ticket Activate workflow 🚀 How to adjust this template Set any custom fields you want to get out of this, that you can quickly do in n8n.
by Ranjan Dailata
Who this is for? Extract & Summarize Yelp Business Review is an automated workflow that extracts the Yelp business reviews using Bright Data Web Unlocker, process and formats the raw data, summarizes using the Google Gemini's LLM, and forward the concise summary with the review respose to a specified webhook endpoint. This workflow is tailored for: Local SEO Specialists who need structured insights from Yelp reviews to optimize listings. Business Owners wanting quick summaries of what customers love or complain about. Reputation Managers who monitor brand sentiment and identify customer pain points. Data Analysts & Researchers extracting Yelp review patterns at scale. AI Product Builders needing clean Yelp review data as input for their LLMs or recommender systems. What problem is this workflow solving? Yelp reviews are rich in customer sentiment but messy to work with manually. This workflow solves: The pain of scraping Yelp review content manually. The challenge of building the structured data with the summary. The need for structured outputs suitable for analysis, reports, or AI input. What this workflow does This automated pipeline does the following: Bright Data Integration**: Queries Yelp and scrapes business listing data using Bright Data's Web Unlocker. Structured Data Formatting**: Formats the Yelp review data to a structured response in JSON format. Google Gemini Summarization**: Sends the cleaned reviews to Google Gemini to: Output Delivery**: Returns the structured response with the concise summary over the webhook endpoint. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). Update the Yelp Business Review URL with the Bright Data zone by navigating to the Set Yelp URL with the Bright Data Zone node. Update the Webhook Notifier for the merged response node with the Webhook endpoint of your choice. How to customize this workflow to your needs This workflow is built to be flexible - whether you’re a market researcher, entrepreneur, or data analyst. Here's how you can adapt it to fit your specific use case: Target Specific Business Categories** Update the Yelp Business Review input to scrape different businesses like gyms, salons etc. Limit Reviews** Add filters by description, location, page range to get the top reviews. Tweak the Data Extraction Node** Update the Structured Data Extractor node Output Parser for building the JSON response with the appropriate fields or attributes. Tweak the Summarization Prompt** Modify the Gemini prompt to generate a comprehensive summary. Send Output to Other Destinations** Replace the Webhook URL to forward output to: Google Sheets Airtable Slack or Discord Custom API endpoints
by Manu
In Grist, when I mark a row as confirmed (via a toggle): a webhook is set up to notify n8n, and this workflow will create derived records in the destination table. Design decisions Confirmation-based In the source table there is a boolean column "Confirmed" that will trigger the transfer. This way there is a manual check involved & it's a conscious step to trigger the workflow. Runs once If the destination table already contains an entry, we will not re-create/update it (as it might've already been changed manually) Setup Create a boolean column Confirmed in source table Add a webhook in Grist Settings Add grist API credentials in n8n Set document ID & source table ID/Name in the 'get existing' node Set docID, the destination table ID/Name - and the columns & values you want in the Create Row node
by ist00dent
This n8n template empowers you to instantly summarize long pieces of text by sending a simple webhook request. By integrating with ApyHub's summarization API, you can distil complex articles, reports, or messages into concise summaries, significantly boosting efficiency across various domains. 🔧 How it works Receive Content Webhook:** This node acts as the entry point, listening for incoming POST requests. It expects a JSON body containing: content: The long text you want to summarize. summary_length (optional): The desired length of the summary (e.g., 'short', 'medium', 'long'). Defaults to 'medium'. And a header containing your apy-token for the ApyHub API. Start Summarization Job:** This node sends a POST request to ApyHub's summarization endpoint (api.apyhub.com/sharpapi/api/v1/content/summarize). It passes the content and summary_length from the webhook body, along with your apy-token from the headers. ApyHub processes the text asynchronously, and this node immediately returns a job_id. Get Summarization Result:** Since ApyHub's summarization is an asynchronous process, this node is crucial. It polls ApyHub's job status endpoint (api.apyhub.com/sharpapi/api/v1/content/summarize/job/status/{{job_id}}) using the job_id obtained from the previous step. It continues to check the status until the summarization is finished, at which point it retrieves the final summarized text. Respond with Summarized Content:** This node sends the final, distilled summarized text back to the service that initiated the webhook. 👤 Who is it for? This workflow is extremely useful for: Content Creators & Marketers:** Quickly summarize articles for social media snippets, email newsletters, or blog post intros. Researchers & Students:** Efficiently get the gist of academic papers, reports, or long documents without reading every word. Customer Support & Sales Teams:** Summarize customer inquiries, long email chains, or call transcripts to quickly understand key issues or discussion points. News Aggregators & Media Monitoring:** Automatically generate summaries of news articles from various sources for quick consumption. Business Professionals:** Condense lengthy reports, meeting minutes, or project updates into digestible summaries for busy stakeholders. Legal & Compliance:** Summarize legal documents or regulatory texts to highlight critical clauses or changes. Anyone Dealing with Information Overload:** Use it to save time and extract key information from overwhelming amounts of text. 📑Data Structure When you trigger the webhook, send a POST ** request with a **JSON body and an apy-token in the headers: { "content": "Your very long text goes here. This could be an article, a report, a transcript, or any other textual content you want to summarize. The longer the text, the more valuable summarization becomes!", "summary_length": "medium" // Optional: "short", "medium", or "long" } Headers: apy-token: YOUR_APYHUB_API_KEY Note: You'll need to obtain an API Key from ApyHub to use their API services. They typically offer a free tier for testing. The workflow will return a JSON response similar to this (the summary content will vary based on input): { "summary": "Max Verstappen believes the Las Vegas Grand Prix is '99% show and 1% sporting event', not looking forward to the razzmatazz. Other drivers, like Fernando Alonso, were more equivocal about the hype, acknowledging the investment and spectacle. Lewis Hamilton praised the city's energy but emphasized it's 'a business, ultimately', believing there will still be good racing.", "status": "finished", "result_file_id": "..." // ApyHub might provide a file ID for larger results } ⚙️ Setup Instructions Get an ApyHub API Key:** Go to https://apyhub.com/ and sign up to get your API key. Import Workflow:** In your n8n editor, click "Import from JSON" and paste the provided workflow JSON. Configure Webhook Path:** Double-click the Receive Content Webhook node. In the 'Path' field, set a unique and descriptive path (e.g., /summarize-content). Activate Workflow:** Save and activate the workflow. 📝 Tips This content summarizer is a powerful component. Here's how to supercharge it and make it an indispensable part of your automation arsenal: Integrate with Document/File Storage:** Google Drive/Dropbox/OneDrive:* Automatically summarize documents uploaded to these services. Add a Watch New Files trigger (if available for your service) or a Cron node to regularly check for new files. Then, read the file content, pass it to this summarizer, and save the summary back to a designated folder or as a comment on the original file. CRM/CMS Systems:* Pull long notes, customer interactions, or article drafts from your CRM/CMS, summarize them, and update the records with the concise version. Email Processing & Triage:** Email Trigger: Use an Email node to trigger the workflow when new emails arrive. Extract the email body, summarize it, and then: Send a shortened summary as a notification to your Slack or Telegram. Add a summary to a task management tool (e.g., Trello, Asana) for quicker triaging. Create a summary for an email digest. Slack/Discord Bot Integration:** Create a Slack/Discord command (using a custom webhook or a dedicated Slack/Discord node) where users can paste long text. The bot then sends the summarized version back to the channel. Dynamic Summary Length & Options:** Allow the user to specify summary_length (short, medium, long) in the webhook body, as already implemented. Explore ApyHub's documentation for more parameters (if any) and dynamically pass them. Error Handling & User Feedback:** Add an IF node after Get Summarization Result to check for status: 'failed' or error messages. If an error occurs, send a helpful message back to the webhook caller or an internal alert. For very long texts that might exceed API limits, add a Function node to truncate the input content if it's too long, and notify the user. Multi-language Support (if ApyHub offers it):** If ApyHub supports summarization in multiple languages, extend the webhook to accept a language parameter and pass it to the API. Web Scraping & Article Summaries:** Combine this with a HTTP Request node to scrape content from a web page (e.g., a news article). Then, pass the extracted article text to this summarizer to get quick insights. Data Storage & Archiving:** Store the original content alongside its summary in a database (e.g., PostgreSQL, MongoDB) or a simple spreadsheet (Google Sheets, Airtable). This creates a searchable, summarized archive of your content. Automated Report Generation:** If you receive daily/weekly reports, use this workflow to summarize key sections, then compile these summaries into a concise digest or dashboard using a Merge node and send it out automatically.
by Jimleuk
This n8n template demonstrates how to use AI to compose or "stitch" separate images together to generate a new image which retains the source assets and consistent style. Use cases are many: Try producing storyboard scenes with consistent characters, marketing material with existing product assets or trying on different articles on fashion! Good to know At time of writing, each image generated will cost $0.039 USD. See Gemini Pricing for updated info. The model used in this workflow is geo-restricted! If it says model not found, it may not be available in your country or region. How it works We'll import our required assets via our Cloud storage using the HTTP node. The images are then converted to base64 strings and aggregated so we can use it for our AI model. Gemini's image generation model is used which takes all 3 images and a prompt that we define. Our prompt instructs the model on how to compose the final image. Gemini generates a new image but uses the original 3 assets to do so. The consistency to the source images is very high and shows little signs of hallucinations! Gemini's output is base64 so we use a "Convert to file" node to convert the data to binary. The final binary image is then uploaded to Google Drive to complete the demonstration. How to use The manual trigger node is used as an example but feel free to replace this with other triggers such as webhook or even a form. Technically, you should be able to compose even more images but of course, the generation will take longer and cost more. Requirements Gemini account for LLM and Image generation Google drive for upload Customising this workflow AI Image editing can be used for many use-cases. Try a popular use-case such as virtual try-on for fashion or applying branding on editing image assets.