by Javier Quilez Cabello
Automate syncing of offline donor data from Google Sheets into SinergiaCRM for fast, error-free face-to-face fundraising tracking. Who is this workflow for? This workflow is perfect for nonprofit organizations that run face-to-face fundraising campaigns and collect donor data offline (e.g. via tablets or spreadsheets). It helps fundraising and CRM teams ensure donor records are correctly and automatically stored in SinergiaCRM, a popular CRM platform used by NGOs. What it does / How it works Watches a Google Sheet for new rows (also works offline if synced later). Filters rows marked To CRM = Yes and Processed = No. Checks if the donor already exists in SinergiaCRM (based on NIF). If the contact exists: creates a “member” relationship and a payment commitment. If not: creates the contact first, then adds the relationship and commitment. Finally, updates the original row as “Processed” to prevent re-importing. Requirements An active Google Sheets account with the donor spreadsheet. A SinergiaCRM account with API/OAuth access. Fields First name, Last name, Email, NIF, To CRM, Processed must exist in the sheet. SinergiaCRM modules enabled: Contacts, stic_Contacts_Relationships, stic_Payment_Commitments. How to set up Connect your Google Sheets account and set the correct Document ID and Sheet name in the trigger and update nodes. Connect your SinergiaCRM account using OAuth credentials. Adjust the assigned_user_id field if required by your CRM instance. How to customize the workflow Change the filter conditions in the IF nodes to match your business logic. Customize fields like relationship type, amount, and periodicity to suit your campaign. Add or remove nodes if you want additional steps (like sending Slack notifications). 📌 Category: CRM & Customer Management 📘 Learn more about SinergiaCRM
by Robert Breen
Eventbrite → Pipedrive Lead‑Sync Bring your Eventbrite attendee data into Pipedrive automatically—no spreadsheets, CSVs, or manual uploads. 🚀 What the Workflow Does Polls Eventbrite* on a schedule (default 30 min) for *new registrations. Creates or updates* matching *Person* and *Deal** records in Pipedrive. Deduplicates** by email and stores a timestamp so each attendee is processed only once. Easily configurable** field‑mapping lets you decide exactly which attendee data lands in Pipedrive. 📋 Key Features | Feature | Benefit | |---------|---------| | Incremental Sync | Processes only registrations created since the last run. | | Person + Deal Linking | Keeps contacts and sales opportunities in one place. | | No Community Nodes | 100 % official n8n nodes—simple to import and run. | | Fully Editable Code Node | Swap your Eventbrite token, organization ID, and field mappings in seconds. | 🔑 Prerequisites Eventbrite Personal OAuth Token** Eventbrite Organization ID** Pipedrive API Token** n8n 1.25 or later 🛠 Quick Start Import the workflow JSON. Open the Code node → paste your Eventbrite token and organization ID. Add your Eventbrite and Pipedrive credentials in their respective nodes. Activate the workflow and watch new registrants appear in Pipedrive within minutes. Contact Email:** rbreen@ynteractive.com Website:** https://ynteractive.com YouTube:** https://www.youtube.com/@ynteractivetraining LinkedIn:** https://www.linkedin.com/in/robertbreen
by Sk developer
📊 n8n Workflow: Email Validation & Google Sheets Update This workflow automates the process of validating email addresses stored in a Google Sheet. It reads each email, checks if it's disposable or fake using Email Validator AI via RapidAPI, and updates the same row with the result. All interactions with Google Sheets are done securely using a Google Service Account. ✅ Node-wise Overview 🔘 Manual Trigger Starts the workflow manually from the n8n editor. 📄 Google Sheets (Read) Reads all rows (including emails) from a specified Google Sheet. 🔁 Split In Batches Processes each row (email) one at a time. 🌐 HTTP Request (RapidAPI) Sends the email to Email Validator AI for validation. 🟩 Google Sheets (Update Row) Updates the corresponding row in the sheet with the validation result. 📋 Google Sheet Columns Required Make sure your Google Sheet contains the following columns: | Column Name | Purpose | |------------------|----------------------------------------------| | email | Email address to validate | | is_disposable | Flags if the email is a disposable address | > 📝 You can rename columns, but ensure the node mappings in n8n match accordingly. 💡 Use Cases 📧 Email List Cleanup Keep your lead or contact lists free of fake, temporary, or disposable emails. 🧼 Data Quality Enhancement Ensure your Google Sheets contain only validated, high-quality email addresses. 🔁 Automated Data Enrichment Add metadata (like is_disposable) to your contacts without manual review. 📥 Lead Qualification Filtering Automatically flag or remove junk leads before importing into CRMs or email tools. 🚀 Benefits of This Workflow (for You) 🧼 Cleans Your Email Data Automatically Detects fake or throwaway email addresses using Email Validator AI and flags them right in your spreadsheet. ⏳ Saves You Time Fully automates email validation — no more copy-pasting into online tools. 📈 Improves Marketing & Outreach Accuracy Focuses your efforts on real contacts, improving delivery, open, and conversion rates. 💡 Lets You Focus on High-Value Leads Filters out low-quality leads so you can prioritize those with actual potential. 🔄 Works Seamlessly Within Google Sheets No import/export headaches — updates happen directly in your existing sheet. 🔐 Runs Securely Without Manual Intervention Uses a Google Service Account for safe access and can be scheduled to run automatically. 🛠️ Requirements to Use A Google Sheet with the required columns listed above A RapidAPI key for Email Validator AI A Google Service Account with access to the sheet Proper credentials configured in your n8n instance 🧪 Tips for Enhancement Add a status or note column for better tracking Filter only unvalidated rows to improve performance Send a Slack or Email notification when invalid emails are found Schedule this workflow using a Cron Trigger to run daily/weekly
by Zakwan
This workflow is a user-friendly tool that automates the creation of high-quality advertising images for products. It takes a simple product image uploaded by a user and uses AI to transform it into a professional, photorealistic advertisement featuring a fashion model actively using the product. The final image is then made available for the user to download. Step-by-Step Breakdown: Here is a breakdown of the automated process: Form Submission: The workflow is triggered by a public form. The user uploads a product image and selects a character model (male or female) from a dropdown menu. Image Processing: The uploaded image file is extracted and prepared for the AI. This includes converting the binary file data into a format that the AI model can understand. AI Image Generation: An HTTP request is sent to a large language model (Google's Gemini via OpenRouter). The request includes a prompt that combines the user's selected character model and the uploaded product image. The AI is instructed to generate a new, photorealistic image of the model using the product. Data Conversion: The AI's output, which is a base64 encoded image string, is then processed. The workflow separates the image data from its metadata. Final Image Delivery: The base64 data is converted back into a binary file, which is then provided to the user for automatic download via a completion form.
by Arjan ter Heegde
n8n Placeholdarr for Plex (BETA) This flow creates dummy files for every item added in your *Arrs (Radarr/Sonarr) with the tag unprocessed-dummy. It’s useful for maintaining a large Plex library without needing the actual movies or shows to be present on your Debrid provider. How It Works When a dummy file is played, the corresponding item is automatically monitored in *Arr and added to the download queue. This ensures that the content becomes available within ~3 minutes for playback. If the content finishes downloading while the dummy is still being played, Tautulli triggers a webhook that stops the stream and notifies the user. Requirements Each n8n node must have the correct URL and authorization headers configured. The SSH host (used to create dummy files) must have FFmpeg installed. A Trakt.TV API key is required if you're using Trakt collections. Warning > ⚠️ This flow is currently in BETA and under active development. > It is not recommended for users without technical experience. > Keep an eye on the GitHub repository for updates. https://github.com/arjanterheegde/n8n-workflows-for-plex
by Cooper
DMARC Reporter Gmail and Yahoo send DMARC reports as .zip or .gz xml attachments that can be hard to read. This workflow unpacks them on a schedule, turns the data into a simple table, and emails you an easy-to-read report. DMARC insights at a glance: Confirm that your published policy is correct and consistent. Quickly spot unknown or suspicious IPs trying to send as you. Distinguish between legitimate high-volume senders (e.g. your ESP) and one-off or small-scale abuse. Makes it easy to confirm your legitimate servers are authenticating correctly, and to detect spoofed mail that fails DKIM/SPF. Who is this For? Email Marketing Team Mailchimp, Sensorpro, Omnisend users Compliance Team Customize: Adjust the Gmail node to include other DMARC reporters by changing the search parms. If not using Gmail you can use any of the n8n email nodes. To keep a record, add an Airtable node after the Set node.
by Oneclick AI Squad
Description Automates weekly checks for broken links on a website. Scans the site using HTTP requests and filters broken links. Sends Slack alerts for detected broken URLs and creates a list for tracking. Essential Information Runs weekly to monitor website link integrity. Identifies broken links and notifies the team via Slack. Generates a list of broken links for further action. System Architecture Link Checking Pipeline**: Weekly Cron Trigger: Schedules the workflow to run weekly. Scan Blog with HTTP: Performs HTTP GET requests to check website links. Alert and Tracking Flow**: Filter Broken Links: Identifies and separates broken links. Send Slack Alert: Notifies the team via Slack about broken URLs. Create Broken Links List: Compiles a list of broken links. Non-Critical Handling**: No Action for Valid Links: Skips valid links with no further action. Implementation Guide Import the workflow JSON into n8n. Configure the HTTP node with the target website URL (e.g., https://yourblog.com). Set up Slack credentials for alerts. Test the workflow with a sample website scan. Monitor link checking accuracy and adjust HTTP settings if needed. Technical Dependencies HTTP request capability for link scanning. Slack API for team notifications. n8n for workflow automation and scheduling. Database & Sheet Structure No specific database or sheet required; relies on HTTP response data. Example payload: {"url": "https://yourblog.com/broken", "status": 404, "time": "2025-07-29T20:21:00Z"} Customization Possibilities Adjust the Cron trigger to run at a different frequency (e.g., daily). Customize HTTP node to scan specific pages or domains. Modify Slack alert messages in the Send Slack Alert node. Enhance the Create Broken Links List node to save results to a Google Sheet or Notion. Add email notifications for additional alert channels.
by curseduca.com
📘 Curseduca – User Creation & Access Group Assignment How it works This workflow automates the process of creating a new user in Curseduca and granting them access to a specific access group. It works in two main steps: Webhook – Captures user details (name, email, and group information). HTTP Request – Sends the data to the Curseduca API, creating the user, assigning them to the correct access group, and sending an email notification. Setup steps Deploy the workflow Copy the webhook URL generated by n8n. Send a POST request with the required fields: name email groupId Configure API access Add your API Key and Bearer token in the HTTP Request node headers (replaCurseducace placeholders). Replace <GroupId> in the body with the correct group ID. Notifications By default, the workflow will trigger an email notification to the user once their account is created. Example use cases Landing pages**: Automatically register leads who sign up on a product landing page and grant them immediate access to a course, training, or bundle. Product bundles**: Offer multiple products or services together and instantly give access to the correct group after purchase. Chatbot integration: Connect tools like **Manychat to capture name and email via chatbot conversations and create the user directly in Curseduca. 📘 Curseduca – Criação de Usuário e Liberação de Grupo de Acesso Como funciona Este fluxo de trabalho automatiza o processo de criação de um novo usuário no Curseduca e a liberação de acesso a um grupo específico. Ele funciona em duas etapas principais: Webhook – Captura os dados do usuário (nome, e-mail e informações de grupo). HTTP Request – Envia os dados para a API do Curseduca, criando o usuário, atribuindo-o ao grupo correto e disparando uma notificação por e-mail. Passos de configuração Publicar o workflow Copie a URL do webhook gerada pelo n8n. Envie uma requisição POST com os campos obrigatórios: name email groupId Configurar o acesso à API Adicione sua API Key e Bearer token nos headers do nó HTTP Request (substitua os placeholders). Substitua <GroupId> no corpo da requisição pelo ID correto do grupo. Notificações Por padrão, o fluxo dispara uma notificação por e-mail para o usuário assim que a conta é criada. Casos de uso Landing pages**: Registre automaticamente leads que se inscrevem em uma landing page de produto e libere acesso imediato a um curso, treinamento ou pacote. Pacotes de produtos**: Ofereça múltiplos produtos ou serviços em conjunto e conceda acesso instantâneo ao grupo correto após a compra. Integração com chatbots: Conecte ferramentas como o **Manychat para capturar nome e e-mail em conversas e criar o usuário diretamente no Curseduca.
by Stéphane Heckel
Copy n8n workflows to a slave n8n repository Inspired by Alex Kim's workflow, this version adds the ability to keep multiple versions of the same workflow on the destination instance. Each copied workflow’s name is prefixed with the date (YYYY_MM_DD_), enabling simple version tracking. Process details and workflow counts are recorded centrally in Notion. How it works Workflows from the source n8n instance are copied to the destination using the n8n API node. On the destination, each workflow name is prefixed with the current date (e.g., 2025_08_03_PDF Summarizer), so you can keep multiple daily versions. The workflow tracks and saves: The date of execution. Number of workflows processed. Both details are recorded in Notion. Rolling retention policy example: Day 1:** Workflows are saved with 2025_08_03_ prefix. Day 2:** New set saved with 2025_08_04_. Day 3:** Day 1’s set is deleted, new set saved as 2025_08_05_. To keep more days, adjust the “Subtract From Date” node. How to use Create a Notion database with one page and three fields: sequence: Should contain "prefix". Value: Today's date as YYYY_MM_DD_. Comment: Number of saved workflows. Configure the Notion node: Enter your Notion credentials. Link to the created database/page. Update the "Subtract From Date" node: Set how many days’ versions you want to keep (default: 2 days). Set the limit to 1 in the "Limit" node for testing. Input credentials for both source and destination n8n instances. Requirements Notion** for tracking execution date and workflow count. n8n API Keys* for both source and destination instances. Ensure you have the necessary *API permissions** (read, create, delete workflows) n8n version** this workflow was tested on 1.103.2 (Ubuntu) Need Help? Comment this post or contact me on LinkedIn Ask in the Forum!
by David Soden
Extract and Upload Files from Zip to Google Drive How it works This workflow automatically extracts all files from an uploaded zip archive and uploads each file individually to Google Drive. Flow: User submits a zip file via form Zip file is temporarily saved to disk (workaround for compression node limitation) Zip file is read back and decompressed Split Out node separates each file into individual items Each file is uploaded to Google Drive with its original filename Key features: Handles zip files with any number of files dynamically Preserves original filenames from inside the zip No hardcoded file counts - works with 1 or 100 files Set up steps Connect Google Drive: Add your Google Drive OAuth2 credentials to the "Upload to Google Drive" node Select destination folder: In the Google Drive node, choose which folder to upload files to (default is root) Update temp path (optional): Change the temporary file path in "Read/Write Files from Disk" node if needed (default: c:/temp_n8n.zip) Requirements Google Drive account and OAuth2 credentials Write access to local filesystem for temporary zip storage Tags automation, file processing, google drive, zip extraction, file upload
by System Admin
Tagged with: , , , ,
by Davide
This automated workflow generates a video featuring a talking AI avatar from a single image and automatically publishes it to TikTok with Postiz. The process involves two main AI services chained together: Elevenlabs v3 and Infinitalk. Key Benefits ✅ Full Automation – From text input to TikTok publication, the process is completely automated. ✅ Time-Saving – Eliminates manual video editing, voice-over recording, and social media posting. ✅ Scalable – Can generate multiple avatar videos daily with minimal effort. ✅ Customizable – Flexible inputs (image, voice, text, prompts) allow adaptation for different content types (weather forecasts, product promos, tutorials, etc.). ✅ Engagement-Oriented – Uses AI to optimize video titles for TikTok, increasing chances of visibility and audience interaction. ✅ Consistent Branding – Ensures uniform style and messaging across multiple video posts. How It Works Text-to-Speech (TTS) Generation: The workflow starts by sending a predefined text script and a selected voice (e.g., "Alice") to the Fal.ai service, which utilizes ElevenLabs' AI to generate a high-quality audio file. The workflow then polls the API until the audio generation is COMPLETED and retrieves the URL of the generated audio file. Talking Avatar Video Generation: The workflow takes a static image URL and the newly created audio URL and sends them to another Fal.ai service (Infinitalk). This AI model animates the avatar in the image to lip-sync and match the provided audio. A prompt guides the avatar's expression (e.g., "You are a girl giving a weather forecast and you must be expressive"). The workflow again polls for status until the video is COMPLETED. Title Generation & Publishing: Once the video is ready, its URL is fetched. Simultaneously, an OpenAI (GPT-4o-mini) node generates an optimized, engaging title (under 60 characters) for the TikTok post based on the original script and avatar prompt. The final video file is downloaded and uploaded to Postiz (a social media scheduling service), which finally posts it to a pre-configured TikTok account. Set Up Steps Before executing this workflow, you must configure the following third-party service credentials and node parameters within n8n: Fal.ai API Credentials: Create an account on Fal.ai and obtain an API key. Create a new credential of type "HTTP Header Auth" in n8n named "Fal.run API". The key should be placed in the Value field, and the Header Name must be set to Authorization. The value should be Key <YOUR_FAL_AI_API_KEY>. OpenAI API Credentials: You need an OpenAI API key. Create a credential in n8n of type "OpenAI API", name it (e.g., "OpenAi account"), and enter your API key. Postiz API Credentials: Create an account on Postiz, connect your TikTok account, and get your API key from the Postiz dashboard. In n8n, create an "HTTP Header Auth" credential named "Postiz". Set the Header Name to X-API-Key and the Value to your Postiz API key. Also, create a "Postiz API" credential in n8n and enter the same API key. Configure Postiz Node: In the "TikTok" (Postiz) node, you must replace "XXX" in the integrationId field with the actual ID of your connected TikTok account from your Postiz dashboard. (Optional) Customize Inputs: You can modify the default values in the "Set text input" node (the script and voice) and the "Set Video Params" node (the image_url and the prompt for the avatar's expression) to create different videos without changing the workflow's structure. Need help customizing? Contact me for consulting and support or add me on Linkedin.