How to Connect Coreflux MQTT Broker with MongoDB - Complete IoT Automation Guide | Flash Demo

Estimated Time: 30-45 minutes | Difficulty Level: Intermediate

What if connecting your MQTT broker to a database took 15 minutes instead of 15 hours?

We put this to the test. Tiago Abreu from our team recorded the entire process of connecting Coreflux to MongoDB, from empty Docker containers to data flowing into the database. The result? A surprisingly quick setup that actually scales.

Watch the full 15 minute Flash Demo:

https://youtu.be/h554ybSD_oc

Now, let us dive into the complete tutorial.

Introduction

MQTT brokers are essential for modern IoT infrastructure and automation systems, where the need for a centralized, unified and fast data hub is the key part for system interoperability and data exchange. Coreflux is a powerful, low code MQTT broker that expands the traditional MQTT broker to a system that provides advanced features for real time data processing, transformation, and seamless integration with databases including MongoDB, PostgreSQL, MySQL, and OpenSearch.

In this comprehensive tutorial, you will deploy a complete IoT automation pipeline using Coreflux MQTT broker integrated with MongoDB using Docker containers. This scalable storage and processing solution enables you to collect, transform, and store IoT data efficiently while maintaining enterprise grade reliability and performance.

What You Will Build

By the end of this automation guide, you will have deployed:

  • A MongoDB database instance running in Docker
  • A Coreflux MQTT broker running in Docker
  • Real time data simulation using LoT Notebook extension
  • Low code data transformation models and database integration routes
  • Complete Data Integration and Transformation pipeline for IoT automation

Need Help? Throughout this tutorial, you can get instant assistance at docs.coreflux.org using the Ask AI feature. Simply describe what you are trying to do, and get personalized guidance on LoT syntax, troubleshooting, and best practices.

About the Technologies

What is MQTT?

MQTT (Message Queuing Telemetry Transport) is a lightweight, publish subscribe network protocol widely adopted in IoT ecosystems. Designed for constrained devices and low bandwidth, high latency, or unreliable networks, MQTT enables efficient, real time messaging in bandwidth constrained environments.

About Coreflux

Coreflux offers a lightweight MQTT broker to facilitate efficient, real time communication between IoT devices and applications, including real time data transformation capabilities necessary for each use case. Built for scalability and reliability, Coreflux is tailored for environments where low latency and high throughput are critical.

With Coreflux, you get:

  • Data Processing: Centralization of your data processing needs where your data lives, ensuring real time data processing.
  • Data Integration: Easily integrate with databases, ensuring a single and simple ecosystem for all your data needs.
  • Scalability: Easily handle growing amounts of data and devices without compromising performance.
  • Reliability: Ensure consistent and dependable messaging across all connected devices.

Tutorial Overview

  1. Deploy with Docker Compose (5 min)
  2. Connect LoT Notebook (3 min)
  3. Create Simulated Data (5 min)
  4. Build Data Models (10 min)
  5. Integrate MongoDB (5 min)
  6. Verify Pipeline (5 min)
  7. Explore Coreflux Docs and Use Cases (optional)

Total: approximately 35 minutes

Prerequisites

Before you begin this MQTT broker deployment tutorial, you will need:

  • Docker and Docker Compose installed on your system
  • Understanding of MQTT protocol concepts and IoT architecture
  • Visual Studio Code (for LoT Notebook extension)

Step 1: Deploying MongoDB and Coreflux with Docker Compose

Docker Compose simplifies the deployment by managing both MongoDB and Coreflux containers together with proper networking.

Creating the Docker Compose Configuration

Create a docker-compose.yml file in your project directory:

services:
  mongodb:
    image: mongo:latest
    container_name: mongodb
    ports:
      - "27017:27017"
    environment:
      MONGO_INITDB_ROOT_USERNAME: admin
      MONGO_INITDB_ROOT_PASSWORD: coreflux_password
      MONGO_INITDB_DATABASE: coreflux_data
    volumes:
      - mongodb_data:/data/db
    networks:
      - iot_network

  coreflux:
    image: coreflux/coreflux-mqtt-broker:latest
    container_name: coreflux
    ports:
      - "1883:1883"
      - "1884:1884"
      - "5000:5000"
      - "443:443"
    depends_on:
      - mongodb
    networks:
      - iot_network

volumes:
  mongodb_data:

networks:
  iot_network:
    driver: bridge

This Docker Compose configuration defines two services (mongodb and coreflux), sets up MongoDB with persistent storage using a named volume, configures MongoDB credentials, creates an initial database called coreflux_data, exposes all necessary ports for both services, creates a shared network for container communication, and ensures Coreflux starts after MongoDB is ready.

Starting the Services

Start both containers with a single command:

docker-compose up -d

The -d flag runs containers in detached mode (in the background).

Stopping the Services

When you need to stop the services:

docker-compose down

To stop and remove all data (including the MongoDB volume):

docker-compose down -v

Verifying Both Services are Running

Check that both containers are running correctly:

docker ps

You should see both MongoDB and Coreflux containers in the list of running containers.

Testing MongoDB Connection

You can test the MongoDB connection using MongoDB Compass or the MongoDB shell. The connection string will be:

mongodb://admin:coreflux_password@localhost:27017

Validating MQTT Broker Connection

You can access the MQTT broker through an MQTT client like MQTT Explorer to validate access using the default credentials:

  • Host: localhost
  • Port: 1883
  • User: root
  • Password: coreflux

Security Note: These are default credentials for development. In production, use strong passwords and enable authentication and encryption.

Step 2: Setting Up IoT Data Integration with LoT

Installing the LoT Notebook Extension

The LoT (Language of Things) Notebook extension for Visual Studio Code provides an integrated low code development environment for MQTT broker programming and IoT automation.

  1. Open Visual Studio Code
  2. Go to Extensions (Ctrl+Shift+X)
  3. Search for "LoT Notebooks"
  4. Install the LoT VSCode Notebooks Extension by Coreflux

Connecting to Your MQTT Broker

Configure the connection to your Coreflux MQTT broker using default credentials:

  • Host: localhost (or your server IP)
  • Port: 1883
  • User: root
  • Password: coreflux

Assuming no errors, you will see the status of the MQTT connectivity to the broker in the bottom bar on the left.

Progress Check

At this point you should have:

  • MongoDB running on port 27017
  • Coreflux MQTT broker running on port 1883
  • LoT Notebook connected and showing "Connected" in the status bar

If any of these are not working, see the Troubleshooting section at the end.

Step 3: Creating Simulated IoT Data with Actions

For this use case, we will build an integration of raw data through a transformation pipeline into MongoDB. Since we are not connected to actual MQTT devices, we will use LoT capabilities to simulate device data.

In LoT, an Action is executable logic triggered by specific events such as timed intervals, topic updates, or explicit calls from other actions or system components. Actions allow dynamic interaction with MQTT topics, internal variables, and payloads, facilitating complex IoT automation workflows.

You can create an Action in LoT Notebooks files (.lotnb) inside your VS Code, that can then be deployed. You can also add Markdown cells to document your code.

Learning LoT Syntax: If you are new to the Language of Things (LoT), the Coreflux Documentation provides comprehensive guides and examples. Use the Ask AI feature to get instant answers about syntax, see working examples, or troubleshoot any issues you encounter.

Create your LoT Notebook

To run your LoT Action, Model, and Routes, you will need to create a LoT Notebook. Similar to Jupyter Notebooks, you will be able to create LoT code and add Markdown sections to document and organize your work.

Create a yourfile.lotnb file and create your first code section.

Generating Simulated IoT Data

Create an Action to generate simulated sensor data using the LoT interface:

DEFINE ACTION RANDOMIZEMachineData
ON EVERY 10 SECONDS DO
    PUBLISH TOPIC "raw_data/machine1" WITH RANDOM BETWEEN 0 AND 10
    PUBLISH TOPIC "raw_data/station2" WITH RANDOM BETWEEN 0 AND 60

When you run this Action, it will deploy automatically to the MQTT broker, generate simulated IoT sensor data every 10 seconds, publish real time data to specific MQTT topics, and show sync status in the LoT Notebook interface.

Step 4: Creating Data Transformation Models

Defining Data Models with Language of Things

Models in Coreflux are used to transform, aggregate, and compute values from input MQTT topics, publishing the results to new topics. They serve as the foundation for creating the UNS (Unified Namespace) of your system.

A Model allows you to define how raw IoT data should be structured and transformed, both for a single device or for multiple devices simultaneously (through the use of the wildcard +). A model also serves as the key data schema used for storage to the database.

DEFINE MODEL MachineData WITH TOPIC "Simulator/Machine/+/Data" 
    ADD "energy" WITH TOPIC "raw_data/+" AS TRIGGER
    ADD "energy_wh" WITH (energy * 1000)
    ADD "production_status" WITH (IF energy > 5 THEN "active" ELSE "inactive")
    ADD "stoppage" WITH (IF production_status EQUALS "inactive" THEN 1 ELSE 0)
    ADD "maintenance_alert" WITH (IF energy > 50 THEN TRUE ELSE FALSE)
    ADD "timestamp" WITH TIMESTAMP "UTC"

This standard model uses wildcard + to apply to all machines automatically, converts energy to Wh units, calculates production status based on energy levels, adds timestamps to all real time data points, and publishes structured data to the Simulator/Machine/+/Data topics.

As we generated two simulated sensors with the Action, we can see the Model structure being applied automatically to both, generating both a JSON object and individual topics.

Step 5: Setting Up MongoDB Integration

Creating a Database Route

Routes define how processed real time data flows to external systems like databases. They are defined with the following format:

DEFINE ROUTE mongo_route WITH TYPE MONGODB
    ADD MONGODB_CONFIG
        WITH CONNECTION_STRING "mongodb://admin:coreflux_password@mongodb:27017/coreflux_data?authSource=admin"
        WITH DATABASE "coreflux_data"

Note: Replace with your MongoDB information and credentials. If you are running Coreflux outside of Docker or without container linking, replace mongodb in the connection string with localhost or your server IP address.

Updating the Model for Database Storage

Modify your LoT model to use the database route for storage by adding this to the end of the Model:

STORE IN "mongo_route"
    WITH TABLE "MachineProductionData"

Additionally, add a parameter named device_name with the device name to have a unique identifier for each entry. This line extracts the device name from the MQTT topic structure. For example, with topic raw_data/machine1, it extracts machine1 as the device name.

Complete Model with storage:

DEFINE MODEL MachineData WITH TOPIC "Simulator/Machine/+/Data"
    ADD "energy" WITH TOPIC "raw_data/+" AS TRIGGER
    ADD "device_name" WITH REPLACE "+" WITH TOPIC POSITION 2 IN "+"
    ADD "energy_wh" WITH (energy * 1000)
    ADD "production_status" WITH (IF energy > 5 THEN "active" ELSE "inactive")
    ADD "stoppage" WITH (IF production_status EQUALS "inactive" THEN 1 ELSE 0)
    ADD "maintenance_alert" WITH (IF energy > 50 THEN TRUE ELSE FALSE)
    ADD "timestamp" WITH TIMESTAMP "UTC"
    STORE IN "mongo_route"
        WITH TABLE "MachineProductionData"

After you deploy this updated model, all data should be automatically stored in MongoDB when updated.

Step 6: Verifying the Complete IoT Automation Pipeline

Monitoring Real Time Data Flow

  1. MQTT Explorer: Use an MQTT client to verify real time data publication
  2. MongoDB Compass: Connect to verify data storage

Checking Database Storage

Connect to your MongoDB database using MongoDB Compass to verify storage:

  1. Use the connection string:
mongodb://admin:coreflux_password@localhost:27017/coreflux_data?authSource=admin

2. Navigate to the coreflux_data database

3. Check the MachineProductionData collection for stored documents

You should see real time data documents with a structure similar to:

{
  "_id": {
    "$oid": "695eb4e25c3f70134a8452cf"
  },
  "energy": 3,
  "energy_wh": 3000,
  "production_status": "inactive",
  "stoppage": 1,
  "maintenance_alert": false,
  "timestamp": "2026-01-07 19:32:50",
  "device_name": "station2"
}

All of the data is also available in the MQTT Broker for other uses and integrations.

Step 7: Expand Your Use Case and Integrations

Test LoT Capabilities

  • Publish Sample Data: Use MQTT Explorer to publish sample datasets to your Coreflux broker. Experiment with different payload structures and different Models and Actions to see how they are processed and stored in MongoDB.
  • Data Validation: Verify that the data in MongoDB matches the payloads you published. Check for consistency and accuracy using MongoDB Compass, ensuring your IoT automation integration is working as expected.
  • Real Time Monitoring: Set up a continuous real time data feed using actual MQTT devices or sensors. Watch how Coreflux and MongoDB handle incoming IoT data streams and explore response times.

Build Analytics and Visualizations

  • Create Dashboards: Integrate with visualization tools like Grafana to create dynamic dashboards that display your IoT data, both from the MQTT Broker live data or from the stored data in MongoDB.
  • Trend Analysis: Leverage MongoDB aggregation framework to analyze trends over time. Look for patterns, spikes, or anomalies in your real time data that could indicate system issues or optimization opportunities.
  • Multi Database Integration: Explore integrating additional databases like PostgreSQL for relational data, MySQL for structured queries, or OpenSearch for advanced analytics. Use Coreflux routes to send data to multiple destinations simultaneously.

Optimize and Scale Your IoT Infrastructure

  • Load Testing: Simulate high traffic by publishing many messages simultaneously using LoT Notebook or automated scripts. Monitor how your Coreflux MQTT broker and MongoDB instance handle the load and identify any bottlenecks.
  • Production Deployment: For production environments, consider adding resource limits, health checks, and restart policies to your Docker Compose configuration.
  • Monitoring: Set up monitoring and logging solutions to track system performance, error rates, and data throughput in real time.

Troubleshooting

MongoDB Connection Issues

Error: "connection refused on port 27017"

Solution:

  • Verify container is running: docker ps | grep mongodb
  • Check if port is already in use: lsof -i :27017
  • Try restarting the container: docker restart mongodb

Coreflux Broker Issues

Error: "cannot connect to MQTT broker"

Solution:

  • Verify container is running: docker ps | grep coreflux
  • Check broker logs: docker logs coreflux
  • Ensure firewall allows port 1883

LoT Code Not Deploying

Error: Code runs but nothing appears in MongoDB

Solution:

  • Check Route deployment status in LoT Notebook
  • Verify MongoDB connection string uses mongodb (container name) not localhost

Data Not Appearing in MongoDB

  • Wait at least 10 seconds for first Action trigger
  • Verify Model is publishing to correct topic (check MQTT Explorer)
  • Verify Route EVENT configuration matches Model output topic
  • Check MongoDB credentials in connection string

Conclusion

Integrating Coreflux MQTT broker with MongoDB provides a powerful solution for real time IoT data processing and scalable storage. Following this tutorial, you have set up a seamless automation pipeline that allows you to collect, process, and store IoT data efficiently using low code development practices.

With Coreflux scalable architecture and MongoDB robust document storage capabilities, you can handle large volumes of real time data and gain valuable insights instantly. Whether you are monitoring industrial systems, tracking environmental sensors, or managing smart infrastructure, this IoT automation integration empowers you to make data driven decisions quickly and effectively.

The Language of Things (LoT) approach creates an IoT ready foundation that scales with your needs. Your MQTT broker deployment is now ready for production workloads and can be extended to support PostgreSQL, MySQL, OpenSearch, and other database technologies as your requirements evolve.

What Next?

You have got a working pipeline. Now what?

  1. Connect real devices: Replace the simulated data action with actual MQTT devices or use Routes to bridge existing systems
  2. Build dashboards: Connect Grafana to MongoDB for real time visualization
  3. Scale up: Add more machines and watch the wildcard models handle them automatically
  4. Explore other databases: Coreflux also integrates with PostgreSQL, MySQL, and OpenSearch

Have questions? Use the Ask AI feature at docs.coreflux.org or drop a comment below.

Additional Resources

Coreflux Resources

MongoDB Resources

Related Content

Recent Posts