Kafka Connectors have been deprecated and will be removed on October, 1st 2024. Please refer to the deprecation notice for more information.

MongoDB Sink Connector allows you to continuously store the data that appears in your Kafka Topics to MongoDB database. In this guide, we will walk you through creating DB Sink Connector with MongoDB database to Upstash Kafka.

Get Started

Create a Kafka Cluster

If you do not have a Kafka cluster and/or topic already, follow these steps to create one.

Create the Connector

Go to the Connectors tab, and create your first connector by clicking the New Connector button.

Choose MongoDB Connector Sink

Enter a connector name and MongoDB URI(connection string). Select single or multiple topics from existing topics to read data from.

Enter Database and Collection that the selected topics are written into. We entered “new” as Database and “test” as Collection. It is not required for this database and collection to exist on MongoDB database. They will be created automatically.

The advanced screen is for any other configuration that the selected connector supports. At the top of this screen, you can find a link to related documentation. We can proceed with what we have and click the Connect button directly.

Congratulations! You have created your MongoDB Sink Connector. As you put data into your selected topics, the data will be written into your MongoDB database.