A hands-on demo showing how to download, install, configure and use the ScyllaDB CDC Source Connector.
So let’s go straight into the demo
so previously, in the previous demo we read data from Kafka and inserted it into ScyllaDB
In this demo, the opposite thing happens
the Connector reads data from. ScyllaDB and inserts it into Kafka
So let’s go into the demo
So similarly to the Sink Connector, we have to install the source Connector
source Connector is also open-source you can visit the repository and learn more
about it here, and similarly the easiest way to install it is to use the Confluent Hub tool
and it’s as easy as invoking Confluent Hub install and specifying the name of the Connector
As we have installed a new Connector, we’ll have to restart the Kafka Connect cluster
I have stopped it
Started it back
and while it’s starting up, let’s go back to the ScyllaDB cluster, our local ScyllaDB cluster
and let’s create a table that will be the source of data for the CDC source Connector
As you can see we have enabled CDC functionality on this table and let’s insert some data
to this to this table so we will see it later as it’s processed by the Connector
The Kafka connect cluster has started and we can go back to the Confluent control center
As you can see a new Connector has appeared that’s the source Connector for ScyllaDB
After we click ok, we can configure the Connector
Similarly to our previous example we’ll use the JSON converter
and similarly, we’ll disable the schemas to make this message a bit smaller
Using the key.converter.schemas.enable configuration
Next in the ScyllaDB category, we specify the host, so that’s the IP address and port of ScyllaDB
and next in the events section we have to specify the table name so we’ll use the
table name we have just created
and that’s all
We have to specify also the namespace, the namespace allows you to logically partition your data, logically
Name your different deployments or we’ll call it ScyllaDBU deployment
So Piotr, two things one we are having five minutes and second there are two
questions, the first one I think it’s easier to answer. Can we use a hookup custom converter?
look at the question, yes you could use a custom converter
Okay and the second one regarding Confluent cloud
So currently it’s not available as a one-click installation at Confluent cloud
but we’ll be working with Confluent to hopefully have it enabled but you can install it in
Confluent cloud if you have your own Kafka connect deployment, and just to finish up the presentation
If we look at the topic that was created by the source Connector now you can see three messages
that were created and they have the data we have inserted, so the rows with 10, 11, 12
7, 8, 9 and it’s above that 4, 5, 6