Service Accounts¶
All Confluent Cloud connectors require credentials to allow the connector to operate and access Kafka. You can either create and use a Kafka API key and secret or use a service account API key and secret. This section provides the steps to create a service account and the API key and secret key.
You create and manage service accounts using the Confluent Cloud CLI. See Service Accounts for Confluent Cloud for detailed information about service accounts.
Note
The service account is separate from the cloud provider platform service account that may be required for your connector to access cloud platform resources. For example, a sink connector sending data to a Google Cloud bucket requires both a service account for Kafka and a service account to access the bucket.
Example: Configuring a service account¶
The following examples show how to set up a service account. These steps can be used for a cluster running on any cloud provider.
Note
Sink connectors autogenerate internal topics when launched. You must set access control list (ACL) READ and WRITE access for these topics.
Sink connector service account¶
This example assumes the following:
- You have a Kafka cluster with cluster ID
lkc-gqgvx
. - You want the sink connector to read from a topic named
pageviews
.
Use the following example steps to create a service account, add an API key and secret, and set ACLs on topics.
Note
The following steps show basic ACL entries for sink connector service accounts. Make sure to review the Sink connector success and error topics section for additional ACL entries that may be required for certain connectors.
Create a service account named
myserviceaccount
:ccloud service-account create myserviceaccount --description "test service account"
Find the service account ID for
myserviceaccount
:ccloud service-account list
Set a DESCRIBE ACL to the cluster.
ccloud kafka acl create --allow --service-account "<service-account-id>" --operation "DESCRIBE" --cluster-scope
Set a READ ACL to
pageviews
:ccloud kafka acl create --allow --service-account "<service-account-id>" --operation "READ" --topic "pageviews"
Set a CREATE ACL to the following topic prefix:
ccloud kafka acl create --allow --service-account "<service-account-id>" --operation "CREATE" --prefix --topic "dlq-lcc"
Set a WRITE ACL to the following topic prefix:
ccloud kafka acl create --allow --service-account "<service-account-id>" --operation "WRITE" --prefix --topic "dlq-lcc"
Set a READ ACL to a consumer group with the following prefix:
ccloud kafka acl create --allow --service-account "<service-account-id>" --operation "READ" --prefix --consumer-group "connect-lcc"
Create a Kafka API key and secret for
<service-account-id>
:ccloud api-key create --resource "lkc-gqgvx" --service-account "<service-account-id>"
Save the API key and secret. You need this to configure your client applications. This is the only time you can get this information.
Important
Client applications that connect to the cluster must have at least the following three parameters configured:
- API key: available when you create the API key/secret pair
- API secret: available only when you create the API key/secret pair
bootstrap.servers
: set to theEndpoint
in the output ofccloud kafka cluster describe
Source connector service account¶
This example assumes the following:
- You have a Kafka cluster with cluster ID
lkc-gqgvx
. - You want the source connector to write to a topic named
passengers
.
Use the following example steps to create a service account, and service account API key and secret.
Note
The following steps show basic ACL entries for source connector service accounts. Make sure to review Debezium source connectors and JDBC-based source connectors and the MongoDB Atlas source connector for additional ACL entries that may be required for certain connectors.
Create a service account named
myserviceaccount
:ccloud service-account create myserviceaccount --description "test service account"
Find the service account ID for
myserviceaccount
:ccloud service-account list
Set a DESCRIBE ACL to the cluster.
ccloud kafka acl create --allow --service-account "<service-account-id>" --operation "DESCRIBE" --cluster-scope
Set a WRITE ACL to
passengers
:ccloud kafka acl create --allow --service-account "<service-account-id>" --operation "WRITE" --topic "passengers"
Create a Kafka API key and secret for
<service-account-id>
:ccloud api-key create --resource "lkc-gqgvx" --service-account "<service-account-id>"
Save the API key and secret. You need this to configure your client applications. This is the only time you can get these keys.
Important
Client applications that connect to the cluster must have at least the following three parameters configured:
- API key: available when you create the API key/secret pair
- API secret: available only when you create the API key/secret pair
bootstrap.servers
: set to theEndpoint
in the output ofccloud kafka cluster describe
For the CLI command reference, see ccloud kafka acl create.
Additional ACL entries¶
Certain connectors require additional ACL entries.
Debezium source connectors¶
The Source connector service account section provides basic ACL entries for source connector service accounts. Debezium source connectors require additional ACL entries. Add the following ACL entries for Debezium source connectors:
ACLs to create and write to table related topics prefixed with
<database.server.name>
. Use the following commands to set these ACLs:ccloud kafka acl create --allow --service-account "<service-account-id>" \ --operation "CREATE" --prefix --topic "<database.server.name>"
ccloud kafka acl create --allow --service-account "<service-account-id>" \ --operation "WRITE" --prefix --topic "<database.server.name>"
ACLs to describe configurations at the cluster scope level. Use the following commands to set these ACLs:
ccloud kafka acl create --allow --service-account "<service-account-id>" \ --cluster-scope --operation "DESCRIBE"
ccloud kafka acl create --allow --service-account "<service-account-id>" \ --cluster-scope --operation "DESCRIBE-CONFIGS"
The Debezium MySQL CDC Source and the Debezium Microsoft SQL Source connectors require the following additional ACL entries:
ACLs to create and write to database history topics prefixed with
dbhistory.<database.server.name>.lcc-
. For example, the server name iscdc
in the configuration property"database.server.name": "cdc"
. Use the following commands to set these ACLs:ccloud kafka acl create --allow --service-account "<service-account-id>" \ --operation "CREATE" --prefix --topic "dbhistory.<database.server.name>.lcc-"
ccloud kafka acl create --allow --service-account "<service-account-id>" \ --operation "WRITE" --prefix --topic "dbhistory.<database.server.name>.lcc-"
ACLs to read database history consumer group named
<database.server.name>-dbhistory
. For example, the server name iscdc
in the configuration property"database.server.name": "cdc"
. Use the following commands to set these ACLs:ccloud kafka acl create --allow --service-account "<service-account-id>" \ --operation "READ" --consumer-group "<database.server.name>-dbhistory"
JDBC-based source connectors and the MongoDB Atlas source connector¶
The Source connector service account section provides basic ACL entries for source connector service accounts. Several source connectors allow a topic prefix. When a prefix is used, you need to add additional ACL entries. The following source connectors require the additional ACL entries if a prefix is used.
- MySQL Source (JDBC) Connector for Confluent Cloud
- PostgreSQL Source (JDBC) Connector for Confluent Cloud
- Microsoft SQL Server Source (JDBC) Connector for Confluent Cloud
- Oracle Database Source Connector for Confluent Cloud
- MongoDB Atlas Source Connector for Confluent Cloud
Add the following ACL entries for these source connectors:
ccloud kafka acl create --allow --service-account "<service-account-id>" --operation "CREATE" --prefix --topic "<topic.prefix>"
ccloud kafka acl create --allow --service-account "<service-account-id>" --operation "WRITE" --prefix --topic "<topic.prefix>"
Sink connector success and error topics¶
The Sink connector service account section provides basic ACL
entries for sink connector service accounts. Several sink connectors create
additional success-lcc
and error-lcc
topics when the connector is
launched. The following sink connectors create these topics and require
additional ACL entries:
- AWS Lambda Sink Connector for Confluent Cloud
- Azure Functions Sink Connector for Confluent Cloud
- Google Cloud Functions Sink Connector for Confluent Cloud
- Datadog Metrics Sink for Confluent Cloud
Add the following ACL entries for these sink connectors:
ccloud kafka acl create --allow --service-account "<service-account-id>" --operation "CREATE" --prefix --topic "success-lcc"
ccloud kafka acl create --allow --service-account "<service-account-id>" --operation "WRITE" --prefix --topic "success-lcc"
ccloud kafka acl create --allow --service-account "<service-account-id>" --operation "CREATE" --prefix --topic "error-lcc"
ccloud kafka acl create --allow --service-account "<service-account-id>" --operation "WRITE" --prefix --topic "error-lcc"