Overview
SSL (Secure Socket Layer) is a protocol that is used to establish encrypted connections while Kerberos is a secure method for authentication between network services. Kafka can be configured to use SSL and Kerberos for communication between Kafka brokers and producers/consumers, as well as inter-broker communication. This article describes how to configure SSL and Kerberos for Kafka in a BigInsights IOP cluster. This feature is supported in IOP 4.2 and later.
SSL and Kerberos are supported only for the new Kafka producer and consumers APIs. The new producer APIs were added in Kafka 0.8 and the new consumer APIs were added in Kafka 0.9. The old Scala-based producer and consumer APIs will be deprecated in a future release of Kafka in favor of the new Java-based producer and consumer APIs. For more information, visit Apache Kafka 0.9 documentation.
Setup SSL for Kafka Brokers
The steps below show how to configure SSL for Kafka brokers and clients on a two-node cluster. The two nodes in the cluster will be referred to as brokerhost1 and brokerhost2. Before running the commands below, replace <broker-hostname>
with the broker’s fully qualified domain name where the command is run.
In all nodes, create the directory /etc/kafka/conf/security
under which the certificates will be stored.
mkdir /etc/kafka/conf/security cd /etc/kafka/conf/security
Generate a SSL key and certificate on each node by running the following command on brokerhost1 and brokerhost2. This generates kafka.server.keystore.jks.
keytool -keystore kafka.server.keystore.jks -alias <broker-hostname> -validity 365 -genkey Enter keystore password: Re-enter new password: What is your first and last name? [Unknown]: hostname.abc.com What is the name of your organizational unit? [Unknown]: iop What is the name of your organization? [Unknown]: ibm What is the name of your City or Locality? [Unknown]: san jose What is the name of your State or Province? [Unknown]: california What is the two-letter country code for this unit? [Unknown]: US Is CN=hostname.abc.com, OU=iop, O=ibm, L=san jose, ST=california, C=US correct? [no]: yes Enter key password for (RETURN if same as keystore password):
Create a certificate authority (CA) used to sign certificates by running the following command on brokerhost1. The command will prompt for a password for this CA and it will generate the files ca-cert and ca-key.
openssl req -new -x509 -keyout ca-key -out ca-cert -days 365
Copy ca-cert to brokerhost2
Import the CA’s certificate to the broker’s truststore. Run the following command on brokerhost1 and brokerhost2 to generate the broker’s truststore, kafka.server.truststore.jks
keytool -keystore kafka.server.truststore.jks -alias CARoot -import -file ca-cert
Export the certificate from the keystore for each broker by running the following command on brokerhost1 and brokerhost2. This generates <cert-file-brokerhost> (make sure the name is unique per broker).
keytool -keystore kafka.server.keystore.jks -alias <broker-hostname> -certreq -file <cert-file-brokerhost>
Copy cert-file-brokerhost2 to the node where the CA is located, brokerhost1.
Sign the certificates with the CA by running the following command on brokerhost1 for each broker’s <cert-file-brokerhost>. Update the command with the CA password that was entered previously. This generates <cert-signed-brokerhost> (make sure the out file name is unique per broker).
openssl x509 -req -CA ca-cert -CAkey ca-key -in <cert-file-brokerhost> -out <cert-signed-brokerhost> -days 365 -CAcreateserial -passin pass:<ca-password>
Copy the signed certificate <cert-signed-brokerhost> from brokerhost1 to brokerhost2.
Import the CA’s certificate into each node’s keystore by running the following command on brokerhost1 and brokerhost2.
keytool -keystore kafka.server.keystore.jks -alias CARoot -import -file ca-cert
Import the signed certificate into each broker’s keystore by running the following command on brokerhost1 and brokerhost2 (<cert-signed-broker> is unique per brokerhost).
keytool -keystore kafka.server.keystore.jks -alias <broker-hostname> -import -file <cert-signed-brokerhost>
Setup SSL for Kafka Clients (producers and consumers):
If Kafka brokers are configured to require client authentication by setting ssl.client.auth
to required
or requested
, you must create a client keystore. Run the following command on each client node where the producers and consumers will be running from, replacing <client-hostname> with the node’s fully qualified domain name. This generates kafka.client.keystore.jks
keytool -keystore kafka.client.keystore.jks -alias <client-hostname> -validity 365 -genkey Enter keystore password: Re-enter new password: What is your first and last name? [Unknown]: hostname.abc.com What is the name of your organizational unit? [Unknown]: iop What is the name of your organization? [Unknown]: ibm What is the name of your City or Locality? [Unknown]: san jose What is the name of your State or Province? [Unknown]: california What is the two-letter country code for this unit? [Unknown]: US Is CN=client-hostname.abc.com, OU=iop, O=ibm, L=san jose, ST=california, C=US correct? [no]: yes Enter key password for (RETURN if same as keystore password):
Create a truststore for each client node by running the following command on each client. This generates kafka.client.truststore.jks.
keytool -keystore kafka.client.truststore.jks -alias CARoot -import -file ca-cert
Similar to the steps for the Kafka brokers, export each client’s certificate from its keystore and use the same CA to sign it. Finally, import the CA’s certificate and the client’s signed certificate into each client’s keystore.
Enable SSL
In Ambari UI under the “Advanced kafka-broker” section, update the protocol from PLAINTEXT
to SSL
in the listeners
property. Note that the hostname in the listeners
property should remain as localhost
. When starting Kafka from Ambari, “localhost” will be replaced with the actual hostname the broker is running on.
listeners=SSL://localhost:6667
Add the following SSL properties in “Custom kafka-broker”.
ssl.keystore.location=/etc/kafka/conf/security/kafka.server.keystore.jks ssl.keystore.password=bigdata ssl.key.password=bigdata ssl.truststore.location=/etc/kafka/conf/security/kafka.server.truststore.jks ssl.truststore.password=bigdata ssl.client.auth=required security.inter.broker.protocol=SSL
If security.inter.broker.protocol
is not set, inter-broker communication will use PLAINTEXT. In this case, the listeners
property will have to be updated to support both PLAINTEXT and SSL protocols.
listeners=PLAINTEXT://localhost:6667,SSL://localhost:6668
Note: There is an issue with starting Kafka after enabling SSL in IOP 4.2. Please refer to this technote on how to resolve the problem: Kafka Fails to Start if SSL is Enabled.
Restart the Kafka service from Ambari for the changes to take effect.
Verify that Kafka has started with the SSL endpoint indicated by the following message in /var/log/kafka/server.log on any Kafka broker node.
INFO Registered broker 1001 at path /brokers/ids/1001 with addresses: SSL -> EndPoint(hostname.abc.com,6667,SSL) (kafka.utils.ZkUtils)
If Kerberos was enabled prior to SSL, refer to the section SSL+Kerberos on how to support both protocols.
Configuring SSL for Broker-Client Communication
In order for clients to communicate to Kafka brokers using SSL, the following properties need to be set.
If using the console producer or consumer, create a configuration file under /usr/iop/current/kafka-broker/client-ssl.properties with the following properties:
security.protocol=SSL ssl.truststore.location=/etc/kafka/conf/security/kafka.client.truststore.jks ssl.truststore.password=bigdata ssl.keystore.location=/etc/kafka/conf/security/kafka.client.keystore.jks ssl.keystore.password=bigdata ssl.key.password=bigdata
If running a Java producer or consumer, set the following properties in the Java program:
Properties props = new Properties(); props.put("security.protocol", "SSL"); props.put("ssl.truststore.location", "/etc/kafka/conf/security/kafka.client.truststore.jks"); props.put("ssl.truststore.password", "bigdata"); props.put("ssl.keystore.location", "/etc/kafka/conf/security/kafka.client.keystore.jks"); props.put("ssl.keystore.password", "bigdata"); props.put("ssl.key.password", "bigdata"); ...
To verify that the console producer and consumer are working, run the following commands. The client configurations file must be passed in as a parameter.
#Create topic /usr/iop/current/kafka-broker/bin/kafka-topics.sh --create --zookeeper hostname1.abc.com:2181,hostname2.abc.com:2181 --replication-factor 2 --partitions 2 --topic mytopic #Start the console producer /usr/iop/current/kafka-broker/bin/kafka-console-producer.sh --broker-list hostname1.abc.com:6667,hostname2.abc.com:6668 --topic mytopic --producer.config /usr/iop/current/kafka-broker/conf/client-ssl.properties #Start the console consumer /usr/iop/current/kafka-broker/bin/kafka-console-consumer.sh --zookeeper hostname1.abc.com:2181,hostname2.abc.com:2181 --topic mytopic --from-beginning --new-consumer --bootstrap-server hostname1.abc.com:6667,hostname2.abc.com:6668 --consumer.config /usr/iop/current/kafka-broker/conf/client-ssl.properties
Enable Kerberos
To enable Kerberos for Kafka, follow the steps from the IBM Knowledge center: Setting up Kerberos for IBM Open Platform.
The Kerberos wizard will set the correct values for all required properties if SSL was not previously enabled. Note that the listeners
property in Ambari UI will continue to show PLAINTEXT://localhost:6667
after Kerberos is enabled. View /etc/kafka/conf/server.properties
for the actual values.
authorizer.class.name=kafka.security.auth.SimpleAclAuthorizer listeners=SASL_PLAINTEXT://hostname.abc.com:6667 principal.to.local.class=kafka.security.auth.KerberosPrincipalToLocal super.users=User:kafka security.inter.broker.protocol=SASL_PLAINTEXT
For console command line producers and consumers, it is recommended to run kinit to authenticate.
kinit -kt /etc/security/keytabs/kafka.service.keytab kafka/hostname.abc.com
For long running processes such as running Java producers and consumers, it is recommended to use keytabs to authenticate. Update the KafkaClient section in /etc/kafka/conf/kafka_jaas.conf
on the Kafka client nodes to include the following properties:
KafkaClient { com.sun.security.auth.module.Krb5LoginModule required useTicketCache=true renewTicket=true serviceName="kafka" useKeyTab=true keyTab="/etc/security/keytabs/kafka.service.keytab" principal="kafka/hostname.abc.com@ABC.COM"; };
The system property java.security.auth.login.config
is automatically set for the console producers and consumers after enabling Kerberos in IOP. For Java producers and consumers, add the system property with the location of the kafka_jaas.conf file as its value. If the Java producers and consumers are running outside the IOP cluster, copy this file to the machine where the Java program will be run.
System.setProperty("java.security.auth.login.config","/etc/kafka/conf/kafka_jaas.conf");
SSL+Kerberos
If SSL was enabled prior to Kerberos or Kerberos was enabled before SSL, there will be an issue starting Kafka. The listeners
property will not be automatically set to the correct value. To support both SSL and Kerberos, the following properties will have to be manually updated in the Ambari UI to the correct protocol, SASL_SSL:
listeners=SASL_SSL://localhost:6667 security.inter.broker.protocol=SASL_SSL
The following property in the client configurations will also need to be updated to support both protocols.
security.protocol=SASL_SSL
Restart the Kafka service from the Ambari for the changes to take effect.
Disable Security
To disable SSL, revert to the original settings in Ambari. Delete all the newly added properties mentioned above and set security.inter.broker.protocol
to PLAINTEXT
or SASL_PLAINTEXT
(if Kerberos is still enabled).
To disable Kerberos for Kafka, set security.inter.broker.protocol
to PLAINTEXT
or SSL
(if SSL is still enabled).
Finally, update the listeners
property to use the same protocol listed in security.inter.broker.protocol
. For example, PLAINTEXT://localhost:6667
if neither SSL nor Kerberos are enabled. Restart the Kafka service from Ambari.
Additional Information
This article provided a simple example on setting up SSL and enabling Kerberos for Kafka in IOP. Only one protocol was configured in the listeners
property, but it accepts a comma-separated list of values. From the Apache Kafka documentation for the listeners property:
“Listener List – Comma-separated list of URIs we will listen on and their protocols. Specify hostname as 0.0.0.0 to bind to all interfaces. Leave hostname empty to bind to default interface. Examples of legal listener lists: PLAINTEXT://myhost:9092,TRACE://:9091
PLAINTEXT://0.0.0.0:9092,TRACE://localhost:9093”
For more information regarding Kafka SSL, visit Apache Kafka documentation for Encryption and Authentication using SSL.
For more information regarding Kafka Kerberos, visit Apache Kafka documentation for Authenticating using SASL.