Up to now we’ve been using kafka with jaas (plain login/password) the following way:
- yarnship the jaas file
- add the jaas file name into “flink-conf.yaml” using property “env.java.opts”
How to support multiple secured kafka 0.10 consumers and producers (with different logins and password of course) ?
From what I saw in the kafka sources, the entry name “KafkaClient” is hardcoded…
I'm not a Kafka expert but if something is hardcoded that should not, it might be worth opening an issue for it. I loop in somebody who might knows more your problem.
Am 26/04/17 um 14:47 schrieb Gwenhael Pasquiers:
In reply to this post by Gwenhael Pasquiers
Sorry for the very long delayed response on this.
As you noticed, the “KafkaClient” entry name seems to be a hardcoded thing on the Kafka side, so currently I don’t think what you’re asking for is possible.
It seems like this could be made possible with some of the new authentication features in Kafka 0.10 that seems related:  .
I’m not that deep into the authentication modules, but I’ll take a look and can keep you posted on this.
Also looping in Eron (in CC) who could perhaps provide more insight on this at the same time.
On 26 April 2017 at 8:48:20 PM, Gwenhael Pasquiers ([hidden email]) wrote:
Follow-up for this:
Turns out what you require is already available with Kafka 0.10, using dynamic JAAS configurations  instead of a static JAAS file like what you’re currently doing.
The main thing to do is to set a “sasl.jaas.config” in the config properties for your individual Kafka consumer / producer.
This will override any static JAAS configuration used.
Note 2 things here: 1) static JAAS configurations are a JVM process-wide installation, meaning using that any separate Kafka client within the same process can always only share the same credentials and 2) the “KafkaClient” is a fixed JAAS lookup section key that the Kafka clients use, which I don’t think is modifiable. So using the static config approach would never work.
An example “sasl.jaas.config” for plain logins:"org.apache.kafka.common.security.plain.PlainLoginModule required username=xxxx password=yyyy
Simply have different values for each of the Kafka consumer / producers you’re using.
On 8 May 2017 at 4:42:07 PM, Tzu-Li (Gordon) Tai ([hidden email]) wrote:
Gordon's suggestion seems like a good way to provide per-job credentials based on application-specific properties. In contrast, Flink's built-in JAAS features are aimed at making the Flink cluster's Kerberos credentials available to jobs.
I want to reiterate that all jobs (for a given Flink cluster) run with full privilege in the JVM, and that Flink does not guarantee isolation (of security information) between jobs. My suggestion is to not run untrusted job code in a shared Flink cluster.
On Wed, May 24, 2017 at 8:46 PM, Tzu-Li (Gordon) Tai <[hidden email]> wrote:
|Free forum by Nabble||Edit this page|