-
Notifications
You must be signed in to change notification settings - Fork 106
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Kafka Supplier and Consumer need to provide separate Kafka configuration properties #557
Comments
Isn't that what is supposed to be configured on the binder level via |
The binder has no authentication properties but tries to use the security properties that was set for the source application. |
So there could be a problem in the binder code that it picks up |
Right. So, it feels like fix has to be done in Kafka Binder itself. |
Let's add @sobychacko , since I'm not fully on board how Kafka Binder works. |
I will have a look at this today. |
@corneil The issue is that we have a Kafka supplier on one side consuming from the first instance and then the output binding that publishes to the other instance of Kafka. However, you only provide the Spring Boot based properties for the whole application which is used by the auto-configuration on the supplier side. We think, you need to override the following properties on the output binding to override the values used by the supplier.
|
Adding these properties worked. I believe a need to change the configuration of the binders to use this mechanism as the default to avoid issue for other customer stream apps that may rely on default spring kafka configuration. Need to create similar for RabbitMQ |
In what version(s) of Spring Functions Catalog are you seeing this issue?
5.0.0
Describe the bug
When kafka-supplier is used as kafka-source-kafka the properties provided to configure access to the 'external' kafka cluster is also applied to the 'internal' kafka the stream apps outputs and inputs are bound to.
To Reproduce
Deploy simple SCDF with single container kafka that doesn't require authentication in same namespace as scdf.
Deploy another kafka using Bitnami helm.
Configure a stream
kafka | log
with the properties:Expected behavior
A message created on topic ABC should be written to log.
The behaviour found is that topic .kafka cannot be written
Unexpected handshake request with client mechanism SCRAM-SHA-256, enabled mechanisms are []
** Required change **
The properties for configuration of Kafka instance should be prefixed with kafka.supplier or kafka.consumer as follows:
Alternatively the user will only supply the topic and the same kafka will be used as configured for all stream applications.
The text was updated successfully, but these errors were encountered: