Interact | activityOrchestrator | receivers

The activity orchestrator receivers category specifies the event receivers for your Unica Interact inbound gateway activity.

Category name

Description

The name of your receiver.

Type

Description
The type of receiver. You can choose between Kafka, and Custom. Custom requires you to use an implementation of the iReceiver.
Note: If you have used Kafka in the previous version, then you can set the value of type as Kafka in the upgraded version.

Enabled

Description
Select True to enable the receiver or false to disable the receiver.

className

Description
This property defines the fully qualified class name of this receiver implementation. It is used only when the type is Custom. If the type is Kafka, then the value must be empty.

classPath

Description
This property defines the URI to the JAR file that includes the implementation of this receiver. If left empty, the class path of the hosting Unica Interact application is used. It is used only when the type is Custom. If the type is Kafka, then the value must be empty.
Interact | activityOrchestrator | receivers | Parameter Data
You can add receiver parameters, such as queueManager and messageQueueName to define your receiver queue.

If the type is Kafka, the following parameters are supported.

  • providerUrl: A list of host/port pairs to be used for establishing the initial connection with the Kafka cluster. This list must be in the form of host1:port1,host2:port2,....
  • topic: A topic is a category or feed name to which messages are stored and published. All Kafka messages are organized into topics. If you require to send a message, you can send it to a specific topic and if you require to read a message you can read it from a specific topic. Producer applications write data to topics and consumer applications read from topics. Topic name must contain a ASCII alphanumeric, '.', '_' and '-' characters. Due to the limitations in topic names, you can either use topics with a period ('.') or underscore ('_'). The maximum length of a topic name can be 255 characters. For example, if you create or provide a topic name 'InteractTM_1’, and you try to create a topic like ‘InteractTM.1’, then the following error is generated. "Topic InteractTM.1 collides with existing topics: InteractTM _1.”
  • group.id: Specifies the name of the consumer group to which a Kafka consumer belongs.
  • zookeeper.connect: Specifies the zookeeper connection string in the form of hostname:port, where hostname and port are the host and port of a zookeeper server.
  • authentication: Users can use Kafka by enabling different authentication mechanisms.
  • amplifyConsumer: Specifies the flag to start amplifying consumer utility (default value is false). The utility periodically analyzes consumer lag and amplifies consumers by adding new consumers if the existing consumer or consumers are overloaded, and if there is the scope of adding new consumers within the same consumer group ID. Valid values are true | false.
  • analyzeLagIntervalInSec: Specifies interval in seconds to periodically run analyze consumer lag (default value is 10 seconds). Valid values are any positive integers.
  • consumerLagThreshold: Specifies the threshold lag value to evaluate consumer or consumers that are not overloaded (default value is 1000). Valid values are any positive integers.
  • minAmplifyConsumers: Specifies the minimum number of consumers that were initially added during Interact startup (default value is 1). Valid values are any positive integers.
  • maxAmplifyConsumers: Specifies the maximum number of consumers that can be added within the same group ID. This number will not be in effect if the topic has lesser number of partitions (default value is 3). Valid values are any positive integers.
Mandatory parameters for subscribing messages
By default, the Kafka server does not support any authentication mechanism. You can start the Kafka server considering that authentication mechanism is disabled. In this case, you can set the "authentication" parameter with value "None". The following table includes the mandatory parameters required to subscribe messages.
Parameters Allowed/Sample Parameter Value
providerUrl <host>:<port> (example: localhost:9092)
group.id Any string (example: InteractTMGateway)
topic Any string (example: InteractTM)
authentication Any string
zookeeper.connect <host>:<port> (example: localhost:2181)
Authentication mechanism
You can use Kafka by enabling different authentication mechanisms.
Authentication by SASL_PLAIN mechanism
If you want to use the SASL_PLAIN authentication mechanism, you must set the parameter "authentication" with value "Plain" along with its supported parameters.

The following parameters are required, if SASL_PLAIN mechanism is supported.

  • asmUserforMQAuth: The user name for logging into the server. It is required when the server enforces authentication.
  • authDS: The password associated with the user name for logging into the server.
  • username/password: The username or password of Kafka server configured in the JASS configuration file.

The following table provides the parameters required for the SASL_PLAIN mechanism.

Parameters Allowed/Sample parameter values
authentication Plain
asmUserforMQAuth Any string (example: test_user)
authDS Any string (example: authDS)
username Any string (example: test_user)
password Any string (example: test-secret)

If the "authentication" parameter is "Plain", you must either use asmUserforMQAuth/authDS or username/password parameters for authentication .

Create the data sources (authDS) in the User section in platform configuration. See the following example for data sources details.

Datasource Username Password
authDS test_user test-secret
Authentication by SSL mechanism
To use the SSL authentication mechanism, you must set the parameter ‘authentication’ with value ‘SSL’ along with its supported parameters.

The following parameters are required to support SSL mechanism.

  • ssl.keystore.location: The location of the key store file. You can use it for a two-way authentication for client.
  • ssl.truststore.location: The location of the trust store file.
  • SSLKeystoreDS: The keystore datasource name, which stores the password of ssl keystore.
  • SSLKeyDS: The key datasource name, which stores the password of ssl key.
  • SSLTruststoreDS: The truststore datasource name, which stores the password of ssl truststore.

The following table includes the supported parameters for SSL mechanism.

Parameters Allowed/Sample Parameter Values
authentication SSL
ssl.keystore.location SSL Keystore location (example: C:/SSL/kafka.client.keystore.jks)
ssl.truststore.location SSL Keystore location (example: C:/SSL/kafka.client.truststore.jks)
asmUserforMQAuth Any string (example: test_user)
SSLKeystoreDS Any string (example: SSLKeystoreDS)
SSLKeyDS Any string (example: SSLKeyDS)
SSLTruststoreDS Any string (example: SSLTruststoreDS)

Create the data sources (SSLKeystoreDS, SSLKeyDS, and SSLTruststoreDS) in the User section in platform configuration. See the following example for data sources details.

Datasource Username Password
SSLKeystoreDS keystore keystore-secret
SSLKeyDS key key-secret
SSLTruststoreDS truststore truststore -secret
Note: Client keystore or truststore is required at Producer or Consumer side in Interact application (where the Interact application is installed). C:/SSL/kafka.client.keystore.jks and C:/SSL/kafka.client.truststore.jks are the local locations, where the Interact application is installed.
Authentication by SASL_SSL mechanism
If you require to use the SASL_SSL authentication mechanism, then you must set the parameter "authentication" with value "SASL_SSL" along with its supported parameters. SASL_SSL mechanism is the combination of SASL_PLAIN and SSL mechanisms. The following table includes the supported parameters for SASL_SSL mechanism.
Parameters Allowed/Sample Parameter Values
authentication SASL_SSL
asmUserforMQAuth Any string (example: test_user)
authDS Any string (example: authDS)
username Any string (example: test_user)
password Any string (example: test-secret)
ssl.keystore.location SSL Keystore location (example: C:/SSL/kafka.client.keystore.jks)
ssl.truststore.location SSL Keystore location (example: C:/SSL/kafka.client.truststore.jks)
SSLKeystoreDS Any string (example: SSLKeystoreDS)
SSLKeyDS Any string (example: SSLKeyDS)
SSLTruststoreDS Any string (example: SSLTruststoreDS)

If the "authentication" parameter is "SASL_SSL", you must either use asmUserforMQAuth/authDS or username/password.

Create the data sources (authDS, SSLKeystoreDS, SSLKeyDS, and SSLTruststoreDS) in the User section in platform configuration. For data sources details, see the following example.

Datasource Username Password
authDS admin admin-secret
SSLKeystoreDS keystore test1234
SSLKeyDS key test1234
SSLTruststoreDS truststore test1234
Note: If you provide any data sources like authDS, SSLKeystoreDS, SSLKeyDS, or SSLTruststoreDS in the platform configuration parameter, then you must also provide asmUserforMQAuth parameter.

Client keystore/truststore is required at Producer/Consumer side in Interact application (where Interact application is installed). C:/SSL/kafka.client.keystore.jks and C:/SSL/kafka.client.truststore.jks are the local locations, where Interact application is installed.

Optional parameters for subscribing messages
enable.auto.commit means that offsets are committed automatically with a frequency controlled by the config "auto.commit.interval.ms". The value of auto.commit.interval.ms must not exceed 1000 as the poll interval is set to 1000. The value of auto.commit.interval.ms must not exceed the value of poll interval.

The following table includes the optional parameters for subscribing messages.

Parameters Default value Allowed/Sample parameter values
enable.auto.commit true True, False
auto.commit.interval.ms 200 Positive integer
Optional thread management parameters
The following optional parameters can be used for thread management.
  • corePoolSize: The number of threads to keep in the pool for monitoring Kafka service.
  • maxPoolSize: The maximum number of threads to keep in the pool for monitoring Kafka service.
  • keepAliveTimeSecs: The maximum time taken by the excess idle threads to wait for new tasks before terminating to monitor Kafka service, when the number of threads is greater than the core.
  • queueCapacity: The size of the queue used by the thread pool for monitoring Kafka service.

The following table includes the optional parameters for thread management.

Parameters Default value Allowed/Sample Parameter Values
corePoolSize 1 Positive integer
maxPoolSize 5 Positive integer
keepAliveTimeSecs 5 Positive integer
pqueueCapacity 100 Positive integer
Optional zookeeper parameters
The following optional parameter can be used for Zookeeper activities.

zookeeper.connection.timeout.ms: The maximum time that the client waits to establish a connection with zookeeper. If not set, the value in "zookeeper.session.timeout.ms" is used.

The following table includes the optional parameters for zookeeper activities.

Parameter Default Value Allowed/Sample Parameter Value
zookeeper.connection.timeout.ms 6000 Positive integer
Optional parameters for creating topic
The following optional parameters can be used for creating topic.
  • num.partitions: The number of partitions for the offset commit topic.
  • replication.factor: The replication factor to change log topics and repartition topics created by the stream processing application.

The following table includes the optional parameters for creating topic.

Parameters Default value Allowed/Sample Parameter Values
num.partitions 1 Positive integer
replication.factor 1 Positive integer