Interact | triggeredMessage | dispatchers | <dispatcherName> | Parameter Data

The configuration properties in this category define parameters for a specific dispatcher in triggered messages.

You can choose between three types of dispatchers. InMemoryQueue is the internal dispatcher for Unica Interact. Custom is used for Kafka. JMSQueue is used to connect to a JMS provider via JNDI. Kafka is distributed as a streaming platform, which is used to publish and subscribe the streams of records.

category name

Description

This property defines the name of this parameter. The name must be unique among all parameters for that dispatcher.

value

Description

This property defines the parameters, in the format of name value pairs, needed by this dispatcher.

Note: All parameters for trigger messages are case sensitive and should be entered as shown here.

If the type is InMemoryQueue, the following parameter is supported.

  • queueCapacity: Optional. The maximum offers that can be waiting in the queue to be dispatched. When specified, this property must be a positive integer. If not specified or invalid, the default value (1000) is used.
If the type is Custom, the following parameters are supported.
  • providerUrl: <hostname>:port (case sensitive)
  • queueManager: The name of the queue manager that was created on the Kafka server.
  • messageQueueName: The name of the message queue that was created on the Kafka server.
  • enableConsumer: This property must be set to true.
  • asmUserforMQAuth: The user name for logging into the server. It is required when the server enforces authentication. Otherwise, it should not be specified.
  • authDS: The password associated with the user name for logging into the server. It is required when the server enforces authentication. Otherwise, it should not be specified.

If the type is JMSQueue, the following parameter is supported.

  • providerUrl: The URL to the JNDI provider (case sensitive).
  • connectionFactoryJNDI: The JNDI name of the JMS connection factory.
  • messageQueueJNDI: The JNDI name of the JMS queue where the triggered messages are sent to and retrieved from.
  • enableConsumer: Theis proprerty specifies whether a consumer of those triggered messages must be started in Unica Interact. This property must be set to true. If not specified, the default value (false) is used.
  • initialContextFactory: The fully qualified name of the JNDI initial context factory class. If you use WebLogic, the value of this parameter must be weblogic.jndi.WLInitialContextFactory.

If the type is Kafka, the following parameters are supported.

  • providerUrl: A list of host/port pairs to be used for establishing the initial connection to the Kafka cluster. This list must be in the form of host1:port1,host2:port2,....
  • topic: A topic is a category or feed name to which messages are stored and published. All Kafka messages are organized into topics. If you require to send a message, you can send it to a specific topic and if you require to read a message you can read it from a specific topic. Producer applications write data to topics and consumer applications read from topics. Topic name must contain a ASCII alphanumeric, '.', '_' and '-' characters. Due to the limitations in topic names, you can either use topics with a period ('.') or underscore ('_'). The maximum length of a topic name can be 255 characters. For example, if you create or provide a topic name 'InteractTM_1’ and you create a topic like ‘InteractTM.1’, then the following error is generated. "Topic InteractTM.1 collides with existing topics: InteractTM _1.”
  • group.id: Specifies the name of the consumer group to which a Kafka consumer belongs.
  • zookeeper.connect: Specifies the zookeeper connection string in the form of hostname:port, where hostname and port are the host and port of a ZooKeeper server.
  • authentication: Users can use Kafka by enabling different authentication mechanisms.
  • throttleProducer: Specifies the flag to start the throttle producer utility (default value is false). The utility analyzes consumer lag periodically and adds calculated wait time before producing the next record. Valid values are true | false).
  • analyzeLagIntervalInSec: Specifies interval in seconds to periodically run analyze consumer lag (default value is 10 seconds). Valid values are positive integers.
  • maxThrottleWaitInSec: Specifies maximum throttle wait time in seconds (default value is 2 seconds). Valid values are positive integers.
Mandatory parameters for publishing and subscribing messages
By default, the Kafka server does not support any authentication mechanism. Users can start the Kafka server considering that authentication mechanism is disabled. In this case, users can set the "authentication" parameter with value "None".
Table 1. Mandatory parameters for publishing messages
Parameters Allowed/Sample Parameter Values
providerUrl <host>:<port> (example: localhost:9092)
topic Any string (example: InteractTM)
authentication none | Plain | SSL | SASL_SSL
zookeeper.connect <host>:<port> (example: localhost:2181)
Table 2. Mandatory paramaters for subscribing messages
Parameters Allowed/Sample Parameter Value
providerUrl <host>:<port> (example: localhost:9092)
group.id Any string (example: InteractTMGateway)
topic Any string (example: InteractTM)
authentication none | Plain | SSL | SASL_SSL
zookeeper.connect <host>:<port> (example: localhost:2181)
Authentication mechanism
You can use Kafka by enabling different authentication mechanisms.
Authentication by SASL_PLAIN mechanism
If you require to use the SASL_PLAIN authentication mechanism, you must set the parameter "authentication" with value "Plain" along with its supported parameters.

The following parameters are required if SASL_PLAIN mechanism is supported.

  • asmUserforMQAuth: The user name for logging into the server. It is required when the server enforces authentication.
  • authDS: The password associated with the user name for logging into the server.
  • username/password: The username or password of Kafka server configured in the JASS configuration file.

The following table provides the parameters required for SASL_PLAIN mechanism.

Parameters Allowed/Sample parameter values
authentication Plain
asmUserforMQAuth Any string (example: test_user)
authDS Any string (example: authDS)
username Any string (example: test_user)
password Any string (example: test-secret)

If the "authentication" parameter is "Plain", you must either use asmUserforMQAuth/authDS or username/password parameters for authentication .

Create the data sources (authDS) in the User section in platform configuration. See the following example for data sources details.

Datasource Username Password
authDS test_user test-secret
Authentication by SSL mechanism
To use the SSL authentication mechanism, you must set the parameter "authentication" with value "SSL" along with its supported parameters.

The following parameters are required to support SSL mechanism.

  • ssl.keystore.location: The location of the key store file. You can use it for a two-way authentication for client.
  • ssl.truststore.location: The location of the trust store file.
  • SSLKeystoreDS: The keystore datasource name, which stores the password of ssl keystore.
  • SSLKeyDS: The key datasource name, which stores the password of ssl key.
  • SSLTruststoreDS: The truststore datasource name, which stores the password of ssl truststore.

The following table includes the supported parameters for SSL mechanism.

Parameters Allowed/Sample Parameter Values
authentication SSL
ssl.keystore.location SSL Keystore location (example: C:/SSL/kafka.client.keystore.jks)
ssl.truststore.location SSL Keystore location (example: C:/SSL/kafka.client. truststore.jks)
asmUserforMQAuth Any string (example: test_user)
SSLKeystoreDS Any string (example: SSLKeystoreDS)
SSLKeyDS Any string (example: SSLKeyDS)
SSLTruststoreDS Any string (example: SSLTruststoreDS)

Create the data sources (SSLKeystoreDS, SSLKeyDS, and SSLTruststoreDS) in the User section in platform configuration. See the following example for data sources details.

Datasource Username Password
SSLKeystoreDS keystore keystore-secret
SSLKeyDS key key-secret
SSLTruststoreDS truststore truststore -secret
Note: Client keystore or truststore is required at Producer or Consumer side in Unica Interact application (where the Interact application is installed). C:/SSL/kafka.client.keystore.jks and C:/SSL/kafka.client.truststore.jks are the local locations, where the Interact application is installed.
Authentication by Kerbrose

Kerbrose is used as an authentication method in Kafka receiver and Kafka outbound gateway.

In order to use Kerbrose, the following parameters with their values must be set to the activity orchestrator receiver or trigger message outbound gateway, in addition to the parameters set for "Authentication by SSL mechanism".
  • authentication = SASL_SSL
  • sasl.mechanism = GSSAPI
In addition, the following JVM parameters must be added to the application server hosting Interact runtime.
  • -Djava.security.auth.login.config=/path/to/jaas.conf
  • -Djava.security.krb5.conf=/path/to/krb5.conf
Authentication by SASL_SSL mechanism
If you require to use the SASL_SSL authentication mechanism, then you must set the parameter "authentication" with value "SASL_SSL" along with its supported parameters. SASL_SSL mechanism is the combination of SASL_PLAIN and SSL mechanisms. The following table includes the supported parameters for SASL_SSL mechanism.
Parameters Allowed/Sample Parameter Values
authentication SASL_SSL
asmUserforMQAuth Any string (example: test_user)
authDS Any string (example: authDS)
username Any string (example: test_user)
password Any string (example: test-secret)
ssl.keystore.location SSL Keystore location (example: C:/SSL/kafka.client.keystore.jks)
ssl.truststore.location SSL Keystore location (example: C:/SSL/kafka.client. truststore.jks)
SSLKeystoreDS Any string (example: SSLKeystoreDS)
SSLKeyDS Any string (example: SSLKeyDS)
SSLTruststoreDS Any string (example: SSLTruststoreDS)

If the "authentication" parameter is "SASL_SSL", you must either use asmUserforMQAuth/authDS or username/password.

Create the data sources (authDS, SSLKeystoreDS, SSLKeyDS and SSLTruststoreDS) in the User section in platform configuration. For data sources details, see the following example.

Datasource Username Password
authDS admin admin-secret
SSLKeystoreDS keystore test1234
SSLKeyDS key test1234
SSLTruststoreDS truststore test1234
Note: If you provide any data sources like authDS, SSLKeystoreDS, SSLKeyDS, or SSLTruststoreDS in the platform configuration parameter, then you must also provide asmUserforMQAuth parameter.

Client keystore/truststore is required at Producer/Consumer side in the Interact application (where Unica Interact application is installed). C:/SSL/kafka.client.keystore.jks and C:/SSL/kafka.client.truststore.jks are the local locations, where Interact application is installed.

Optional parameter for publishing messages
The following optional parameters can be used for publishing messages.
  • acks: The acks config controls the criteria under which requests are considered complete. The "all" setting results in blocking the full commit of the record.
  • retries: If the request fails, the producer can retry. Since, the specified retries are set as 0, retry is not possible. Enabling retries can lead to duplicates.
  • batch.size: The default batch size is in bytes, when multiple records are batched and sent to a partition.
  • linger.ms: The producer waits till the given delay time to allow other records to be sent so that the sent records can be batched together.
  • buffer.memory: The total bytes of memory that the producer can use to buffer records, which are waiting to be sent to the server.

The following table includes the optional parameters required for publishing messages.

Parameters Default value Allowed/Sample Parameter values
acks 1 0, 1, all
retries 3 Non-negative integer
batch.size 16384 Positive integer
linger.ms 0 Non-negative integer
buffer.memory 33554432 Positive integer
Optional parameters for subscribing messages
enable.auto.commit means that offsets are committed automatically with a frequency controlled by the config "auto.commit.interval.ms". The value of auto.commit.interval.ms must not exceed 1000 as the poll interval is set to 1000. The value of auto.commit.interval.ms must not exceed the value of poll interval.

The following table includes the optional parameters for subscribing messages.

Parameters Default value Allowed/Sample parameter values
enable.auto.commit true True, False
auto.commit.interval.ms 200 Positive integer
Optional thread management parameters
The following optional parameters can be used for thread management.
  • corePoolSize: The number of threads to keep in the pool for monitoring Kafka service.
  • maxPoolSize: The maximum number of threads to keep in the pool for monitoring Kafka service.
  • keepAliveTimeSecs: The maximum time that the excess idle threads waits for new tasks before terminating to monitor Kafka service, when the number of threads is greater than the core.
  • queueCapacity: The size of the queue used by the thread pool to monitor Kafka service.

The following table includes the optional parameters for thread management.

Parameters Default value Allowed/Sample Parameter Values
corePoolSize 1 Positive integer
maxPoolSize 5 Positive integer
keepAliveTimeSecs 5 Positive integer
queueCapacity 100 Positive integer
Optional zookeeper parameters
The following optional parameters can be used for zookeeper activities.

zookeeper.connection.timeout.ms: The maximum time that the client waits to establish a connection with zookeeper. If not set, the value in zookeeper.session.timeout.ms is used.

The following table includes the optional parameters for Zookeeper activities.

Parameters Default Value Allowed/Sample Parameter Value
zookeeper.connection.timeout.ms 6000 Positive integer
Optional parameters for creating topic
The following optional parameters can be used for creating topic.
  • num.partitions: The number of partitions for the offset commit topic.
  • replication.factor: The replication factor to change log topics and repartition topics created by the stream processing application.

The following table includes the optional parameters for creating topic.

Parameters Default value Allowed/Sample Parameter Values
num.partitions 1 Positive integer
replication.factor 1 Positive integer