Integrating Unica Discover and Unica Journey

Unica Discover and Unica Journey integration works with CEP configuration. The CEP configuration enables the event to be sent to CEP and the external systems. Unica Discover-Unica Journey integration is supported from version 12.1.0.3 onwards.

To integrate Unica Discover and Unica Journey, complete the following procedures:

Activating Enable Event bus

Procedure

  1. In the Discover Portal UI, select Discover > Managed Services.
    The Discover Services Management page appears.
  2. Expand Canister.
  3. Select Canister configuration (xxxxx) (Registry).
  4. Within the Config Actions section, select View/Edit.
    The Discover Canister Config popup appears.
  5. Select Services Perform > Event Bus
  6. Enable Enable Event Bus.

Activating Send to Event Bus

Procedure

  1. In the Discover Portal UI, select Event > Manage Events.
    The Event Manager page appears.
  2. Locate the required event by using filters and filtering the values based on Name or ID.
  3. To edit an event, perform one of the following actions:
    1. Double-click the event.
    2. Right-click the event and select Edit Event.
  4. Select More options.
  5. Enable Send to Event Bus.

Configuring DiscoverEventBus.cfg

Procedure

  1. In the Discover Portal UI, select Discover > Managed Services.
    The Discover Services Management page appears.
  2. Expand Canister.
  3. Select Event Bus configuration (xxxxx) (Registry).
  4. Within the Config Actions section, select View/Edit.
    The Pipeline Editor - Event Bus configuration (xxxxx) popup appears.
  5. If CEP does not exist in the Event Bus configuration panel, drag-and-drop CEP from the Available SessionAgents panel to the Event Bus configuration panel. Ensure that you drop CEP after the Decouple option and before the Null option.
  6. Select CEP.
    The Edit Session Agent:CEP popup appears.
  7. For the OutputType field select JOURNEY.
  8. Click Apply.

Configuring REST type of Entry Source

Procedure

  1. Select Journey Entry Source Type REST and click Apply.
  2. Enter Journey Entry Source API Server URL and click Apply. For example, http://localhost:8080/journey/api/
  3. Enter Journey Entry Source API Client ID and click Apply.
  4. Enter Journey Entry Source API Client Secret and click Apply.

Configuring Unica Discover / Kafka type of Entry Source

Procedure

  1. Select Journey Entry Source Type Kafka and click Apply.
  2. Enter Journey Entry Source Kafka Server / Broker, along with the port number for listening, and click Apply. For example, localhost:9092
  3. Enter Journey Entry Source Kafka Security Type.

Configurations applicable when Entry Source Kafka Security Type selected is SASL_PLAINTEXT

Procedure

  1. Enter the Journey Entry Source Type Kafka SASL User and click Apply.
  2. Enter the Journey Entry Source Type Kafka SASL Password and click Apply.

Configurations applicable when Entry Source Kafka Security Type selected is SSL

Procedure

  1. Enter the Journey Entry Source Type Kafka SSL Cert location and click Apply. For example, C:\KSSLCert\client_cert.pem.
  2. Enter the Journey Entry Source Type SSL CA location and click Apply. For example, C:\HCL\KSSLCert\ca-cert.
  3. Enter the Journey Entry Source Type SSL Key location and click Apply. For example, C:\HCL\KSSLCert\client_key.pem.

Configurations applicable when Entry Source Kafka Security Type selected is SASL_SSL

Procedure

  1. Enter the Journey Entry Source Type Kafka SASL User and click Apply.
  2. Enter the Journey Entry Source Type Kafka SASL Password and click Apply.
  3. Enter the Journey Entry Source Type Kafka SSL Cert location and click Apply. For example, C:\KSSLCert\client_cert.pem.
  4. Enter the Journey Entry Source Type SSL CA location and click Apply. For example, C:\HCL\KSSLCert\ca-cert.
  5. Enter the Journey Entry Source Type SSL Key location and click Apply. For example, C:\HCL\KSSLCert\client_key.pem.

Configuring CEP for outputtype as Kafka and REST

About this task

The following procedure installs Kafka on Microsoft Windows server:

Procedure

  1. Install or verify that the latest version of JDK is installed in your system.
  2. For the environment variable PATH, append the following value:
    C:\Program Files\Java\jdk-18.0.1\bin
  3. Add the environment variable JAVA_HOME with the following value:
    C:\Program Files\Java\jdk-18.0.1
    Restart the machine.
  4. Download the Latest version of Kafka (Scala) and extract all content to an appropriate folder.
  5. Create two folders (for example, one named zookeper and one named broker) within the folder named data.
    CAUTION: You cannot name either of the two folders as data.
  6. Open config\zookeeper.properties and modify the path to reflect the path of the first folder. For example, if you named the folder zookeper and the zookeper folder resides within the C:/Kafka/kafka_2.13-3.1.0/data/ location, set the path as follows:
    dataDir=C:/Kafka/kafka_2.13-3.1.0/data/zookeeper
  7. Open config\server.properties and modify the path to reflect the path of the second folder. For example, if you named the folder broker and the broker folder resides within the C:/Kafka/kafka_2.13-3.1.0/data/ location, set the path as follows:
    log.dirs=C:/Kafka/kafka_2.13-3.1.0/data/broker
  8. To start zookeper, navigate to the bin\windows folder, open the command line interface, and run the following command:
    zookeeper-server-start.bat ..\..\config\zookeeper.properties
  9. To start the Kafka server, navigate to the bin\windows folder, open the command line interface, and run the following command:
    kafka-server-start.bat ..\..\config\server.properties
  10. To create a new topic, open another command line interface, navigate to the bin\windows folder, and run the following commands:
    kafka-topics.bat --create --topic <STREAMING_IMPORT> --bootstrap-server localhost:9092
    If the topic creation is successful, you will see the following success message: Created Topic <STREAMING_IMPORT>.
  11. To view all the created topics, run the following command:
    kafka-topics.bat --list --bootstrap-server localhost:9092
  12. To execute zookeper and broker as a windows service, you must use a utility specific for that task. Complete the following steps:
    1. Download latest version of nssm.
    2. Steps to create zookeper service are as follows:
      1. Navigate to the win64 folder within the downloaded nssm folder.
      2. Open the Command Line Interface and run the following command:
        nssm install "Kafka Zookeeper"
      3. The nssm window opens. In the default Application tab, set the following values to the variables:
        • Path: C:\Kafka\bin\windows\zookeeper-server-start.bat
        • Startup Directory: C:\Kafka\bin\windows
        • Arguments: C:\Kafka\config\zookeeper.properties
      4. To install Kafka Zookeeper windows service, click Install.
    3. Steps to create broker service are as follows:
      1. Navigate to the win64 folder within the downloaded nssm folder.
      2. Open the Command Line Interface and run the following command:
        nssm install "Kafka Broker"
      3. The nssm window opens. In the default Application tab, set the following values to the variables:
        • Path: C:\Kafka\bin\windows\kafka-server-start.bat
        • Startup Directory: C:\Kafka\bin\windows
        • Arguments: C:\Kafka\config\server.properties
      4. To install Kafka Broker windows service, click Install.
    4. Checking the status and other basic commands:
      C:\Users\Administrator\Downloads\nssm-2.24\nssm-2.24\win64> .\nssm.exe --help
      C:\Users\Administrator\Downloads\nssm-2.24\nssm-2.24\win64> .\nssm.exe status "Kafka Broker"
      PS C:\Users\Administrator\Downloads\nssm-2.24\nssm-2.24\win64> .\nssm.exe status "Kafka Zookeeper"
  13. Sample Configuration in DiscoverEventBus.cfg for Kafka:
    The following is a sample configuration for Kafka in DiscoverEventBus.cfg (the sample is only relevant for the Kafka output and not for the entire CEP section):
    [CEP]
    TypeName=CEP
    OutputType=KAFKA
    
    #KafkaEntrySourceType values (KAFKA, REST)
    KafkaEntrySourceType=KAFKA
    
    #KafkaEntrySourceAPIServer: value (http://x.x.x.x:xxxx/kafka/api/)
    KafkaEntrySourceAPIServer=
    
    #KafkaEntrySourceAPIClientId: applicable when KafkaEntrySourceType=REST 
    KafkaEntrySourceAPIClientId=
    
    #KafkaEntrySourceAPIClientSecret: applicable when KafkaEntrySourceType=REST
    KafkaEntrySourceAPIClientSecret=
    
    #KafkaEntrySourceKafkaServer: applicable when KafkaEntrySourceType=KAFKA
    #Example localhost:9092
    KafkaEntrySourceKafkaServer=
    
    #KafkaEntrySourceKafkaSecurityType: applicable when KafkaEntrySourceType=KAFKA, values (PLAINTEXT, SSL, SASL_PLAINTEXT, SASL_SSL)
    KafkaEntrySourceKafkaSecurityType=
    
    #KafkaEntrySourceKafkaSASLUser: applicable when KafkaEntrySourceType=KAFKA and KafkaEntrySourceKafkaSecurityType=SASL_PLAINTEXT OR  SASL_SSL
    KafkaEntrySourceKafkaSASLUser=
    
    #KafkaEntrySourceKafkaSASLPassword: applicable when KafkaEntrySourceType=KAFKA and KafkaEntrySourceKafkaSecurityType=SASL_PLAINTEXT OR  SASL_SSL
    KafkaEntrySourceKafkaSASLPassword=
    
    #KafkaEntrySourceKafkaSSLCertLocation: applicable when KafkaEntrySourceType=KAFKA and KafkaEntrySourceKafkaSecurityType=SSL OR  SASL_SSL
    #example : C:\KSSLCert\client_cert.pem
    KafkaEntrySourceKafkaSSLCertLocation=
    
    #KafkaEntrySourceSslKeyLocation: applicable when KafkaEntrySourceType=KAFKA and KafkaEntrySourceKafkaSecurityType=SSL OR  SASL_SSL
    #example : C:\HCL\KSSLCert\client_key.pem
    KafkaEntrySourceSslKeyLocation=
    
    #KafkaEntrySourceSslCaLocation: applicable when KafkaEntrySourceType=KAFKA and KafkaEntrySourceKafkaSecurityType=SSL OR  SASL_SSL
    #example : C:\HCL\KSSLCert\ca-cert
    KafkaEntrySourceSslCaLocation=
    
    #KafkaLogLevel: applicable when KafkaEntrySourceType=KAFKA. Values (ALL, DETAIL, ERROR, INFO, NONE, STATUS, TRACE, WARNING)
    KafkaLogLevel=ERROR
    
    #KafkaTopicName: applicable when KafkaEntrySourceType=KAFKA. Name of Kafka Topic. Example - DISCOVER
    KafkaTopicName=
  14. Sample Configuration in DiscoverEventBus.cfg for REST

    The following is a sample configuration for REST in DiscoverEventBus.cfg (the sample is only relevant for the REST output and not for the entire CEP section):

    [CEP]
    TypeName=CEP
    OutputType=REST
    
    #Url for the REST API
    REST_Url=
    
    #Headers to be added as part of POST. Multiple headers (name:value) separated by pipe (|) and name values separated by colon ( : )
    
    #eg : API-key:test|clientid:abc123
    REST_Header=
    
    #Enable logging for REST - set value to ON
    REST_Log=
    
    #Folder for log files
    REST_LogDir=
    
    #Set to ON for Debug logs or OFF to turn off Debug logs
    REST_Debug=
  15. Optional: To view all messages produced by the producer on UI, install the following UI tool for Kafka:
  16. To cleanup the cluster, delete the content of the following folders:
    • C:\tmp\kafka-logs\
    • C:\tmp\zookeeper\

Configuring CEPCustomFields.cfg

Procedure

  1. In the Discover Portal UI, select Discover > Managed Services.
    The Discover Services Management page appears.
  2. Expand Canister.
  3. Select CEP custom fields configuration file xxxxx) (CEPCustomFields.cfg).
  4. Within the Config Actions section, select View/Edit.
    The CEP custom fields configuration file xxxxx) (CEPCustomFields.cfg) popup appears.
  5. Add the following properties at the end:
    
    [Journey]
    Type=Journey
    # Below is explicit mapping of Journey Events with Discover Event ID, and Journey Entry Source
    PRODUCT_ADDED=<Discover_Event_ID>
    PRODUCT_REMOVED=<Discover_Event_ID>
    ORDER_COMPLETE=<Discover_Event_ID>
    ORDER_ABANDONED=<Discover_Event_ID>
    FORM_SUBMITTED=<Discover_Event_ID>
    FORM_ABANDONED=<Discover_Event_ID>
    Journey_Entry_Source_Code=<Journey_Entry_source_Code>
    
    #Below is explicit mapping of Journey data definition fields with Discover traffic attribute
    email=<loginid or some other field>
    Name=<customer_name>
    formId=<form_id>
    formname=<form_name>
    CartId=<cart_id_field>
    CartValue=<cart_value_field>
    CookieID=<JSESSIONID>
    Note: An example for email=<loginid or some other field including session attributes>, in the earlier configuration, is email=CustomVarN

    where CustomVarN is the JavaScript Name of the concerned session attribute. For more information, see the topic "Session attribute list" in Unica Discover Event Manager Manual.

    In the earlier configuration example, you can use session attributes for all Journey data definitions, except CookieID.

  6. Sample Configuration in CEPCustomFields.cfg for Kafka:
    The following is a sample configuration for Kafka in CEPCustomFields.cfg (the sample is only relevant for the Kafka output and not for the entire CEP section):
    [Kafka]
    Type=Kafka
    CustomField0=HOUR_OF_DAY
    CustomField1=DAY_OF_WEEK
    CustomField2=DAY_OF_MONTH
    CustomField3=WEEK_OF_MONTH
    CustomField4=DAY_OF_YEAR
    CustomField5=WEEK
    CustomField6=MONTH
    CustomField7=QUARTER
    CustomField8=YEAR
    CustomField9=HTTP_X_DCXVID
    CustomField10=<ABANDON_CART_SUBTOTAL_AMOUNT>,<CustomVar37>
    Note:
    • In the sample configuration provided earlier, provide an appropriate or relevant value for variables embedded within < and >. Do not embed the values within < and >.
    • Based on your requirement, you can add up to 99 CustomFields starting from CustomField0 to CustomField99
  7. Sample Configuration in CEPCustomFields.cfg for REST

    The following is a sample configuration for REST in CEPCustomFields.cfg (the sample is only relevant for the REST output and not for the entire CEP section):

    [REST]
    Type=REST
    
    [RESTSections]
    Type=RESTSections
    #Comma separated list of sections to include
    Sections= <iamie,appdata,env>
    CustomField1=<EMAIL_ID>,<CustomVar0>
    Note:
    • In the sample configuration provided earlier, provide an appropriate or relevant value for variables embedded within < and >. Do not embed the values within < and >.
    • Based on your requirement, you can add up to 99 CustomFields starting from CustomField0 to CustomField99
  8. Access Portal > Event Manager to fetch the Event ID. Modify the Journey Section and specify the Discover Event ID for each of the following events:
    PRODUCT_ADDED=<Discover_Event_ID>
    PRODUCT_REMOVED=<Discover_Event_ID>
    ORDER_COMPLETE=<Discover_Event_ID>
    ORDER_ABANDONED=<Discover_Event_ID>
    FORM_SUBMITTED=<Discover_Event_ID>
    FORM_ABANDONED=<Discover_Event_ID>
  9. Provide a value for Journey_Entry_Source_Code=<Journey_Entry_source_Code>.
    For example,
    Journey_Entry_Source_Code=ES-0000005
    Note: If you are not mapping all event IDs, ensure that you enter all the details correctly. If a Discover Event is followed by a Journey Entry Source Code, ensure that the Discover Event has an EventID. For example, if you want to map ORDER_COMPLETE and not ORDER_ABANDONED, the incorrect format and correct format are as follows:
    INCORRECT FORMAT
    ORDER_COMPLETE=1098
    ORDER_ABANDONED=<Discover_Event_ID>
    Journey_Entry_Source_Code=ES-0000001
    CORRECT FORMAT
    ORDER_COMPLETE=1098
    Journey_Entry_Source_Code=ES-0000001
  10. Provide the Discover traffic attribute for each of the folowing Journey Data Definition fields:
    
    email=<loginid or some other field>
    Name=<customer_name>
    formId=<form_id>
    formname=<form_name>
    CartId=<cart_id_field>
    CartValue=<cart_value_field>
    CookieID=<JSESSIONID>