Taking a snapshot of data for export to a table or file or Kafka
Use the Snapshot process to capture data for export to a table or file or Kafka. Select the source of the values that you want to capture, and define the output table or a file or topic name for those values.
Procedure
- Open a flowchart for editing.
- Drag the Snapshot process from the palette to your flowchart.
-
Connect one or more configured processes to provide input to the Snapshot process.
Note: All of the cells that you select as input must have the same audience level.
- Double-click the Snapshot process in the flowchart workspace.
The Snapshot process configuration dialog box opens and the Snapshot tab is open by default.
- Use the Snapshot tab to specify how to capture data.
-
To specify which fields to snapshot, use the controls to move selected fields from the
Candidate fields list to the Fields to snapshot list.
You can select multiple fields with Ctrl+Click or select a range of fields
with Shift+Click.
Note: To view the values in a field, select a field in the Candidate fields list and click Profile.
- If you selected a table as the snapshot destination, the fields in that table appear in the Candidate fields list. You can automatically find matching fields by clicking the Match button. Fields with exact matches for the table field names are automatically added to the Fields to snapshot list. If there are multiple matching fields, the first match is taken. You can manually modify the pairings by using Remove << or Add >>.
- To include generated fields, expand the list of Unica Campaign generated fields in the Candidate fields list, select a field, then use the controls to move the field to the Fields to snapshot list.
- To work with derived fields, click the Derived fields button.
- You can reorder the Fields to snapshot by selecting a field and clicking Up 1 or Down 1 to move it up or down in the list.
-
To skip records with duplicate IDs or to specify the order in which records are output, click
More to open the Advanced settings dialog.
-
Use the General tab to set the following options:
- Process name: Assign a descriptive name. The process name is used as the box label on the flowchart. It is also used in various dialogs and reports to identify the process.
- Note: Use the Note field to explain the purpose or result of the process. The contents of this field appears when you rest your cursor over the process box in a flowchart.
- Click OK to save and close the configuration.
Results
The process is now configured. You can test run the process to verify that it returns the results you expect.
Campaign Node Configuration details
KafkaBrokerURL | Kafka servers being used to export data. User can define more than one kafka server separated by comma. Example: IP-0A862D46:9092 OR 11.22.33.44:9092,11.22.33.55:9092 |
CommunicationMechanism | Specify the connection mechanism to connect to Kafka server. Possible values: SASL_PLAINTEXT_SSL - Use this to connect to kafka with username/password and SSL enabled. NO_SASL_PLAINTEXT_SSL - Use this to connect kafka with no authentication and no SSL. SASL_PLAINTEXT - Use this to connect kafka with username and password only. SSL - Use this to connect kafka without username/password but with SSL. |
KafkaKeyFile | Specify the client key file if connection mechanism is using SSL. Example: /opt/Unica/Kafkakeys/client_key.pem |
KafkaCertificateFile | Specify the certificate file if connection mechanism is using SSL. Example: /opt/Unica/Kafkakeys/client_cert.pem |
CertificateAuthorityFile | It is signed certificate of Kafka Server, it is required when connection mechanism is using SSL. Example - /opt/Unica/Kafkakeys/ca-cert |
UserForKafkaDataSource | Marketing Platform user contains the datasource credentials for Kafka while connecting with username / password |
KafkaDataSource | DataSource containing the kafka user credentials. |
TopicName | Journeys designated topic for Campaign to push data to Journey. Required value - CAMPAIGN_PB. Please do not change this as it would send data to Kafka topic which is not being used in Journey. |
NumberOfPartitions | Number of partitions supports Kafka to hold user exported data. |
NumberOfReplicas | Each partition is replicated across a configurable number of servers for fault tolerance. |
RetentionPeriodInSeconds | The maximum time Kafka will retain messages exported over topic. Once retention period over Kafka clears all eligible exported messages. |
SslKeyPasswordDataSource | If KafkaKeyFile is password protected then please create a separated data source which will include that password. User name is not used so can be anything. Mention that data source name as value of this field. |