Introduced in Feature Pack 2

Troubleshooting: Preprocessing large indexes fails

When running an index preprocess (di-preprocess), the utility can fail if the catalog data is large. For example, 500,000 catalog entries across 500 Extended Sites.

To resolve this issue, you must update the preprocessor configuration file (wc-dataimport-preprocess-attribute.xml) with higher settings, and run the di-preprocess utility again. For example:
  • Update VARCHAR2(4000) to CLOB.
  • Update Batchsize from 500 to 5000.
  • If necessary, recreate the table with a larger table space size. For example, TAB16K.
For example, to increase the CLOB size in the TI_ATTR_0_1 table, update the wc-dataimport-preprocess-attribute.xml file with the following change:

<_config:table definition="CREATE TABLE TI_ATTR_0_#lang_tag# (CATENTRY_ID BIGINT NOT NULL, AT
TRS CLOB(20485760), ATTRI CLOB(20485760), ATTRF CLOB(20485760), PRIMARY KEY (CATENTRY_ID))" name=
"TI_ATTR_0_#lang_tag#" />

See Preprocessing and building the search index and Preprocessing the WebSphere Commerce search index data for more information.