Data import errors

This topic lists the errors that might appear when you perform data imports. You can also find solution to those error in this topic.

Error messages

During the initial import, the following error is written in the logs: Error was getaddrinfo: name or service not known (SocketError).
During the initial import, the following error is written in the logs:
ERROR: Datasource file citsearch_0_4580013_cit.xml.bz2 raised an exception
while reading from {:port=>"52311", :path=>"/UploadReplication",
:query=>{:BaseDirectory=>1,
:Name=>"\\13\\4580013\\citsearch_0_4580013_cit.xml.bz2",
:sha1=>"5B0FE15F7E097171566F0AC3B9BE93826FDC0D41", :offset=>0}}.
Error was getaddrinfo: name or service not known (SocketError)
The problem might be caused by incorrect DNS name settings. To solve the problem, ensure that BigFix Inventory can ping the BigFix server by using the DNS name that is specified in the fixlet site. To find the DNS name, on the computer where the BigFix server is installed, go to the C:\Program Files (x86)\BigFix Enterprise\BES Installers\Server, and find the ActionSite.afxm file. If BigFix Inventory cannot ping the BigFix server by using this DNS name, add the name to the etc\hosts file on the BigFix Inventory server.
The import fails. After you rerun the import, the software inventory is empty.
One of the scenarios in which the problem occurs is when you run an import and the BigFix server is not running. After you restart the server and rerun the import, the software inventory is empty. To solve the problem, manually initiate the scanner. Gather fresh scan data, and run the import.

As an alternative, you can run the Force Reupload of Software Scan Results task and then run the import of data. This task forces reupload of inventory data that was gathered by the software inventory and the file system scans to the BigFix server. The data is then imported to BigFix Inventory.

The import fails because the transaction log is full.
After a failed import, the import log contains the following error:

Batch failure. The batch was submitted, but at least one exception occurred on an individual member of the batch. Use getNextException() to retrieve the exceptions for specific batched elements.
ERRORCODE=-4229,
Also, the following error can be found in the tema.log file:
Batch execution error: Error for batch element #903:
The transaction log for the database is full.
To solve the problem, complete the following steps:
  1. Increase the size of the transaction log for the database.
  2. Restart the DB2® and BigFix Inventory servers.
The import fails because the Java heap size is too low.
After a failed import, the import log contains the following error:
E SRVE0777E: Exception thrown by application class
'java.lang.StringBuilder.ensureCapacityImpl:342'
java.lang.OutOfMemoryError: Java heap space

To solve the problem, increase the Java heap size.

The import fails and the following message is written in the logs: Overflow occurred during numeric data type conversion.
The problem occurs when you create a contract custom field that requires an integer value, and then enter a value that is greater than 32767. To solve the problem, enter a smaller value.
The import fails and the following message is written in the logs: The SQL statement or command failed because of a database system error. (Reason "optBitFilterSize less than min".). SQLCODE=-901, SQLSTATE=58004, DRIVER=4.14.111.
To solve the problem, create a script that reorganizes internal database tables and keeps statistics up-to-date, and run it against the BigFix Inventory database.
  1. Create the reorg.sh script.
    $ cat reorg.sh
    #!/bin/bash
    db2 connect to TEMADB
    db2 -x "select 'reorg table',substr(rtrim(tabschema)||'.'||rtrim(tabname),1,50),
    'allow no access;'from syscat.tables where type = 'T' and tabschema not in 
    ('NULLID','SYSCAT','SYSFUN','SYSIBM','SYSPROC','SYSSTAT') order by tabschema,tabname
    " > reorgs.sql
    db2 -tvf reorgs.sql
    db2 terminate
    
    db2 connect to TEMADB
    db2 -x "select 'runstats on table',substr(rtrim(tabschema)||'.'||rtrim(tabname),1,50),
    ' and indexes all;'from syscat.tables where type = 'T' and tabschema not in 
    ('NULLID','SYSCAT','SYSFUN','SYSIBM','SYSPROC','SYSSTAT') order by tabschema,tabname
    " > runstats.sql
    db2 -tvf runstats.sql
    db2 terminate
  2. Log in as a database instance owner to the computer where the DB2® is installed and run the script.
The import fails and the following message is written in the logs: INFO: ETL from Data Source data_source_name - RawDatasourceFixletResult: Failed
The problem occurs because there is not enough disk space on the computer where the BigFix Inventory database is installed. To solve the problem, free some disk space.
The import fails and the following message is written in the logs: Snapshot isolation transaction failed accessing database 'TEMADB' because snapshot isolation is not allowed in this database. Use ALTER DATABASE to allow snapshot isolation.
To solve the problem, enable snapshot isolation in the MS SQL Server. For more information, see the Microsoft SQL Server documentation.
After adding a large data source, the import fails and the following message is written in the logs: 500 Internal Server Error.
The problem might occur because there is not enough disk space on the computer where the BigFix database is installed. To calculate the amount of required disk space, perform the following steps:
  1. Optimize the import of data from BigFix. Log in to BigFix Inventory and go to Management > Advanced Server Settings. Then, change the value of the schema_next parameter to true.
  2. To calculate the required disk space, check how many objects exist in all fixlet sites that you have enabled in the BigFix console. An object is every computer group, analysis, fixlet, and task that exist in the console, including the ones that are not relevant. Every 1000 of objects requires 1GB of free disk space. For example, if you have 500 fixlets and tasks, 300 analyses, and 20 computer groups, you have 820 objects in total. The BigFix database requires 1 GB of disk space.
The import fails and the following message is written in the logs: INFO: ETL from Datasource - RawDatasourceAnalysis (0x000000 - 0x00000035): Failed.
During the import, the following error is written in the logs:
INFO: ETL from Datasource - RawDatasourceAnalysis (0x000000 - 0x00000035): Failed
ERROR: Sequel::SerializationFailure: Java::ComMicrosoftSqlserverJdbc::SQLServerException:
Transaction (Process ID 1111) was deadlocked on lock resources with another process
and has been chosen as the deadlock victim. Rerun the transaction
To solve the problem, ensure that no actions such as a backup or recovery are taking place on the BigFix database. Then, rerun the import.
The import fails after you change the host name of the BigFix Inventory server.
Changing the host name of the BigFix Inventory server is not supported. When the application cannot recognize the original host name, the ETL step of a data import is failing and you cannot gather and process data from your endpoints. A new host name requires a new installation of BigFix Inventory.

The host name also cannot be changed for the BigFix server. In this case, the host name of the server is recorded into your license certificate during the installation. To change it, you must create a new license certificate that requires a new installation.

The import hangs after the database connection is lost and recovered
During the import of data, database connection is lost and information about connection problems is displayed on the user interface. After you restart the database, the user interface is refreshed but the import hangs. Additionally, the following or similar error is written in the logs.
ERROR: Sequel::DatabaseError: DBNAME: temadb25 - Java::ComIbmDb2JccAm::SqlNonTransientConnectionException:
[jcc][t4][10335][10366][4.22.29] Invalid operation: Connection is closed. ERRORCODE=-4470, SQLSTATE=08003
To solve the problem, restart the BigFix Inventory server.
A successful software scan is run but there are no changes to the data after the data import to the BigFix Inventory server.
Check the maximum archive file size to ensure that it is greater than the scan file size.
  1. Log on to the BigFix console.
  2. In the left navigation, click Computers, right-click the name of the appropriate computer, and then click Edit Computer Settings.
  3. Check the _BESClient_ArchiveManager_MaxArchiveSize setting is greater than the size of the largest scan file. If needed, edit the value to increase the maximum archive size.
Verify that there are no software scan errors.
  1. In the BigFix console navigation, click Sites > External Sites > BigFix Inventory v10 > Analyses.
  2. Select the Software Scan Status analysis.
  3. In the lower pane, click the Results tab, and verify that the status of the software scan is OK for the computers.
The import is delayed when there more than 20 computer groups and all computers have metrics calculation enabled.
Some of the products are not discovered.
If you optimized the volume of scanned file data, either during the post-upgrade configuration or manually, you must run an import for the changes to take effect. After the import, some software items might not be visible on the reports. It is an expected behavior. To ensure that the software inventory is properly reported, perform the following steps.
  1. Ensure that the catalog that is uploaded to BigFix Inventory is in the canonical format. If the catalog is in the native format, upload a new catalog. If the catalog is in canonical format but a new version is available, upload the new catalog. To check the format of the uploaded catalog, click Management > Catalog Upload and check the Catalog Format column.
  2. Stop the current scan.
    1. Log in to the BigFix console and in the left navigation tree, click Actions.
    2. In the upper-right pane, click Initiate Software Scan and then click Stop.
  3. Initiate a new software scan. Wait for the scan to finish.
  4. Wait for the scheduled import or run it manually.
Files with a particular extension are not reported on the Scanned File Data report.
The problem might occur if you optimized the volume of scanned file data and removed the extension from the list of monitored extensions. To solve the problem, contact HCL support in order to add the file extension that you want to monitor.
After a failed import of scan data, the data that is displayed in reports is inconsistent
If there are inconsistencies in the data, and the last data import failed, the data inconsistencies might be the result of the import failure. To solve the problem, run the import again.

As an alternative, you can run the Force Reupload of Software Scan Results task and then run the import of data. This task forces reupload of inventory data that was gathered by the software inventory and the file system scans to the BigFix server. The data is then imported to BigFix Inventory.

The data import log cannot be viewed in the user interface.
The problem might occur if you enabled debug logging for the import process. Debug information significantly increases the size of the import log file and it might happen that the log file cannot be displayed in the user interface. To solve the problem, consider the following solutions:
  • Instead of viewing the log file in the user interface, open it as a file. The log file is in one of the following directories:
    • Linux icon installation_directory/wlp/usr/servers/server1/logs/imports
    • Windows icon installation_directory\wlp\usr\servers\server1\logs\imports

    To avoid problems in the user interface, move the log file to a different location so that BigFix Inventory does not load it into the user interface.

  • Server log file to limit the amount of information saved in the log file.
Software is properly discovered by the scan but is not reported in BigFix Inventory after the import.
The problem occurs on BigFix 9.0 installed on Linux. If the value of the sequence changed in the BigFix database and is higher than the sequence that was imported, scan results are not imported during a particular import. To solve the problem, wait for the next scheduled import or run it manually.
Data about a computer was not imported from the BigFix server.

Because BigFix Inventory imports live data that is constantly changing on the BigFix server, it might happen that some data, for example about computers, might not be imported. That happens because only the scope of data that is calculated at the beginning of an import is processed during the import.

Example: If a computer is saved in the BigFix database during a data import from BigFix to BigFix Inventory, it will not be imported.
Important: It might also happen that after the second data import, the imported computers do not have important properties such as Computer Name, IP Address, or Operating System. To fix this problem:
  1. In the navigation tree of the BigFix console, click Computers and then in the list pane, select the computer for which the properties are missing.
  2. In the lower right pane, click Send refresh 3 times. All missing computers properties will be imported during the next data import.
BigFix Inventory takes longer to import data due to the File Facts ETL step involved
The default list of excluded Windows endpoints is extended to improve the data import and overall file scan. For a newly installed scanner, the list of exclusions has been extended with directories containing large files that are not related to the software discovery. The list of excluded directories are listed in the KB article. To add your preferred entry of exclusion for upgraded scanners, follow the instruction provided in the KB article.
If an import fails because of a locked table in the database, the import log shows the details about active locks in that table. For example,
2023-04-05 12:56:00 (+0:03:07.842) ERROR:  (ImportThread) Application tries (try 1 out of 5) to get exclusive access to the adm.computer_membership table but it has been denied and following active locks have been found:
Note: "SQL Query" might be empty if query concluded but transaction is not committed.
Lock 1:
-> Database Name: TEMABVT, 
-> Schema Name: ADM, 
-> Table Name: COMPUTER_MEMBERSHIP, 
-> IP Address: 10.190.82.231, 
-> User Name: DB2INST1, 
-> SQL Query: 'null', 
-> Elapsed Time: null, 
-> Application Status: UOWWAIT