Good practices for running scans and imports

Apply the good practices when running scans and imports to ensure that your infrastructure works efficiently.

Initial configuration task

  • Run the initial import

    It is a good practice to run the initial import before you schedule scans and activate analysis. This import uploads the software catalog from the installation directory to BigFix Inventory and extracts basic data about endpoints from the BigFix server. Make sure that the scan data from the first scan group is available in BigFix server and then run the second import. When the scan results from the second scan group are available in the BigFix server, you can run the next import of data.

Scanning related activities

  • Plan scan frequency

    After you find the optimal size of scan groups, set the frequency of software scans. The most common frequency is weekly so that every endpoint is scanned once a week. If your environment has more than 100 000 endpoints, consider performing scans less frequently, for example monthly. If scans are running daily, take into account system updates. When many files are modified, the next data import runs longer.

  • Avoid scanning when it is not needed

    Scan frequency depends on how often software changes on the endpoints and on your reporting needs. Group endpoints on which software changes dynamically together and scan them more frequently, for example once a week. Group endpoints with a more stable set of software together and scan them less frequently, for example once a month.

  • Limit the number of computer properties gathered during scans

    By default, BigFix Inventory retrieves four computer properties from the BigFix server: computer name, DNS name, IP address, and operating system. Imports can be much longer if you extract more properties from BigFix during each import. It is a good practice to limit the number of computer properties to 10 or fewer.

  • Limit the number of BigFix Inventory computer groups

    Create only as many computer groups as are needed. Import of data gets longer with a growing number of computer groups. If the size of your environment requires that you create many computer groups, consider skipping calculation of extended software aggregates. By skipping these calculations, you can reduce the length of data imports in very large environments. For more information, see: Disabling the calculation of extended software aggregates.

  • Enable the collection of checksums for a small group of endpoints

    Each change to the configuration of the cryptographic hash collection (enabling, disabling, adding new types) significantly lengthens the first data import that follows the change. Because of multiple modifications on the file system, the new configuration triggers a complete data import instead of the delta one, in which only modifications are imported. This first data import might take up to three times longer, and the subsequent ones about 10% longer than data imports without file hashes. The impact of subsequent data imports is considered as moderate. Before you enable the collection of file hashes, divide your environment into scan groups to distribute the load of the imported data. For more information, see: Enabling the collection of checksums.

Import related activities

  • Maintain frequent imports

    After the installation, imports are scheduled to run once a day. Do not change this configuration. However, you might want to change the hour when the import starts. If your import is longer than 24 hours, you can improve the scan groups configuration. Alternatively, you can preserve the current configuration because BigFix Inventory handles overlapping imports. If an import is running, no other import is started.

  • Consider web user interface

    Data import is a computation-intensive task so be prepared to experience slower user interface response times while you are using BigFix Inventory. Thus, it is better to schedule the imports to take place at other times when you are not likely to use the application web UI.

    If, in case of huge deployments, loading any report takes substantial amount of time, see: User interface problems.