Distribution of scans for improved performance

Performance of importing data from the BigFix server to BigFix Inventory depends on the number of scan files, usage analyses, and package analyses that are processed during a single import. By properly scheduling scans and distributing them over the computers in your infrastructure, you can reduce the length of the data import.

Use the following guidelines to improve the performance of the data import:
Divide computers into groups
  • After you install BigFix Inventory, do not run the scans on all computers in your infrastructure. Divide the computers into groups in the BigFix console and start by gathering the default properties from a single computer group. For more information about creating computer groups in the BigFix console, see: Computer groups.
  • Consider creating computer groups that are based on stability. In a stable environment, scans can be run less frequently than once a week.
Schedule scans to run at different time
  • Schedule scans to run on different days for different computer groups.
  • Avoid a situation in which multiple groups are scanned at the same day or the following day. It might cause that scans and data imports interfere.
  • Reduce the frequency of scans. In most cases, it is sufficient to scan the infrastructure once a week, which is the default frequency. In large environments, you can disable the option to automatically run scans and initiate them only when necessary. The minimum scan frequency is once per month.
Limit the amount of gathered data
  • In the initial deployment phase, or if you do not need the information, disable the collection of software usage. For more information, see: Disabling the collection of software usage.
  • Try to limit the number of computer properties that are gathered during scans.
For more information about distributing scans and other actions that you can undertake to improve the application performance, see: Tuning performance.