Flexera logo
Image: Track These Software Asset Management Program Metrics for Success

Nobody plays a sport and doesn’t keep score. But plenty of organizations implement a software asset management and license optimization program without any structured way to measure actual outcomes. This article seeks to provide some ideas on how to measure and track data, operations and outcomes from a program focused on optimizing the value and usage of software in an organization.

Business cases for a software asset management and license optimization program will present broad ideals of what ‘success’ looks like, such as:

  • Reduce risk of unplanned payments from using unlicensed software
  • Reduce risk of security breaches arising from use of unpatched software
  • Save money on unneeded maintenance
  • Avoid over-purchasing of software licenses
  • Reduce the time and effort required to fulfill requests for new software

Crossing the gap from high level objectives like these to identifying how to actually measure outcomes can take some thought. The key principle to apply here is to work out what is meaningful and important, and seek ways to measure those things. [1]

Best-in-class organizations have a robust approach to monitoring and improving a range of metrics. Metrics will often be used as the basis for key performance indicators (KPIs) for teams who have responsibility for managing and influencing those metrics.

With that in mind, some sample metrics that may be useful to help with software license optimization are listed below. The metrics have been broken down into three areas:

  1. Input data metrics
  2. Operational metrics
  3. Financial metrics

Input data metrics

A software license optimization program heavily relies upon comprehensive and accurate input data about IT hardware and software assets within the organization. This data is best held within a central Asset Management Database (AMDB) containing all of the data gathered from multiple sources. Input data focused metrics are useful for assessing the trustworthiness of this asset data. They can be used to highlight areas where data may not be clean, and to help identify data gaps for remediation.

Examples are:

  1. % of active devices for which current hardware and software inventory is available.
  2. % of devices found on the network which are not recognized as known assets.
  3. % of devices found active on the network which have a status indicating they should not be active (for example, they are classified as being in storage or retired).
  4. % of devices found on the network with a bad (blank, duplicate or blacklisted) serial number.
  5. % of all IT assets in the organization that are represented in the AMDB.
  6. Number of IT asset purchases that are not yet represented in the AMDB.
  7. % of assets with no recorded location, business unit, etc.
  8. % of computer records with missing processor information (auto-discovery tools which gather inventory do not always reliably collect this data).
  9. Number of active individual-use assets with no assigned user.
  10. Number of assets which are assigned to people who have left the organization.
  11. % of virtual machines for which physical host information is not known.

Operational metrics

Operational metrics help to measure the operational performance of software license optimization activities. Examples are:

  1. % of application installations that have been mapped to normalized application details.
  2. Number of installations of commercial applications without a license.
  3. % of application installations which have not been used within the last (say) 90 days.
  4. Average number of different versions deployed of each application.
  5. % of assets which are not actively deployed (for example, hardware assets on the shelf, or software licenses not used).
  6. % of installations which are not using the latest patch level available from the software publisher.
  7. Number of installations of applications with known security vulnerabilities.
  8. Number of installations of applications which are unauthorized (prohibited) for use.
  9. % of installations which are of applications that have reached their end of support life.
  10. Average time taken to fulfill an end user’s request for new software.

Financial metrics

Financial metrics are useful for measuring the outcomes from a software asset management and license optimization program in terms of the value or cost to the organization. Examples are:

  1. Value of software installations proactively removed.
  2. Value of maintenance not renewed.
  3. Value of software requests fulfilled from existing software license entitlements (i.e. without having to purchase new licenses).
  4. Value of unlicensed deployments identified and remediated.
  5. % of software licenses under maintenance which are currently used.
  6. Value of contingency on the company’s balance sheet or allocated in budgets for unplanned software license liabilities (i.e. due to software audit true-up fees)
  7. Number of software vendor audit notifications received per year.
  8. Labor costs associated with responding to software license audits (and/or average cost per audit).
  9. Costs paid arising from software license audit findings per year.
  10. Number of security incidents arising from software vulnerabilities per quarter.
  11. % of license and maintenance costs that are charged back to other parts of the business.
  12. Ratio of the value delivered to the organization from the SLO program to the full time equivalent people supporting the program.

What are your favorite SAM metrics? How do you go about measuring them?

To learn more, please visit our website.

[1] Of course, some meaningful things in life are very hard to measure. However don’t grab defeat from the jaws of victory and let that get in the way of measuring what can be measured.