Skip to content
Looking for Software Savings in all the Wrong Places
May 17, 2016 11:04:00 AM4 min read

Looking for Software Savings in all the Wrong Places

It has been over a year since the disappearance of Malaysian Flight #370, bound from Kuala Lumpur to Beijing. In that duration, 26 countries, several private contractors and $93 million has been spent searching 2.96 million square miles. The flight recorder is 0.3 cubic feet while the current search area is 150 billion cubic feet. Looking for something that small in a large dark ocean would be like searching blindfolded for a small broken M&M in Yankee Stadium. But why can’t thousands of workers, millions of data, the smartest and brightest engineers, scientists, mathematicians, and oceanographers find a large 209 X 199 foot airplane?

Having analyzed 100 million pieces of data representing over $15 billion in software costs from over 1000 data centers across the globe, a data center’s search for software savings can be difficult if you don’t know where to look, similar to the search of MH370. The brightest and best-intentioned people can still be looking in the wrong place. In the absence of data showing the precise location of where to look, wasted money will be spent without success. Data centers spend large amounts of money for tools to provide data; they hire consultants with very sophisticated processes and maturity models; and they seek consultants with years of experience in negotiating a perceived “best price.” Do data, processes, pricing, and compliance find all the savings available? Some, but rarely will it get the majority of available savings.

Some data centers’ use so much excessive data in their approach to finding the savings, much of it isn’t even meaningful or actionable. Reports with errors are commonly given to someone with little or no background in hardware configuration to interpret the data. When it identified a specific server with a CPU core count of 16, it did not realize the maximum core count that server could have is eight. In the absence of knowing what to look for in the data and where the errors are typically found, a lot of the errors in the data are missed. We have found errors in 100% of the reports that expert auditors have validated. Having the right data and where to look in the data is powerful.

Another common practice in data centers is to look for savings by having “mature” software asset management (SAM) processes. The problem is that the SAM maturity processes have become the goal and not the means. A mature process is only mature if it gets you to where you need to be. The most sophisticated processes developed are by large governments and militaries, yet they have not found MH370 because a mature process is only mature to the extent that it knows where and how to apply the process. Therefore, data centers have built mature processes but they continue to overspend and have become handicapped by the processes. If processes are not getting the data center to industry “best in class” status with their cost structure, then the processes become a liability.

One of the most common problems with data centers is using software vendor discounts as a mean to manage software costs. Hundreds of former software vendor employees have become consultants, offering their services to help you get the best discount from their previous employer. When one of the most recognized global financial services company wanted to renegotiate with their largest software vendor, they were trying to increase their discount rate which was already an amazing 92%. The problem was, they were paying $60 million over list price but didn’t know it. Or, when another data center thought they were getting a 35% discount from a vendor, which historically only gave a 20% discount, they thought it was a well-negotiated deal. They didn’t realize they paid 28% above list price. Most data centers pursue the discount as the ultimate measurement of success. It rarely is an accurate barometer of success as there is a strong correlation between discount and waste. The more waste in a data center, the greater the discount.

A growing trend is to look for software savings in compliance. As every data center is getting audited by at least one software vendor, data centers are desperately trying to get compliant with their licenses. They measure success by the degree they are compliant with their software vendors. Although software compliance has valuable legal and financial implications, being compliant on a bloated cost structure is still wasted money.

So where should a data center look for software savings? They should benchmark their software costs and know where their software costs vary the most from industry “best in class” cost structure. Data centers should always know what their cost structure, industry average and best in class costs are, and why they vary. Are they spending too much on a particular vendor, too much by functionality (e.g. development tools, database, performance management etc.), or too much by platform? Is their product count, product mix, vendor mix or vendor pricing off? Are their unit costs per capacity unit high? Where are they high? Every sports team judges its players by their stats compared to the best in that position at that sport. Unfortunately, data centers avoid critical software cost benchmarks as their guide. Without a data center knowing where their costs should be, it’s analogous to the $93 million spent searching for the lost MH370.

COMMENTS

Related Articles