The importance & value of Trustworthy Data – Part 1

Accurate and insightful intelligence based on Trustworthy Data is an essential component of successful ITAM strategies. Only when organizations understand and manage this data can they use it to effectively inform compliance, contract, consumption and control – the four stages of ITAM maturity that are key for all successful ITAM programs – and make intelligence-led decisions that will benefit both cost and risk.

Traditionally, ITAM has focused heavily on compliance and audits, and how to mitigate risk and avoid undue and unbudgeted outlays. However, this view has migrated from risk to the deal or transaction itself. The renewal of contracts presents an opportunity every three to five years to renegotiate a better, more optimized deal based on consumption. Trustworthy Data plays a vital part here, particularly as cloud consumption and investment continues to grow.

To make a successful, cost-effective and risk-free move to the cloud, large organizations in particular require a huge amount of Trustworthy Data and a specific set of information and analytics to ensure their cloud services are being managed effectively. While it is harder to be non-compliant in the cloud, it is very easy to overspend, and is the reason why the focus has moved from risk to cost control and optimization – and why having Trustworthy Data is invaluable.

In Part 1 of our ‘The importance and value of Trustworthy Data’ series, we outline how to access and measure accurate Trustworthy Data.

 

Accessing accurate data

Accurate, insightful data is invaluable and must be at the heart of your ITAM strategy. It is therefore essential to know how to obtain this data and, even more crucially, know how to extract its maximum value. To do this, structured processes need to be implemented to keep on top of contract lifecycles and to better manage software and cloud assets.

The challenge is that cloud and software environments grow more complex every day, with data sources increasing as a result. Organizations are increasingly turning to SAM tools to help them organize and optimize this growing amount of information – but to do so effectively, they must collect the right data.

Indeed, it is important to remember that technology platforms can collect and analyze data, but cannot specify the type of data that needs to be collected. Instead, compliment the tools by adding a layer of human expertise and knowledge to provide intelligence-driven insight on what needs to be collected as knowing what to look for is essential to optimizing any and all IT functions.

 

The challenges of collecting Trustworthy Data

To obtain a good representation of your entire environment, you will need to take 100 per cent of your estate into your account, otherwise it is very difficult to make an informed decision about risk, opportunity or migration.

In addition, vast amounts of data will need to be collected. But entire estates are complex scenarios, and capturing data and measuring coverage, as well as processing all of the information can be an overwhelming task. This can be done automatically, and should be done on an ongoing basis to create a more accurate picture and ensure no sources are missed – with automated services, the collection frequency can be adjusted, particularly helpful in more dynamic cloud environments. This process will also enable you to establish a baseline that can be used to measure further information against a meaningful benchmark.

This baseline should be created from a combination of sources, but while having multiple sources is useful and necessary, it also means a fair amount of duplication. So, setting up an ongoing process to easily measure the coverage is essential.

 

Measure your coverage

It is important to remember to track the data you are collecting, in whatever cycle or frequency you are using. The aim is to see consistency in the levels of data, so if you notice a big drop or major variances, having the collection processes in place means that you can find where discrepancies lie.

It is also important to keep in mind that automated tools should not solely be relied upon as some cannot capture all the necessary information, such as non-instance data. To rectify this, you may have to communicate with administrators, capture screenshots, and, as stated above, add a layer of human knowledge from the likes of external third parties. A winning combination of automation and manual effort is key to knowing what needs to be collected and how to extract its maximum value.

 

Keep an eye out for Part 2 of ‘The importance and value of Trustworthy Data’, where we take a look at the cleansing and normalization process of obtain Trustworthy Data.

To learn more, download our Trustworthy Data Guide or visit our dedicated Trustworthy Data page. You can also listen to our webinar Visualize your ITAM data  

 

ABOUT THE AUTHOR

Simon Leuty - Chief Technology Officer

Simon is a founder of Livingstone Technologies. He currently sits on the Executive Leadership Team as Head Platform Development. Simon is responsible for the development roadmap of our client portal LUCE.

He has over 15 years’ experience of the Software Asset Management and Cloud markets, working closely with our growing customer base to help them take control of their software and cloud cost. Simon helps to eliminate unacceptable spend and enforce tight governance standards that keep them compliant, agile and secure.

Related Services

Alternate Text
Mega Vendors

Services Portfolio

Alternate Text
Technology & Data

Services Portfolio

Alternate Text
Contract

Services Portfolio

Alternate Text
Compliance

Services Portfolio