New report: 2024 Gartner® Magic Quadrant for SAM Managed Services.

Resources

How AI and Real-Time ITAM Can Revolutionize Your Organization

Listen to “How AI and Real-Time ITAM Can Revolutionize Your Organization" on Spreaker.

Host: Kris Johnson, Anglepoint Chief Product Officer

Guest: Ron Brill, Anglepoint President & Chairman, Chair at ITAM Forum, Chair at ISO ITAM Standards Committee

In this episode of The ITAM Executive, Kris Johnson and Ron Brill discuss the necessity and increasing importance of AI and real-time IT Asset Management.

Traditionally, ITAM has focused on audits for software compliance and contract negotiation. These practices often rely on manual processes and Excel sheets. With the rapid pace of technological changes, ITAM needs to evolve from a point-in-time practice to real-time data management.

This evolution will empower organizations to use generative ai technologies to better manage licenses, enhance cybersecurity, and reduce costs.  Utilizing technology, such as automation, with artificial intelligence, and intelligent tools, will be crucial in achieving real-time ITAM.

Listen as Ron and Kris discuss this critical transition (and the strides that Anglepoint has made) to real-time ITAM.

By listening to this episode, you’ll learn about:

  • The evolution and future of AI and real-time ITAM
  • The need for intelligent document processing (and Anglepoint’s experience with it so far)
  • The organizational impacts of automation and performing tasks with AI
  • The future landscape of ITAM
  • And more

Additionally, for a limited time only don’t miss complimentary access to the latest Gartner AI research – Defining AI and Setting Realistic Expectations.

Episode Transcript

Kris Johnson:

Hi, welcome to another episode of the ITAM Executive. I’m Kris Johnson, Chief Product Officer at Anglepoint.

With me is Ron Brill, President and Chairman of Anglepoint, also Chairman of the ITAM family of standards with the ISO committee, WG 21, as well as President of the ITAM Forum. Welcome, Ron. It’s certainly not your first time here, but it’s great to have you back.

Ron Brill:

Good to be here with you Kris.

Kris Johnson:

Ron, you recently published an article in the German affiliate of the Harvard Business Review, which we can provide a link to, discussing the realities and necessity of real-time ITAM, or near-real-time data from an ITAM function. The necessity of that and its growing importance, along with an outlook on how we get there. I thought it would be good to dive a little deeper into some of the details. You and I have talked a bit about this, and I thought we could explore more on some of those areas. Could you give a recap of where you see real-time data becoming more and more important to ITAM executives?

Ron Brill:

Absolutely. Historically, IT asset management evolved in response to the need to help with contract renewals or defending against software publisher audits, and so forth. It was initially meant to be a one-time activity.

Kris Johnson:

Yeah, it was really in response to compliance audits.

Ron Brill:

Compliance audits and contract negotiations, to some degree, as well. But yes, absolutely. ITAM was never meant to be an evergreen, near-real-time provider of data. That may have been sufficient initially, but nowadays, with the fast pace of technology changes, particularly in the cloud where assets are more ephemeral, their shelf life can be days, hours, or even minutes. That’s just not good enough anymore. There is a growing need to have more real-time information, which would serve all functions dependent on such ITAM data, from license management to cybersecurity. They all need more real-time information.

Kris Johnson:

Yeah. So where the original impetus was perhaps compliance assessments in response to an audit or the need to have data to make a decision around a renewal—very point-in-time, very project-based—now, the pace of decision-making has increased beyond that particular use case of a one-time event.

Decision-makers need information at their fingertips to know what direction to move, the impact of other changes within their environment as it relates to licensing, security implications, and so forth. Not to mention the fact that cash is king and more scarce these days with inflation and so forth. The need to reduce waste is ever-present. We go through macroeconomic cycles, and we’re in a cycle right now where everyone’s motivated to reduce waste and save money.

Ron Brill:

Yeah, absolutely. Another good example is FinOps, which was born in the new real-time world. It was meant to help with cloud financial management and reducing costs in the cloud.

There’s a lot of talk about ITAM and FinOps converging. Gartner predicted that by 2026, 50 percent of organizations will have converged the two functions. To achieve that, ITAM needs to catch up to FinOps in terms of providing real-time information. FinOps is not going to go down to the level of ITAM. That’s not going to happen. ITAM needs to catch up to FinOps to provide cost savings at the overall enterprise level, for both on-prem and cloud environments, covering the totality of the hybrid digital infrastructure.

Kris Johnson:

Yeah, if you think about it, cloud resources are being built by the minute, literally. Your cost position is changing by the minute. To manage ephemeral infrastructure and cloud costs effectively, you have to be real-time or near-real-time. The expectation landscape has changed. Why do we only get periodic reports related to our licensing costs or exposure in our non-cloud environments or our cloud environments in the hybrid case?

Ron Brill:

Absolutely.

Kris Johnson:

Yeah, it makes total sense. So how do we get software asset management and IT asset management in an on-premises world out of a point-in-time look-back methodology into a more real-time approach? How do we help ITAM as a discipline catch up to FinOps, as you say?

Ron Brill:

Yeah, it’s a great question.

I think the short answer is that this can only be achieved with technology, automation, artificial intelligence, and so forth. The days of using Excel as the primary tool and relying on a lot of manual processes to collect entitlement information, deployment information, translate that to license consumption information, and perform manual reconciliation—those days are over.

Using Excel is not scalable, and it cannot be translated to real-time information. So, there must be a greater reliance on tools, which are going to become much more important in this process. These tools will also need to become smarter than they are today. They should be able to ingest a lot more information from various sources, providing a complete picture of entitlements, deployments, and consumption. Many of these tools will need to incorporate generative AI capabilities.

Kris Johnson:

Yeah, I think about it in terms of a two-by-two matrix. You have the frequency of tasks and the sophistication of tasks. Up to this point, automation has focused on high-frequency, relatively low-sophistication tasks. With AI, you can now automate highly sophisticated tasks that are very complex. They still need high frequency for training data, but it changes the landscape of what is possible to automate and make repeatable.

One of the areas we’ve been discussing is intelligent document processing. This involves extracting data from invoice data and PO line items in electronic documents, PDFs, and different document stores, then putting it into an entitlement repository. Applying license rules to those transaction histories translates into what a client is actually entitled to use.

It’s ironic to me that the technology for AI applied to this problem is more advanced and will solve that problem, as opposed to what the promise of e-invoicing was or blockchain, like we’ve discussed in the past. Rather than prevent the problem through e-invoicing or hard-copy static document transfer, applying machine learning to the extraction of static documents is what will ultimately win out.

Ron Brill:

Yeah, that’s a good point. Historically, the vision was to turn everything into machine-readable data, right? With invoicing and so forth, let’s make it machine-readable at the time it’s generated. That was supposed to solve the problem. Well, that didn’t happen.

But what we have now with AI is that everything is really machine-readable. You can take a picture of something, and AI can read the text from it, understand that text, and interpret clauses like order clauses, license clauses, or clauses that describe usage rights and so forth. We almost don’t need the source document to be machine-readable because AI can just figure it out.

Kris Johnson:

Especially with multimodal AI, combining vision with document processing and good data analysis.

In our research and development department, which I oversee, we’ve been testing various intelligent document processing solutions and options, some of which are commercially available today, and some are in alpha or beta stages. We’ve seen very mixed results.

We’ve even explored developing our own intelligent document processing and different machine learning models for that, with again, very mixed results. You can categorize these solutions into those trying to process all types of documents. Whatever your document type is, if it’s in a PDF, it’s static. The promise is, we will read it with a machine, make sense of it, and give it to you in a format you can use and search. What we’ve found, not surprisingly, is that solutions hyper-focused on a very narrow use case are far better at addressing that use case than those trying to train a very generic model.

For example, Oracle order documents. We have access to solutions that can process Oracle order documents, not just extract fancy OCR table data. Yes, it can do that, but it also understands that, for example, my subscription end date referenced elsewhere outside of that table needs to be applied to the start dates here, or your actual entitlement needs to multiply your unit of measure by your quantity. It goes beyond just fancy OCR data extraction; it’s actual intelligent document processing. Very interesting.

We’re looking at solutions from the likely suspects in our industry, from tool platform providers, but also bespoke tools specifically targeting intelligent document processing, even outside of an ITAM use case.

What we’ve also found is that there are off-the-shelf solutions promising to automatically analyze your contract. Those are harder to get right, but again, those specifically targeting that use case can actually do a decent job. So, beware the promises of anyone peddling an AI solution, just as you would be skeptical of anyone selling any technological solution. Focus on what problem they are actually trying to solve and assess objectively how well they solve it.

If anyone’s interested in the results of that research, I don’t know if we can publish it in a very public forum like this, but I’m happy to have a conversation about it. It allows us, as a service provider and managed service provider, to have various options at our disposal, enabling us to gain greater efficiencies that we can pass on to our clients to do more at scale.

Ron Brill:

I think a really important part is not just the tool and the platform, but the training you apply, right? It’s the consultant who knows what they’re doing, helping train the model, providing feedback until it gets good enough. That kind of training cycle is absolutely key with whatever platform you choose. There isn’t anything out of the box that will provide a hundred percent solution.

Kris Johnson:

And from a solution development standpoint, the compute for that training is the upfront cost of developing a solution like that. If you’re out there and this is a problem you need to fix, intelligent document processing, again, reach out. We can maybe get you included as part of a beta program. If you’re willing to contribute your documents as part of a training exercise, there can be a lot of benefits from doing that.

Ron Brill:

Absolutely. It’s really exciting to see the impact of real-time ITAM. What do we do as consultants? What do ITAM programs do?

I think this will help us transform the profession from being focused on a lot of manual activities and bean-counting tasks to letting the machines do that, letting AI do that, letting technology do that. Let’s focus the consultants or the practitioners on higher value-add activities. What happens next? How to take those recommendations and insights and translate them into actual opportunities to reduce costs. How to get those recommendations implemented within the organization, working with stakeholders, and so forth. That’s where the value is to the organization, right?

And that’s the higher value-add activities that many ITAM programs don’t currently have time for because they’re focused on the manual process of generating ELPs and so forth.

Kris Johnson:

Yeah, exactly. Those higher-order tasks, as well as organizational change management. We talk about this all the time in FinOps, right? Creating a cloud cost-conscious culture within the organization is achieved through transparency of data, dashboarding, chargeback, and so forth. ITAM needs to follow suit.

If we can help people be software cost-conscious within the organization, that’s organizational change—changing hearts and minds, right? It’s not just bean-counting.

We have a client we were talking about yesterday. There’s a lot of disruption happening in many industries right now with macroeconomic pressures and inflation. This particular client was going through a round of layoffs, and a significant portion of their ITAM staff was being let go. It turns out it’s not as alarming to the director of that program because of the efficiencies we’re gaining using these techniques to automate previously manual processes and streamline operations to achieve economies of scale.

These efficiencies pay dividends. Sometimes organizational change has to happen, and people get let go. But sometimes that’s the right decision to make if you have too much labor cost. Hopefully, you can redirect those resources within the organization to higher-order tasks or different functions. The net result is that the ITAM function becomes significantly more cost-effective to run because it requires fewer people.

Ron Brill:

It takes fewer people, and those people are able to generate a much bigger impact.

Kris Johnson:

Because they’re not doing manual processes and swivel-chair tasks that just take a lot of time with low impact.

Ron Brill:

Right, absolutely.

Kris Johnson:

Yeah, the tools need to get better. We need to help them get better. Clients can help them get better by contributing data for model training. What else should ITAM executives have in mind as they try to move towards real-time SAM?

Ron Brill:

Yeah, I mean, it’s really a lot more focus on technology. It’s technology-driven insights, right? Whatever could not be automated could not be leveraged. So, it’s really the importance of tools in the marketplace, like Flexera, like ServiceNow, and others, which is going to increase significantly in this whole process. That’s a big impact.

It’s about aligning some of the practices of ITAM with FinOps, which is one of the things we’re trying to do in the next edition of the ISO standard as well, Dash one. Because FinOps has evolved to provide a solution for the cloud environments, right? To provide near real-time information in dashboards and…

Kris Johnson:

FinOps as a discipline, if you’re in the run phase, yeah, you’re there, right?

Ron Brill:

For sure. And I think real-time ITAM and the convergence of ITAM and FinOps really go hand in hand, as well as with what we’re seeing with AI and support, like we spoke about. All those things are leading in that direction. I think we’re also going to see a lot more integration with cybersecurity and some of those practices that require near real-time information.

Historically, they’ve had to resort to other means or use other tools because ITAM was not able to provide that information. Now, ITAM is moving in that direction and should be able to be that source of truth that’s not only more complete and accurate, but also more timely and relevant. That could be relied upon for those purposes as well. So, I think all these things are converging and will help increase the value that ITAM programs can continue to provide to their organizations and increase their relevance.

Kris Johnson:

Yeah, it reminds me of a mantra that you and I live by and remind each other of: adapt or die.

Ron Brill:

Yeah.

Kris Johnson:

And ITAM needs to adapt or it’s going to increase in irrelevance if it languishes in this world of point-in-time, retroactive ELPs. The SAM programs that fail to adapt are going to increase in irrelevance.

Thank you, Ron, for helping the discipline of ITAM move towards that evolution in the ISO standard work. With the new version of Dash one, the prior version started as a SAM standard. You defined it as an ITAM standard. We’ve now incorporated in the new version of Dash one for 2024 the definition of a cloud resource as an IT asset, the alignment with the FinOps methodology, and the form, optimize, and operate phases to help with that convergence.

But we need practitioners, ITAM executives in the field, to embrace that change and adapt. Thank you for your leadership there. We need everyone to do the hard work of adapting their organization to real-time SAM, build those bridges with your FinOps programs, and lead that convergence effort.

Ron Brill:

Yeah, absolutely.

Kris Johnson:

Well, thank you for the time, Ron. It’s always a pleasure to have you, and hopefully, this has been helpful for our audience. We look forward to seeing you again soon.

Ron Brill:

Absolutely. Thanks, Kris.

 

For more insights, check out these resources.

 If you’re interested in learning more about Ron or Kris connect with them on LinkedIn.

Listen in on our latest podcasts by checking out the ITAM Executive.

Dig into more insights from ITAM executives by subscribing on Apple Podcasts, Spotify, or wherever you listen to podcasts.

Related Resource