When it comes to implementing transaction automation, managing the trade-off between the speed of execution and the granularity of data is a challenge…
There are many factors that require careful consideration to bring about effective cognitive solutions.
It’s akin to conducting a group of musicians – it might be possible (easy even!) to attain a pleasant sound from a solo instrument…
But, if expertly managed, you could accomplish a symphony from the entire orchestra!
This week, our podcast series will guide you through the five steps required to conduct a dazzling cognitive symphony.
On Day 3 of Conducting a Cognitive Symphony Anna Madarasz, Analytics & Cognitive Lead , IBM Global Procurement discusses the importance of appropriately applied transaction automation, striking a balance between speed of execution and granularity of data and how to avoid landmines.
The importance of transaction automation
Marco Romano, Procurement Chief Analytics Officer, Global Procurement, Transformation Technology, IBM discusses taxonomy in his white paper, “Transaction automation is a business necessity.
“We all want to spend less time doing repetitive lower-value work and use our skills to provide higher-value services to the business. However, as with many good things, badly applied transaction automation results in poor data and ultimately lost productivity and analytics effectiveness down the road.”
Transaction automation landmines
Procurement organisations are usually very well intentioned when it comes to the implementation of transaction automation but that’s not to say the process is without its challenges. We asked Anna to describe some of the landmines she’s seen procurement professionals hit.
Catalogs or other automation processes that allow the editing of item description and price can make the life of the client and the buyers easier.
As companies see the positive effect of this they are likely to have a higher percentage of their transactions and spend going through catalogs.
The risk with this, as Anna points out, is setting yourself unrealistic targets, “there is always a logical threshold, over which it is a risk to apply automation. Of course, you will not implement a catalog line if you only have two purchase orders of the same nature in a year.
“With wrongly defined targets, a catalog isn’t going to decide action and then, of course, you spend more time on creating and maintaining your catalogs than creating your purchase orders.
Anna also advises avoiding the catalog lines that allow bulk purchases.
“Many times it is really not easy to identify the purchase in a fixed line. Let’s say you are buying server configurations [or] storage configurations. Those are made up of multiple parts, so you have hardware, software and services elements in it.
“A configuration can be made up of 50, 100 lines. If you allow your clients and your buyers to raise purchase orders simply as a one line item, this server [could cost] one million US dollars!”
“Of course, it’s a really sensitive balance because you also want to avoid the workload of raising incredibly granular purchase orders, so it is really your call at what level you would like to analyse [a given category.]”
“If this is a category which is your main area of focus, then try to go granular, try to get the data. If it’s not, then it’s your call if you are allowing these bulk purchases.”
The trade off
“There is always going to be a trade-off between speed of execution and granularity of data” says Marco
“Finding the right balance again takes us back to developing an understanding of what data we need to achieve our desired cognitive and analytics state. There is no doubt that teaming with the right technology and innovation provider, and selecting the right tools, is critical to that balance”
Striving to conduct a cognitive symphony but in need of some expert guidance? Our podcast series runs throughout this week and will have your orchestrating cognitive success in no time! Register here.