While the success or failure of a digital transformation project typically becomes apparent only at adoption stage, the warning signs appear much earlier in the process.
The question is, how do you pick up on them before it’s too late to course-correct?
At Fundipedia, we use the DATA framework to objectively assess a firm’s current state, identify potential weaknesses, and nip them in the bud before they can derail the project.
In this post, we’ll walk you through each of the four principles underpinning this framework, and explain how they can help you make sure you meet the objectives you’ve set out to achieve.
What’s the DATA framework?
The DATA framework is a set of four key performance indicators designed to tell you how likely it is that your digital transformation project will succeed and, if not, why not:
- D stands for Data
- A stands for Autonomy
- T is for Technology
- A is for Accountability
There’s a set of assessment questions under each KPI.
Depending on your replies, your organisation earns a score of between one and five. One means the project is probably doomed to fail, while five means there’s a high chance of success.
In the following paragraphs, we’ll look at each KPI in more detail. But, first, a brief disclaimer.
We’ve tailored the DATA framework to digital data management projects, because these are the projects we typically do for our clients.
That said, the DATA framework can be useful in any digital transformation project that involves data. Which — let’s face it — is most, if not all of them, these days.
With that out of the way, let’s dive in.
Mapping out the landscape: D is for Data
The two most common data issues we encounter are silos and poor quality.
All too often, data is locked in a series of spreadsheets and other documents that are jealously guarded by gatekeepers.
Even when the data is stored in a central location, legacy systems, unclear policies, and practical issues like inconsistent file-naming can make it difficult to extract that data and migrate it to a new system.
On the quality front, when data collection and management are heavily manual, human error is almost inevitable. 88% of spreadsheets, for instance, have at least one mistake.
Inaccuracies can also become an issue when the original source is a third-party, and your organisation doesn’t have a verification process in place.
It sounds obvious, but to deliver the results you expect, software has to be configured correctly. In a data management context, that entails knowing exactly how much data you have, where it is, and what shape it’s in.
With this in mind, it’s worth asking yourself three questions:
- Do you have access to all the data you need for the project?
- Is the data complete?
- Can you trust its accuracy?
If your answer to any of these questions is ‘No’, you’ll need to do some prep work before the project can start. Depending on the state of your data, this could include:
- Coming up with naming, categorisation, and filing conventions so data is quicker and easier to track down
- Putting validation rules in place so you can flag problems straight away, including who is responsible for verifying accuracy and how often data should be checked
- Identifying and addressing gaps
- Building in additional time for data extraction, particularly if your current process is heavily manual or involves legacy systems that don’t play well with modern technologies
Setting standards: A is for Autonomy
How do you handle data?
Do different teams create their own documents, reports, and rules around management and storage? Or do you have a centralised team and strict top-down data management policy?
In our experience, both can create challenges when putting a modern digital system in place
Too much autonomy is the leading cause of silos and data quality issues.
Teams can start caring only about their data or become overly protective of it.
Equally, because data management is one more item on a jam-packed to-do list, they can struggle to keep it up to date, or mistakes and inaccuracies can fall through the cracks.
By contrast, the biggest issue with too much centralisation is that the data team can become a bottleneck.
Here again, this can create data quality issues, as well as gaps. Plus, the lack of flexibility can hamstring individual teams and prevent them from using data in ways that enable them to work more efficiently and effectively.
On this last point, over the past few years there’s been a move away from storing data in one large, centralised, warehouse and towards a more decentralised approach called a data mesh.
This brings together the best of both worlds. Anyone with access can find the data they need from one convenient location. At the same time, the data is owned by the team that’s most reliant on it, and organised in the way that makes it easiest for them to do their jobs.
Finding the right partner: T is for Technology
In any digital transformation project, your choice of technology partner is make or break. So, a critical question to ask yourself before you embark on a project is whether the platform you’ll be working with is fit for purpose.
Many firms opt for one of the large, incumbent firms in the belief that, because they’ve been the asset management industry’s go-to-partners for close to 40 years, they must be a safe pair of hands.
Leaving aside the obvious — they’re eye-wateringly expensive — the way these firms are set up means they can be painfully slow, bureaucratic, and inflexible. Their experience also tends to be broad, rather than deep, so they may lack the technical know-how required to address specialised problems.
Of course not all small specialised fintechs are what they seem, either. And that’s why having a solid vendor selection process in place is crucial.
For best results, start with your goals.
What are you hoping to achieve? Technology is a tool, not an end in itself. Having clear objectives from the outset will help you ensure you pick a platform that meets your needs, instead of adapting your needs to fit the platform.
More to the point, make sure you involve end-users early on.
Ultimately, the acid test for success is whether end-users actually use the new platform. Involving them in the vendor selection process and listening to what they have to say increases the odds of this happening.
Getting the project over the line: A is for Accountability
You’ve worked through your data issues, set goals, and put your vendor through a thorough vetting process. But who will be in charge of seeing the project through?
Most reputable software vendors will have a dedicated project manager whose job it is to make sure the project stays on track. But entrusting someone from your firm to take ownership of the project — ideally this should be a senior manager — is just as important, for two reasons.
First, it means there’s somebody who can liaise with the technology vendor, senior management, and end-users, ensure the project gets the attention and resources it needs, and make key decisions.
This will keep the project moving smoothly and reduce the risk that things could get stuck while different teams argue over who is responsible for a particular area. Or, worse, over what the project goals are.
Second, and more important, the project owner’s enthusiasm can help keep staff motivated when the going gets tough. At the end of the day, leaders set the tone. If the person in charge is keen on the project and genuinely believes it’s for the better, their attitude will spread.
You need to build on solid foundations if you want digital transformation to stick
An often-repeated statistic states that 80% of digital transformation projects either fail or fall short. But while this can make the process seem daunting or even unworthwhile, the reality is that making technology stick isn’t as hard as you’d think.
The trick is to go into it with both eyes wide open.
You wouldn’t paint over a surface until you’ve primed it properly. Or go camping in the wilderness without the right equipment (Well, unless you’re Bear Gryllis).
Using the DATA framework helps you make sure you’ve properly assessed your current state, tackled issues that could cause issues later on, and put structures in place to get the project over the line.
Have more questions about our DATA framework, or wondering what it would be like to work with us?