tokenisation done right
Preparing the path
“If one learning point stands out above all others in the tokenisation journey, it is that preparation is the key. We have seen so many examples of firms rushing into a technology-build without having taken the time to define the need, the ecosystem and the target operating model fully – but the due diligence in this first phase is the difference between success and failure in our tokenisation projects.”
(Kelly Mathieson, Chief Client Experience Officer – Digital Asset)
1. Stakeholder education and shifting mindsets
“This is something you have to be spending time on today”
The tokenisation journey begins well before any technology solutions are even in the frame. It is simply the ability for a whole firm to see a new model beyond today’s (legacy) infrastructure and processes.
With new applications and usage cases of tokenisation emerging every day, there is an abundance of creativity and innovation being realised across our industry – and firms that have succeeded in their tokenisation journeys have taken the time to educate themselves on the technology, the innovations that it is enabling.
More than just an innovation sprint, this needs to a broad-based, continuing effort that touches all levels of the organisation and that questions the status quo. Board members need to be continually briefed on what is possible and on how they need to be thinking in order to drive meaningful change. Risk, Compliance, Legal and Operations – who manage the daily business - equally need to be briefed and thinking about how blockchains can transform their lives tomorrow. The central question is not how tokenisation can help fix one problem – it is how a digital asset roadmap can transform your business.
This is no overnight journey – nor can it be done passively. Today’s successful firms have been experimenting, sharing and thinking through tokenisation for up to ten years – often thanks to a small number of empowered, visionary senior managers who have forged a consensus that studying digitisation today is unavoidable.
“We have to change our mindset around what we want to achieve with digital assets. The tendency today is to focus on building the platform that will lead the world. But tokenisation doesn’t mean digitisation – it means connecting to a bigger ecosystem. It’s about focusing on core competencies and acknowledging that I’m not the captain of every piece of the process any more. Instead, I am becoming the master of my application within a much bigger ecosystem”
(Eric Saraniecki, Co-Founder and Head of Strategic Initiatives – Digital Asset)
2. Have I defined the problem? Really?
“The majority of DLT projects today are still strongly driven by the ‘coolness’ factor – but those are the ones that don’t last past the pilot.”
Given that DLT is still a new technology, the general preference of many firms has been to learn about the technology through doing. We apply DLT, we learn a little and we try a bit more.
But learning is very different from deployment and as we move into an era of commercialisation and live, scalable DLT deployments, it is critical that we be able to describe the quantifiable problem that tokenisation is going to solve for – and all of the dependencies that drive the problem.
“X users struggle with problem Y – because of A, B and C factors. All told, this is costing us Z on a yearly basis”.
Two factors make this extremely challenging today. First is a lack of outcome-based KPIs from comparable tokenisation projects (i.e. knowing that a peer’s blockchain has reduced reconciliation costs by 20%) – giving us only anecdotal evidence to build on.
Second is the fact that our view of end-to-end processes across our firms is heavily impaired. With half of us missing over 20% of the cost of a trade, it is extremely hard for organisations to fully detail the impact and nature of a particular problem – as we invariably can all only see part of the problem.
Yet without this clarity on the exact problem (and its drivers), tokenisation projects can often become discovery projects. Significant resources can end up focused on exploring infrastructure, underlying processes and unknown dependencies – all in parallel with an actual build. Changing problem statements then lead to constantly changing requirements…
As parents we are comfortable answering “why” five or six times to our children in succession. We need to apply the same discipline to our own infrastructures.
3. Is DLT the right solution? Why?
"There’s been a lot of focus on factionalising funds – but we’ve already had the solution for years in feeder funds. Does the technology add more value here?"
It is no accident that the most advanced examples of tokenisation and digital deployments today include a blend of different technologies – including smart contracts, APIs and blockchains.
In a world of regulatory complexity, legacy technology and intense cost pressures, it is critical that the use of blockchains and tokens be a uniquely compelling solution to a particular problem.
Given the high costs of transition to a blockchain, market centralisation (not decentralisation) has sometimes proven to be a better solution to a market problem than DLT. In many cases – the use of smart contracts alone has helped to deliver significant efficiencies without a blockchain.
It’s all about relative value and, before entering into a project, users need to be able to define exactly which unique DLT characteristics will help them resolve the problem statement – and how.
4. Business case and management approvals: what’s the main event?
“What is the main event? What is the usage case that will trigger client migrations?”
"This can not be a like for like"
Few business cases for DLT today are entirely empirical. There is simply not enough information available on the results of tokenisation to make a clear and complete estimate of the end-to-end value of a project.
Yet competition for investment spend in firms today is increasingly intense. After mandatory changes and regulatory updates, firms struggle to allocate remaining investment budgets across client-sponsored developments and strategic developments such as multi-year tokenisation projects.
Like all projects today, tokenisation projects need to deliver results quickly and often – but firms need to balance the need for incremental savings from tokenisation, whilst avoiding the trap of focusing only on incremental change.
Every firm also needs to define the point in the future when these projects will reach their full potential. They need to define not only the journey but also the ‘main event’.
“How to prioritise? When you look at it all through a profitability lens, it all becomes very clear very quickly”
This generally means a 3-part business case:
- Short term: Quantitative estimations of tactical, operational savings that can be achieved in today’s environment (and regulatory frameworks) – mostly through tokenising existing securities.
- Medium term: A mixture of quantitative and qualitative estimations of how tokenisation can deliver both savings and revenue benefits. These usually focus on removing external (inefficient) processes or redefining operating models and relationships – and are often subject to one or two dependencies (such as regulatory change). This is often when native digital issuance becomes part of the plan.
- Longer term: Entirely qualitative estimations of the potential for market restructuring, disintermediation and entirely new levels of innovation (e.g. DeFi, etc.)
Investment approvals for tokenisation are inevitably based on support for a journey that may include many steps – but defining the root driver for tokenisation is critical.
5. Pilot or MVP? Preparing for scale
“People are tired of proof-of-concepts. It’s hard to convince people internally to do another pilot”
One of the major shifts highlighted in our 2022 DLT in the Real World survey is a move away from experimentation to commercialisation as the key driver for tokenisation projects.
With much still to be learned about DLT, we face a challenge in making sure that we can learn from our tokenisation work, whilst ensuring that the same work scales to deliver real, P&L benefits.
“Our MVP criteria changes every quarter”
Herein lies the key difference between pilots (or proof-of-concepts) and minimum-viable-product builds (MVPs). There appears to be little management time left for the former – meaning that successful tokenisation projects today need to evidence preparedness for a journey from small to scale.
In design terms, this means addressing issues such as bridging the digital-legacy technology divide, regulatory dependencies and other factors.