news

From Strategy to Implementation: Leveraging Business Capabilities for Effective Transformation

Introduction

In today’s rapidly evolving business landscape, many organizations struggle to bridge the gap going from strategy to implementation. A useful framework to drive this transformation is the concept of business capabilities*. By defining what an organization must be able to do to execute its strategy, business capabilities provide a concrete, actionable bridge from strategic goals to implementation.

Capabilities as a Vehicle for Change

Business capabilities are in fact the essential building blocks of the organization’s business value streams. They define what the organization needs – in terms of processes, people/skills and technology – in order to deliver value and achieve its strategic objectives. Using capabilities as a framework for change helps focus the organization on what truly matters — aligning resources, technology, and processes toward achieving its strategic goals.

Business capabilities, like value streams, are stable, but how they are delivered evolves over time. This makes capabilities a powerful tool to manage transformation, as they provide a consistent lens through which both current and future needs can be viewed.

Capabilities offer a clear focus for change, as they cut across traditional silos of people, processes, and technology, aligning these elements toward achieving strategic outcomes

Step 1: Conducting an As-Is Capability Analysis

The first step in any change or transformation is to understand the organization’s current capabilities, or the “as-is” state. Organizations often lack a detailed understanding of their current capabilities, which makes this analysis crucial.

The as-is analysis focuses on identifying the organization’s current capabilities in relation to the goals of the transformation. Rather than being an exhaustive assessment, it should prioritize areas critical to addressing the challenges or opportunities driving the transformation. This means concentrating on which capabilities are strong, which need improvement, and which may be missing—always with the overarching transformation objectives in mind.

This focused approach streamlines the analysis, avoiding unnecessary complexity, and provides a clear, actionable baseline that lays the groundwork for meaningful and goal-oriented transformation efforts.

Step 2: Developing a Target Capability Model

Once the current state has been assessed, the next step is to develop the “to-be” or target capability model. This model serves as a blueprint of the capabilities and value streams needed to achieve the organization’s strategic objectives in the future.

Each capability should be designed to meet future demands by refining its core components: aligning people (skills and roles), enhancing processes (workflows and practices), and upgrading tools and systems (technology and infrastructure), where applicable.

The target capability model is a powerful communication tool, clearly articulating the future vision and ensuring alignment among leadership, stakeholders, and teams. By detailing how capabilities will evolve, it provides a shared understanding of the transformation’s direction and steps required to achieve it.

Step 3: Identifying and Analyzing Gaps

The gap between the as-is and to-be capabilities becomes the basis for the transformation. These gaps highlight where the organization needs to improve, invest, or re-engineer processes, technologies, and structures.

The clear definition of these gaps enables the organization to break down the transformation into manageable work packages, ensuring that each effort is focused, actionable, and aligned with the broader strategic goals. Different gaps will require different levels of effort to close, and they won’t all be equally critical. Therefore, it’s essential to prioritize them based on the potential value they deliver and the cost of execution.

Step 4: Prioritizing Transformation Activities

The next step is to prioritize the gaps and the activities needed to close them. This prioritization should be based on two factors:

  1. Value Delivery: What is the expected impact on the organization’s ability to execute its strategy once the gap is closed?
  2. Cost of Execution: How complex or resource-intensive is it to close the gap?

By focusing on high-value/lower-cost activities first, organizations can start realizing value early in the transformation process while managing risk.

Step 5: Creating a Transformation Roadmap

With the gaps identified and prioritized, the next step is to create a transformation roadmap. This roadmap serves as a detailed guide, breaking down the work into actionable packages that address specific gaps. By structuring these work packages, the roadmap provides a clear sequence of activities required to enhance and evolve business capabilities to meet the transformation goals.

Each work package in the roadmap should include defined timelines, required resources, and key dependencies to ensure smooth execution. This granular approach not only makes the transformation more manageable but also helps stakeholders stay focused on achieving measurable outcomes while staying aligned with the organization’s strategic priorities.

The transformation roadmap should be structured in a way that maximizes value delivery early and reduces risks. This approach builds momentum, generates quick results, and lays a strong foundation for sustained progress throughout the transformation.

Step 6: Building a Business Case and Value Delivery Plan

The value of closing capability gaps should be quantified in a business case to justify the organization’s investment in the transformation. This business case should outline the anticipated returns, such as time or cost savings, improved efficiency, enhanced customer experiences, or faster time-to-market.

In parallel, a value delivery plan ensures that the transformation remains focused on delivering measurable outcomes, ensuring that each closed gap contributes to the overall strategic goals.

Conclusion

Transforming an organization’s capabilities is essential for successfully executing its strategy and achieving sustainable competitive advantage. By using the concept of business capabilities to guide this transformation, organizations can bridge the gap between strategy and execution, ensuring a continuous focus on strengthening the business value streams.

The step-by-step approach outlined — from conducting an as-is analysis to developing a target capability model, identifying gaps, prioritizing activities, creating a transformation roadmap, and building a business case — provides a clear and actionable path to transformation. Each step builds on the previous one, ensuring alignment with strategic goals while breaking down complex changes into manageable, prioritized work packages.

This structured process not only drives measurable value but also empowers organizations to adapt, innovate, and thrive in an ever-changing business environment, turning strategic vision into operational reality.

* A business capability is a high-level representation of what an organization is able to do to achieve its strategic goals and fulfill its mission. It defines the organization’s capacity to deliver specific value — whether internal or external — independent of how it is achieved. Capabilities are relatively stable over time, even as processes and technology evolve, and they are modular, often spanning organizational structures and crossing functional boundaries.

Get Access to More Insights from Opticos

Subscribe

Authors

Maria Nelsson, Mario Gonzalez Muñoz & Mikael Palm

Maria Nelsson, is a seasoned business leader with over 25 years of experience in strategic IT and governance, sourcing, and leading IT/business transformations.
Mario Gonzalez Muñoz is a Senior Consultant at Opticos with a strong Strategic IT consulting experience.
Mikael Palm, is a seasoned business leader and Certified Enterprise Architect with a focus on Strategy, Digital Transformation and Innovation

You might find this content relevant as well:

news

Enhancing Generative AI with Your Own Data

Generative AI is great, but it lacks precision and can be very wrong

Since ChatGPT was introduced in November 2022, the subject of AI and its possibilities has gained significant attention in business and technology sectors. Our previously published articles and insights have explored both the potential business value of various AI types and the challenges associated with generative AI.

Generative AI, particularly Large Language Models (LLMs), can assist organizations with daily tasks and enhance product offerings by producing human-like text and answering general knowledge questions. Having been trained on large sets of data written by humans, these models perform rather well at these tasks. However, challenges arise when LLMs are supposed to handle queries beyond publicly available information or capture events after the data cut-off date. In addition to that, the assertive tone in LLM-generated text can be convincing for users despite the generated text being false, this is usually called hallucination.

In this article, we will explore the concept of Retrieval-Augmented Generation (RAG) as a method to address these challenges with LLMs. We start with an overview of different methods to address the challenges with LLMs before looking at what RAG is and how it works. We conclude by looking at the benefits RAG can bring to an organization.

Which methods can be used to get more precise and reliable results?

Figure 1. Popular AI methods to improve accuracy of Generative AI and large language models

 

To improve the output of the LLM and address these challenges, several approaches can be employed. It’s worth bearing in mind that no single technique is universally superior; the best approach depends on the specific context.

A commonly suggested way of increasing the accuracy of an LLM would be prompt-engineering. For some cases the interactions and phrasing of the questions may result in better output, however the output still relies on the data it was trained on. As a tool prompt engineering is a key capability within generative AI, but it does not solve the issue related to non-public data or data that was created after the cut-off date.

Both custom model and fine-tuned models will perform better for queries requiring specific data. There are however two major drawbacks: they are costly and the time gap between training and user prompting remains.

Neither custom models, fine-tuned models nor unmodified LLM models can provide text that is based on information that has been produced since the latest update of the underlying model. In these models the issue of being limited by the cut-off date remains, but if accuracy of the output is not dependent on recent information or the information may change slowly, these models may provide value for the user.

Between prompt engineering and custom or fine-tuned models is an approach which does not rely on training a model but still provides output based on data for a specific use case. This approach is called retrieval augmented generation, RAG. It is a combination of different types of AI where one part consists of AI technologies to retrieve your data that is relevant to the prompt, the retriever, and the second part is generative AI, the generator.

 

What is RAG?

When feeding the LLM with data through the prompt, a basic approach would be to input a section of text into a standard prompt with some prompt engineering. This provides context for the question the user wants the AI-generated text to be based on.

The essence of RAG takes this basic approach a step further. In a RAG method, the prompt is prepared with the context that has been retrieved from your own data source. Afterwards, the LLM answers the user question based on this prompt.

The idea of a retriever is getting a section of relevant text from your own source data in text format. To achieve this the text is converted into numerical format, capturing its meaning. This type of format allows comparison of different text sections based on their meanings. Both reference text from source documents, and user question undergo this conversion. The purpose of the comparison procedure is to select the section(s) of text from the source documents that are closest in meaning to the text in the user question.

Figure 2. Basic RAG architecture

An example where we have seen RAG being used is in chatbots for product support. It is common for product user guides and manuals to be kept in a knowledge/data base. When a user needs support, through a RAG app, the user query would be used to search for relevant information in the user manuals or product documentation.  Information would be retrieved first, then provided to the prompt for the LLM to give the user advice on how to use the product.

The architecture above and the example described represent a basic form of RAG. There are various methods of creating RAG applications for different purposes, these methods vary in complexity. Generally, the more complex methods of RAG will give more accuracy at the cost of effort to create and manage the RAG application. Conversely, the simpler approaches will be less accurate but easier to create and manage.

 

What does this mean in the world of utilizing generative AI?

More trusted AI applications

RAG would make it possible to get context-specific output from LLMs based on our non-public data and information, by providing the LLM with this data and information as context in the prompt. It lowers the risk of hallucinations, and we should also be able to trust the output from the AI model to a higher degree than we normally can. We should, if we manage the knowledge base, know where the input to the context for the query comes from.

Accuracy and precision from RAG

In the bigger picture, RAG offers a way to get more precise output from generative AI using our own data. It combines the strength of text generation in a human-like way from generative AI and other AI technologies to search for and retrieve relevant information. With this approach, RAG applications leverage different forms of AI specialized for different tasks, such as writing human like text or relevant finding text. This enables higher accuracy in the output than an LLM alone can produce.

Adaptability and flexibility from RAG

RAG offers a relatively flexible knowledge base where the model does not need retraining or fine-tuning to be able to adapt when the context changes. It is still a less flexible way to work with generative AI than prompt engineering, because RAG still requires a knowledge base and will generate answers related to information in the knowledge base.  With proper management of the knowledge base, we should be able to maintain and update the databases that are used as knowledge sources. While this is something that requires effort, if we want to use generative AI when there is a requirement for information to be recent and provide insights on enterprise data, then having a RAG app with a maintained knowledge base is a possibility to get this up-to-date information.

 

Strengthen your data science and data management capabilities

RAG has existed as a methodology since the early days of generative AI. It offers a way to increase the accuracy and timeliness of information by leveraging the strengths of various AI models currently on the market. Additionally, this allows us to use enterprise or context specific documents to provide useful information to the generative AI model. However, the cost here is that in many cases there is a need for data scientists and data engineers to set up and maintain the RAG app, including the knowledge base and retriever. While this overview might give the impression that building a RAG application is easy, several considerations need to be carefully done to create a good RAG app. Once RAG is explored deeper, there are several ways to go about building the app and optimizing the performance for the specific use case. An organization starting with RAG needs to understand its use cases and data needs, have proper data management practices in place and establish and maintain AI and data science capabilities.

At Opticos we enable organizations to leverage the business benefits of new emerging technologies. Drawing from our extensive client experience and methodology in business strategy, change management, data management and AI governance, we’re here to support you in your AI journey from strategy to implementation. Through our strategic partnership with Algorithma, a company offering data science and hosting services for AI, we provide end-to-end AI capabilities to our clients.

Write to us to discuss your organizational AI goals.

Get Access to More Insights from Opticos

Subscribe

Authors

Eric Wallhoff, and Tatiana Schön

Eric Wallhoff, is a Senior Consultant with experience in Data Management and Data Analytics. He is also part of the Strategic IT capability.
Tatiana Schön is a Management consultant with focus on Strategy, Business analysis, Data & Analytics and IT management.

You might find this content relevant as well:

news


Preparing for ESPR Compliance
The DPP Chicken Race: Key CIO Challenges and Strategic Actions for Digital Product Passports

Introduction:

In July 2024 the European Parliament made a historical decision to approve years of work going into the legislation named “Eco-design for Sustainable Products Regulation” (ESPR). This will have large-scale effects on the requirements put on by companies selling or producing products intended for the European market. One requirement that is part of ESPR is that products should have a so-called Digital Product Passport (DPP). In this viewpoint, we at Opticos describe ESPR, DPP, and the potential challenges it might be to a manufacturing company CIO.

In short, these CIO will enter a waiting game, where they can act as forerunners, risking going wrong or followers risking being late. So, the CIO will find themselves in a DPP Chicken Race.

 

Background on ESPR

The proposed ESPR legislation is a cornerstone in the European Commission’s strategy to establish environmentally sustainable and circular products as the new norm. This will be achieved by requiring companies to provide sustainability and circularity data on their products using a Digital Product Passport (DPP), leading to voluntary data sharing and expansion to other sustainable aspects under Union legislation.

Uncertainties remain about what product groups will initially be prioritized. Some groups are expected to be impacted as early as 2027, identified by the EU identifying as suitable first movers. Most industries and product categories, however, are expected to be subject to regulation by 2030. The legislation will be equally applied to products intended for consumers (B2C) as well as for professional (B2B) markets.

The first delegated act specifying the product groups is expected to come into force soon. These groups will likely include those highlighted in the Circular Economy Action Plan (CEAP), such as electronics and ICT, batteries and vehicles, textiles, plastics, construction, and buildings. Although packaging is mentioned in CEAP, it is not expected to be covered by a separate delegated act.[1]

Furthermore, even though the ESPR is European legislation, it is expected to have a global impact, given the inherently global nature of supply chains, it is also likely that other regions will follow. This is partly due to political ambitions on fighting climate change, but also due to making manufacturing companies competitive. Since European legislation is implemented in a way that if you want to still sell or supply the European market you will need to follow the ESPR. By introducing similar legislation, pressure will be put on manufacturers to level up their game and still be able to export to regions with legislation in place. Of the followers, the expected upcoming US legislation will most likely be the one with biggest impact.

As noted, the ESPR regulation is currently in development, implying several uncertainties such as:

  • Scope: How should requirements vary based on company size, and what is the appropriate level for applying Digital Product Passports?
  • Technology: Questions arise about the storage and management of data, the choice of carriers, and the methods for accessing data
  • Data: What specific information will be required, and to what extent? Additionally, considerations include identifying who will be responsible for collecting, updating, and verifying the data

These aspects are still evolving and will be clarified as the ESPR regulation progresses. This makes it difficult to accurately balance your efforts now. As a CIO it’s essential to be prepared to act and understand potential challenges ahead.

 

Digital Product Passport Overview:

A Digital Product Passport (DPP) serves as a tool to gather and share information about a product throughout its entire lifecycle. It helps identify what product it is and showcases basic information as well as detailed data on sustainability, environmental impact, and recyclability of a product. The DPP records data from the entire supply chain, detailing where the raw materials come from and how the product was made. This information is then shared among different stakeholders, unlocking product insights and highlighting advantages and practical applications.

The data can be shared using different methods such as QR codes, barcodes, or NFC technology. The most likely option initially is using QR codes, given their widespread adoption across diverse industries. This preference is attributed to their common usage, ease of use, strength, and flexibility in connecting with smart devices. QR codes have potential security risks due to its simplistic nature with weblinks that can easily be exchanged by just adding an exchanged QR code on top of the original one, where users end up in fraudulent sites. It is therefore likely that more robust technologies will be developed over time.

All information requirements for a digital product passport should be relevant and serve a purpose. Each piece of information should have a clear scope and provide tangible user benefits throughout the product life cycle. At this stage, there are uncertainties regarding the information required for different industries and product groups. However, the European Commission will further elaborate on this in delegated acts [2], aiming to enhance the following product aspects:

  • Product durability and reliability
  • Product reusability
  • Product upgradability, reparability, maintenance and refurbishment
  • The presence of substances of concern in products
  • Product energy and resource efficiency
  • Recycled content in products
  • Product remanufacturing and recycling
  • The product’s carbon and environmental footprint
  • The product’s expected generation of waste material

On top of this, producers might use the digital product passport for linking to relevant documentation on the products such as user manuals, support pages, software download pages, etc.

 

Challenges for the CIO

In most companies the implications this new legislation will create will not be owned by the CIO or the IT organization. The ownership will likely land in R&D, Product management, Operations or the Legal or Sustainability departments. However, like what happened in the case of the GDPR legislation that was owned by HR and where many of the required actions landed on the IT organization to resolve, it is logical to assume this will be the case also with the ESPR/DPP legislation.

Business requirements on R&D, Operations and Legal will be broken down into IT requirements and will be put towards the IT department as change or transformation requests.

Like other major legislative agendas have done in the past (e.g., GDPR), the ESPR will significantly impact the IT landscape of large companies. It will affect applications that handle product data. In organizations focused on developing, manufacturing, and selling products, experience at Opticos indicates that this typically encompasses 30-40% of their total IT landscape [3]. In large manufacturing companies, this can equate to hundreds of applications. While not all applications will be affected, many may need to be adjusted to contribute data to the Digital Product Passport (DPP).

What kind of changes might be needed in those systems?

  • Product Data Architecture: To keep and publish product data to end consumers as regulated in ESPR and DPP the ability to keep consolidated product data in one place and being able to publish this is essential. When multiple applications handle data to be used in a publication platform like this, establishing a consistent metadata structure and standardizing attributes, both in terms of descriptions and data content, become essential. This can pose a significant challenge for large companies, particularly those with competing silos of product information, often a result from historical factors or mergers
  • Product Data Identifiers: To ensure accurate data retrieval, each product requires unique identifiers, such as model numbers, article numbers, and serial numbers. The combination of these identifiers should be unique within the company and, ideally, globally. Standards exist in this area, such as GTIN from GS1; however, it is worth noting that some of these standards are published by organizations with commercial interests
  • Product Data Quality & Supply: Even after addressing the challenges related to product data architecture and unique product identifiers, the most significant hurdle remains; populating product data with high-quality information for critical attributes. While this may seem straightforward, many of these attributes likely consist of data not directly controlled internally, requiring input from suppliers. Ensuring both the delivery and quality of this data will necessitate updated contracts with suppliers, as obtaining this information can involve substantial efforts on their part. Most manufacturers and their suppliers may even need to trace several steps down in value chain to gather data on metrics such as CO₂ emissions or water usage

Even with strong efforts to address the challenges outlined above, implementation within existing IT systems can still encounter obstacles, particularly with older legacy applications that may be unable to handle necessary changes without disrupting other operational functions. Additionally, there is the question of how much to invest in outdated platforms, especially those that are already past their intended lifecycle.

Many companies will investigate mitigating solutions, building translating and aggregating data lakes / data fabrics instead of fixing the issues in the source systems. This might work for publishing the DPP data to end consumers, however it will introduce other issues. These data lakes / data fabrics might become business critical solutions for other departments that have direct access to the customers like product documentation and aftersales. These departments need to see the same information that the end-consumers see and suddenly, these solutions have become part of the day-to-day operational IT solutions creating transactional data like tickets and spare part sales. Getting these normalized and aggregated solutions to provide data in a non-normalized and de-aggregated structure to these operational solutions might be quite cumbersome. However, for many companies these aggregated solutions built on top of the old backbone will be needed to at all be able to meet the requirements put by DPP in time. When establishing these solutions, it is important to understand the long-term risk in those solutions.

 

Our recommendations for the CIO and initial actions

So how should you as the CIO go about taking on these challenges? Opticos has a 7-step approach that can give you a better understanding of the challenges ahead for your organization. Step 1 is understanding the stakeholder situation and step 2-7 is to understand the magnitude of the data and IT challenges. Even if the ownership of the data challenges most likely resides in the business it’s not likely that the business themselves will have the capability to conduct the analysis required in step 2-7. That’s why it’s likely that the IT department will be involved and as the CIO it is wise being prepared for this.

The 7-step approach:

  1. Investigate how and when ESPR will affect your company. As a CIO this might initially mean understanding who is responsible for this question and how your company’s products are affected.
  2. Get an understanding of what data you need to publish and what potential gaps you have in supplying them
  3. Understand if data is needed on the product model/article level or on the individual product level. The latter increases complexity and demands of traceability quite dramatically and if it’s possible to avoid this then big efforts can be avoided
  4. Understand and set the data architecture needed to achieve the requirements and understand if this architecture can be achieved with existing data sources, modified data sources or new data layers aggregating and transforming the data into the architecture you need
  5. For the data you already have available investigate if the format and granularity are consistent over different data sources and in line with expectations on the ESPR publishing formats
  6. For the data you don’t have available, investigate if this data can be supplied internally by your suppliers and if the latter start discussions and negotiations with them to get that data provided in the future
  7. Secure that the data collected and published is of equivalent quality, since it’s going to be visible to end-consumers and authorities and bad data will lead to a bad market reputation in the same way as bad product quality

The time these challenges will require depends on your as-is situation, as well as the complexity of your PLM/ERP landscape. It is also dependent on how big and complex your product portfolio is and lastly how big your network of component suppliers is. For a large-scale manufacturing company, selling complex and assembled products, we at Opticos estimate that this is years of work rather than months. Therefore, good advice is to start sooner rather than later to meet the regulation timeline.

 

Business Enablement and Risks

The legislators in the EU are mostly motivating this type of legislation with decreasing environmental impact and fighting climate change. However, there is also a competitive advantage for companies to adopt this in a timely and appropriate way. By adopting the legislation companies can protect their products from being banned on the European market. Furthermore, if other less serious competitors’ products are banned then there will be less competition in the market. So, less competition can be expected for those that do this right. Long term when governments or legislating bodies in other parts of the world eventually put similar legislation in place, the companies with this infrastructure in place will have an advantage over those that chose not to make these efforts up front.

The primary risk for companies starting early and attempting to be early adopters is that they might invest in solutions that do not ultimately fulfil the actual requirements, once clarified and formalized. Solutions in this field remain relatively immature, and there are opportunistic players aiming to be first on the market, often without offering effective, comprehensive solutions. Conducting thorough evaluations and relying on neutral, unbiased advice is essential to avoid missteps.

Being an early adopter also means risk overinvesting if legislation in the end has lower requirements than was originally being communicated. However, simply waiting for an industry standard to emerge also carries significant risk, as delays could ultimately lead to products being barred from the European market due to non-compliance. From Opticos we believe that our 7-steps approach as described above is a balanced way of preparing without making unnecessary commitments and investments early on.

 

Conclusion

Aligning with ESPR requirements is crucial for companies aiming for sustainability and market success. CIOs as well as other business leaders should strive to be well prepared for upcoming challenges. ESPR/DPP will for most manufacturing companies require substantial system upgrades or changes and enforced data management. It will also require a completely new level of supplying data from suppliers. Acting proactively as a CIO in accordance with Opticos recommendations in this viewpoint will lead to reduced risk of IT ending up on the critical path for future compliance. A proactive CIO will also gain trust in the business and strengthen the position as a strategic partner.

For CIOs or business owners seeking guidance in streamlining this process, Opticos offers expertise to navigate ESPR compliance effectively. Please reach out to see how we can support your organization in this sustainable production transition.

 

[1] The proposal for the ESPR regulation encompasses all industries except for food, feed, medicinal products, veterinary medicinal products, living plants, animals and micro-organisms, products of human origin, and products of plants and animals directly related to their future reproduction.

[2] Delegated acts are a type of legal instrument that allows the Commission to make specific changes or clarifications to existing legislation without requiring a full legislative process (like passing a new law through the European Parliament and Council each time)

[3] Figures are based on Opticos experience from at least 3 major Swedish companies in the global manufacturing segment. All of those have more than 40 % of applications containing product data. It’s mainly within R&D, Operations/ERP, SCM, Purchase, Sales, e-commerce and Aftersales.

Get Access to More Insights from Opticos

Subscribe

Authors

Hans Bergström, Teodor Danielsson and Nils Andersson

Hans Bergström, is a senior business leader and partner at Opticos. He has been a Developer, Project Manager, IT strategist, Enterprise Architect and is currently associated with Strategic IT capability at Opticos.
Teodor Danielsson is a seasoned business leader and Partner at Opticos. He is also heading the Strategic IT capability.
Nils Andersson, is a senior consultant at Opticos who specializes in sustainability.

You might find this content relevant as well:

news

Sourcing of Agile Development with Shared Risk

Summary

For organizations looking to gain the benefits and flexibility of agile development, while still maintaining predictability and cost control, shifting to a risk-sharing contract model is a key strategy when sourcing application maintenance and development services from external suppliers.

Introduction

In the evolving landscape of digitalization, agile development has become a preferred approach for delivering high-quality solutions. The flexible and iterative nature of agile makes it an attractive approach in an environment where requirements often evolve over time. However, this flexibility can also introduce uncertainty in delivery timelines, costs, and outcomes for an organization.

Challenges

When sourcing Application Maintenance and Development (AM/AD) services in an agile context, through external suppliers, a common challenge arises. Agile teams and development services are generally sourced on a time and material basis – often without a common supplier strategy. Time and material contracts allow for a great deal of flexibility as suppliers provide resources based on the number of hours worked, and customers adjust priorities as new insights emerge during development. However, this approach places the bulk of the risk on the customer, as they are paying for the supplier’s time, regardless of the quality or functionality delivered. Without a common supplier strategy there is also a risk of misalignment between the suppliers (in situations where they may have different objectives) as well as with the customer’s objective.

Risk-sharing through commitment to results

A more effective approach to outsourcing agile AM/AD services is through contracts focused on risk sharing, clear responsibilities as well as commitment to common ways of working.

Risk should be distributed between the customer and the suppliers, where the supplier should be committed to support the customer’s strategic objectives and incentivized to deliver tangible and predictable results.

These risk-sharing focused contracts can include outcome-based milestones or other mechanisms that align the supplier’s performance with delivery success.

Mechanisms to ensure commitment, shared risk and predictability

Below are some examples of mechanisms that could be introduced to ensure that both predictability and shared risk are built into the agile sourcing agreements:

  • Preferred suppliers with a formal responsibility to enable and support the customer’s transformation or innovation agenda
  • Agreed team size and cost: The customer and supplier agrees on a baseline for the configuration and size of the agile team(s) – and thus a fixed running cost for the team(s). Changes to team configuration or size, and thus running cost, should be agreed upon between both parties.  
  • Handshake agreements at the sprint level: At the beginning of each sprint, both the customer and supplier agree on the scope of functionality to be delivered. These sprint-level handshakes act as micro-call offs, ensuring the supplier commits to delivering within the agreed scope and timeframe
  • Increment-level follow-ups: At the end of each increment (typically four to six sprints), both parties conduct a formal follow-up on sprint delivery. These follow-ups serve as checkpoints where committed delivery within the increment is compared with actual delivery. By tying payments to the performance over an increment, the customer can better control costs and ensure predictability.
  • Increment delivery precision: An increment delivery precision is a powerful risk-sharing tool. It holds the supplier accountable for delivering to, at least, a pre-agreed percent of the functionality in the sprint handshakes and pre-agreed quality level, as a condition to full payment. If the delivery falls short, a part of the payment could be deducted. The size of the deducted payment could be dependent on both the quality and functionality delivered. e.g quality and delivery performance. On concurrent increment under-performance, an even larger amount could be deducted.
  • Earn-back mechanism: the supplier could be given the possibility to recover part of the deducted payment if the supplier’s performance significantly exceeds the committed delivery during an agreed number of increment follow-ups after one follow-up resulting in payment deduction due to under-performance.
  • Fixed price bundling of base-services: To achieve a high degree of predictability, base-services such as upgrades, incident management, planning and reporting, could be bundled together as a fixed price commitment for the supplier. This would also put a pressure on the supplier to automate and simplify services within the fixed price scope.

 

The ultimate goal of these mechanisms is to enhance cost efficiency in agile AM/AD sourcing by ensuring predictable delivery and creating a fair distribution of risk between the customer and suppliers. By focusing on agreed team and scope, performance-based payments, and rigorous follow-up mechanisms, customers can limit the financial risk of delays or scope creep while still benefiting from agile’s adaptability.

This shared risk approach motivates suppliers to deliver value while safeguarding the customer from overpaying for underperformance. Additionally, by linking full payment to outcome, organizations can ensure that their investments yield tangible results, promoting both cost efficiency and high-quality results.

Get Access to More Insights from Opticos

Subscribe

Authors

Maria Nelsson, Maria Wennerberg

Maria Nelsson, is a seasoned business leader with over 25 years of experience in strategic IT and governance, sourcing, and leading IT/business transformations.
Maria Wennerberg is a senior independent business leader with over 25 years of experience in Advisory Services and Sourcing. She is Associated with Opticos and enjoys working with large Transformations and Change management.

You might find this content relevant as well:

news

Tools for Managing Cultural Differences in Global Teams

In today’s global business environment, companies increasingly partner across borders, forming diverse teams that bring together different nationalities and cultural perspectives. While these collaborations offer significant opportunities for innovation, they also present challenges, particularly when it comes to managing change. Ensuring the success of change initiatives in culturally diverse teams requires businesses to address the unique hurdles posed by cultural differences, communication styles, and geographically dispersed teams.

From our consulting experience, we’ve seen that managing these challenges demands structured approaches that address cultural and communication barriers. It’s not about prescribing a one-size-fits-all solution but finding the right tools to adapt to the specific cultural contexts of the teams involved.

Common Challenges in Global Teams

Global organizations often face recurring challenges when working with culturally diverse teams. Some of them are:

 

Managing Cultural Differences in Practice

Here are some practical tools and strategies that we’ve found effective in managing cultural diversity:

  1. Cultural Awareness Workshops
    Cultural awareness is more than a formality—it’s an essential starting point for collaboration. We’ve found that offering targeted cultural awareness training helps team members understand diverse perspectives and workplace behaviours. For example, in hierarchical cultures, employees may defer to authority, while in more egalitarian cultures, decision-making tends to be participatory. By educating employees on these differences, workshops enable better communication across hierarchical and cultural divides.
  2. Social Engagement and Team-Building Activities
    Building trust and rapport is crucial for change initiatives to succeed, particularly in collectivist cultures where group harmony is prioritized. Organizing social activities or team-building exercises helps break down barriers and fosters a culture of openness. These activities also contribute to work-life balance in more indulgent cultures, improving team satisfaction and cohesion.
  3. Establishing Ground Rules and A Common Vision
    Establishing clear ground rules—such as prioritizing the company’s success, encouraging accountability, and fostering collaboration—ensures that all stakeholders are aligned and working towards the same goal. This helps streamline interactions and creates a structured framework for effective teamwork across cultural boundaries.
  4. Stakeholder Mapping and Engagement Plans
    Identifying key stakeholders and understanding their influence is vital in high power distance cultures, where leadership approval may hold significant weight. In cultures with high uncertainty avoidance, early and frequent stakeholder engagement reduces resistance to change. Effective engagement plans ensure that stakeholders are kept informed and that their feedback is integrated throughout the change process.
  5. Unified Communication Platforms and Escalation Processes
    Clear communication is a cornerstone of managing change. A unified communication platform allows teams to share updates, raise concerns, and provide feedback, no matter their location. In high uncertainty avoidance cultures, defined escalation processes offer security and clarity, while in more individualist cultures, such platforms empower team members to voice concerns directly.
  6. Empowering Local Leadership and Change Agents
    Local leadership plays a key role in bridging cultural gaps. Leaders’ familiar with local customs can adapt change initiatives to meet their team’s expectations. In hierarchical cultures, empowering local leadership ensures that change efforts respect the existing structure, while in collectivist cultures, local leaders can foster group consensus and collaboration, making change more sustainable.
  7. Feedback Loops and Continuous Improvement Mechanisms
    Regular feedback loops are essential to address concerns early and adjust change initiatives based on real-time input. We’ve found that continuous improvement mechanisms are especially effective in long-term-oriented cultures, where gradual growth is highly valued. In high uncertainty avoidance cultures, frequent feedback reduces fear and builds confidence in the change process.

 

Case Study: Managing Collaboration Across Multiple Stakeholders in a Global Transformation

In one of our recent engagements, we worked with a client facing the challenge of managing collaboration across multiple stakeholders—both internal and external—spanning several regions, including the UK, Sweden, the US, and India. The complexity of this setup was compounded by the fact that different cultures, priorities, and working styles needed to be aligned for the transformation to succeed.

To address these challenges, the company implemented several ground rules that ensured effective collaboration:

  1. Put the Company First: This principle guided all stakeholders to prioritize resolving business problems over individual disagreements. It helped to maintain focus on the company’s overall success.
  2. Be Accountable: Everyone was expected to demonstrate accountability for their roles and responsibilities. This fostered a sense of mutual responsibility, encouraging transparency and ownership.
  3. Work Collaboratively: The mantra was simple but powerful—everyone’s efforts collectively contributed to the company’s success. This reinforced the importance of partnership across internal teams and external suppliers.
  4. Share Knowledge: Knowledge sharing was emphasized as essential to the success of the transformation. All stakeholders, whether internal or external, were encouraged to ensure that no part of the ecosystem was left unprepared.

Alongside these ground rules, we facilitated cultural awareness training to help the teams understand the diversity in communication styles, decision-making processes, and approaches to authority across different regions. This holistic approach—combining structured ground rules and cultural sensitivity—ensured that all stakeholders, regardless of location, could work together cohesively and achieve the desired outcomes.

 

Conclusion: Communication, Relationships, and Psychological Safety as Catalysts for Success

Managing cultural differences in global teams requires more than just awareness—it demands flexibility, trust, and a commitment to continuous learning. From our experience, the most successful teams are those that understand the pivotal role of communication and relationships in bridging cultural divides. Open, flexible communication fosters trust and builds psychological safety, where team members feel comfortable expressing their ideas and concerns without fear of judgment. This environment of psychological safety is crucial in diverse teams, as it encourages open dialogue and supports collaborative problem-solving.

Equally important are the relationships within the team. Strong personal connections, built through both formal and informal interactions, help break down barriers and create a sense of unity. These relationships make it easier for team members to navigate cultural differences and collaborate effectively, even in high-pressure situations.

By embedding communication, psychological safety, and relationship-building into the core of your change initiatives, organizations can turn cultural diversity into a strength. When teams embrace cultural differences as an asset, not a challenge, they unlock the full potential of their global workforce. This inclusive, adaptive approach not only enhances collaboration but ensures the long-term success of your initiatives.

Get Access to More Insights from Opticos

Subscribe

 

Jasmeen Kaur & Maria Ekberg

Jasmeen Kaur, Senior consultant with experience in Intelligent Automation, Process Transformation and Program Management.
Maria Ekberg is a seasoned business leader with over 25 years of experience in strategic business planning, sales and change management.

You might find this content relevant as well:

news

Measuring the Impact of Digital Transformation Projects

As digital business transformation becomes increasingly vital, management is willing to embark on initiatives that are expensive and both time and resource-intensive. But are these initiatives successful? How can progress and results be assessed? Many organizations invest heavily in digital technologies but are unable to determine if the investment has generated value or if they are headed in the right direction.

Digital transformation can go beyond technology and envelop the entire business. Successful digital initiatives require a strategic approach, strong leadership, employee engagement, and a focus on meeting customer needs. According to Harvard Business Review, although 89% of large companies globally have a digital and AI transformation underway, only 31% have captured the expected revenue lift and 25% have seen expected cost savings from the effort.

As digital projects become more complex and resource-intensive, it becomes necessary for organizations to quantify their impact effectively. A clear framework can help align initiatives with strategic goals, optimize resource allocation, manage risks, and make informed decisions. This in turn can foster a culture of continuous improvement and innovation. It can further ensure that digital projects not only deliver immediate benefits but also contribute to long-term organizational growth. This enables the management team to prioritize scalable initiatives capable of substantially improving the organization’s performance with fast, minimally viable outcomes that can be improved over time.

 

A measurement framework is key to digital transformation success

Unfortunately, this is an oversight that undermines many transformations before they even begin. It becomes easier to course-correct with a framework to identify weak points and monitor progress. A digital initiative is different for every organization and there are no universal metrics that can make it easy.

Figure 1: Building a framework for measuring impact of digital transformation projects

So, where do we start to identify the right metrics and KPIs? Organizations must focus on what they want to achieve with the transformation efforts and assess their digital maturity level to identify the right set of KPIs. Once the KPIs are identified, one must have the tracking and monitoring systems in place along with a robust reporting mechanism to enable analysis and decision making.

 

Focus on the right metrics to guide your initiative to success

Before introducing any KPIs, it would make sense to set the context by answering some preliminary questions. For instance, what are the key drivers for the digital project? Do we have the right systems to collect and analyse data necessary for the said KPIs? Organizations should understand their digital maturity levels and drivers for the project before establishing the right set of KPIs.

Figure 2: Measuring Digital Business Transformation

Organizations can have different objectives for their transformation initiatives, and they should begin by aligning KPIs with organizational goals, ensuring they reflect desired business outcomes. They can start by involving key stakeholders to identify relevant metrics that impact various departments. Every stakeholder may look at different metrics depending on their roles, and it is thus important to keep them informed, take feedback and incrementally course correct if necessary.

As the business objectives evolve, regular review and adjustment of KPIs can enable continuous improvement and successful digital transformation. When discussing the importance of measuring the success of any transformation project, we can draw on our experiences from previous projects at Opticos.

Every organization tracking its digital transformation progress will naturally employ various measurement tools. Given that the selection of these tools impacts the efficiency of measurement efforts, as well as the workloads and workflows of their users, selecting the right tools can significantly enhance the overall effectiveness of the transformation program.

Opticos provides expert knowledge and helps organizations to develop successful frameworks, analyse current reporting mechanisms, establish robust KPIs and identify the right tools for measurement. Contact us to know more.

Get Access to More Insights from Opticos

Subscribe

Authors

Abhishek Kale, Matilda Johansson

Abhishek Kale, is a Senior Consultant with experience in Process Excellence, Digital Transformation, and ERP Consulting across multiple business sectors.
Matilda Johansson,is a consultant with experience in Digital Transformation projects across various sectors. She has been involved in ERP implementations and has taken on roles within change management and communication.

You might find this content relevant as well:

news

The Four Pillars to Succeed in Change Management

Introduction

Change management is a crucial process for any organization preparing to transition from one state to another. At Opticos, we understand that driving change means not just implementing new strategies but anchoring them within the organization to achieve long-lasting effects. Our experience in change management has shown us that the key to success lies in addressing four fundamental areas: Motivate, Involve, Enable, and Empower. These areas are applicable irrespective of the change management framework chosen, ensuring their universal relevance and effectiveness.

Motivate: Creating the Drive for Change

The first stage of any change management process is motivation. At Opticos, we believe in preparing minds through clear communication of the benefits and necessity of the change. This is done through:

  • Establishing a Clear Vision: We develop a concise, inspiring vision of the future state that resonates with all stakeholders.
  • Aligning Objectives: By aligning the change with the organization’s strategic goals, we ensure everyone understands its importance.
  • Creating Enthusiasm: Through inspirational leadership and communication, we generate excitement and commitment towards the change.

Involve: Engaging Stakeholders

Successful change cannot happen in isolation. Involving stakeholders at every level is crucial. We do this through:

  • Psychological Safety: We create a safe environment where team members feel comfortable sharing their perspectives and concerns.
  • Inclusive Decision Making: By involving diverse viewpoints, we ensure decisions are well-rounded and widely accepted.
  • Active Participation: Encouraging active participation from all stakeholders helps build ownership and accountability for the change.

Enable: Facilitating the Change Process

Enabling involves creating an environment that supports the change and makes it sustainable. Our approach includes:

  • Training and Development Programs: Comprehensive training programs equip employees with the skills needed for new ways of working.
  • Support Systems: Building robust support systems, including mentoring and coaching, guides employees through the change.
  • Delegating Authority: Empowering teams with the authority to make decisions gives them a sense of ownership and helps accelerate the change process.

Empower: Equipping for Success

Empowerment involves providing the necessary tools, authority, and confidence to embrace change. We achieve this by:

  • Infrastructure and Resources: Providing the necessary infrastructure and resources to support new processes and technologies.
  • Continuous Improvement: Implementing feedback loops to continuously refine and improve the change process.
  • Reinforcement: Reinforcing new behaviours through recognition and rewards ensures that the change is anchored and becomes part of the organizational culture.

Opticos’ Proven Framework

At Opticos, our change management framework is agnostic, allowing us to tailor our approach to the specific needs of each organization. We leverage well-known frameworks such as Kotter’s 8-Step Process and Prosci’s ADKAR model, integrating them into our unique toolbox to drive successful outcomes.

Conclusion

Change is not a straight line, and navigating it requires careful planning and execution. At Opticos, our expertise in motivating, involving, enabling, and empowering teams has helped numerous organizations achieve their desired business benefits through effective change management. Our commitment to sustaining change ensures that it goes beyond implementation and has a lasting impact on the organization.

Get Access to More Insights from Opticos

Subscribe

 

Maria Ekberg & Jasmeen Kaur

Maria Ekberg is a seasoned business leader with over 25 years of experience in strategic business planning, sales and change management.
Jasmeen Kaur, Senior consultant with experience in Intelligent Automation, Process Transformation and Program Management.

You might find this content relevant as well:

news

AI in Predictive Manufacturing

Introduction

Manufacturing companies face increasing pressure to optimize operations, reduce costs, and enhance competitiveness. To meet these challenges, manufacturing and supply chain companies are turning to predictive manufacturing. This approach leverages advanced analytics and AI algorithms to anticipate disruptions, optimize production processes, and enhance overall efficiency.

Moving effectively towards predictive manufacturing requires a strategic approach and the right tech solutions. In this article, we’ll explore the role of AI in predictive manufacturing, its key use cases, best practices, and steps for businesses to get started.

Industry 4.0 and predictive manufacturing

Industry 4.0 is the ongoing automation of traditional manufacturing and industrial practices using smart tech. It is a fusion of technologies that blur the lines between the physical and digital domains. Six primary technologies are driving Industry 4.0; internet of things, cloud computing, artificial intelligence, federated AI, cybersecurity and digital twins.

Convergence of technologies enabling predictive manufacturing

The convergence of these technologies creates a more intelligent and connected manufacturing environment, offering real-time analytics, increased flexibility and efficiency and customer-centric manufacturing.

Enabled by these tech developments, predictive manufacturing is a data-driven approach that uses historical and real-time data to forecast future events and outcomes in the manufacturing process. It involves leveraging advanced analytics techniques, such as machine learning and predictive modeling, to analyze data from various sources, including sensors, equipment, and production systems. By identifying patterns, trends, and anomalies in the data, predictive manufacturing enables companies to make informed decisions, optimize processes, and mitigate risks.

When manufacturing machinery are fully integrated with digital systems, real-time data collection and analysis becomes possible, and this data can then be used to optimize production processes, predict maintenance needs, and improve overall quality control. Automation enables process efficiency and predictive algorithms enable supply chain flexibility, ultimately where the entire supply chain becomes more agile to support customer-centric manufacturing.

Building a resilient, sustainable and efficient supply chains through digital twins

There are many applications and use-cases to be explored with these technologies, but the ability to build resilient, sustainable and efficient supply chains is perhaps one of the most interesting. Imagine a supply chain fully modeled as a ‘digital twin’, including the manufacturing process, where the physical equipment is fully integrated with digital systems, offering a real-time view of your supply chain flows. While this is a daunting task, value can be realized in a stepwise manner. For example, benefits can already be created with an isolated view of 1) the manufacturing process, or 2) with a limited extension down- or up-stream.

Digital twins leverage the data and connectivity of Industry 4.0 to create a dynamic digital counterpart that reflects the real world. This allows for better decision making, improved efficiency, and overall optimization across the entire manufacturing process. A digital twin is essentially a virtual representation of a physical object, process, or system. It’s constantly updated with data from the physical counterpart using IoT sensors. This creates a bridge between the physical world and the digital world, allowing for better monitoring and analysis. This can also be extended with concepts like federated AI/machine learning to further enhance dynamic adaptability, privacy and data security. In essence, the combination of federated AI and digital twins creates a dynamic and collaborative manufacturing ecosystem. Factories can leverage the power of AI and real-time data, while still maintaining data privacy, to optimize processes, improve efficiency, and drive innovation across the industry.

Creating value from digital twins

There are four pockets of distinct value to be explored: efficiency gains, innovation, supply chain resilience and sustainability. This value can be illustrated through a set of straight-forward use-cases; how to sharpen foresight, optimize production, securing delivery and reduce environmental impact.

Sharpening foresight: AI for predictive maintenance

The factory utilizes digital twins for equipment health monitoring. Historical maintenance data is analyzed to identify common failure patterns and lead times for specific equipment types. This allows for:

  • Developing predictive maintenance schedules based on equipment usage and sensor data from the digital twin.

  • Proactive stocking of spare parts most likely to fail, minimizing downtime.

  • Optimizing technician training by focusing on the most common maintenance tasks identified through historical data analysis.

Optimizing production: AI for efficiency 

The factory leverages digital twins to track production processes in real-time. Data analysis from the digital twin identifies bottlenecks and inefficiencies. This allows for:

  • Simulating production scenarios using the digital twin to test potential solutions for bottlenecks.

  • Investing in targeted automation solutions to address specific bottlenecks identified by the digital twin.

  • Optimizing production line layouts based on data insights to improve material flow and reduce processing times.

Securing delivery: AI for supply chain resilience

The digital twin tracks inventory levels and production capacity. Advanced data analytics are implemented on historical supplier and logistics data to predict potential disruptions. This allows for:

  • Identifying alternative suppliers and negotiating backup agreements with the orchestrating party to ensure parts availability.

  • Maintaining buffer stock of critical materials based on historical usage patterns and lead times.

  • Developing contingency plans to adjust production schedules or source materials quickly in case of disruptions.

Reduce environmental impact: AI for sustainability

The factory’s digital twin monitors energy consumption. Data analysis identifies areas for improvement. This allows for:

  • Investing in energy-efficient machinery based on recommendations from the digital twin’s analysis of energy usage patterns.

  • Optimizing production scheduling to minimize peak energy consumption periods.

  • Exploring alternative energy sources like solar or wind power based on data on the factory’s location and energy needs

Now imagine the potential if this factory operates within a supply chain ecosystem together with other parties represented by similar digital twins. Each factory/party leverages a digital twin, a virtual representation of its physical machinery and processes. By orchestrating these parties, and leverage federated data and insights, this can strengthen the overall resilience and efficiency of the supply chain in a significant way. A collaborative approach would foster continuous improvement and pave the way for a more sustainable future for all participants.

 

Taking the first steps towards predictive manufacturing

The potential of AI in predictive manufacturing is undeniable. Transitioning to this data-driven approach requires careful planning and execution. AI and new tech is not a silver bullet and the necessary groundwork will increase chances of success:

Assess readiness: Conduct a thorough evaluation of your current manufacturing processes, data collection capabilities, and infrastructure. Identify areas where data collection can be improved and invest in necessary sensors and IT systems.

  • Set target KPIs: Clearly define your goals for implementing predictive manufacturing. What areas do you want to improve – efficiency, maintenance, sustainability? Establish key performance indicators (KPIs) to track progress towards these goals.

  • Craft data strategy: Plan how you will collect, store, and analyze data.  Ensure data security and establish a governance framework for responsible data use.

  • Build capabilities: Consider hiring data scientists or partnering with AI specialists who can help develop and implement AI models for predictive maintenance, process optimization, and supply chain management.

  • Start small and scale: Begin with a pilot project focused on a specific area like predictive maintenance for a single machine or production line.  This allows you to test the technology, refine your approach, and demonstrate the value proposition before scaling up across your entire operation.

  • Continuous improvement: Predictive manufacturing is an ongoing journey. Regularly evaluate your AI models and data analysis processes to ensure they remain accurate and effective.  Be prepared to adapt and refine your approach as your needs and the manufacturing landscape evolve.

In addition, look beyond the boundaries of your own organization. The magic happens when connecting a series of federated model to create a dynamic, data-driven supply-chain.

Algorithma’s digital twin solution for the shipping industry

Author

This article is authored by our strategic partner Algorithma, a pioneering strategy, transformation and AI firm committed to driving business impact through the latest in AI and tech.

Get Access to More Insights from Opticos

Subscribe

You might find this content relevant as well:

news

Strategic Considerations While Addressing Your Operating Model

Introduction

Companies on a growth journey often come to a point when taking the decision to set aside time to refine and/or reassess its operating model. Initiating such a program is often associated with multiple challenges but would also bring multiple benefits if successful. Regardless of whether your business model is built on flexibility and tailored solutions or aimed at streamlined operations focusing on efficiency, volume, and cost – a well-oiled operating model is needed and can help your business in the execution part.

To help getting off to a good start and to make sure an operating model program keeps on track, several factors need to be addressed. The complexity and scope of such a task may be daunting. To help power program efficiency and limit the risk of not reaching the desired target, we propose paying extra attention to a number of aspects as listed below.

Laying the Groundwork: Defining Processes, Scope, and Objectives

 

1. Agree on a high-level process landscape

Without strong agreement on what your company’s key business processes are, there is a high risk of getting stuck in debates on scope, (functional) responsibilities, and overall ambition level with the Operating Model program.

Ensure agreement of scope with key stakeholders early on and assign ownership to each of key process to help make sure key decisions are taken in the work with developing future state. Below is an example on how a high-level process landscape might look like for one company. However, what is considered a supporting process for one company can for another be considered a core business process.

2. Identify and Address Core Pain Points

Understanding and prioritising the real pain points across the organisation is important to ensure that focus is on solving the most pressing questions that help the business see the true value of the work.

Engage with people in the organisation to identify, categorise, and prioritise pain points in appropriate categories (e.g., process, people, IT, governance). Typical pain points within e.g., the travel & expense process in finance, might be a time-consuming process for handling expense receipts, a document intensive process for applying for corporate credit cards and lack of personnel to handle the requests.

 

3. Tackling High-Impact Areas First: Do not try to solve everything at once

Trying to solve everything (all at once) and not just the most important pain points may probably result in a prolonged and inefficient operating model program and the desired target may not be achieved.

Build your future state by continuously focusing on the true pain points and manage your scope to ensure progress (e.g., in sprints). It is not possible to address all management processes, core business processes and supporting processes at once. If, for example, finance is considered one of the areas with most urgent pain points, it should be addressed first and later pain points within other areas can be addressed.

 

4. Collaborative Redesign: Involve relevant stakeholders

Cross-company involvement is vital to ensure that the organisation understands what the desired target is and the expectations on them as well as their contribution in capturing current issues that they experience (which would need to be addressed). Change is embraced best when you feel involved.

Make sure to involve people across the business, ensure that there is a combination of functional representation and operations. For instance, when addressing a process such as order to cash, many stakeholders (finance controller, treasury, tax, project manager, etc.) need to be involved as they are all part of the process.

 

5. Set up a Governance Framework

Inefficiencies and unclarities often come down to issues related to decision-making at various levels in the company. Agree on what key decisions there are, who should take them and when. Avoid bottleneck situations and limit the number of roles involved when possible.

For example, within finance there are many processes and the processes in turn have multiple decision points or approvals. There should be one role responsible for the order to cash process to ensure that the process runs smooth and without bottlenecks. The decision points within the process can have other roles assigned to them, e.g., treasury to decide on funds flow.

Conclusion

In summary, to ensure program efficiency and achieving the target when reassessing the operating model, make sure to agree on a high-level process landscape, find the true pain points, address what is important and do not try to solve everything. Try to involve the business as much as possible, and secure that governance and decision-making are clear in the processes.

At Opticos, our team of experienced management consultants specialize in guiding companies through complex operating model transformations. With our strong background in IT management and business process optimization, we offer tailored solutions to align your strategy, people, processes, technology, and governance for sustainable growth. If you’re embarking on an operating model reassessment, Opticos can be your trusted partner.

Get Access to More Insights from Opticos

Subscribe

Authors

Jessica Grundström & Rickard Holmkvist

Jessica Grundström is a senior consultant with experience in working with companies and their operating model journey as well as project and change management across various industries.
Rickard Holmkvist is a seasoned leader, COO & Head of consulting services at Opticos. He has over 20 years of experience in strategy, sourcing and business consulting across industries.

You might find this content relevant as well:

news

Ten Digital and AI Innovations Redefining Business Dynamics

In the fast-paced realm of business, a silent revolution is coming in the form of a digital wave. Digital and Artificial Intelligence (AI) innovations are reshaping the landscape of how organisations operate, unlocking unprecedented efficiency, agility, and strategic insights. Digital technology now touches almost every part of our lives thanks to new technologies like cloud computing and artificial intelligence, along with new ways of building software. As long as technology keeps advancing, your business will need to adapt. So, it’s important to see digital and AI transformation as something ongoing, not just a one-time thing. This journey involves constantly improving competitiveness by quickly adopting new technologies. As technology becomes more important, the difference between business and tech leaders will become less clear. That means all senior executives will need to know how to use technology effectively in their areas.

This substantial increase in spending, clearly suggests that organizations worldwide are increasingly investing in digital transformation initiatives to modernize their operations, improve efficiency, enhance customer experiences, and stay competitive in an increasingly digital world.

In this exploration, let’s shed some light on the ten digital and AI innovations that are reshaping businesses worldwide.

1. Generative AI: Data scientists globally are working on a special part of AI that focuses on making machines creative, like humans. These AI programs, called generative algorithms, can take different kinds of data – like videos, pictures, sounds, or even computer code – and use it to make completely new stuff that has never been seen before. One famous example is GPT-3, made by OpenAI. It can write text that’s so similar to what humans write, you can hardly tell the difference. Another version called DALL-E can even make pictures.

2. Cybersecurity Reinforced by AI: Business leaders worldwide see cyber-attacks as the biggest threat. In today’s world where cyber dangers are growing, using AI for cybersecurity has been a game-changer. AI technology helps to examine large amounts of data to find any unusual activities or possible breaches, making our digital systems stronger against cyber threats.

3. Blockchain beyond cryptocurrencies: Blockchain technology is experiencing dynamic growth beyond its association with cryptocurrencies. While Non-Fungible Tokens have surged in popularity, redefining ownership in digital art and entertainment, decentralized finance (DeFi) platforms are reshaping traditional banking and finance systems. Alongside these trends, blockchain is quietly revolutionizing business operations by enhancing transparency and accountability in areas like supply chain management and secure data sharing. The integration of blockchain with emerging technologies such as artificial intelligence and the Internet of Things (IoT) is further fueling innovation and expanding its applications across various sectors.

4. Augmented Reality (AR): Augmented Reality (AR) is experiencing a dynamic evolution, marked by several key trends. It’s revolutionizing retail with immersive shopping experiences and aiding remote collaboration across industries like manufacturing. In healthcare, AR is enhancing surgical precision and patient care while also transforming education through interactive learning experiences. The emergence of spatial computing, combining AR with AI, promises even more seamless integration of digital content into our physical world. As AR gets better and more popular, it’s going to really change how we do things, learn, and talk with each other in the future.

5. Digital twins transforming industries and reshaping operations – Digital twins are rapidly becoming a key trend across industries, including manufacturing, where they are quietly revolutionizing production processes. By creating virtual replicas of physical assets, businesses can continuously monitor, analyze, and optimize operations in real-time, leading to reduced downtime and enhanced efficiency. Recent advancements in IoT sensors and data analytics have made digital twins more sophisticated, enabling predictive maintenance and AI-driven optimizations. Cloud-based solutions are also gaining prominence, facilitating remote access and collaboration among stakeholders. Moreover, digital twins are finding applications beyond manufacturing, such as in urban planning and smart city initiatives, where they allow authorities to simulate and optimize infrastructure projects before implementation. As digital twin technology continues to evolve, its transformative capabilities are poised to drive innovation and efficiency across various sectors. The digital-twin technology market is projected to experience a rapid annual growth rate of approximately 60 percent over the next five years, culminating in a valuation of $73.5 billion by 2027. Companies have observed a 25 percent reduction in quality issues for products that originate as digital twins before entering production. Furthermore, nearly 75 percent of companies have embraced digital-twin technologies, with many achieving at least medium levels of complexity. Source : “Digital twins: The key to smart product development,” McKinsey, July 31, 2023.

6. Cloud computing for real-time insights: Cloud computing is the very foundation upon which businesses can build their digital transformation initiatives. Its power is its agility, adjusting perfectly to workloads and adapting to the ever-changing markets, new customer demands and trends. With cloud, scaling applications up and down has been super-fast and easy, leading to much more efficient operations and getting your products or services faster to the markets than before. Cloud is the backbone of today’s digital world and base for companies’ digital transformation lift.

7. Big data analytics driving informed decision making: Data has become the very lifeline of every thriving business and is no longer an option. Big data analytics is emerging as a game-changer. By using historical data and advanced algorithms, businesses can use predictive analytics to predict future trends, customer behaviors, and market shifts. This enables informed decision making, minimizing risks and maximizing opportunities. Harnessing the power of big data is the key to surviving in the new age business world and leading the charge with every data driven step.

8. Natural Language Processing (NLP) and AI powered chatbots in customer service: AI-powered chatbots are increasingly being used in the customer service area. These chatbots can be used 24*7 and hence, enable businesses to respond to customer queries faster and in a more consistent way. Using Natural Language Processing in chatbots, businesses can engage in seamless conversations with customers. This innovation transforms customer interactions, providing personalized and efficient services. Overall chatbots have been a game changer in customer service by automating customer support tasks. This has enabled businesses to reduce their costs substantially in this area by automating standard queries and hence, saving time and resources to focus on more critical tasks. 

9. Robotic Process Automation (RPA) in business processes: Behind the scenes, robots are taking on repetitive tasks, liberating human resources for more strategic endeavors. RPA is revolutionizing data entry, invoice processing, and other routine tasks, resulting in increased efficiency and cost savings. This is helping businesses to scale their operations more quickly, and alongside, enabling them to take on more employees and transactions without adding more manpower.

 

10. Explainable AI for Ethical decision making: The increased reliance on AI has resulted in greater accuracy observed by humans. However, the transparency and reasoning behind AI judgments are often more crucial for trust and reliability in both AI and human decision-making. Explainable AI serves as a bridge between humans and AI, offering a set of methods or processes used by AI to arrive at specific conclusions. The emphasis on interpretability to enhance decision-making accuracy is expected to become more pronounced in industries such as healthcare, human resources, and others. With AI becoming more common, it’s crucial that its decisions are ethical. Explainable AI plays a critical role in ensuring transparency and understanding in the decision-making processes of algorithms, thereby mitigating the risks associated with biased or unethical outcomes.

While big news may focus on fancy new technologies, the real change in business happens quietly. These ten digital and AI innovations are like unsung heroes, changing how companies work. They’re making things run better and smarter, while also thinking about what’s right. As businesses keep growing, using these behind-the-scenes changes will be important to stay ahead in the digital world that’s always changing.

Get Access to More Insights from Opticos

Subscribe

 

Ankita Bhargava & Jasmeen Kaur

Ankita Bhargava, Senior Manager with experience in IT strategy consulting for process improvement, IT operating model, digital transformation and delivering business solutions.
Jasmeen Kaur, Senior consultant with experience in Intelligent Automation, Process Transformation and Program Management.

You might find this content relevant as well: