news

Are You Ready for NIS2?

The European Union’s NIS2 Directive (Network and Information Security 2), set to take effect in 2024, represents an evolution in cybersecurity requirements for organizations across member states.

As an update to the original NIS Directive, NIS2 expands the scope of industries and strengthens the baseline security standards for critical infrastructure, technology providers, and essential service operators.

In the simplest of terms: “NIS2 is about stricter cyber requirements and incident reporting”. For organizations, this brings both challenges and opportunities—enhancing cybersecurity frameworks, ensuring compliance and mitigating risk.

 One aspect that is becoming increasingly clear is the shift of responsibility. Traditionally, cybersecurity has been viewed as an IT issue. However, with NIS2, it’s clear that cybersecurity is as much a governance and leadership issue as it is technical. The ultimate responsibility for cybersecurity is shifted from IT departments to company executives and boards. This puts increased emphasis on the need to educate the management in cybersecurity to better understand their role. At the same time, CISOs need to emphasize clear communication about cyber risks for the organization and direct the communication to the business.

“With NIS2, cyber is further established as a board-level issue”

 

What is going on now?

While the NIS2 Directive is still being interpreted and transposed into national law by each EU member state, companies are already starting to prepare to align with the forthcoming regulation and ensure compliance when the deadline arrives.

In our experience, engaging with clients, current NIS2 compliance efforts are not focused on groundbreaking technical enhancements, architectural changes or large investments. Rather, organizations focus on reviewing and aligning their existing policies and processes with the new requirements, integrating them into the broader cybersecurity management system and thereby laying the foundation for a risk-based continuous improvement approach.

 

Aligning an Organizational-Wide Business Continuity Framework

With The NIS2 Directive, there is a clear emphasis on resilience; the ability to withstand and recover from disruptions, this is also tied to the requirement on swift reporting to authorities.

While many organizations have processes in place for incident response, recovery, business continuity planning and communication strategies, our experience is that these are not always aligned. Processes and plans are often scattered in the organizations with various levels of descriptions and have untested dependencies across departments. To build resilience, organizations can unify these elements under a cohesive business continuity framework.

A business continuity framework provides a structured approach to manage and coordinate all activities related to maintaining business operations during and after a disruption. By formalizing the processes, roles, and responsibilities related to resilience, organizations can increase resilience and provide real business value while adhering to NIS2’s requirements. One key advantage with this approach is to be able to perform test, ensure effectiveness and continuous improvement. We propose the following key components for building an effective business continuity framework:

The sooner an organization starts to prepare for NIS2, the better. We are in contact with many organizations where the preparations are at full speed. If you wish to learn more about this, please don’t hesitate to contact us!

Get Access to More Insights from Opticos

Subscribe

Authors

William Hansson, Madeleine Tornard

William Hansson, is a consultant with experience in IT and Strategic management. He has a strong passion for leveraging new technology to create superior business value.
Madeleine Tornard is an experienced Project Manager and Information Security Officer. She often takes on the role as interim CISO at various companies. She is responsible for the Opticos Chief Data Officer (CDO) network, an invite only for CxO forum. She is leading the Information Security area within Opticos.

You might find this content relevant as well:

news

Data Fabric: A Wet Blanket, or A Resilient Bedspread Over Your Data?

Introduction:

What if there was one place where you could go to get all the data you need? No matter what kind of data you are looking for you will find it in this place. It could be your company’s master data, transactional data, analytics data, IoT data or even a document or a video. Sure, you will need to authenticate yourself and have the rights to access the data, but if that’s the case, then you just need to look for it in one place. And best of all, you could rely on the data being presented in this place to be accurate. Like a bedspread wrapped over all your data assets giving shelter and at the same time providing a common interface to the surrounding world. Wouldn’t that be wonderful? Well, that is the main idea behind the concept of “Data Fabric”.

What is Data Fabric?

Data Fabric as a concept has been developing during the past years as an answer to the increasing challenge of getting full benefits out of enterprise-wide data where:

  • Companies are consuming IT services provided by multiple cloud platforms like AWS, Azure, GCP and of course, are using different SaaS more frequently.
  • The amount of data generated by IoT, apps and social media is exploding.
  • More evolving business models depend on the ability to have easy access to data and share more data with end consumers.
  • End consumers evaluate products and services by the attached digital services as a key differentiator in their final selection.
  • Data regulations like GDPR, DPP and AI-act are focusing on what specific data companies should share with regulating authorities and end consumers.
  • Many companies still struggle with data in silos and locked-in legacy systems.

The ideas behind Data Fabric have been in development for many years. It started with collecting data in data warehouses, data lakes and BI platforms and was further developed by adding integration, security, data lineage and master data aspects. In the beginning, this was branded as Enterprise Information hubs or sometimes even as API platforms, but that was unfortunately driven by software vendors who had a hard time delivering upon their promises. So, the more general architectural pattern of Data Fabric was born, though the definition is somewhat blurry and depends on who you ask.

The figure below shows a conceptual architecture of the data fabric concept in relationship to other established concepts.

Fig 1: Conceptual architecture of the data fabric concept in relationship to other established concepts

For example, the exhibit below gives an interpretation of how different vendors in this area define Data Fabric:

Table 1: Interpretation of Data Fabric by different vendors

As evidenced by the diverse interpretations provided in the table above, gaining a comprehensive understanding of the concept of Data Fabric is not straightforward, as it involves multiple perspectives from various sources and vendors. However, analyzing existing discussions surrounding the definition of a Data Fabric reveals a consensus that it serves as a mechanism for generating data pipelines and integrations from diverse sources within a unified platform. However, there are divergent views on whether a Data Fabric is considered an architecture or a broader concept encompassing various technologies and architectures.

Furthermore, when it comes to looking at what creates a Data Fabric, some of the suggestions are pushing the idea that the Data Fabric is based around metadata and metadata analysis. In a metadata-based driven Data Fabric, metadata is used through activation and is then pushed towards the users when creating pipelines but is also suggesting new metadata when data is created from external sources. The metadata would also be enriched with semantics, putting meaning and context to the metadata through knowledge graphs. On these knowledge graphs, it would be possible to apply artificial intelligence and machine learning, and then we have achieved the concept of active metadata. The active metadata feature is considered one of the key features in achieving a Data Fabric architecture, which is analyzed using semantics, knowledge graphs, artificial intelligence, and machine learning.

There are a few contrasting views of how Data Fabric is defined by market participants. While some call it a design concept which could be interpreted as architecture. Then there are others who view the data fabric as a ‘Solution’, thereby interpreting it as an instantiated architecture. Interestingly, most market participants agree that Data Management should be an integral part of the Data Fabric definition, however one goes as far as viewing data fabric as a data management approach which is a broad view which leaves a lot of room for own interpretations.

What challenges are the Data Fabric envisioned to solve?

One of the challenges in organizations is the diversity of sources and systems dealing with data. Data is generated at an increasing pace with the development of new technologies, regulations, and business needs. An increase in data volumes and number of data sources will make the landscape more complex and finding the right data, and understanding the data in context will therefore become more difficult. Additionally, with different systems and user groups, it is not unusual that data becomes siloed. Each system or user group will have access to and understand the data within their respective silos. However, their knowledge about data outside of their organizational unit will be limited.

In addition, this may also lead to difficulties in harmonizing data and establishing consistent data categorization across the organization. Instances may arise where identical data objects exist in multiple locations but with varying identification formats, or worse, different taxonomies. A prime example of this is product or customer information, which may be scattered across numerous systems and often lacks consistency or sometimes, may even present contradictory information.

Another common problem is that the depth of the data architecture needed is underestimated. Large companies tend to strive to simplify the complexity of their own operations to be able to achieve a higher degree of freedom in their business processes. Consequently, processes are not adequately documented, and data is not captured and stored with the requisite granularity and quality. When these companies are faced with higher requirements on data quality by external parties like end-customers, regulating authorities or business partners, it could be a painful wake-up call. Attempting to implement a better order in the base data of a fully operating business is comparable to performing engine repairs on an airplane while it is airborne. Frequently, the solution involves implementing a new IT platform, such as a more robust ERP or master data platform. This enables the enhancement of data quality to coincide with the implementation of the new platform.

Yet another problem is simplifying data architecture work by putting it in the hands of large commercially available off-the-shelf platforms. To avoid tedious data management work, organizations rely on the data architecture presented to them by large solution platforms. The argumentation is that these platforms have many kinds of customers and hence they probably have already thought all data architecture aspects through anyhow. Later it’s not uncommon that these organizations discover the hard way that the complexity of their own business does not fit into this architecture. Then, costly adjustments are made to the solution platforms, data lakes are installed to bridge the gaps and analytical tools are installed to try to understand the data. If these mistakes are repeated frequently across the organization, the final situation will be characterized by disorder and disintegration. Data Fabric has been presented as the cure for these problems, but the question managers should ask themselves is:

Will a Data Fabric really solve the problems, or will it rather attempt to minimize the damage done?

Well, a Data Fabric focused on connecting all data in a business in an easy achievable platform in one place will mostly focus on damage control. To increase the data quality and reliability it is not enough to just connect the data with the surroundings. Here is where a structured approach to cleansing, normalizing, and analyzing the data on the fly can make a real difference. However, even if we could attain a higher level of data quality by incorporating capabilities for “on-the-fly-data-quality-improvement” such as AI, active metadata, and machine learning, it’s essential to maintain a skeptical approach regarding the reliability of the output. Data-driven decision-making assumes that the data is as accurate as possible. However, would you trust entirely machine-generated data to make critical decisions?

Given this rationale, it’s probable that Data Fabric architectures will initially see implementations in domains characterized by high-volume public data flows. In these contexts, data can be interpreted and acted upon with a lower risk of serious consequences in the event of faulty actions. However, as algorithms improve, Data Fabric architectures are likely to ascend in the value chain, eventually becoming a substantial data source in executive decision-making.

Another common problem is to establish and operate a working data governance organization. Many consider working with data governance to be a tedious and time-consuming task that often does not receive adequate attention. Furthermore, even when prioritized, it is frequently assigned to individuals with limited understanding of the potential business consequences of poor or inaccurate data. By implementing a sophisticated Data Fabric Architecture some of the problems generated by poor data governance could potentially also be addressed by establishing a self-healing active metadata architecture. While it may seem utopian at this moment, accomplishing this feat would yield significant potential.

What Data Fabric related technologies are there on the market?

When it comes to technologies related to Data Fabric, these can be classified into different categories:

  • To start with, there are metadata technologies, such as data catalogs. Providers of data catalogs, such as Atlan and Informatica, tend to view metadata as a central part of the Data Fabric. These companies also seem to have come the farthest in AI capabilities related to Data Fabrics.
  • Other companies like TIBCO, IBM and Microsoft tend to view data pipeline platforms or data integration platforms as central to Data Fabrics.
  • Companies like Denodo and Cloudera which have a legacy in data virtualization over several cloud platforms provide offerings that tackle the issue by handling data across multi-cloud environments.

Not surprisingly the different vendors tend to accommodate the Data Fabric concepts as much as possible into their own domains. In the table below we highlight some of the vendors in this domain and their current offerings in the Data Fabric area.

Table 2: Offerings of different vendors in Data Fabric area

How, and to what extent, do these technologies address the challenges above?

Depending on how we view Data Fabrics, and what we need to get out of them, we could look at the following three components:

  • A data discovery part, which in essence means a metadata model which can push information to the user.
  • A data pipelining/integration part, where data is pulled from the source system and prepared and served to the user that needs the data.
  • An active metadata part, where the organization might cater and compensate for some of the worst data quality issues in the underlying data by actively and virtually create the metadata from inbound data instead of matching inbound data against static metadata in the background.

However, merely focusing on data discovery and automation for data pipelines would only address part of the issue as outlined earlier. The common perspective is that a Data Fabric platform should encompass data discovery, data catalog, and hybrid integration tools to effectively tackle these challenges.

Some experts offer AI capabilities to achieve the active metadata setup and argue that without this, it cannot really be considered a Data Fabric platform. The main argument is that if we are aware of which data assets exist and what level of quality those data assets hold, then we can address data quality issues in real-time using AI and machine learning to provide a shield of automated data improvements that will result in improved quality compared to the underlying sources. Nevertheless, to make decisions based on this aggregated data, you must have a high level of trust in your algorithms.

Working with metadata has traditionally been a time-consuming and relatively static process, involving the analysis of required attributes and characteristics of data objects. However, in the future, advancements in technology could enable machines to interpret metadata dynamically, analyzing input data and utilizing AI algorithms to meta-tag data based on similarities with previous instances. This would undoubtedly represent a paradigm shift in data interpretation, capture, and analytics, unlocking the potential for advanced data handling in a fraction of the time compared to current methods.

Some providers are also taking this one step further and are integrating AI tools to create a spoken natural language interface to the data discovery module making the data even more accessible.

 

What do we see as challenges with Data Fabric?

Implementing a seamless data layer on top of a large and scattered data landscape brings its fair share of challenges. It is likely that organizations with complex IT/data landscapes will be those with the most to gain from investing in a Data Fabric architecture. On the other hand, the implementation of such a fabric project could be an arduous and time-consuming process, increasing the possibility of failure.

While legacy issues such as data accessibility and presenting the data in a consumer-friendly format remain, modern hybrid platforms have the tools built-in to address them.

If a company has successfully implemented technologies for data access and publication, the primary challenge moving forward will be ensuring the trustworthiness of the outcomes. It is tempting to think that an enterprise’s common Data Fabric is seen as a cure for the bad underlying structure. What you achieve by implementing a Data Fabric on top of this mess is that you get a centralized mess where you easily can access the underlying bad data. This makes the poor data quality more visible rather than being restricted to a backbone legacy environment, translating into business decisions likely being taken based on incorrect data!

So, the real concern will be around data quality. The most effective approach to addressing this concern is by improving data quality at the source through processes such as data cleansing and enrichment. Further, an understanding of metadata and data structures will be required. AI can assist in automating manual and time-consuming tasks, not only in tidying up data sources but also during real-time data consumption.

What are our conclusions and recommendations?

 To summarise, we feel the following observations are note-worthy:

  • Consider Data Fabric to be an architecture rather than a tool.
  • Currently, we do not observe any vendor in the market offering a comprehensive tool or platform for building a Data Fabric architecture. However, several vendors provide platforms and tools that could serve as essential components of your overall Data Fabric architecture.
  • Investing in a Data Fabric architecture must be seen as a research proposition where parts of the investment can be attributed to building knowledge and gaining experience. Prepare for the need to replace components in your architecture later, as the technologies mature.

Data Fabric architecture will find its first successful and cost-saving implementations in organizations with a high volume of data exchange with the external world and where the quality level of the data exchange is not critical. For example, the distribution of reviews on hotels, products and restaurants to many different sites and platforms.

Get Access to More Insights from Opticos

Subscribe

Authors

Hans Bergström, Mattias Gustrin & Eric Wallhoff

Hans Bergström is an experienced Strategic Advisor and Partner at Opticos. He is also heading the Opticos Service Offering on Enterprise Architecture & Technology.
Mattias Gustrin, Senior advisor within Data Management, Advanced Analytics, and technology strategies across multiple industries.
Eric Wallhoff is a Senior Consultant with experience in Data Management and Data Analytics. He is also part of the Strategic IT capability.

 

You might find this content relevant as well:

news

Value Realization in a Modern IT Organization

Introduction

In today’s rapidly evolving digital landscape, IT and technology have become pivotal to business success and as a result, navigating IT investments has become a critical challenge for modern organizations. Aligning these investments with business strategy has become not just a necessity but a strategic imperative to ensure that each investment not only supports but also drives strategic objectives.

Furthermore, funding from continuous improvements and operational excellence initiatives driven within the business or IT organization are often expected to drive out cost, releasing funds that can be used for transformation and change rather than operation.

In this article, we will present an introductory overview of the Opticos model, designed to address these key challenges. The model assists organizations in adopting a value governance approach for IT investments. The aim of the model is to create a comprehensive model that supports the reader in connecting tangible business values to IT investments ensuring a clear connection between technology initiatives and business strategy. This alignment is crucial for realizing the full potential of IT in driving business success.

We’ll explore how the Opticos model facilitates this alignment, transforming IT investments into strategic assets that propel organizational growth and operational excellence.

The Opticos Model

The Opticos model is an evolution from common concepts addressing areas such as value governance, portfolio and investment management, and business impact value tracking, in combination with customer assignment experiences.

Opticos has built its service offering and advisory around the three-step approach and the four key dimensions. The foundation of the model is depicted in a three-step approach that when done right will ensure that IT investments within an organization will always support the strategic objectives of the business, and by so doing ensure the realization of business values. The four key dimensions are interconnected and essential to drive business value by identifying and prioritizing what IT investments will support the business the best based on its priorities. Both these concepts will briefly be introduced in the following.

The three-step approach

The three-step approach can be used regardless of operating models or organizational design. However, it is important to keep in mind that to navigate through these steps, different tools need to be used to successfully implement the model, and that the tools are chosen with regard to organizational procedures and existing governance.

What the three-step approach offers is a structural approach to aligning IT investments with business strategy and ensuring that each investment has been approved and that it is providing the value that was intended by value tracking through the entirety of an investment’s lifecycle.

1. Business strategy change implies that each organization’s business strategy normally contains elements where change initiatives have been identified as required for the strategy to materialize and be achieved

2. Strategic objectives define the requirements for executing the identified changes into a select number of objectives that can be further detailed and prioritized. These objectives can put focus on any function or area within the organization such as HR, Sales, Supply, R&D, ESG, etc.

3. Business capability mapping breaks down the strategic objectives from a business capability perspective, and by doing so, also provides the foundation for what IT needs are required to secure the business capabilities at an enterprise

4. Investment domains define the specific IT requirements at a platform or solution level, closely aligning with the conceptual mapping of business processes. This approach ensures that the IT investments support the operational and strategic needs of the business

5. Investment management is where the investment domains within IT have been converted into specific IT solutions, platforms, domains or transformation activities that will support the strategic objectives to materialize and subsequently enable the business strategy to be fulfilled.

At this stage, identified change initiatives have been converted into programs or projects that need to be priced in order to be prioritized. To facilitate this, an initial business case is typically required. This business case should originate from the originally envisioned business benefits and expected value outcomes.

6. Value tracking is needed during the entire lifecycle of each investment in order to ensure that the expected values identified and showcased in the approved business cases are realized. For this, clear ownership and metrics are key

Four dimensions

The four dimensions can be used to manage IT-driven change where IT is leveraged or required to fulfil the business strategy and the expected values. The tool can be used as an end-to-end process to ensure value realization as well as a tool to transform the current IT organization focusing on operational efficiency and the release of IT funding that can be used for transformation instead of operations.

DEFINE – The desired outcome of the business strategy provides guidance for how funding should be allocated to ensure that the expected business values can be realized. To ensure the right focus, the three-step approach can be used to convert the business strategy into tangible strategic objectives, and business capabilities that are easily mapped to enablement and requirements within IT

TRANSFORM – Focuses on achieving operational excellence and cost optimization within both the business and the IT organization. Traditionally, standardization, consolidation, and sunsetting of the current IT landscape contributes to release funding by lowering operational cost that can be used for change initiatives requested by the business

INVEST – This dimension is the culmination of DEFINE and TRANSFORM where available funding is distributed to IT investments that contribute to fulfilling the business strategy and the strategic objectives set by the business. By doing so, IT investments will only be realized where tangible values have been identified that in a cohesive way can be measured and tracked

TRACK – Focuses on accountability and responsibility for the investment, business case ownership, and tracking value realization over the entire lifecycle of the investment. It is not uncommon for organizations to focus on the project phase and take control over project execution, capitalization, and cash flow, but lose themselves in the actual operational phase of the outcome from the investment in question.

It is important to, for instance, reflect over: “Did we manage to automate the intended processes so that x number of FTEs could be used elsewhere? Did we manage to decrease our storage footprint by consolidating and lifting local storages to the cloud and applying active management of the solution?” just to give a few examples. Securing the full potential of value realization is only possible by adopting a lifecycle perspective

Putting it together

Jointly, the three-step approach and the four dimensions offer a powerful toolset that can be used to align IT investments with the business strategy where transformation requirements are driven or leveraged through IT and new technologies. While the three-step approach addresses how the business strategy in a structured way manifests itself in IT investments, measured and prioritized by business value, the four dimensions put the realization of the business strategy in an IT strategic and operational perspective.

Given the output from the three-step approach, how should IT (and the business) distribute its funding given current operational requirements and future transformation expectations? How should the IT organization balance funding of its own service portfolio to the portfolio of investment, e.g. portfolio, program, and project management, and how should the expected and realized values be quantified, prioritized, and tracked? With the Opticos model, the users can choose to apply the whole model as well as cherry-picking relevant or prioritized parts. The model can be used as a toolbox of governance activities and metrics as well as a means to build understanding and collaboration between IT and the business.

Summary

Organizations of today need to ensure that IT investments are made based on what business values they will contribute to realizing. Strategic objectives of the business strategy need to be linked to business capabilities, processes and supporting IT solutions. With limited funding, only investments in IT that contribute to fulfilling strategic objectives should be approved and prioritized. To raise and enable funding, many organizations need to optimize and streamline their current business and IT operations, often through digital transformation or automation, consolidation, and modernization.

Regardless of where you are in your business transformation journey, scaling and leveraging new technologies and IT to become more competitive, more innovative, or more agile, Opticos can help you ensure that your IT investment portfolio is governed, tracked, and aligned towards your long-term strategic objectives of your business.

 

Get Access to More Insights from Opticos

Subscribe

Authors

Mario Gonzalez Muñoz & Teodor Danielsson

Mario Gonzalez Muñoz is a Senior Consultant with experience in Information Security, IT governance, and IT sourcing. He is also part of the Strategic IT capability.
Teodor Danielsson is a seasoned business leader and Partner at Opticos. He is also heading the Strategic IT capability.

You might find this content relevant as well:

news

Implementing and Maintaining a Global ERP Solution

In today’s business environment, even traditional mid-market organizations have become global enterprises. More and more companies are implementing a global ERP to drive optimization and integrate data and information across multiple countries into connected applications. Developing and maintaining a global ERP solution comes with its own set of challenges – but with the right ERP strategy, companies can navigate through the barriers to meet their objectives.

For any organization, the typical goals of a global ERP Implementation can be as below:

  • Standardized business processes on a global level
  • Improved data quality and consolidated reporting
  • Simplification of the IT landscape and reduction in operating costs

Before embarking on the ERP implementation journey, organizations need to have a clear vision of where they are and where they want to be. It is a huge transformation effort and involves significant changes to business processes, technologies, and culture. The success of the implementation does not solely depend on the technical aspects. It also requires detailed attention towards organizational change management.

The ERP universe is ever-expanding, but ERPs can only deliver results when the fundamentals are in place. An executable strategy can maximize your return on investment and set your organization up for long-term success.

 

ERP Strategy

A pre-study or a conceptual phase is often carried out to prepare for a large-scale ERP implementation exercise. In our experience, a conceptual phase can help in defining the scope, goals and expected outcome that can be communicated to all the internal and external stakeholders. Some outcomes are as follows:

  • Clear vision and blueprint of business needs, alignment of all stakeholders
  • Final scope, time plan and roadmap for the implementation exercise
  • Program Methodology, Governance and Team Structure
  • Sourcing Principles and Strategy
  • Data Strategy and Key Design Decisions

 

1. Sourcing Strategy

 ERP programs are often very large investments for organizations and are associated with operational and financial risk and are a driver for major change. That is why it is essential to form a solid strategy for sourcing an ERP implementation partner(s) early on. There are many factors to look at – for instance, make versus buy strategy, pricing model, sourcing objects, supplier capacity, and capabilities to engage the right partner(s) in a timely manner and to help deliver on the transformation objectives.

These areas are generally part of an ERP Implementation Partner Sourcing Strategy:

  • Strategic Drivers or Business Value
  • Sourcing Direction (Sourcing Objects, Supplier Selection, Make or Buy decision)
  • Application Maintenance and Support Services
  • Program Governance
  • Commercial Engagement Model

To know more, you check our article here.

 

2. Data and Reporting Strategy

Today, many organizations are flooded with data and are looking for better ways to develop real time reports and insights. Technologies are often implemented without yielding the envisioned results and become overly complex to maintain. Why? To put it simply, it’s the lack of an overall data and reporting strategy to integrate, synchronize and govern. What’s more, while companies are inundated with data, much of that data is in disparate, non-integrated systems. Master data can be all over the map in global organizations with layers of different ERP systems. For multinational enterprises, this issue is further amplified by the number of subsidiaries.

A robust data and reporting strategy – focused on clean, consistent, and well-defined data – can assist companies to drive growth and overall competitiveness. To build a successful ERP data and reporting strategy, companies must start by accounting for different stakeholders across the organization.

Fig 1: Data & Reporting requirements of Stakeholders involved

Program Methodology:

ERP implementations are often multi-year initiatives that involve significant capital investments, long-term planning, and business process re-engineering. The implementation approach you select for your company must fit your company’s complexity and culture to maximize the chances of a successful implementation.

Many companies try to derive the best values out of different methodologies to identify a best fit for their organization. For instance, a waterfall approach can provide good structure and milestones which can then be combined with agile within each phase to iteratively create the phase deliverables. It is always a good practice to maintain an MVP mindset and try to get user feedback as early and frequently as possible.

  • Fit To Standard:

Organizations now view ERP as a backbone for their operations. ‘Fit to Standard’ is hardly a new concept but there is a renewed focus on it as companies strive to reduce cost of ownership, implement best practices and create a clean core solution.  A stable core is essential as it is relatively easy to add niche capabilities on top.

A Business change impact assessment, while building the strategy, can assist organizations to study existing processes and identify what they can or cannot live without. If needed, only the truly essential processes should be customized. This can give organizations a better control over their IT landscape and make way for easier governance.

  • Global Template:

 When you finally make the decision to implement an ERP solution, there comes a point when you need to decide whether deploying a global ERP template across multiple sites, divisions, and regions is worth it. The answer to this question can drive the overall strategy of design, development, and deployment decisions along with the long-term support services going forward.

International organizations often have non-standardized business processes and deciding to implement a global ERP system across any enterprise is not easy. For one, the decision to globalize or localize isn’t black and white. While standardization allows to scale for growth by consolidating business processes and driving operational efficiency, organizations also need the flexibility to serve diverse customers, employees, economies, and regulatory bodies.  Striking the right balance between standardization and flexibility thus becomes essential. To know more on this, you can read our article here.

Figure 2: Risk for Global Template Rollout

Program Management and Governance

Any ERP implementation is a complex and costly undertaking that requires careful planning, execution, and monitoring. Project management and Governance are thus key factors that can determine the success or failure of an ERP implementation. With global implementations, the complexity for the program management teams is higher as several projects run in parallel across different regions. Additionally, there may be several implementation partners working simultaneously leading to multiple participants and a need for an even better coordination. Organizations, thus need to have a set of governing principles that help them define the PMO teams and structure.

A key aspect of ERP project management and governance is to monitor and control the progress and performance of the project. Organizations need to use appropriate tools and techniques, such as dashboards, reports, meetings, and feedback to monitor and control the project. PMO needs to track and measure the key performance indicators (KPIs) and milestones of the project and compare them with the baseline and expectations and communicate it with all stakeholders – internal or external.

Another critical aspect of ERP project management and governance is to ensure the quality and testing of the ERP system. You need to verify and validate that the system meets the functional and non-functional requirements and complies with the standards and regulations. It is important to involve the end-users, business owners, and vendors in the testing process, and document and resolve any defects or issues.

 

Change Management is Critical

Human factors like organizational behaviour and training are often ignored but are just as crucial to a successful ERP implementation as addressing technological requirements. Local entities may be extremely resistant to globalized processes. They need to understand the reasons for transformation and how it will personally benefit them.

It’s important to create a communication plan that addresses how, when, and where you’ll communicate key updates. This includes adjusting your messages as required to accommodate different languages, cultures, and time zones. Communication is just one part of organizational change management.

A plan at the local and global levels is needed to ensure your organization is prepared for a global ERP rollout. Without properly training employees across headquarters and local establishments, it is nearly impossible for a global ERP rollout to occur smoothly.

Throughout the ERP programme, change management needs to ensure that organizations, leadership, and employees are:

  • Clear on how the change will impact the existing ways of working
  • Aware of why the change is needed and what are the benefits of change
  • Ready, willing, and able to implement change

 

Application Management Services

Creating or implementing an ERP is not the end and organizations need specific expertise, robust operating methods and tools that seamlessly enable work across a global team environment. Decisions around AMS can have a profound impact on the total cost of ownership, user satisfaction, solution agility and ultimately the success or failure of the ERP solution.

Organizations should look to prioritize AMS while developing the ERP sourcing strategy and evaluate the benefits of different available options – Developing an in-house team, using an external supplier or a combination of the two.

 

Guiding you through your Organizational Change Journey

A global ERP implementation is not just a technical IT project but a huge change effort. It is an organizational journey leading to new ways of working. Organizations therefore need to create a business change program and mobilize a team to focus on communication and change strategy.

While you may want to rush into it to maintain a competitive advantage or keep up with shifts in customer demand, it is important to ensure your business is prepared for this endeavour. The key is to create a balance between central and local requirements, prioritize communication, and strategize the rollout.

Drawing from our extensive client experience and proven methodologies, we can guide your organizational journey from strategy to implementation and support.

Abhishek Kale & Johan Saks

Abhishek Kale, Senior Consultant with experience in Process Excellence, Digital Transformation, and ERP Consulting across multiple business sectors.
Johan Saks, is a seasoned leader and a Partner at Opticos. He specialises in guiding clients through the challenges of digital transformation, strategic planning, and performance enhancement.

You might find this content relevant as well:

news

Setting AI on Your Company’s Agenda

How will your company’s business look like in a world that is influenced by AI? Do you consider technological shifts imposed by AI as transformative or disruptive for your industry and business? This article will provide you with a general framework and some practical recommendations on how to get started with mapping the technological impact of AI and start crafting your AI journey.

How does your company address the technological shift imposed by AI?

Perhaps, you see AI as a subset of the advanced analytics domain and have a team of highly skilled professionals working on developing algorithms for business needs. Maybe, your vision of AI’s role at your company is similar to autopilot systems in airplanes which manage a slew of routine tasks so pilots can focus on essential decision-making. Perhaps, you have just started conversations on AI topics within your technical or business teams. Regardless of where you are in your AI journey, this is the right time to employ a structured approach to managing the impact of AI technology at your company. While integration of AI into your business processes may seem complex, there are a few simple things you can do to get started.

AI is an umbrella term for several technologies, methods, and their combinations that in one way or another, attempt to replicate aspects of human intelligence. One of the technologies is Machine Learning (ML), which is primarily useful for data analytics and predictive modeling. Companies use it to forecast sales, manage inventory, and detect fraud. Another common AI method is Natural Language Processing (NLP), which powers chatbots and customer service automation. We also see AI in the form of robotics that streamline manufacturing processes. Recent advancements in the field of generative AI such as ChatGPT and Midjourney have made it possible for everyone to see AI in a new, content creator role.

One of the first actions you might take in the beginning of your AI journey is to initiate a process for formulating the organization-wide AI strategy. It will enable you to think about your goals with AI and create a high-level action plan for achieving these goals. A good strategy will build a common understanding of what the main principles for the expansion of AI at your organization are and, importantly, what you have chosen not to do. The whole purpose of AI strategy should be to provide a formula to align all AI projects and activities with business capabilities and value creation in the organization.

Here are some recommendations for how to make your strategy more actionable:

1. Get yourself acquainted with AI

Understand what it is capable of and assess its relevance for your business in the short and long term.  You could allocate a team within your innovation department or cross-functional, engage vendors to provide ideas, organize an ideation workshop such as a Hackathon with experts to help you map your ideas to available technologies. The choice you make will depend on your organization’s size, industry, culture, availability of resources and skills, budget and other factors.

2. Set a time plan for your AI initiatives

Define the time horizon for your strategic goals. Do not look too far in the future. As there is little evidence that AI will outperform humans in all aspects of intelligence in the next ten years, you do not need to focus too much on how the world will look like then. Instead, put your efforts into exploring how AI can improve your business and consider the following phases.

Figure 1: Example of an AI roadmap

3. Focus on areas that can make an immediate impact on your business

For example, you could consider the impact of AI technology in the following areas:

3.1. Improving operational efficiency and reduce costs: AI as a productivity tool

Operational efficiency focuses on means to reach business goals and productivity. AI will not replace humans, but humans not using AI in their work will be replaced by humans using AI. The choice of use cases will depend on type of tasks that are being performed by your employees. If customer service is an important part of your business, then AI-powered automation presents tremendous efficiency gains and relatively low risks. If you develop software internally, consider using AI-powered programming co-pilot to accelerate delivery of features and bug fixes. Test your AI innovation ideas here first. Showcase the success stories and lessons learned to inform decisions regarding upcoming AI projects. For example, Morgan Stanley Wealth Management deployed generative AI to power an internal chatbot that streamlines access to their extensive knowledge base, providing advisors with instant access to expert-level knowledge.

The main idea of AI as a productivity tool is do the old thing with a new tool. So, look for tasks that you can outsource to AI.

3.2. Launch new business lines and products: AI and business development

Your company’s competitive strategy was formulated in an environment with certain industry opportunities and threats as well as broader societal expectations. Changes in this environment, driven by AI advancements, could be so substantial that you will need to re-visit your offering. For example, a clothing brand could utilize a personalized recommendation engine together with an AI-powered chatbot to deliver a styling advice to its customers. If your company offers knowledge-intensive products and services such as risk advisory, you might find a business case in leveraging AI to deliver analytics in real time instead of static reports.

Use the knowledge obtained from your first AI projects where AI was used as a productivity tool. Now it is time to do the new thing with a new tool. What new customer needs can be met by AI-powered products and services? How should you combine human-centric and AI elements to stay competitive, attract talent and improve your position on the market?

3.3. Become familiar with AI risks: create foundation for AI governance.

AI is already regulated in several geographies and more extensive regulations such as European AI Act are on the way. In addition to regulations there are ethical considerations that could be used as a guidance for AI governance. If you are in the beginning of your AI journey, make it a responsible AI journey from the start. The guardrails will pave the way for trust in AI, address the common fears and encourage use of AI in your business.

Figure 2: One of the basic purposes of AI governance is to enable responsible AI

The three steps above provide you with a good ground to stand on when formulating and executing your initial AI strategy.

Final notes

AI awareness program

An AI awareness program is a good way to take lead in AI area and create common standards. If AI will have an impact on organizational culture, the awareness program is a way to soft start this shift.

Cross-functional cooperation to enable AI

Successful integration and use of AI in your business is not a tribute to a particular function, but rather result of collective efforts of several functions, including so-called AI enablers such as Digital, Data, Ethics, Legal and Business process owners.

Democratization of AI

There are several signs of AI becoming an integral part of our work and existence. If democratization of AI is what the future holds, it might be relevant to prepare your organization for new processes involving decentralized decision making, increased role of citizen data scientists and relevant data governance practices.

At Opticos, we enable organizations to leverage the business benefits of new emerging technologies. Drawing from our extensive client experience and methodology in business strategy, change management, data management and AI governance, we’re here to support you in your AI journey from strategy to implementation.

Tatiana Schön & Johan Valeur

Tatiana Schön Management consultant with focus on Strategy, Business analysis, Data & Analytics and IT management.
Johan Valeur Management consultant with a focus on Innovation, artificial intelligence and entrepreneurial business development.

You might find this content relevant as well:

news

Information Security: How Secure is Secure Enough?

All organizations are required to carefully manage information security to survive in an era of complex threats and repeatedly large security breaches for all types of organizations, for instance, T-Mobile, The Swedish Transport Agency, and LinkedIn. But how does your organization know how secure is secure enough? How much should we spend on security?

In an increasingly digitalized business world and with a sharp increase in cybersecurity spending across the board, ensuring confidentiality, integrity, and availability for your organization’s information and IT systems is, in many cases, becoming a major issue not only for CIOs and CISOs but for the entire C-suite. It is a complex challenge to strike the correct balance between, on the one hand, security and, on the other, sound investments and resource allocation.

Investments and initiatives within cybersecurity and information security should be proactive, ensuring the possibility of being responsive and optimise security spending. Through the introduction of different types of security measures, we decrease our exposure and avoid disturbances and even severe incidents and attacks with tangible financial impact. These measures could, for example, include security audits, vulnerability assessments, employee training, regular software patching, and so on.

While these investments are crucial for minimizing risk, they all have a cost associated with them. With the inherent proactive nature of these types of security investments, determining how much to invest is particularly difficult, as the benefits or Return on Investment (ROI) may be challenging to track and quantify with traditional monetary metrics, as you would, for example, for advertisement spend.

This article explores the dilemma of security investment, highlighting the need for a shift to a risk-based mindset on all levels and outcome-based performance metrics on proactive security measures to enable prioritization of initiatives and proactive resource allocation.

Risk-based mindset

Deciding how much to invest in security will often begin with a fundamental discussion of “How much should we spend?” or simply become the result of how much is left in the budget. However, how does an organization ensure this is the right or appropriate amount to invest? In security, there is always something more that could be done, one more measure that could be taken to avoid incidents or attacks.

No one can fully secure themselves from every threat, so to make a conscious decision, every organization that is not limited by regulatory constraints should instead focus on the question, “How secure can we be? What risks can we accept?” With this mindset, an organization will be equipped with a straightforward way to communicate and discuss what an initiative implies for the risk level accepted and, additionally, what cost is associated with reaching a certain level of protection.

Outcome-based performance metrics on proactive security measures

While a risk-based mindset is important and a prerequisite, it is not enough to make informed decisions. Organizations need hard metrics that can be assessed and compared, and the size of an investment is not an indicator of IT protection. The challenge lies in the fact that not all investments of the same size result in the same levels of protection. This discrepancy stems from factors such as the effectiveness of the chosen security solutions, the organization’s unique threat landscape, and its overall security level.

To bridge this gap, organizations should use outcome-based performance metrics for evaluating security investments and measures. Instead of using the size of investments as an indicator of security, organizations should use metrics that focus on measurable protection level outcomes. These metrics provide a better overview of the actual impact of security initiatives and investments. By analyzing how these metrics change with each proposed investment or configuration, organizations can objectively discuss and determine acceptable levels of security and balance that with the associated cost.

Examples of performance metrics

Time to resolve

Definition: Time to resolve is the average duration it takes to resolve a cybersecurity incident or breach from when it is detected or reported.

Purpose: This metric evaluates the organization’s ability to not only respond quickly but also to effectively contain and eliminate the threat. A shorter time to resolve implies a more effective incident handling process.

Average patching frequency

Definition: How often an organization applies security patches and updates to its systems and software on average in terms of a time interval, such as days or weeks.

Purpose: The metric reflects the organization’s commitment to keeping its software and systems up-to-date with the latest security patches. A higher patching frequency indicates a proactive approach to addressing known vulnerabilities and reducing the attack surface.

Approach

For an organization to assess the result of an outcome-based metric and use this in a decision, it is necessary to adapt and analyze this in relation to the specific organization. To construct and adapt a metric, an organization can follow these steps:

1. Baseline Assessment

Clearly define metrics important to the organization and conduct a baseline assessment based on historical data. By gathering this data, it will serve as a benchmark against which the potential impact of the investment can be measured.

2. Estimating the Change

Construct an estimate of the potential positive effect that could be achieved if the investment, initiative, or configuration change were to be made. This could, for instance, include the use of suppliers’ customers as benchmarks to assess what benefit could be achieved.

3. Evaluation and Informed Decision-Making

Evaluate the Return on Investment (ROI) of the proposed security investment by comparing the projected ROI in terms of actual protection levels achieved against the organization’s risk appetite expressed in hard metrics and the investment cost.

It is important to note that one investment in a security measure may have a potential positive impact on more than one security metric.

Example: Illustrating the Concept with Time to Respond (TTR)

A powerful example of an outcome-based security performance metric is the “Time to Respond” (TTR). TTR measures the speed at which an organization can detect and mitigate a cyber threat once it is identified. Organizations with shorter TTRs are better equipped to minimize the potential damage of a breach. By utilizing TTR as a metric, organizations can quantify the effectiveness of their incident response strategies and identify areas for improvement.

This concrete data allows for informed decision-making that goes beyond abstract risk assessments. The metric data can also be leveraged when considering an investment that could potentially reduce the TTR significantly but would imply a substantial cost. Comparing these aspects can give organizations a powerful tool to make informed and fact-based decisions.

Optimizing Budget for Maximum Protection

Consider a scenario where two different organizations allocate the same budget to cybersecurity. While this might seem like an equal investment, the actual protection level could significantly differ based on the configuration of security resources and initiatives. Outcome-based metrics as discussed provide a framework for organizations to optimize the security budget according to their risk appetite and support the prioritization of resources that is inevitable. By identifying the most effective security measures for their unique threat landscape, organizations can maximize their protection levels within the same budget constraints.

Conclusion

All organizations and cybersecurity responsible roles should establish a risk-based mindset when assessing investments and to be secure enough requires a focus on the outcome. With a risk-based mindset in the process of deciding how much investment to make in cybersecurity and information security, the organization is well-equipped to achieve constructive discussion with a common basis. To this, by implementing the usage of outcome-based metrics the decision will be further assisted with facts that are comparable both internally and externally.

With this management and cybersecurity responsible can make sound decisions on how much to invest, while at the same time equipping them with an opportunity to understand what a decision of not investing entails for the level of threat. Conclusively this would ensure organizations that the investments have an advantageous impact on the company’s level of protection.

Where are you in your path regarding strategies for information security and cybersecurity? We offer support in your specific journey, and we invite you to discuss this further with us.

Madeleine Tornard, Malin Rydén & William Hansson

Madeleine Tornard is a experienced Project Manager and Information Security Officer. She is responsible for the Opticos Chief Data Officer (CDO) network, an invite only CxO forum.
Contact us
 to know more.
Malin Rydén is a consultant with experience in Information Security, System management and Project and Program management across various industries.
William Hansson is a consultant with experience in IT and Strategic management. He has a strong passion for leveraging new technology to create superior business value.

You might find this content relevant as well:

news

A Technology-Driven Approach to Sustainability: An Interview With Göran Kördel, CIO of Boliden

Even as sustainability takes centre stage across the value chain in a conscious effort to minimise the impact on the environment, governments are leveraging regulatory frameworks as powerful catalysts for fostering eco-conscious change. At this pivotal juncture, corporate leaders are faced with an essential decision; to acknowledge sustainability as a corporate duty and a strategic necessity. 

Nils Andersson, Sustainability Consultant at Opticos, sat down with Göran Kördel to discuss various aspects around sustainability, opportunities, challenges, and get an insght into Boliden’s journey in this important area.

Contents 

  1. About Boliden 
  2. Guest Spotlight: Göran Kördel 
  3. Focus on Digitisation 
  4. The Approach to Sustainability 
  5. The Double Role of IT in Sustainability
  6. Data-Driven Approach to Sustainability Reporting 
  7. The Quest for Data and EU Sustainability Regulations 
  8. The Evolving Role of the CIO 
  9. Top 3 Priorities for CIOs in Sustainability 

About Boliden

Boliden is a leader in the sustainable extraction and processing of base and precious metals. Through technical innovation and expertise in mining and smelting, Boliden delivers high-quality products such as zinc and lead ingots, copper cathodes, gold, and silver granules. Their operations span across the Nordics and include prospecting, mining, smelting, and delivery to industrial customers in Europe. Read more about Boliden.

Guest Spotlight: Göran Kördel

Orchestrated under the capable leadership of Göran Kördel, the Chief Information Officer (CIO) of Boliden, the company has a unified IT organisation, where the CIO is responsible for aligning IT with the business goals. With a career spanning over two decades, Göran´s journey through the IT industry is marked by transformative roles and influential leadership, making him a driving force in adapting technology for modern challenges. Göran Kördel´s journey into the IT world began at Ericsson and then continued to Sandvik, where he held positions as President, CIO and Vice President. Göran has paved his way through the industry, and in his current role as CIO at Boliden, he continues to be a leading force in adapting technology and IT for the mining industry and beyond. 

Focus on Digitisation

 The strategic adoption of digitalization is a fundamental driver for achieving sustainability goals within any organisation. Recognising the significance of digitisation in the pursuit of sustainability, Boliden has focused on transforming its operations to become more sustainable, where the IT function plays a crucial role in the company’s success. 

 The company’s primary focus is on production, resulting in an emphasis on Operational Technology (OT). While we discussed the digitisation of Boliden’s operations, Göran emphasised that there are three important aspects to digitising the company operations: 

  • Supporting production with automation: This involves connecting machines and sensors to optimise production.  
  • Developing data analysis: The company collects comprehensive data from connected machines, enabling advanced analysis. 
  • Digitising employee ecosystem: Today´s digitisation includes all employees, not just executives and IT users as it was in the past.  

The Approach to Sustainability 

 Sustainability within an organisation can’t be the sole responsibility of a single department; it requires a collaborative effort. When it comes to inter-departmental collaboration to support the company´s sustainability goals, Göran identifies two cornerstones that shape his work:   

Focus on sustainable production: Boliden aims not only for electrification but also for sustainable metal extraction. Despite the company already establishing itself as a leading player in sustainable mining globally, they are striving to challenge themselves by raising the bar. “We are currently producing metals with a significantly lower CO2 footprint than the industry in average, but we must continue to evolve and improve,” Göran emphasized, with goals such as a 40% reduction in absolute CO2 emissions in Scope 1 & 2 by 2030 compared to 2021 as the base year and carbon neutrality by 2050 as guiding principles for their corporate strategy. 

We are a big contributor to the circular economy by recycling electronic scrap, car batteries, and steel mill dust. While we explored the role of technology in advancing the company’s sustainability objectives, Göran highlighted two significant challenges Boliden encountered on its sustainability journey.  

Firstly, the deployment of electric vehicles for transportation, while technologically feasible, presents a formidable task in replacing diesel-based vehicles. 

Secondly, the transformation of core processes, such as smelting and blasting, particularly in endeavors like green steel production, necessitates a protracted process development timeline, illustrating the complex nature of process transformation in achieving sustainability goals. These challenges underscore the nuanced interplay between technology and operational change essential for sustainability success in the mining industry.

Source: www.boliden.com 

The Double Role of IT in Sustainability  

IT can play an important role for organizations in their sustainability operations. Göran identifies two cornerstones that shape his work:  

  1. Technology´s contribution to environmental efforts: Göran emphasises the crucial role that IT can play in the two primary areas of carbon dioxide reduction. By electrifying transportation and optimising production processes to reduce CO2 emissions, the IT team creates opportunities to use technological innovations like artificial intelligence (AI) to drive improvements. 
  2. Sustainable IT delivery: The second area is the importance of sustainability within the IT department. Through actions like hardware reclamation and extending the lifespan of devices, such as laptops and servers, IT can contribute to more sustainable technology production. “But we can do much more” Göran explained, mentioning opportunities like prolonging the lifetime of the hardware, setting data centers cloud services, and optimising energy consumption by shutting down systems during idle times.

Source: www.boliden.com

Data-Driven Approach to Sustainability Reporting  

With the implementation of a data-driven sustainability program, sustainability reporting has become an increasingly important part of business operations. “We´ve reported on sustainability for many years and have various environmental permits that need to be reported to authorities. Also, we have investors demanding more and more sustainability reporting,” Göran shared. The increasing emphasis on sustainability takes on a new form. 

The European Union’s Corporate Sustainability Reporting Directive (CSRD) is a directive for sustainability reporting, and the requirements for quality and follow-up are becoming increasingly equivalent to those for financial reporting. “Sustainability reporting isn´t new, but the new legislation increases the requirements on what should be reported as well as traceability and frequency. Previously, sustainability reporting could have been done using standard tools like Excel and was somewhat less structured. Now it has to be done in a more controlled way, which requires systems,” he explained. 

In line with increased transparency and accessibility, data reliability has become more important. “We´ve been working on setting up a data platform in a mesh structure where we can conduct data governance. It´s about defining who owns the data, quality-assuring the data, and so on,” he explained. He further describes their efforts to automate more of the process to reduce manual effort. “We´re trying to automate wherever possible, so there’s less manual intervention. Of course, manual uploads are still needed sometimes, but we’re trying to automatically gather data from primary sources or sensors.” These efforts encompass not only general data management but also now emphasise sustainability data, reflecting the growing importance of data management for the company. 

 

The Quest for Data and EU Sustainability Regulations 

While discussing the amount of data to be collected in sustainability reporting and its complex nature, Göran said,“The actual challenge lies in collecting data and knowing where to fetch and harmonise the data. The challenge isn´t the output, but rather the collection of data. When talking about CSRD, it covers the entire spectrum of data. For example, the EU demands hundreds of KPIs across the entire ESG (Environmental, Social, Governance) spectrum. This means a double materiality analysis must be carried out to identify what´s important – both financial and other influencing factors. Therefore, it involves a significant amount of environmental data, like CO2 emissions, and different types of water and air quality, which are important for us.” 

When asked about Boliden’s preparedness for the EU´s new sustainability legislations, such as CSRD,  and upcoming legislation like Eco-design for Sustainable Products Regulation (ESPR), Göran said,We have been preparing for the EU´s sustainability reporting, CSRD, for a year as we recognised that it would take a long time. Further, we’ve been using basic reporting tools until now; moving forward, a comprehensive solution would be required in line with the growing complexity.”  

The Evolving Role of the CIO 

When asked about how Göran´s role as CIO has changed with the increased importance of sustainability aspects, he said, “The significance of IT has grown and become more prominent over time, and sustainability is the latest and important metric in IT”. This trend is visible across many companies, and Boliden is no exception. However, with the increased presence of IT within the business, collaboration between IT and other departments needs to be closer and more integrated than ever before. While sustainability holds a significant place in the company´s strategy, according to Göran, it hasn´t necessarily fundamentally changed the CIO’s role. He sees it as an opportunity for IT to step forward and make meaningful contributions. 

Future Trends

Speaking about the focus of CIOs moving forward Göran said, “It´s hard not to mention AI and advanced analytics, as these technologies enable more selective and enhanced process management. They assist in identifying, understanding, and optimising changes that can lead to a better environment. As for other trends, I believe an extremely critical success factor is a close collaboration between IT and the business. This isn´t unique to sustainability itself, but I consider it of tremendous importance”. 

Top 3 Priorities for CIOs in Sustainability

Finally, Göran believes that the three most important aspects a CIO needs to consider while addressing sustainability goals are: 

  1. Listen to business executives to be able to support and understand their pain points regarding sustainability.
  2. Proactively work on pilots, development, and improvements using analytics to conduct analyzes and enhancements.
  3. Evaluate how IT can be leveraged to address Sustainability goals and integrate such discussions into the IT roadmap

You might find this content relevant as well:

news

Supplier Optimisation: A strategic Lever In Your Sourcing Lifecycle

Even as businesses continuously innovate and scale new growth trajectories, adopting best practices in your sourcing lifecycle can be a key enabler to delivering business objectives. Supplier optimisation remains a key focus area in the Sourcing Toolkit of a CPO and finding the right balance in your supplier management strategy can make a significant difference to your bottom-line while delivering a sustainable competitive advantage. This article offers a perspective on the importance of supplier optimisation in your Sourcing lifecycle.

 

Introduction

In today’s dynamic & rapidly evolving business landscape, strategic sourcing has become a crucial function and is an integral part of the CxO discussion board. In a nutshell, Strategic sourcing is a procurement strategy that focuses on obtaining products (goods/services) in a way that aligns with the organisation’s overall business objectives. It involves a systematic and holistic approach to identify, evaluate, and select suppliers to optimise value, reduce costs, and mitigate risks. 

The Procurement Lifecycle may be best visualised as illustrated in Exhibit 1. The upstream areas usually fall under the realm of ‘Strategic Sourcing’ while the tactical downstream functions are classified as ‘Operational Procurement.’

Exhibit 1: The Procurement Lifecycle

As part of the procurement lifecycle, supplier optimisation plays an important role irrespective of the maturity of your sourcing function in achieving operational efficiency, cost reduction, and overall business success. So, how can one ensure the most optimal number of suppliers for a particular good or service while balancing risk and quality? For example, a long tail of suppliers may lead to operational inefficiencies and value leakage while too few may enhance your business risk significantly. How do you arrive at an optimal solution 

We offer a few perspectives towards achieving this objective.  

1. Develop a Robust Supplier Evaluation Framework 

One of the initial steps in supplier optimisation is to establish a comprehensive supplier evaluation framework. This framework should include key performance indicators (KPIs) aligned with business objectives, such as quality, delivery, cost, innovation, and sustainability. By objectively assessing suppliers based on these criteria, organisations can make informed decisions and identify the most suitable partners for their strategic sourcing initiatives. 

2. Leverage Sourcing tools and best-practice frameworks in your Sourcing Process 

While several tools may be implemented at various stages of the sourcing lifecycle, we highlight a few well-known tools that are extensively used in the industry.  

2.1. The Kraljic Matrix:

Perhaps the most widely used tool by procurement and supply chain professionals for supplier portfolio evaluation, the Kraljic Matrix is a function of two parameters: supply risk and profit impact. A supplier is classified under one of the four quadrants of the Kraljic Matrix based on the function of these two parameters. For example, a supplier is classified as strategic if the sourced product/service is “business critical” while impacting the bottom-line of the company the most. Using the tool in classifying suppliers’ products and services can address supply risks while supporting strategy development.  

The Kraljic Matrix is an integral part of the Opticos consultative toolkit. For example, stratification of suppliers into strategic and non-strategic at a global Swedish conglomerate client of Opticos, enabled a differentiated approach to the procurement optimisation challenge, and ultimately led to a total initial cost saving of ~120 MUSD, and a projected annual saving of ~15 MUSD.

Exhibit 2: Kraljic Matrix for Supplier Portfolio Evaluation

 

2.2. Total Cost of Ownership (TCO)

TCO analysis is a comprehensive approach that goes beyond the initial purchase price of a product or service and considers the entire lifecycle costs. It involves evaluating various cost components such as acquisition costs, operational costs, maintenance costs, and disposal costs.  

By conducting TCO analysis, organisations can make informed decisions based on the total cost impact rather than focusing solely on the purchase price. This helps identify opportunities for cost reduction and value enhancement throughout the sourcing process. 

3. Foster Strong Supplier Relationships 

Building strong and collaborative relationships with suppliers is fundamental to supplier optimization. By establishing open lines of communication, organisations can enhance transparency, trust, and mutual understanding. Regularly engaging with suppliers, conducting face-to-face meetings, and involving them in the product development process (especially the strategic category) can foster innovation and drive continuous improvement. Additionally, organisations should consider conducting periodic supplier performance reviews to address any concerns and reinforce the importance of meeting expectations. 

4. Embrace Technology and Data Analytics 

Leveraging technology such as e-sourcing platforms, e-procurement systems, and supplier relationship management tools can improve process efficiency as well as enable your organisation to gain trust with your current and prospective suppliers by bringing transparency in your sourcing and procurement processes.  Further, Advanced analytics can help identify patterns, trends, and potential risks, enabling organisations to make data-driven decisions, negotiate better contracts, and optimise supplier portfolios. 

5. Practice Supplier Diversity and Risk Mitigation 

Supplier optimisation should extend beyond cost reduction and performance improvement. Organisations should actively promote supplier diversity by considering businesses owned by underrepresented groups, fostering a more inclusive and sustainable supply chain. Diversifying the supplier base can enhance innovation, support local economies, and contribute to a positive brand image. 

Furthermore, effective risk management strategies are essential for supplier optimisation. Organisations should identify and assess potential risks associated with suppliers, such as geopolitical instability, natural disasters, or financial vulnerabilities. Developing contingency plans and alternative sourcing options can help mitigate potential disruptions and ensure business continuity. 

6. Encourage Continuous Improvement and Innovation 

Strategic sourcing is not a one-time exercise but a continuous process of improvement and innovation. Organisations with a mature Sourcing and Procurement functions encourage suppliers to embrace continuous improvement methodologies, such as Lean Six Sigma to eliminate waste, reduce costs, and enhance quality. Further, engaging suppliers in collaborative innovation initiatives, seeking their expertise in product development, process optimisation, and sustainability practices are important levers in achieving competitive advantage. By fostering a culture of continuous improvement, organisations can stay ahead of the competition and drive long-term supplier optimisation. 

The Final Word: 

Even as businesses continuously innovate and scale new growth trajectories, adopting best practices in your sourcing lifecycle can be a key enabler to delivering business objectives. Supplier optimisation remains a key focus area in the Sourcing Toolkit of a CPO and finding the right balance in your supplier management strategy can make a significant difference to your bottom-line while delivering a sustainable competitive advantage.

Authors:

Jan-Vidar Hugsted & Aravind Venkatesh

Jan-Vidar is a seasoned leader and Management Consultant at Opticos. Working closely with clients, he navigates complex challenges across sourcing and procurement lifecycle, contracting, financial governance, market intelligence and change management.
Aravind is an experienced leader with several years of cross-functional experience within management consulting, sourcing, strategic advisory, and business development.

You might find this content relevant as well:

news

A “Capability First” Perspective for Sourcing Success

As organizations look for cost effective ways of working, outsourcing non-core functions appears as an alluring prospect. But before you hand over your operations to third parties, it is worth pausing to consider the risks associated with your sourcing approach. Starting with an overall capability perspective entails several benefits over service first approach, and it enables you to build a strategic approach to sourcing. 

Outsourcing is an effective way for companies to focus on their core competencies, reduce costs, and access specialised expertise. However, the success of outsourcing initiatives is not guaranteed, and it is essential to take a strategic approach where one of the most important elements is to start with an overall capability perspective rather than a service focus. 

When starting with a service focus, companies tend to focus solely on the services that will be provided by the service supplier. Such an approach does not take into account the company’s broader capability portfolio, which could lead to gaps in capabilities, ineffective management of supplier relationships, and suboptimal outcomes.

Figure 1. Potential value leakages in the sourcing process

In contrast, starting with an overall capability perspective allows companies to take a more holistic approach to sourcing. This perspective involves identifying and assessing the key capabilities necessary to ensure an optimally functioning and strategically aligned service delivery.  

An overall capability perspective also allows for companies to develop a more strategic sourcing plan. The initial capability assessment should serve as the foundation for the transformation roadmap. Any capability gaps identified should be visualized, for example in a heat map, prioritised and addressed as part of the overall service delivery transformation plan – outlining the steps required to enhance or build the necessary capabilities, such as governance, processes, and technology, to close the gaps. This transformation plan will thus to a greater deal focus on achieving the company’s strategic objectives, rather than just accomplishing specific service targets.

Figure 2. An example of the heatmap illustrating capabilities and their maturity levels

The capability assessment can also be a useful tool in the service provider selection, where selection criteria should include ability to proactively support in providing and building the key capabilities. The transformation plan will then enable a joint focus and common strategic priorities. 

In essence, starting with an overall capability perspective enables organizations to take a more strategic approach to sourcing and how to outsource. It involves considering the long-term implications of the sourcing approach, including the impact on the organization’s capabilities and the ability to meet future business needs. This approach also allows organizations to identify the right service providers who can help build capabilities while delivering the required services.

Maria Nelsson & Abhishek Kale

Maria Nelsson is a seasoned business leader with diverse experience spanning over three decades in the consulting domain.
Abhishek Kale, Senior Consultant with experience in Process Excellence, Digital Transformation, and ERP Consulting across multiple business sectors.

You might find this content relevant as well:

news

A first step to an actionable Data Strategy

This is how you can take your first step to an actionable Data Strategy

 

To stay competitive, companies harness data to enhance customer experience, streamline operations, and carve out a sustainable advantage. Leveraging AI and advanced analytics, they tackle challenges from predictive maintenance and pricing models to organizational optimization and digitizing products.

 

Success with advanced analytics hinges on data accuracy and sustainable data sourcing. At Opticos, we have found that a pragmatic, focused data strategy is crucial for companies aiming to become data- and analytics empowered. This strategy facilitates the extraction of value from data, paving the road to success.

Many businesses grapple with fragmented IT landscapes, point solutions, lack of data ownership, and poor integration. As a result, analytics teams are burdened with data collection and cleaning rather than focusing on valuable analysis and insights.

Concerns about data quality frequently surface. Businesses periodically launch data quality and master data initiatives. But without robust data governance, data quality tends to decline over time. Hence, achieving accurate data often depends more on luck than a solid business practice. The remedy? An actionable data strategy.

 

To align the organization on data-related objectives and to overcome the challenges above, Opticos provides three recommendations:

  1. Use Case-Driven Approach: Identify, evaluate, and prioritize use cases that tie business needs to value, targeting relevant data assets and datasets. For instance, forecasting customer order volumes in supply chain management could use historical and weather data.
  1. Prioritized Data Asset Governance: Instead of tackling organization-wide data ownership head-on, establish ownership of data assets for prioritized use cases. Ideally, process or function owners will own the data from source to consumption, regardless of the storage and processing systems. Start with use cases, like customer order forecasting, where assets include product inventory, article details, orders, and sales forecasts. Assign ownership from source to consumption, independent of the data’s system journey.
  1. Transparent Data Architecture: Establish a clear blueprint detailing capabilities to capture, ingest, store, process, share, and consume data for prioritized use cases. Setting up systems for data discovery, monitoring, and governance is also crucial. Emphasize transparency and communication so all stakeholders understand their roles in delivering high-quality data. For instance, use diagrams to map out data flow for each prioritized data asset: from source systems like CRM, through data storage like Data Lake, to Business Intelligence reporting tools.

Furthermore, we recommend a phased data strategy implementation, detailing the first phase and keeping subsequent phases indicative. As you begin implementing, the roadmap’s details become more distinct and defined.

 

Illustration 1 – Illustrative template for a phased, use-case-driven Data Strategy execution roadmap

 

An actionable data strategy implements data management practices and governance structures that enable efficient data sharing and continuous quality improvement. Once initial success is evident, the process can be scaled and replicated across other use cases.

Guiding your Data Management Journey

At Opticos, we enable organizations to leverage business benefits by building pragmatic, and holistic data management practices. Drawing from our extensive client experience and methodology, we’re here to guide your data management journey from strategy to implementation.

Tatiana Schön & Mattias Gustrin

Mattias Gustrin, Associate Director with a focus on Data Management, Advanced Analytics, and technology strategies across multiple sectors.
Tatiana Schön, Manager with experience in consulting, project management and business analysis within AI governance, Data privacy, IT and IT Finance.

You might find this content relevant as well: