Posts Tagged Expertise

How to Break Down the Data Silos in Life Sciences Companies

How to Break Down the Data Silos

How to Break Down the Data Silos in Life Sciences Companies

In Life Sciences industry, making sense of data becomes increasingly challenging due to its growing volume, introduction of new and complex technologies and increase of stakeholder numbers.

Introduction of the right data reporting and analytics solution at the corporate level allows Life Sciences companies not only to streamline the collaboration within the company by breaking down the silos between the departments but also to improve the decision-making process and optimise technology investments.

On July 8, Arithmos hosted a complimentary webinar on using data reporting and visualisation for breaking down the silos in Life Sciences companies. The webinar was conducted by Silvia Gabanti, Managing Director of Arithmos, and Massimo Businaro, CEO of E-project.

Continue reading to learn how effective data reporting and visualisation can improve the decision making at multiple levels, increase operational efficiency, and break the departmental silos.

Register now to get free access to webinar recording

Data Ecosystems in Life Sciences companies

Rapid technology advancement and new regulations benefit patients and speed up clinical trials. However, they also increase the amount of data generated daily. Real World Data, Internet of Things, ePRO – all these technologies allow to multiple the amount of data that a company deals with on a daily basis.

Data is the new fuel for Life Sciences businesses and source of competitive advantage; it drives innovation and stimulates growth.

Currently, despite the enormous potential of this newly generated data, a lot of pharmaceutical companies are suffering from the data silos issue. It manifests itself in an inability to share data across departments, the presence of multiple data repositories, and the absence of a clear picture of the company’s data.

As data is often stored in multiple tools and locations, data silos severely impact the departments’ ability to collaborate and harm scientific discovery and drug development.

Data silos are caused by multiple reasons:

  • Every department has its own data repository isolated from the rest of the organisation
  • Silos tend to arise naturally in organisations over time because each department or business function has different goals, priorities, responsibilities, processes, and data repositories.
  • Legacy solutions don’t prioritise sharing data to other departments while maintaining data segregation, reduce the flow of information to what is considered strictly necessary

The challenge of data silo is normally recognised but often not addressed in a strategic and planned way, especially by small and medium companies.

Elimination of data silos and the creation of free data flow between the departments is the secret to transforming data into fuel for the company’s growth.

Breaking Down the Silos

If a pharmaceutical company plans to break down the data silos and use data to improve its performance and deliver its solutions to the market in a faster and more efficient manner, it should handle a cultural and structural change.

This can be done in 5 steps:

  1. Identify the cause of the company’s silo problem and the main impacted areas (ex. Pharmacovigilance department)
  2. Get management to buy in
  3. Define a scalable technology strategy that covers structural and cultural aspects of the change
  4. Embrace a corporate advanced reporting and analytics solution to create a bridge between the departments
  5. Invest in a process review and cross-functional training

Step 1: Identify the Cause of Silos and the Main Impacted Areas

Identifying the data management inefficiencies that impact company’s performance is very challenging. This is particularly true in small and medium companies without a Chief Data Officer who advocates the importance of collecting and leveraging data in decision-making.

In this case, the first step is to identify a business area that relies the most on data and rationalise the way it collects and manages information. This department will be the “Front door” of the change.

A good example is pharmacovigilance department. It is often perceived as a pure «cost» for the company, but that is key business unit in terms of regulatory compliance and patient safety.

Pharmacovigilance departments need to continuously improve their processes to reduce department costs. On the other side, they are under great pressure because of strict regulations and necessity to work closely with other departments (like regulatory affairs and clinical research) that have their own, completely different business goals.

Breaking down the data silos between pharmacovigilance and other departments eases the cooperation and allows faster and more robust decisions based on data analysis and visualisation, safeguarding patient’s health more efficiently.

Step 2: Get the Management to Buy In

To break down the data silos by introducing a cross-departmental reporting solution, you need to get the upper management to buy in.

Getting rid of data silos helps each individual department and the entire organisation by:

  • offering them the big picture of the company’s data
  • aligning company’s long-term goals and department objectives
  • reducing the conflict between departments providing clear guidance on corporate priorities and avoiding conflict between personal objectives and corporate growth
  • engage corporate stakeholders like IT which should be more conscious of the business goals and act as opinion leaders

Step 3: Define a Scalable Strategy

Pharmaceutical companies can find it challenging to decide which technology solution to choose or when to eliminate redundant technology.

This challenge can arise from biases in business decision making, which in turn can be caused by the following reasons:

  • advocating previous choices (technologies, consultants, processes)
  • sticking to the company’s previous habits
  • lack of appropriate decision-support tools

To ensure the appropriate allocation of funds and minimal disruption to the company’s work during the period of change, it is critical to consider the change as a program. Optimisation of data management and data reporting should be included in a company’s defined roadmap.

This means that Work Breakdown Structure (WBS) approach should be followed – below you can find an example of WBS for defining activities of the project aim at breaking down the data silos.

How to Break Down the Data Silos in Life Sciences Companies

Step 4: Embrace Advanced Reporting and Analytics Tools

Most pharmaceutical companies manage different systems that contain heterogeneous and disparate data.

This gives rise to two challenges:

  • Optimisation of the access to the information of the specific business systems (QA, PhV, RA databases)
  • Increase of the ability to share data by rationalising and connecting these systems

Introduction of an advanced reporting and analytic solution allows pharmaceutical companies to:

  • Support operational teams with data analysis and reporting and reduce manual elaboration of the data. This is not only time consuming, but also increases the risk of regulatory non-compliance
  • Allow the managers to effectively oversee operational teams. This includes:
    • Identifying challenging and ineffective processes
    • Improving KPIs
    • Obtain data that supports the decision-making process
  • Integrate external data in a smooth manner
  • Merge and evaluate data coming from different business areas to allow to identify in real time inconsistencies, carry out data reconciliation, avoid duplicating information between the systems, speed up the regulatory reporting, and increase data quality.

Step 5: Invest in Process Review and Cross-functional Training

Social change is a key step in breaking down the silos – all stakeholders must be fully committed to the successful result from the very beginning.

It is extremely important to share the company objectives both with management and operational teams. The management can define the strategic and economic advantages, but only the resources involved in the data management can identify the real bottlenecks that this solution can destroy.

The involvement of the teams during the requirements gathering, organisation of cross-functional workshops and training during the entire program development can improve the quality of the new processes and tools. What is more important, it can also break down the cultural silos (together with the data ones) reducing the change resilience and building relationships and corporate culture.

Conclusion

Companies can unlock the full data potential and optimize internal processes, and improve decision making by breaking down the data silos and allowing data to flow freely. Advanced data reporting and visualisation solution is a key step in this process alongside cultural and structural change.

The destruction of the data silos in Life Sciences companies can result in streamlined delivery of drugs to the market and more efficient work of all the departments, benefiting patients and allowing companies to safeguard their health more effectively.

Register now to get free access to webinar recording to learn more about breaking down the silos in Life Sciences companies

Q&A: Kairos – Data Analytics and Reporting Solution

Kairos Q&A Data Analytics and Reporting Tool

Q&A: Kairos – Data Analytics and Reporting Solution

Kairos, Data Analytics and Reporting solution for PharmacovigilanceLast year, Arithmos, a provider of innovative IT solutions with a specific focus on the Life Sciences, announced a partnership agreement with E-project, a consulting and system integration company. Together they have brought to the market Kairos, a unique Data Analytics and Reporting solution for pharmacovigilance.

Kairos gives Life Science companies full visibility of their safety data with real-time reports and dashboards, easily ensures regulatory compliance and creates value from the advanced analysis of their safety data.

We have talked to Silvia Gabanti, Arithmos’ Managing Director, to learn more about this new solution.

How can Kairos improve the pharmacovigilance processes and ease the life of the pharmacovigilance team?

First of all, Kairos allows to produce PSUR/PBRER and other safety reports, automatically, in a simple and error-free manner. It excludes the factor of human error.

Secondly, it allows to optimize the performance of the pharmacovigilance department by creating reports to measure KPIs.

Thirdly, it removes the silos between Pharmacovigilance and Regulatory and Clinical departments supporting the data reconciliation in the most effective way.

Silvia Gabanti Managing Director ArithmosCan other departments use Kairos?

Absolutely, and this makes Kairos unique. It helps to make sense of data regardless of its origin. Now Life Science companies gather enormous amounts of data from different sources, like EDCs and safety systems, and mostly use it to ensure compliance. However, data is much more than this.

Data analysis allows to optimize the processes while reducing the team’s effort and the gap between operational units. This helps to make faster and safer decision and improves the business efficiency. Life Science companies can grow business and act proactively, not only reactively.

Although Kairos was born as a pharmacovigilance reporting and data analytics solution, it can also be used horizontally across the company. Departments from Clinical to Marketing can use it to analyse their data, create reports, and share them easily.

Kairos allows you to break down the silos between the departments.

What makes Kairos a unique solution on the market?

Kairos is the result of a synergy between teams with a decade of experience in business intelligence, data analytics, pharmacovigilance, and Life Sciences in general. It is more than just a piece of technology. Kairos includes the solution itself, consultancy, report building, and fine tuning.

Such a combination makes Kairos a powerful tool that is dispensed with low license and maintenance costs.

How can you describe Kairos’ price model?

It is scalable. You start with the minimum investment with the licenses and configuration. If you would like to add new reports, or extend Kairos to other departments, you can easily do it.

Can Kairos be customised?

Yes, absolutely. The whole system is customisable. The first thing we do is understand our clients business needs and type of reporting required. Then we can customize Kairos based on these expectations.

The next step is to give the clients detailed training. We want them to be completely independent and able to change the format, design or data, themselves, to fit their organisation.

How does Kairos fit the bigger Arithmos goals?

We strive to provide 360-degree support, to our clients, on the road to Digital Transformation. Kairos is another step towards this goal.  It allows Arithmos to give more support to the clients.

We also believe in a proactive approach to safety and regulatory compliance. Harnessing the power of data that Kairos contributes to it.

Last, but not the least, we preach efficiency in operations and processes. Introduction of a tool that can operate horizontally and not vertically, allows to save on IT investments and optimize the departments’ work by breaking down the silos.

Want to learn more about Kairos? Contact Arithmos now to see the demo of the solution.

How to Perform a Successful Data Migration in Life Sciences

How to Perform a Successful Data Migration in Life Sciences

How to Perform a Successful Data Migration in Life Sciences

Data Migration is defined as the process of selecting, preparing, extracting, and transforming data and permanently transferring it from one computer storage system to another.

Data migration has the reputation of being risky and difficult and it’s certainly not an easy process. It is time-consuming with many planning and implementation steps, and there is always risk involved in projects of this magnitude.

Without a sufficient understanding of both source and target, transferring data into a more sophisticated application will amplify the negative impact of any incorrect or irrelevant data, perpetuate any hidden legacy problem and increase exposure to risk. A data migration project can be a challenge because administrators must maintain data integrity, time the project correctly to minimize the impact on the business and keep an eye on costs.

However, following a structured methodology will reduce the pain of managing complex data migration.

On May 14, Arithmos hosted a complimentary webinar on performing a successful data migration in Life Sciences. The webinar was conducted by Alessandro Longoni, Arithmos Senior Project Manager & Data Analyst, and focused on the challenges in data migration and ways of overcoming them.

Continue reading to learn the tips on successful performance of data migration in Life Sciences.

Register now to get free access to webinar recording

Tip 1 – Understanding the data

Before starting the data migration, you have to prepare your data for the migration, carrying out an assessment of what is present in the legacy system, understanding clearly which data needs to be migrated, avoiding duplication and promoting quality and standardization.

We can divide the assessment of legacy system in two macro categories:

  • Assessment of the data meaning
  • Assessment of the data quality

Every piece of data that you move is something that has to be validated, to be cleaned and transformed. In data migration projects, migrating only relevant data ensures efficiency and cost control.

Understanding how the source data will be used in the target system is necessary for defining what to migrate. It is important to look at how people are using existing data elements. Are people using specific fields in a variety of different ways, which need to be considered when mapping out the new system’s fields?

The second macro area is the assessment of the quality of the data. It is very important to define a process to measure data quality early in the project, in order to obtain details of each single data piece, and identify unused fields or obsolete records that may have undergone multiple migration. It is also important to avoid the migration of duplicate records or not relevant records.

The quality analysis typically leads to a data cleaning activity.

Cleaning the source data is a key element of reducing data migration effort. This is usually a client’s responsibility. However, the client can be supported by the provider, who will perform specific data extractions, aggregations and normalizations in order to reduce client’s effort.

Another way to clean data is the adoption of migration scripts for cleaning purposes. It is important to understand that this kind of activity could create validation issues, because we are modifying the source data leaving the perimeter of a pure data migration and creating potential data integrity issues.

Tip 2  – Project Governance

The best for approaching a data migration project is clearly defining roles and responsibilities and avoiding accountability overlapping. This can be done in several steps:

  • Define the owner of the data in the target system
  • Include the business users in decision-making. They understand the history, the structure and the meaning of the source data. Additionally, if business users are engaged in the data migration project, it will be easier for them to interact with migrated cases after GoLive.
  • Rely on data migration experts. Each data migration requires assistance from experts, which can fill the gap between business and IT, where both are key stakeholders but often unable to understand each other.

Based on our experience, what makes a difference is the presence of a business analyst. This is a person that acts as a bridge between the technical staff involved in the technical implementation of the migration, and the businesspeople. The business analyst can explain in a clear way technical requirements, and that can really help the business to define the migration rules based on how the target system will use the migrated data.

Tip 3 – Roll back & Dry Run

A roll back strategy has to be put in place in order to mitigate risks of potential failures. Access to source data have to be done in read only mode. This prevents any kind of data modification and ensures its integrity. Backups have to be performed on the target system in order to restore it in case of failures.

Accurate data migration dry run allows to execute validation and production migrations without incidents or deviations. Procedures and processes have to be tested in order to check the completeness of the records, and to ensure the integrity and authenticity in according with data migration rules and purposes.

Tip 4 – The Importance of the Data Mapping Specification Document

Data Mapping Specifications document is the core of data migration. It ensures a complete field mapping and it is used to collect all mapping rules and exceptions.

This project phase is usually long and tiring for a number of reasons:

  • Volume and amount of data details
  • Technical activity with technical documents
  • Little knowledge of dynamics of target database
  • Compromises that have to be made

The Data Mapping Specifications document specified details all the rules related to the data that is migrated. The following tips can help you to do it in the most efficient way:

  • Clarify what has to be migrated and what shouldn’t be migrated
  • Clean source data – this will reduce the number of fields to migrate
  • Liaise with a business analyst that will translate technical requirements and help to explain how data will work in the target system
  • Rely on data migration expert that have already performed similar data migration in the past
  • Avoid using an Excel sheet for mapping table fields – a more visual document with pictures will ease the conversation with the business users

Tip 5 – Perform comprehensive validation testing

To ensure that the goals of the data migration strategy are achieved, a company needs to develop a solid data migration verification process. The data migration verification strategy needs to include ways to prove that the migration was successfully completed, and data integrity was maintained.

Tools or techniques that can be used for data migration verification include source vs target data integrity checks. A combination of manual and automatic checks can be used to cover all verification needs. The verification can include qualitative and quantitative checks.

In order to carry out data verification, a processing of a migrated data through a workflow must be done. This ensures that migrated data properly interacts with the target system functionalities.

The sampling strategy is defined in the Data migration plan and it should be driven by the risk impact of the data. 100% sampling is not feasible and adequate, therefore standards such as ANSI/AQL is usually used to define the risk-based sampling strategy.

This article is based on a webinar presented by Alessandro Longoni on May 14. Register today to get access to the webinar recording.

About Alessandro Longoni, Senior Project Manager & Business Analyst

Alessandro LongoniAlessandro Longoni is a Senior Business Analyst & Project Manager of Arithmos. He has been working in the field of Life Sciences technology for 13 years and has particular expertise in pharmacovigilance and electronic data management.

Learn more about Alessandro Longoni, his career path and current role in Arithmos in our “Career Insights” section.

Webinar Q&A: How to Perform a Successful Data Migration in Life Sciences

Data Migration Webinar Q&A

Webinar Q&A: How to Perform a Successful Data Migration in Life Sciences

In the Life Sciences industry a structured, but lean, approach to data migration is vital for success. Such an approach not only ensures data integrity and consistency, but also minimizes the data migration impact on day-by-day business operations.

On May 14, Arithmos hosted a complimentary webinar on performing a successful data migration in Life Sciences. The webinar described the main risks and challenges of the data migration, ways to overcome them, and presented a case study of data migration in pharmacovigilance.

The webinar was conducted by Alessandro Longoni, a Senior Business Analyst & Project Manager of Arithmos. He has been working in the field of Life Sciences technology for 13 years and has particular expertise in pharmacovigilance and electronic data management.

Continue reading to discover the questions from the Q&A session from the webinar.

Register today to get access to the webinar recording

During a pharmacovigilance migration, should only the last version of the cases be migrated, or should we migrate all the versions?

It depends on the legacy and the target system. Some classical safety systems save each version of each case as an independent case.

If you use an ETL tool, to migrate to a new system, that does not operate according to this logic, we would advise to migrate only the last version of the case in order to avoid the duplicate cases. It is also advisable to migrate the case history (case workflow) if possible.

However, in all the cases, it is technically possible to migrate all the versions of the case if you find the right workaround. For example using the following naming format: <case_number>_<caseversion>.

The best practice in data migration is adopting the E2B XML approach – regardless of the number of case versions, only the last version is included in the legacy system.

What steps can be omitted in semi-automated data migration compared to the automated?

Data migration, regardless of the approach used (ETL, XML import, re-entry), must be included in a validation exercise.

The main steps and deliverables are the same regardless of the approach. However, the documents and activities in semi-automated and automated data migration projects require different accuracy and effort:

  • Data Migration Plan: it should include the strategy of the migration project and the main rules for data migration. This means that it covers the mapping needs for re-entry, and in case of the ETL tool, a data mapping document is highly suggested.

In case of the XML migration the format depends on the number of cases and the complexity (due to the difference between the configurations of target and legacy system). For simple migrations/low number of cases, the data migration plan could be sufficient. For complex migrations/high number of cases, we suggest a dedicated data mapping document

  • Data Mapping Specifications: see above
  • Data Migration Validation: it is a protocol that must be executed in the validation environment to verify the migration process. It is more extended in the case of ETL, but we suggest to use it regardless of data migration approaches.
  • Data Migration Production: it consists of the execution of the protocol, described above, that is conducted to migrate data in production. In case of an ETL approach, this phase will require 2-5 days for technical activities plus 1-2 days for business data verification (production downtime period). In case of an XML or re-entry process, this phase will take up to several months of business work (no production downtime period).
  • Data Migration Report

When you access the source data in read-only mode, is it possible to also access the files attached to the ICSR such as publication (in PDF format) or ACK from Regulatory Authorities (in XML format)? / How can we migrate attachments from one system to another? Should the client send the attachments separately from the XML files if the migration is performed by import/export migration?

Yes, it is possible. Of course, this possibility depends heavily on the structure of the legacy system database. Sometimes, technology constraints don’t allow you to extract some particular information, such as attachments or audit trail.

If the legacy system stores the attachments in the database with proper references to the relevant case, and if the target system supports the case attachment functionality, the best idea would be to perform the integration using the ETL tool.

If I have an E2B XML file as source of a case recorded in an Excel database, can I use it as a source for the migration into the new database in order to avoid doing a migration manually from the old database?

The standard approach to data migration is migrating data from the legacy to the target system and not from the source to the target. Nevertheless, a different approach can be chosen if it can grant a better data quality (like in the question) and efficiency. This approach must be described and justified in the Data Migration Plan. Also, we advise to carry out a final verification to detect every discrepancy between migrated data and data stored in the legacy system.

How can you calculate the number of the records from the source and target systems that need to be validated?

An ETL tool can reduce the need for the data verification, especially because a migration script works on specific fields independently from the number of records to be migrated. For example, if my script moves correctly the laboratory data, the number of tests executed by a specific patient is not relevant. This ensures the quality of the migrated data.

However, a minimum quality check of the fields has to be performed, and it should be based on the recommendations of your CSV manager and the risk assessment.

A numeric check of the records moved, (100%), is always required to ensure that there were no exceptions during the script run.

This article is based on a webinar presented by Alessandro Longoni on 14th May 2020. Register today to get access to the webinar recording.

Webinar: How to Perform a Successful Data Migration in Life Sciences

Data migration Life Sciences

Webinar: How to Perform a Successful Data Migration in Life Sciences

In Life Sciences industry a structured but lean approach to data migration is vital for success. Such approach not only ensures data integrity and consistency, but also minimizes the data migration impact on day-by-day business operations.

The webinar will describe the main risks and challenges of the data migration, ways to overcome them, and present a case study of data migration in pharmacovigilance.

Click here to learn more about the webinar and register your attendance

Date and time:

Thursday, May the 14th, 10:00 AM – 12:00 PM CET

The key learning outcomes will be:

  • How to define an adequate data migration strategy including data migration verification
  • The main steps in data migration
  • Understanding of the major challenges during the data migration and ways to overcome them

Presenter: 

Alessandro Longoni is a Senior Business Analyst & Project Manager of Arithmos. He has been working in the field of Life Sciences technology for 13 years and has particular expertise in pharmacovigilance and electronic data management.

Follow us on Twitter

RESOURCES

NEWS