Archive for the RESOURCES Category

How to Break Down the Data Silos in Life Sciences Companies

How to Break Down the Data Silos

How to Break Down the Data Silos in Life Sciences Companies

In Life Sciences industry, making sense of data becomes increasingly challenging due to its growing volume, introduction of new and complex technologies and increase of stakeholder numbers.

Introduction of the right data reporting and analytics solution at the corporate level allows Life Sciences companies not only to streamline the collaboration within the company by breaking down the silos between the departments but also to improve the decision-making process and optimise technology investments.

On July 8, Arithmos hosted a complimentary webinar on using data reporting and visualisation for breaking down the silos in Life Sciences companies. The webinar was conducted by Silvia Gabanti, Managing Director of Arithmos, and Massimo Businaro, CEO of E-project.

Continue reading to learn how effective data reporting and visualisation can improve the decision making at multiple levels, increase operational efficiency, and break the departmental silos.

Register now to get free access to webinar recording

Data Ecosystems in Life Sciences companies

Rapid technology advancement and new regulations benefit patients and speed up clinical trials. However, they also increase the amount of data generated daily. Real World Data, Internet of Things, ePRO – all these technologies allow to multiple the amount of data that a company deals with on a daily basis.

Data is the new fuel for Life Sciences businesses and source of competitive advantage; it drives innovation and stimulates growth.

Currently, despite the enormous potential of this newly generated data, a lot of pharmaceutical companies are suffering from the data silos issue. It manifests itself in an inability to share data across departments, the presence of multiple data repositories, and the absence of a clear picture of the company’s data.

As data is often stored in multiple tools and locations, data silos severely impact the departments’ ability to collaborate and harm scientific discovery and drug development.

Data silos are caused by multiple reasons:

  • Every department has its own data repository isolated from the rest of the organisation
  • Silos tend to arise naturally in organisations over time because each department or business function has different goals, priorities, responsibilities, processes, and data repositories.
  • Legacy solutions don’t prioritise sharing data to other departments while maintaining data segregation, reduce the flow of information to what is considered strictly necessary

The challenge of data silo is normally recognised but often not addressed in a strategic and planned way, especially by small and medium companies.

Elimination of data silos and the creation of free data flow between the departments is the secret to transforming data into fuel for the company’s growth.

Breaking Down the Silos

If a pharmaceutical company plans to break down the data silos and use data to improve its performance and deliver its solutions to the market in a faster and more efficient manner, it should handle a cultural and structural change.

This can be done in 5 steps:

  1. Identify the cause of the company’s silo problem and the main impacted areas (ex. Pharmacovigilance department)
  2. Get management to buy in
  3. Define a scalable technology strategy that covers structural and cultural aspects of the change
  4. Embrace a corporate advanced reporting and analytics solution to create a bridge between the departments
  5. Invest in a process review and cross-functional training

Step 1: Identify the Cause of Silos and the Main Impacted Areas

Identifying the data management inefficiencies that impact company’s performance is very challenging. This is particularly true in small and medium companies without a Chief Data Officer who advocates the importance of collecting and leveraging data in decision-making.

In this case, the first step is to identify a business area that relies the most on data and rationalise the way it collects and manages information. This department will be the “Front door” of the change.

A good example is pharmacovigilance department. It is often perceived as a pure «cost» for the company, but that is key business unit in terms of regulatory compliance and patient safety.

Pharmacovigilance departments need to continuously improve their processes to reduce department costs. On the other side, they are under great pressure because of strict regulations and necessity to work closely with other departments (like regulatory affairs and clinical research) that have their own, completely different business goals.

Breaking down the data silos between pharmacovigilance and other departments eases the cooperation and allows faster and more robust decisions based on data analysis and visualisation, safeguarding patient’s health more efficiently.

Step 2: Get the Management to Buy In

To break down the data silos by introducing a cross-departmental reporting solution, you need to get the upper management to buy in.

Getting rid of data silos helps each individual department and the entire organisation by:

  • offering them the big picture of the company’s data
  • aligning company’s long-term goals and department objectives
  • reducing the conflict between departments providing clear guidance on corporate priorities and avoiding conflict between personal objectives and corporate growth
  • engage corporate stakeholders like IT which should be more conscious of the business goals and act as opinion leaders

Step 3: Define a Scalable Strategy

Pharmaceutical companies can find it challenging to decide which technology solution to choose or when to eliminate redundant technology.

This challenge can arise from biases in business decision making, which in turn can be caused by the following reasons:

  • advocating previous choices (technologies, consultants, processes)
  • sticking to the company’s previous habits
  • lack of appropriate decision-support tools

To ensure the appropriate allocation of funds and minimal disruption to the company’s work during the period of change, it is critical to consider the change as a program. Optimisation of data management and data reporting should be included in a company’s defined roadmap.

This means that Work Breakdown Structure (WBS) approach should be followed – below you can find an example of WBS for defining activities of the project aim at breaking down the data silos.

How to Break Down the Data Silos in Life Sciences Companies

Step 4: Embrace Advanced Reporting and Analytics Tools

Most pharmaceutical companies manage different systems that contain heterogeneous and disparate data.

This gives rise to two challenges:

  • Optimisation of the access to the information of the specific business systems (QA, PhV, RA databases)
  • Increase of the ability to share data by rationalising and connecting these systems

Introduction of an advanced reporting and analytic solution allows pharmaceutical companies to:

  • Support operational teams with data analysis and reporting and reduce manual elaboration of the data. This is not only time consuming, but also increases the risk of regulatory non-compliance
  • Allow the managers to effectively oversee operational teams. This includes:
    • Identifying challenging and ineffective processes
    • Improving KPIs
    • Obtain data that supports the decision-making process
  • Integrate external data in a smooth manner
  • Merge and evaluate data coming from different business areas to allow to identify in real time inconsistencies, carry out data reconciliation, avoid duplicating information between the systems, speed up the regulatory reporting, and increase data quality.

Step 5: Invest in Process Review and Cross-functional Training

Social change is a key step in breaking down the silos – all stakeholders must be fully committed to the successful result from the very beginning.

It is extremely important to share the company objectives both with management and operational teams. The management can define the strategic and economic advantages, but only the resources involved in the data management can identify the real bottlenecks that this solution can destroy.

The involvement of the teams during the requirements gathering, organisation of cross-functional workshops and training during the entire program development can improve the quality of the new processes and tools. What is more important, it can also break down the cultural silos (together with the data ones) reducing the change resilience and building relationships and corporate culture.

Conclusion

Companies can unlock the full data potential and optimize internal processes, and improve decision making by breaking down the data silos and allowing data to flow freely. Advanced data reporting and visualisation solution is a key step in this process alongside cultural and structural change.

The destruction of the data silos in Life Sciences companies can result in streamlined delivery of drugs to the market and more efficient work of all the departments, benefiting patients and allowing companies to safeguard their health more effectively.

Register now to get free access to webinar recording to learn more about breaking down the silos in Life Sciences companies

Navigating Business in Times of COVID-19

Navigating Business in Times of COVID-19: Interview with Paolo Morelli

Navigating Business in Times of COVID-19: Interview with Paolo Morelli

COVID-19 pandemic forced companies to explore how to survive while ensuring business continuity and protecting the health of their employees. We talked to Paolo Morelli, CEO of PM Holding (Arithmos, CROS NT, seQure), in order to understand what business owners can do to mitigate the impact of the pandemic on their operations, ensure business continuity, and deliver better outcomes in times of crisis.

What were your initial steps when the outbreak started?

We had two major concerns when the outbreak started: safety of our team and business continuity.

In order to deal with these concerns, we have formed a risk management team and carried out an operational and commercial revision. The former was necessary to mitigate the risks for our clients and employees, while the latter allowed us to adjust the way our company interacts with the market.

The operational and commercial revision was a way to quickly reassess the situation from the business point of view. COVID-19 had changed the way our Life Science industry operates, so we needed to evaluate regulatory aspects of the new situation and understand new operational constraints. For example, we had to assess how to conduct GCP audits with travel restrictions or how to do on-site monitoring. All of these questions needed urgent answers if we wanted to continue delivering services to our clients.

How did the transition to the home office go?

Smoothly. In the last 5 years we have created a structure that allowed our teams to move to their home office without any obstacles. All the companies across PM Holding have a very clear organisational structure. Each company has a managing director, line managers, team leads that work closely with their units and have clear work objectives. We currently have in place recurring meetings and revisions to ensure that projects are delivered on time and many of our colleagues already have experience with working from home.

All these factors combined with the reliable IT infrastructure and underpinned by our company values – trust and communication – allowed us to switch to home office working in the blink of an eye.

What is the biggest challenge in managing a business in pandemic times?

I believe that regardless of the industry, the most challenging part is the change management. All of us are more comfortable with doing things in a familiar way and we can sometimes struggle when we need to adopt new paradigms as some are very risk adverse.

However, these are times when we need to be flexible and adopt a smart strategy effectively, like during the COVID-19 pandemic.

Paolo Morelli, CEOAre you happy about the way PM Holding companies are coping with the COVID-19 challenge?

Yes. We have worked hard to build the company culture based on trust, innovation, and communication and I see very clearly now that the results of these efforts and investments.

What have we learnt?

First of all, we now have proof that our processes and internal structure are really effective. In calm periods these processes, rules and procedures may seem unnecessary and too strict, but in times of crisis like COVID-19 pandemic, they turned out to be hugely advantageous. They have allowed us to provide business as usual approach to our clients and ensure smooth transition to the home office for our teams.

Secondly, we have seen the importance of glocal management. Thinking globally and acting locally allows us to have a global strategy and global mindset, and at the same time, consider local events and be as close to our clients and employees as possible.

How can businesses deliver better outcomes in times of a crisis?

There are three tips that I would like to share:

  1. Analyse the company organisation and understand the new risks
  2. Define the risk mitigation plan and do the necessary changes in the company organisation
  3. Identify opportunities that the crisis may give rise to

I also can’t emphasise enough the importance of innovation. Innovation needs to be part of company culture, as this is one of the only ways to find new solutions, take risks and cope with adverse circumstances successfully. I am not talking only about technology here – innovation is a mindset! We need to develop this mindset and bring the younger generation into our businesses, as they are like sponges and take onboard so much information, have open minds and are able to think outside the box.

What do you think will be COVID-19 impact on Life Sciences?

Pre-COVID-19 working from home in our industry was the exception rather than the rule, however, it may be vice versa in the future. I also think there will be much more focus on decentralised clinical trials and further enhancing technology with the buy in both from sponsors and patients.

Continue reading:

Q&A: Kairos – Data Analytics and Reporting Solution

Kairos Q&A Data Analytics and Reporting Tool

Q&A: Kairos – Data Analytics and Reporting Solution

Kairos, Data Analytics and Reporting solution for PharmacovigilanceLast year, Arithmos, a provider of innovative IT solutions with a specific focus on the Life Sciences, announced a partnership agreement with E-project, a consulting and system integration company. Together they have brought to the market Kairos, a unique Data Analytics and Reporting solution for pharmacovigilance.

Kairos gives Life Science companies full visibility of their safety data with real-time reports and dashboards, easily ensures regulatory compliance and creates value from the advanced analysis of their safety data.

We have talked to Silvia Gabanti, Arithmos’ Managing Director, to learn more about this new solution.

How can Kairos improve the pharmacovigilance processes and ease the life of the pharmacovigilance team?

First of all, Kairos allows to produce PSUR/PBRER and other safety reports, automatically, in a simple and error-free manner. It excludes the factor of human error.

Secondly, it allows to optimize the performance of the pharmacovigilance department by creating reports to measure KPIs.

Thirdly, it removes the silos between Pharmacovigilance and Regulatory and Clinical departments supporting the data reconciliation in the most effective way.

Silvia Gabanti Managing Director ArithmosCan other departments use Kairos?

Absolutely, and this makes Kairos unique. It helps to make sense of data regardless of its origin. Now Life Science companies gather enormous amounts of data from different sources, like EDCs and safety systems, and mostly use it to ensure compliance. However, data is much more than this.

Data analysis allows to optimize the processes while reducing the team’s effort and the gap between operational units. This helps to make faster and safer decision and improves the business efficiency. Life Science companies can grow business and act proactively, not only reactively.

Although Kairos was born as a pharmacovigilance reporting and data analytics solution, it can also be used horizontally across the company. Departments from Clinical to Marketing can use it to analyse their data, create reports, and share them easily.

Kairos allows you to break down the silos between the departments.

What makes Kairos a unique solution on the market?

Kairos is the result of a synergy between teams with a decade of experience in business intelligence, data analytics, pharmacovigilance, and Life Sciences in general. It is more than just a piece of technology. Kairos includes the solution itself, consultancy, report building, and fine tuning.

Such a combination makes Kairos a powerful tool that is dispensed with low license and maintenance costs.

How can you describe Kairos’ price model?

It is scalable. You start with the minimum investment with the licenses and configuration. If you would like to add new reports, or extend Kairos to other departments, you can easily do it.

Can Kairos be customised?

Yes, absolutely. The whole system is customisable. The first thing we do is understand our clients business needs and type of reporting required. Then we can customize Kairos based on these expectations.

The next step is to give the clients detailed training. We want them to be completely independent and able to change the format, design or data, themselves, to fit their organisation.

How does Kairos fit the bigger Arithmos goals?

We strive to provide 360-degree support, to our clients, on the road to Digital Transformation. Kairos is another step towards this goal.  It allows Arithmos to give more support to the clients.

We also believe in a proactive approach to safety and regulatory compliance. Harnessing the power of data that Kairos contributes to it.

Last, but not the least, we preach efficiency in operations and processes. Introduction of a tool that can operate horizontally and not vertically, allows to save on IT investments and optimize the departments’ work by breaking down the silos.

Want to learn more about Kairos? Contact Arithmos now to see the demo of the solution.

How to Perform a Successful Data Migration in Life Sciences

How to Perform a Successful Data Migration in Life Sciences

How to Perform a Successful Data Migration in Life Sciences

Data Migration is defined as the process of selecting, preparing, extracting, and transforming data and permanently transferring it from one computer storage system to another.

Data migration has the reputation of being risky and difficult and it’s certainly not an easy process. It is time-consuming with many planning and implementation steps, and there is always risk involved in projects of this magnitude.

Without a sufficient understanding of both source and target, transferring data into a more sophisticated application will amplify the negative impact of any incorrect or irrelevant data, perpetuate any hidden legacy problem and increase exposure to risk. A data migration project can be a challenge because administrators must maintain data integrity, time the project correctly to minimize the impact on the business and keep an eye on costs.

However, following a structured methodology will reduce the pain of managing complex data migration.

On May 14, Arithmos hosted a complimentary webinar on performing a successful data migration in Life Sciences. The webinar was conducted by Alessandro Longoni, Arithmos Senior Project Manager & Data Analyst, and focused on the challenges in data migration and ways of overcoming them.

Continue reading to learn the tips on successful performance of data migration in Life Sciences.

Register now to get free access to webinar recording

Tip 1 – Understanding the data

Before starting the data migration, you have to prepare your data for the migration, carrying out an assessment of what is present in the legacy system, understanding clearly which data needs to be migrated, avoiding duplication and promoting quality and standardization.

We can divide the assessment of legacy system in two macro categories:

  • Assessment of the data meaning
  • Assessment of the data quality

Every piece of data that you move is something that has to be validated, to be cleaned and transformed. In data migration projects, migrating only relevant data ensures efficiency and cost control.

Understanding how the source data will be used in the target system is necessary for defining what to migrate. It is important to look at how people are using existing data elements. Are people using specific fields in a variety of different ways, which need to be considered when mapping out the new system’s fields?

The second macro area is the assessment of the quality of the data. It is very important to define a process to measure data quality early in the project, in order to obtain details of each single data piece, and identify unused fields or obsolete records that may have undergone multiple migration. It is also important to avoid the migration of duplicate records or not relevant records.

The quality analysis typically leads to a data cleaning activity.

Cleaning the source data is a key element of reducing data migration effort. This is usually a client’s responsibility. However, the client can be supported by the provider, who will perform specific data extractions, aggregations and normalizations in order to reduce client’s effort.

Another way to clean data is the adoption of migration scripts for cleaning purposes. It is important to understand that this kind of activity could create validation issues, because we are modifying the source data leaving the perimeter of a pure data migration and creating potential data integrity issues.

Tip 2  – Project Governance

The best for approaching a data migration project is clearly defining roles and responsibilities and avoiding accountability overlapping. This can be done in several steps:

  • Define the owner of the data in the target system
  • Include the business users in decision-making. They understand the history, the structure and the meaning of the source data. Additionally, if business users are engaged in the data migration project, it will be easier for them to interact with migrated cases after GoLive.
  • Rely on data migration experts. Each data migration requires assistance from experts, which can fill the gap between business and IT, where both are key stakeholders but often unable to understand each other.

Based on our experience, what makes a difference is the presence of a business analyst. This is a person that acts as a bridge between the technical staff involved in the technical implementation of the migration, and the businesspeople. The business analyst can explain in a clear way technical requirements, and that can really help the business to define the migration rules based on how the target system will use the migrated data.

Tip 3 – Roll back & Dry Run

A roll back strategy has to be put in place in order to mitigate risks of potential failures. Access to source data have to be done in read only mode. This prevents any kind of data modification and ensures its integrity. Backups have to be performed on the target system in order to restore it in case of failures.

Accurate data migration dry run allows to execute validation and production migrations without incidents or deviations. Procedures and processes have to be tested in order to check the completeness of the records, and to ensure the integrity and authenticity in according with data migration rules and purposes.

Tip 4 – The Importance of the Data Mapping Specification Document

Data Mapping Specifications document is the core of data migration. It ensures a complete field mapping and it is used to collect all mapping rules and exceptions.

This project phase is usually long and tiring for a number of reasons:

  • Volume and amount of data details
  • Technical activity with technical documents
  • Little knowledge of dynamics of target database
  • Compromises that have to be made

The Data Mapping Specifications document specified details all the rules related to the data that is migrated. The following tips can help you to do it in the most efficient way:

  • Clarify what has to be migrated and what shouldn’t be migrated
  • Clean source data – this will reduce the number of fields to migrate
  • Liaise with a business analyst that will translate technical requirements and help to explain how data will work in the target system
  • Rely on data migration expert that have already performed similar data migration in the past
  • Avoid using an Excel sheet for mapping table fields – a more visual document with pictures will ease the conversation with the business users

Tip 5 – Perform comprehensive validation testing

To ensure that the goals of the data migration strategy are achieved, a company needs to develop a solid data migration verification process. The data migration verification strategy needs to include ways to prove that the migration was successfully completed, and data integrity was maintained.

Tools or techniques that can be used for data migration verification include source vs target data integrity checks. A combination of manual and automatic checks can be used to cover all verification needs. The verification can include qualitative and quantitative checks.

In order to carry out data verification, a processing of a migrated data through a workflow must be done. This ensures that migrated data properly interacts with the target system functionalities.

The sampling strategy is defined in the Data migration plan and it should be driven by the risk impact of the data. 100% sampling is not feasible and adequate, therefore standards such as ANSI/AQL is usually used to define the risk-based sampling strategy.

This article is based on a webinar presented by Alessandro Longoni on May 14. Register today to get access to the webinar recording.

About Alessandro Longoni, Senior Project Manager & Business Analyst

Alessandro LongoniAlessandro Longoni is a Senior Business Analyst & Project Manager of Arithmos. He has been working in the field of Life Sciences technology for 13 years and has particular expertise in pharmacovigilance and electronic data management.

Learn more about Alessandro Longoni, his career path and current role in Arithmos in our “Career Insights” section.

Artificial Intelligence in Clinical Research

AI in Clinical Research

Interview with Paolo Morelli: Artificial Intelligence in Clinical Research

On the 20th of May Paolo Morelli, CEO of Arithmos, joined the Scientific Board of Italian ePharma Day 2020 to discuss the growing role of the new technologies in clinical trials. We have taken this opportunity to talk to him about one of the most debated technologies of the last few years – Artificial Intelligence (AI) – to understand how it is currently used in the industry and what future it has.

There are so many discussions around the AI right now. What are biggest doubts related to it?

There are a number of questions that we are asking ourselves now. Here are just a few examples:

  • Will AI change and automate processes, in clinical research, making it more efficient?
  • Will researchers have to change the way they do their work?
  • Will pharma companies have to reorganize their R&D departments?
  • Will the approach to the training of clinical research personnel change?
  • Will patient privacy be protected?

For sure companies performing clinical trials are more and more interested in AI. They want to have data of higher qualityperform recruitment faster and in general, be more efficient. Also, when it comes to the financial side of the projects, the journey has just begun, and a lot is yet to come.

What does the notion of the AI encompass?

I suggest we start with what is not AI. For example, automation is not AI. Automation is the use of machines that follow a precise program defined by Human Intelligence. The artificial intelligence (AI), sometimes called machine intelligence, is opposite to the Human intelligence (HI). We can define AI as any device that collects information and takes actions that maximize its chance of successfully achieving its goals.

Is AI already used in the clinical trials?

Yes, I can outline two of the biggest areas where it is already in use. The first one is the use of AI over eSource/mSource data to support decision making (for ex: feasibility, communication with the patient) and the second one is the use of AI to reduce data entry and the burden of paper documentation (for ex. eConsent, Regulatory document completion, SDV).

How will the AI use be expanded in the future? Where else we can apply the AI in clinical trials?

For sure AI will allow a continuous communication with the patient, even though the human touch cannot be replaced by technology. I also believe that, at some point, SDV will be fully automated thanks to NLP technologies. With the proper training data set AI will be more successful in detecting data issues than Human Intelligence.

But even if patients and the industry are typically open to the innovative solutions, regulators will have a big role in the spread of the innovation, and can support it by setting up a modern regulatory framework.

Once the AI becomes integral part of the clinical trials industry, what will be the potential benefits for the patients?

AI in Clinical Research

The combination of AI with the scientific content will support a better communication with the patient: eConsent will allow a better understanding of the texts compared to the paper version of the Informed Consent. AI will answer questions that the subjects may have about the study, and the patient will be more informed about how his data is treated. I believe that we will see a bigger presence and influence of the patients in clinical research. It will be important, though, to overcome privacy challenges before introducing technology and informing the patients about the use of all their data.

Do you think the AI will replace the human professionals in clinical trials of the future?

The omnipresence of overly abundant data (both clinical data and project data) can be successfully triaged and managed with the assistance of AI. And yet, it is HI (human intelligence) that is needed to make the final decision from the options offered. Think about our air traffic control system, highly automated, advanced technology, and yet there remains the need for HI to make the final decisions regarding all these simultaneous active flights (projects).

What are the aspects that can determine success of the AI in clinical trials?

There are three main aspects that will determine the success of AI in Clinical Trials. Availability of the technology is not one of them.

One critical aspect is the availability of System Integrators as a bridge between technology and researchers. Their goal will be to understand clinical processes and to shape all this new technology around the clinical trial activities.

The second important aspect will be the ethics, morals, and governance around AI and ML. Regulators and Life Science companies will need to setup a framework before going too far down the road of AI.

And the last thing is data. AI needs data. Accurate data. Lots of it. We need to start collecting organised data and train AI before it can successfully achieve the desired result of making the clinical trials faster, more efficient, and ensuring higher quality.

At Arithmos, did you already start exploring such innovative technologies?

Arithmos, just as other PM Holding companies, has Innovation as its core value, it is at the heart of everything that we do. Looking at the impact of innovative technology on the Life Sciences industry, I believe that investing in this field was the right choice.

At Arithmos we are already supporting our clients with the digital technologies that optimise clinical trial processes and increase patient centricity, ensuring that patients get an active role in the clinical trials.

Would you like to learn how Arithmos can support you in innovating your clinical trials? Contact us by clicking here.

Interested in more materials on Digital Transformation?

Webinar Q&A: How to Perform a Successful Data Migration in Life Sciences

Data Migration Webinar Q&A

Webinar Q&A: How to Perform a Successful Data Migration in Life Sciences

In the Life Sciences industry a structured, but lean, approach to data migration is vital for success. Such an approach not only ensures data integrity and consistency, but also minimizes the data migration impact on day-by-day business operations.

On May 14, Arithmos hosted a complimentary webinar on performing a successful data migration in Life Sciences. The webinar described the main risks and challenges of the data migration, ways to overcome them, and presented a case study of data migration in pharmacovigilance.

The webinar was conducted by Alessandro Longoni, a Senior Business Analyst & Project Manager of Arithmos. He has been working in the field of Life Sciences technology for 13 years and has particular expertise in pharmacovigilance and electronic data management.

Continue reading to discover the questions from the Q&A session from the webinar.

Register today to get access to the webinar recording

During a pharmacovigilance migration, should only the last version of the cases be migrated, or should we migrate all the versions?

It depends on the legacy and the target system. Some classical safety systems save each version of each case as an independent case.

If you use an ETL tool, to migrate to a new system, that does not operate according to this logic, we would advise to migrate only the last version of the case in order to avoid the duplicate cases. It is also advisable to migrate the case history (case workflow) if possible.

However, in all the cases, it is technically possible to migrate all the versions of the case if you find the right workaround. For example using the following naming format: <case_number>_<caseversion>.

The best practice in data migration is adopting the E2B XML approach – regardless of the number of case versions, only the last version is included in the legacy system.

What steps can be omitted in semi-automated data migration compared to the automated?

Data migration, regardless of the approach used (ETL, XML import, re-entry), must be included in a validation exercise.

The main steps and deliverables are the same regardless of the approach. However, the documents and activities in semi-automated and automated data migration projects require different accuracy and effort:

  • Data Migration Plan: it should include the strategy of the migration project and the main rules for data migration. This means that it covers the mapping needs for re-entry, and in case of the ETL tool, a data mapping document is highly suggested.

In case of the XML migration the format depends on the number of cases and the complexity (due to the difference between the configurations of target and legacy system). For simple migrations/low number of cases, the data migration plan could be sufficient. For complex migrations/high number of cases, we suggest a dedicated data mapping document

  • Data Mapping Specifications: see above
  • Data Migration Validation: it is a protocol that must be executed in the validation environment to verify the migration process. It is more extended in the case of ETL, but we suggest to use it regardless of data migration approaches.
  • Data Migration Production: it consists of the execution of the protocol, described above, that is conducted to migrate data in production. In case of an ETL approach, this phase will require 2-5 days for technical activities plus 1-2 days for business data verification (production downtime period). In case of an XML or re-entry process, this phase will take up to several months of business work (no production downtime period).
  • Data Migration Report

When you access the source data in read-only mode, is it possible to also access the files attached to the ICSR such as publication (in PDF format) or ACK from Regulatory Authorities (in XML format)? / How can we migrate attachments from one system to another? Should the client send the attachments separately from the XML files if the migration is performed by import/export migration?

Yes, it is possible. Of course, this possibility depends heavily on the structure of the legacy system database. Sometimes, technology constraints don’t allow you to extract some particular information, such as attachments or audit trail.

If the legacy system stores the attachments in the database with proper references to the relevant case, and if the target system supports the case attachment functionality, the best idea would be to perform the integration using the ETL tool.

If I have an E2B XML file as source of a case recorded in an Excel database, can I use it as a source for the migration into the new database in order to avoid doing a migration manually from the old database?

The standard approach to data migration is migrating data from the legacy to the target system and not from the source to the target. Nevertheless, a different approach can be chosen if it can grant a better data quality (like in the question) and efficiency. This approach must be described and justified in the Data Migration Plan. Also, we advise to carry out a final verification to detect every discrepancy between migrated data and data stored in the legacy system.

How can you calculate the number of the records from the source and target systems that need to be validated?

An ETL tool can reduce the need for the data verification, especially because a migration script works on specific fields independently from the number of records to be migrated. For example, if my script moves correctly the laboratory data, the number of tests executed by a specific patient is not relevant. This ensures the quality of the migrated data.

However, a minimum quality check of the fields has to be performed, and it should be based on the recommendations of your CSV manager and the risk assessment.

A numeric check of the records moved, (100%), is always required to ensure that there were no exceptions during the script run.

This article is based on a webinar presented by Alessandro Longoni on 14th May 2020. Register today to get access to the webinar recording.

Webinar Q&A: Agatha – Quality and Content Management Solution

Agatha Quality and Content Management Solution

Webinar Q&A: Agatha – Quality and Content Management Solution

Biotechnology, pharmaceutical, and medical device companies develop highly complex products. For them, accuracy, consistency, efficiency, and quality are not goals; they are imperatives because the medicines, therapies, and devices they make can improve the quality of patients’ lives and safeguard their health.

On April 17, Arithmos with its partner Agatha Inc. hosted a complimentary webinar on Agatha – a cloud-based Quality and Content Management tool for Life Sciences and healthcare organizations. Agatha addresses the imperatives for their core business processes like managing clinical trials, optimising quality processes, and organising regulatory submissions. The system is highly configurable, allowing for tailored customizations that fit the company’s workflow.

The webinar was conducted by:

  • Silvia Gabanti, Managing Director of Arithmos. Silvia has an impressive 15-year track record in the industry. She began her career in the CRO environment where she developed experience in applications for pharmacovigilance and clinical trials. On her career record, she also has extensive experience with Oracle applications – as analyst she was working with pharmacovigilance and clinical systems like Oracle Clinical.
  • Guillaume Gerard, Chief Operating Officer of Agatha Inc. For the past 15 years Guillaume has been helping Lifesciences organizations worldwide to deliver cloud-based content management applications. His areas of expertise include the compliance aspects and architecture of such systems, as well as the functional expertise on the clinical document management side (Trial Master File) and Quality Management.

Continue reading to discover the questions from the Q&A session from the webinar.

Register today to get access to the webinar recording

What is Agatha’s policy for data backup?

Agatha handles all aspects of the data center operations, including regular backups of all data.  All customer data are backed up daily, and daily backups are retained for 15 days.

It also provides the ability for customers to export all customer data (files, audit logs, metadata) at any time.

Is Agatha compliant with GDPR?

GDPR is a comprehensive privacy regulatory requirement.  Agatha is fully compliant with the requirements in regard to how it stores data and collects information.  There are some aspects of GDPR which are the responsibility of the business entity, for example naming a Privacy Officer.

Are document lifecycle statuses customizable?

Yes, lifecycle stages and other aspects of the review and approval workflows can be fully configured. That includes changing labels, adding steps, reordering steps and changing workflows from serial to parallel structures.

Are electronic signatures compliant to CFR21 part 11?

Yes, electronic signatures within Agatha are fully compliant with health authority requirements.Beyond signatures, Agatha is completely compliant with the FDA’s CFR21 Part 11. We maintain a CFR 21 Part 11 compliance checklist for every release of our service, and any customer can receive a compliance letter that can be used during audits.

Are activities tracked into an audit trail?

Yes, all activities are captured. This includes operations on documents as well as access and any change of settings. When an audit is performed, everything that has been collected in the audit trail process and stored can be provided directly to the auditor via a login with auditor access.

Is it possible to load data via an api from external applications (eg. eTMF, CTMS, EDC, …) and if yes, what kind of data can be loaded, only documents or also xml, json, xls?

It is possible – there is a full set of APIs available that allows data to be loaded, and external applications to be integrated.

Can documents have an expiration date?

Yes, documents can have an expiration date. This can be done using a “valid until” metadata, and processes can be triggered based on that metadata.

Could Agatha be used during the product development stages?

Absolutely. Many aspects of the Agatha solution are appropriate prior to the clinical trial phase, for example management of SOPs and collection of regulatory documentation.

Can Agatha manage the migration of legacy documents?

Yes, most projects include migration steps. There is an import tool that lets the client map data from a source system to Agatha and complete the import.

Is it possible to enable two-factor authentication (e.g. username & password plus SMS) in Agatha application?

Yes, two-factor authentication is the  standard model for the use of Agatha. Two-factor authentication is enabled  on a per-client basis.

Are there any penetration tests that could be shared with customers?

Yes, as part of Agatha hosting service penetration tests are conducted yearly. Reports and results are made available to clients during audits.

During the webinar it was mentioned that Agatha is cost effective. How does it price compare to the competition?

Agatha provides the best value among similar products because, as a ready-to-use and pre-validated system,  it is less expensive to bring into product and on-board users.   It also has lower subscription prices, because it does not operate on top of another product platform. It also has truly packaged modules, as it was shown during the webinar.  Typically, it is 2 times more affordable than the competition.

If you are interested in a specific quote, contact us for a brief conversation. We will ensure we understand the specific functionality you need and the number of users and provide the best offer.

Clinical Oversight Solution: How to Select the Right Vendor

Clinical Oversights How to Select the Right Vendor

Clinical Oversight Solution: How to Select the Right Vendor

Introduction

On June 2017 ICH GCP E6 (R2) entered into force, introducing new guidelines and increasing the responsibility of the Sponsor when it comes to outsourcing the activities to the CROs.

 “The sponsor should ensure oversight of any trial-related duties and functions carried out on its behalf, including trial-related duties and functions that are subcontracted to another party by the sponsor’s contracted CRO(s).”

Clinical Trial Oversight – 5.2.2. Addendum

After ICH GCP E6 (R2) entered in force, sponsors needed to face the following obligations:

  • Checking if the quality requirements, agreed with the CROs, have been met
  • Confirming if the project execution, by the CRO, is aligned with expectations
  • Implementing a risk-based QMS throughout the clinical trial

The reasons that brought the authorities to update the ICH GCP regulation are clear: on one side the increasing complexity, scale, and overall costs of the clinical trials, on the other – the strong shift from a paper-based clinical trial process to digital-based one.

Challenges in ensuring oversight

Although now CRO oversight is considered to be a top priority for the sponsors, it is also true that designing and implementing oversight process is extremely challenging. Introducing a technology solution is the most efficient way to ensure the oversight due to the amount and complexity of data that needs to be analysed.

In order to perform oversight efficiently, this solution needs to:

  • Give a global overview of all the company vendors involved in each study
  • Convert non-homogeneous data into a format that is homogeneous and allows easy comparison, control and performance analysis, for example the visit frequency analysis

As very few sponsors are already in possession of such solution, it needs to be acquired externally through a vendor selection process.

Technologies for ensuring oversight

There is a number of tools that the sponsors can adopt as part of their oversight strategy. The two most widespread are Clinical Trial Management System (CTMS) and Oversight solution.

CTMS

CTMS is a type of software that merges the data from different sources and allows the sponsors to analyse it. CTMS is one of the most common tools on the clinical market and has planning and reporting functions that include participant contact information, deadlines, and milestones.

The main CTMS constraint is the low level of customisation of the tool due to limited integrated customization capabilities. With each CRO having its own report template, it is difficult to accommodate this variety with one CTMS, so only a limited number of studies and CROs can be overseen with it.

Additionally, a CTMS was born as a clinical tool, which limits the inclusion of other data like procurement and pharmacovigilance.

Clinical Oversight solution

A clinical oversight solution is another option that can be used by a sponsor in order to comply with the Addendum of ICH GCP E6 (R2).

An oversight solution is more than a software, it is a complex package that includes the following elements:

  • Business analysis activities to understand the need of the sponsor and review the types of data submitted by CROs
  • Software that covers clinical, procurement, and pharmacovigilance aspects of the study
  • Maintenance services and configuration of the reports of the new CROs

Oversight solution allows creation of multiple dashboards thus being able to accommodate different formats of incoming data, which makes it a perfect tool for sponsors that have multiple CROs and multiple studies.

Being more complex than a CTMS, oversight solution also comes at a higher price.

In this article we will be focusing on the selection of a vendor for oversight solution, as it is more flexible and allows the inclusion of a higher number of CROs and studies.

Oversight solution: vendor selection

Step 1: Kick Off Meeting

During this step the sponsor plans the project activity, defines roles and responsibilities, timeline, budget restriction, and the outputs. It is critical to include the following participants who will be the key figures in the process:

  • Project manager
  • Business owner
  • IT team representative (system owner)
  • Key user that uses the data provided by the CROs; for example, a data manager

Step 2: Defining Business Requirements

The sponsor should list the main requirements for the oversight system. The best approach is starting with the higher level expectations and defining only 5-7 main points. They help to understand the key micro-areas and to create a shortlist of the solutions available on the market.

Step 3: Vendor Selection and Assessment

At this step the sponsor works with the shortlist of the oversight solution vendors compiled previously. Normally, the shortlist includes 3-5 solutions. The following actions will help to filter out the least suitable vendors and define the main candidates:

  • Send to the vendor candidates a survey that gathers information of the size of the company, its capabilities and its ISO certificates
  • Set up a demo appointment and share the list of the oversight requirements during the demo. The advised time is 1-1.5 hours
  • Request a ballpark estimate
  • Create an evaluation matrix based on the survey and demo outcomes in order to understand which vendor has the highest score

When it comes to requesting the ballpark estimate, it is vital to keep the following in mind:

  • All the potential vendors should receive the same description with the same requirements. The more precise and coherent the request, the more likely the different proposals will be comparable.
  • The more flexible the solution, that the vendor offers, the less technical adjustments will be needed, allowing the sponsor to lower the expenses.
  • Aside from the main ballpark estimate, it is important to keep in mind additional costs, like CRO onboarding and data dashboards personalization.

Step 4: Vendor Confirmation

Once the vendor is chosen, the sponsor conducts an audit and verifies the vendor’s compliance and the internal processes. This can either be done directly by the vendor or can be outsourced to an external consultant.

In case of a successful audit, the collaboration with the vendor is formalized with an appropriate contract.

Conclusion:

Each of the four steps are very important for making the right choice and ensuring compliance. However, the most critical step is defining the requisites and the expected results. The correct definition at this stage is the secret for the successful oversight and smooth collaboration.

Sponsors need to make sure that they involve all the stakeholders in the definition of the requisites and avoid the silos between the departments. If such key stakeholders as business department and IT teams are interested in different characteristics and have different expectations, sponsor needs to ensure that it shortlists the solutions that are a good trade-off for both of them.

Given the increasing outsourcing trend in clinical trials, it is not surprising that ICH E6(R2) was introduced, as it allows to address, in more detail, the relations between the sponsors and third parties. ICH E6(R2) has changed the way the information circulates between the sponsors and the CROs and has pushed the companies towards adopting new technology to collect, organize, and share the information in a more efficient and cost-effective way.

How can we support you?

Arithmos has extensive expertise in oversight vendor selection. We support you at every stage of the vendor selection, from business analysis to solution integration. Our extensive knowledge of oversight allows us to choose and customize for you an oversight solution that guarantees efficiency and compliance.

Contact us to learn more about our oversight vendor selection support.

Career Insights: Interview with Michele Montanari, Life Science Applications Manager

Life Science Applications Manager

Career Insights: Interview with Michele Montanari, Life Science Applications Manager

This is the second blog in a new ‘Career Insights’ series and we talk to Michele Montanari, Life Science Applications Manager, to find out more on his background, career path, and his interests outside of work.

Life Science Applications Manager

Have you always been passionate about Life Sciences field?

I have decided to work in the field of Life Sciences already back in school, because I was so passionate about it. One of my first Christmas presents was a toy chemistry set!

I have studied Chemistry and Pharmacy at the University and then started looking for a job in pharmaceutical industry. That was 14 years ago. I tried doing job interviews for Medical Representative positions, but it didn’t click. Then I switched to the clinical field, and it was a perfect match!

What has been your journey to your current role at Arithmos?

I have known Paolo Morelli, the CEO of Arithmos, for many years. I was inspired by Paolo’s idea to create an innovative company with entrepreneurship spirit. Besides, I always loved technology, so the idea of adding it to my profile h was very appealing.

In Arithmos I started as Project Manager, then became Senior Project Manager, and last year I was promoted to Life Science Applications Manager.

What are your duties as Life Science Applications Manager?

I am responsible for supervision of all the activities of Life Science team, including management of the team and resources, quality delivery of projects and adhering to deadlines. We implement the existing applications, such as Symphony EDC and Argus BluePrint, do vendor selections and support our clients with such tasks as data migration and system integration.

What career achievements are you most proud of?

Becoming a Life Science Applications Manager. It gives me an opportunity to learn every day. I get to work closely with clients and manage our portfolio. I wasn’t involved with these tasks so much before, and I love exploring these sides of Arithmos.

As Applications Manager I also act as a brand ambassador and represent Arithmos during meetings with clients and partners. I enjoy helping to shape our image and conveying our values.

What most inspires you about working in this field?

Life Sciences is a very dynamic industry. In Arithmos I have the possibility to use my creativity and suggest innovative solutions for our clients. Technology in Life Sciences has reached such a level of development that it allows me to go off the beaten path and think outside the box to solve the tasks.

What would be your top tips for early career Life Sciences technology specialists looking to develop in this field?

I have two pieces of advice:

  • Invest in communication skills. Working with technology doesn’t mean that you won’t talk to people, opposite to a wide-spread stereotype. You will be in touch with your team and with clients. Besides, your interlocutors will not always be people who understand technical slang and jargon. You will need to understand the profile of the person you are talking to, before using specialised terms.
  • Always evaluate the risks. There are 10 ways to do the same project, at 10 different costs and with 10 different timelines. To ensure the best performance, you need to evaluate the risks of each option and choose the less risky one. Of course, you can also choose to disregard a risk, but it should be a conscious decision and making it requires great skill.

What are your personal values?

I believe in respecting others, regardless of their age and position. For me this is the basis of any successful collaboration.

What do you love to do for fun?

Life Science Applications ManagerI am a huge music fan. Not only do I listen to it, but I also play it. I am a member of a rock band where I play a synthesizer. We write music and lyrics ourselves and we have performed all around Italy and European countries, such as France, Switzerland, Austria, and Slovenia.

My second hobby is beer brewing. I am almost a full-fledge professional at it! I started 8 years ago, and I would say that it was born from my passion for chemistry. There are a bunch of different beers in my portfolio, such as British, American, and Irish beer.

 Arithmos careers

We are always looking for talented and motivated professionals ready to join us. Would you like to be part of our successful team? To find out more about our job openings click here.

Choosing the Right EDC: Advice from a Data Manager

EDC Data Management

Choosing the Right EDC: Advice from a Data Manager

How do you select the best Electronic Data Capture (EDC) system for your study? What are the must-have features and what are the nice-haves? We have asked Pedro M. Lledó, a Clinical Data Management professional with over 20 years experience, to share his tips for choosing the right EDC based on its functionality.

Key factors to consider

When you are looking for a new EDC system for a study, you should consider the following key factors:

  • Vendor – find a reliable and seasoned company, with years of experience. They know how to avoid the most common errors when it comes to selling and implementing an EDC.
  • Type of study – the EDC choice depends on the type of study and the type of users involved. You don’t always need to get the most expensive and powerful EDC on the market. For example, for Phase I or Observational studies you shouldn’t search for a complexity of an eCRF for an oncology study.
  • System functionalities, including how the system is hosted (cloud or on premises).

When choosing a system, make sure you involve all the teams you’ll need during the study. Talk to them in advance and let them know what the adoption of a new EDC system means. For instance, if you involve only IT and DM departments, they can choose a system which does not support partial SDV. Clinical Operations department will learn about the system right before the study. When they discover that the partial SDV is not available, they will be concerned, as it is very important for them and for the sponsor.

The features listed below will ensure that you have efficiency, data integrity, smooth workflow, and your independence in training clinical personnel.

EDC Must-Haves

When you have identified several EDC candidates, check if they correspond to the most important criteria. Your future EDC should be:

  • Compliant – it must be a system that follows 21 CRF part 11 rules, ensuring it is FDA and EMA compliant.
  • Internet browser agnostic – it should run in the most popular browsers.
  • Easy to program different types of queries, intra page, inter page/visits – it should be possible to address the queries not only to the investigator site, but also to CRAs, DMs, and MMs.
  • Allowing per user access control – every user, depending on the assigned role(s), should have access only to the allowed data or actions.
  • Flexible – you should be able to create on page queries and offline periodic listings and reports. This is important because you need to receive the warnings at the datapoint where the problem is located. In addition, you need some standard periodic listings to group all the queries by type, page, module etc.
  • Easy to monitor – this includes the presence of integrated reporting, standard study performance reports, KPIs, metrics. We need to see how people that are related to the data cleaning are performing.
  • Easy to master independently – alongside a user-friendly interface, it is important to also have on-line help resources.

EDC Nice-Haves

The functionalities listed above are critical for having an efficient EDC that allows you to capture data in a smart way. However, if you would like to reduce the number of errors in your data, further boost data integrity, and ease the life of your Data Managers, I suggest to look for an EDC that allows you to:

  • Easily configure register of data managers, CRAs, sites and user accounts.
  • Carry out partial source data verification (SDV monitoring). This function allows you to save the resources and focus only on the critical variables during the verification process.
  • Configure query workflow. It gives you additional flexibility and control when it comes to defining the query workflow, thus reducing such risks as overriding queries.
  • Use it also on portable devices with a nice page rendering. A lot of medical personnel use mobile devices for activities control and CRF data entry.
  • Optimize data management with dynamic CRF environment per protocol or patient data. In such EDC, different visits/pages or modules will appear/be hidden dynamically in a patient CRF. CRF will adjust automatically the number of pages or modules to the amount of information.
  • Gather from/share data with other systems, like eTMFs, CTMS, ePRO systems, patient wearable data management systems, PhV systems, BI tools and reporting systems, etc. This reduces the amount of effort during the data input and ensures data integrity.
  • Carry out training online through an e-Learning training module.

Don’t forget to ask about the technology layer behind the system. Trust a reliable and performant database, with a dynamic, quick page loading and rendering front end.

Finally, do not rush! Take your time to find the appropriate EDC. It will later save you time, money, and stress.

Good luck!

For further information about choosing the right EDC and our proprietary EDC solutions please contact us at info@arithmostech.com or click here.

Pedro Lledo Data ManagementAbout Pedro M. Lledó

Pedro M. Lledó is a physician by training who has been participating in clinical research projects for more than 20 years either as IT specialist, database administrator or clinical research data manager. Prior to joining Arithmos he has held leadership positions within CROs and biotechnology companies as Head of Data Management and Biometrics.

Page 1 of 212

Follow us on Twitter

RESOURCES

NEWS