Archive for the RESOURCES Category

Does your Clinical Technology Provider Have ISO 27001 Certification?

Does Your Clinical Technology Provider Have ISO 27001 Certification?

Read more

Webinar Q&A: Best Practices for Choosing the Right Safety Database

Arithmos Webinar Q&A Choosing the Right Safety Database

Webinar Q&A: Best Practices for Choosing the Right Safety Database

In a resource constrained environment, pharmacovigilance departments should leverage technology in a highly efficient manner in order to achieve the best results for patients and for the business. This means focusing on such aspects of a safety system as impact of the continuously evolving regulations, integration of new data sources, reliability and scalability.

On the 11th of October, we hosted a complimentary webinar titled “Best Practices for Choosing the Right Safety Database”. During this webinar, we proposed a consistent and cost-effective approach for small and medium companies to choosing the right safety database. We will discuss how to balance the need to be aligned with continuously evolving regulations with scalability and cost considerations and will share the best practices for choosing the solution that fits company’s needs the best.

Continue reading to discover the questions from the Q&A session of the webinar.

Register today to get access to the webinar recording

During the webinar you have presented 3 Safety Database models: software to be installed from scratch or preconfigured solutions to be implemented in hosting or on premises and SaaS solutions. Do you foresee a 4th model?

During the webinar we presented the main approaches to the software supply that we are directly managing and that we meet in the market. Nevertheless, every vendor and/or provider could define its own preferred market approach that could be totally different from what we have presented. It can also be a a mix of the presented approaches.

For example, in our presentation we talked about the SaaS as a model suitable for small and medium companies, but the latest trend in the market is that some of the safety databases top-market leaders propose SaaS/Cloud solutions also for big companies.

We did not approach this topic in our presentation because moving to this type of database adds complexity to the vendor selection process that requires an approach to the selection different from the presented ones. Even if the technology is ready for this approach, the network of consultants, system integrators, and support companies is still immature, and these aspects should be carefully considered during the vendor selection.

In a pre-configured system, is it still possible to have some customisation (and of course OQ testing on this specific customisation)?

The choice of a preconfigured system is usually done in order to reduce the implementation costs and speed-up the setup phase. Nevertheless, you could implement specific configurations, customisations, integrations, considering the relevant costs for re-testing (if needed).

The key points to be considered are:

  • The impact of required changes. If a complete/massive re-testing of the solution is required (e.g. in case of workflow major changes), approaching an installation from scratch could be preferable
  • Avoid Customisations. Customisations could provide adherence to your current requirements, but iat the same time they can incur unpredictable costs for the future. Top market leader software solutions are usually configurable and allow to adapt the system functionalities to your needs without massive system changes (or integrations with external software or code).

In this manner you can mitigate the risk to lose your customisation or of generate data inconsistencies in case of future upgrades. In addition, configurable solutions can be easily changed in order to support evolving business and regulatory requirements.

You spoke about modular solutions supporting AI. Will this become a mainstream solution?

AI is definitely the future of safety databases. Pharmacovigilance is evolving, moving from a manual, reactive process to a proactive source of insight.

Robotic processes (Rule-Based/Static technologies) are already available and established in the safety databases.

Database vendors are nowadays upgrading their platforms with AI-based static technologies that include components that are AI-informed. These databases must be trained to imitate human way to sense, learn, deduce and communicate in the pre-release phase.

These technologies are the tools that can be added to your application suite or modules of integrated platforms, but the main software development trend is to define unified platforms that provide unified architecture and data model that reduce support and maintenance complexity.

But this is only the starting point. The software vendor market is strongly investing in R&D projects on AI-based dynamic systems (components that are AI-informed and capable of continuously adjusting their behaviour also in production phase using a defined learning process).

The main obstacle for the widespread use of these system is the regulatory environment: while rule-based/static technologies (RPA) follow the current system validation rules, AI-based static technologies require the extension of the current validation guidelines.

Vendors, companies and regulators as part of the same ecosystem should work together to streamline and speed up the process and to create guidelines tailored to the new technologies.

How should I approach the software selection process?

Successful technology vendor selection in the Life Science environment requires the right level of resource commitment, coordination with the business, and due diligence.

Technology vendors choice for vigilance is a strategic matter, since complex data flows, decentralised environment, regulatory focus on data quality and integrity are expanding the traditional roles and processes.

The companies could improve the results and minimise the risks through a structured vendor selection approach, which should start with a kick-off meeting.

The kick-off meeting aims at defining the project scope, stakeholders, the communication flow and tools, and at collecting available  SOPs about the process. Following the kick-off meeting you should define your Project Plan with tasks and calendars/timeline in which you want to complete the vendor selection.

Then, you should approach to the elicitation phase. To collect the requirements, you should organise interviews to the main stakeholders.

  • To avoid delays, opt for the interviews instead of asking for documentation drafting
  • Define an interview template document to collect information about the requirements of the main stakeholders
  • Take into account two strategies: high-level strategy of your company and IT strategies (for example your IT want to install new systems on premises to reduce the internal IT effort)
  • Conduct the interviews filling in the template with collected information

Before drafting your business requirement document, it would be useful to translate the collected requirements in a high-level solution design, supported by the graphical representation of the desired system workflow.

The result of your analysis can be presented and approved by the main project stakeholders before drafting the business requirements and starting the selection process.

As soon as your workflow has been approved finalise your Business Requirement document. This document represents the starting point of the vendor selection process. You can submit your requirements to many market vendors identifying who and how can satisfy your requirements.

Do the safety databases on the market have Artificial Intelligence functionalities? Is this a key driver for choosing a safety database?

Top market leaders provide systems that support small process automations that mainly support case assessment, distribution & reporting phase.

Before starting investing in AI/new automations the suggestion is to try to optimise the existing automations through proper system configuration/usage. Our suggestion is to choose a safety system based on its scalability.

“Nice to have” extensions include:

  • Case intake phase: Duplicate Check, Prioritisation/Triage
  • Case processing phase: Full Data Entry and Medical Assessment

Register today to get access to the recording of the “Best Practices for Choosing the Right Safety Database”

Do you have other questions related to choosing the right safety database for your pharmacovigilance department? Contact us and we will get back to you with an answer as soon as we can. 

Webinar Q&A: Digital Transformation in Pharma: Challenges and Enablers

Digital Transformation in Pharma: Challenges and Enablers

Webinar Q&A: Digital Transformation in Pharma: Challenges and Enablers

Digital Transformation is a transformation of all the activities, processes and competencies in a way that allows us to harness the advantages of new digital technologies. Right now, pharmaceutical companies face a difficult choice: either evolve with the new era by building a digital organisation or risk becoming less competitive on the market as they fail to embrace the change.

On the 21st of October, we hosted a complimentary webinar on Digital Transformation titled “Digital Transformation in Pharma: Challenges and Enablers”. During this webinar, we discussed the challenges that pharmaceutical companies face on the road to digital success and how to overcome them.

Continue reading to discover the questions from the Q&A session of the webinar.

Register today to get access to the webinar recording

Should Artificial Intelligence to be included in pharmacovigilance digital transformation plan? How should the companies approach it?

Artificial Intelligence is a term generally used to identify different technologies such as natural language processing (NLP), natural language generation (NLG), machine learning (ML) and many others. In order to understand what AI in pharmacovigilance is and how to approach it, it is fundamental to distinguish between Robotic Process Automation (RPA) and real Artificial Intelligence (AI).

  • Robotic processes are the tools based on a fixed/static set of rules. Logic is predefined by humans and is given to the system to mimic human actions. Rule-Based/Static technologies are already available and have a well established position.
  • AI refers to leveraging machine’s capability to imitate human ability to sense, learn, deduce and communicate. In this category we can distinguish between two types of AI:
    • AI-based “Static technologies” that include components that are AI-based, but the “training” of the system is limited to the pre-release phase. Re-training of the model is limited to incident/errors. These are emerging technologies.
    • AI-based “Dynamic Systems” that includes components that are AI-based and capable of continuously adjusting their behaviour also in production phase using a defined learning process. These technologies are “the technologies of the future”.

Keeping this in mind, a company should identify the area in which Risk and Effort benefit of automation are leveraged to grant the major benefit. For this reason, pharmacovigilance process owners should approach artificial intelligence by a progressive adoption of new technologies starting with process automation.

From the best Return Of Investment business case in pharmacovigilance, which are the recommend areas for digital transformation?

The objective should be optimisation while keeping your eye on evolution and revolution. The first investments should be done in maintaining your safety systems updated with current regulations. Another key point is to be able to obtain the highest value from the information collected through efficient reporting and analysis tool.

Considering that a leading market safety system already includes some process automations that mainly support case assessment, distribution & reporting phase, before starting investing in AI/new automations you should try to optimise the existing automations through proper system configuration/usage.

Taking into all the details mentioned above, other investments should result in automating:

  • Case intake phase: Duplicate Check, Prioritisation/Triage can be considered good targets for automation
  • Case processing phase: Full Data Entry and Medical Assessment are time-consuming tasks with high automation benefit and a currently low level of automation.

What is the weight of the lack of regulation, such as specific mention in the GVP in the use of AI in pharmacovigilance?

When it comes to Rule-Based/Static technologies (RPA) current validation guidelines are applicable.

In regards to AI-based “Static technologies” there is the need to extend current validation guidelines: white papers from FDA, some presentations from MHRA, collaborative areas like TransCelerate, Oracle ARGUS Safety Consortium can be useful.

Companies that are approaching these technologies are following the most conservative approach. This means that even if these technologies are possible and used, their implementation requires a big effort in validation, testing and maintenance.

Vendors, companies and regulators as part of the same ecosystem should work together to streamline and speed-up the process to create guidelines tailored to the new available technologies.

When it comes comes to AI-based “Dynamic Systems”, there is no guideline available. Future guidelines will be eventually required for these more transformative systems.

What are the main digital transformation projects implemented in the clinical research area?

Based on our experience, clinical research units are mainly evaluating the possibility to implement:

  • Oversight platforms in order to fulfill ICH E6(R2) regulatory requirements about Sponsor Study Oversight
  • Remote monitoring platform and eTMF. This represents an interesting opportunity since could be the opening door to moving to a paper-free approach. Electronic document management systems can support both remote monitoring platform and eTMF, but also QMS, Regulatory, CSV. Starting from the main requirements (clinical trials) pharmaceutical companies could enlarge the scope of the new technology.
  • Evolution of EDC platforms: from on premises-hosted application suites to integrated or unified platforms that are cloud-based

What are the tools that can support change management process?

To support change management and make sure that the resources are commited to the change, we recommend to involve them in the introduction of the new technology from the very beginning.

During the user requirements phase, pharmaceutical companies can propose surveys to internal users about their needs (this helps in make them part of the decisional process).

In addition, training and workshops on the beta version of the new systems could be useful. They help to adapt the configuration of the system to the real requirements and make the change smoother.

We also recommend to define internal marketing initiatives to promote the change and raise expectations. This could result in users that “want to change” and not users that are “obliged to change”.

Register today to get access to the recording of “Digital Transformation in Pharma: Challenges and Enablers” webinar

 

Risk Management Requirements for Post-market Surveillance for Medical Devices

Risk Management Requirements for Post-market Surveillance for Medical Device

Risk management requirements for post-market surveillance for medical devices

 Risk management requirements for post-market surveillance for medical devices

Medical Device Regulation: what is it about?

The EU’s Medical Device Regulation (MDR) is a hot topic in healthcare and a major concern for companies since 2017. It was officially published on 5th May 2017 and came into effect on 25 May 2017. The MDR is supposed to replace the current EU documents, Medical Device Directive (93/42/EEC) and Directive on active implantable medical devices (90/385/EEC).

Manufacturers of currently approved medical devices were given a transitional period of 3 years, till the 26th of May 2020, during which they had to reorganize the operations to meet the requirements of the MDR. An amendment to the MDR was adopted on 24 April 2020 by European Commission, which postponed the application of most of its provisions by one year, until 26 May 2021. However, certain devices that meet special requirements can be granted permission to extend the transition period till the 26th of May 2024.

Post-market surveillance: what’s new

Articles 82 through 86 and Annex III of the EU MDR describe the requirements for a post-market surveillance system (PMS), making PMS mandatory, and those manufacturers who want to remain in compliance with new MDR are obliged to re-organize the PMS system and Vigilance System following the new requirement.

The PMS process is the collection and analysis of the data that comes from the various sources according to Annex III and is carried out according to a PMS plan for each product. There are various purposes for which this data can be used, such as:

  • Update of the benefit-risk determination and improvement of the risk management;
  • Update of the design and manufacturing information, the instructions for use and the labeling;
  • Update of the clinical evaluation;
  • Update of the summary of safety and clinical performance;
  • Identification of needs for preventive, corrective or field safety corrective action;
  • Identification of options to improve the usability, performance and safety of the device;
  • Contribution to the post-market surveillance of other devices (when relevant);
  • Detection and reporting of trends.

Risk management requirements for post-market surveillance for medical devices

With PMS becoming a duty for medical device manufacturers, the effective risk management system becomes a priority as well as one of the three basic elements that ensure compliance and safety, alongside with PMS and clinical evaluation (see Image 1).

According to the MDR, manufacturers are expected to provide evidence of a risk management plan created for the whole lifecycle of products. Such plans should be used for tracking and reducing any potential hazards and ensuring the safety of the devices.

The MDR references to the following risk-related key notions:

  • Risk is defined in Article 2 as “the combination of the probability of occurrence of harm and the severity of that harm”;
  • Benefit-Risk Determination is defined in Article 2 as “the analysis of all assessments of benefit and risk of possible relevance for the use of the device for the intended purpose, when used in accordance with the intended purpose given by the manufacturer”;
  • General obligations are defined in Article 10 in the following way: “Manufacturers shall establish, document, implement and maintain a system for risk management as described in Section 3 of Annex I”;
  • The Quality Management Systems shall address the following matter – “risk management as set out in in Section 3 of Annex I”[1]

Risk Management for Medical Devices

The following requirements by the MDR should be addressed in order to ensure compliance and correct benefit/risk management:

  • establish and document a risk management plan for each device;
  • identify and analyse the known and foreseeable hazards associated with each device;
  • estimate and evaluate the risks associated with, and occurring during, the intended use and during
  • reasonably foreseeable misuse;
  • eliminate or control the risks referred to in point (c) in accordance with the requirements of Section 4;
  • evaluate the impact of information from the production phase and, in particular, from the post-market
  • surveillance system, on hazards and the frequency of occurrence thereof, on estimates of their associated risks, as well as on the overall risk, benefit-risk ratio and risk acceptability;
  • Amend control measures if necessary.

What else is there to keep in mind?

In 2019, a new ISO 14155:2018 draft will be published and will contain changes on pre- and post-market clinical investigations for medical devices. It is expected that the new, third revision will contain more explicit and thorough indications on risk management. Additionally, it will be closely tied to the risk management requirements outlined in ISO 14971.

Other significant changes in the new ISO 14155:2018 draft include:

  • Guidance on clinical quality management, clinical investigation audits and ethics committees
  • Risk-based monitoring requirements
  • Registration of clinical investigations in publicly accessible databases
  • Clarifications on how ISO 14155 requirements apply to each stage of clinical development
  • Annexes relating ISO 14155 to the European Medical Devices Regulation, and to the Medical Devices Directive (MDD) and Active Implantable Medical Devices Directive (AIMDD).

Useful Medical Device Regulation terminology

  • MDR – Medical Device Regulation
  • PMS – Post Market Surveillance
  • PIP- Poly Implant Prosthesis
  • MDD – Medical Device Directive
  • FDA – Food and Drug Administration
  • PMCF- Post Market Clinical Follow-up
  • CER – Clinical Evaluation Report
  • RM – Risk Management
  • PSUR- Periodic Safety Updated Report
  • PMSR – Post Market Surveillance Report
  • SSCP – Summary on Safety and Clinical Performances
  • SAE – Serious Adverse Event
  • IFU – Instruction For Users

Are you looking for technological solutions to facilitate clinical trials and adverse events management for your Medical Device products? Arithmos offers such solutions as Symphony, flexible and easy to set up EDC system, and Argus BluePrint, pre-validated and pre-configured version of Oracle Safety, that ensure compliance and security of the processes for Medical Device companies. Arithmos, alongside its sister company seQure Life Sciences, can also support companies in a consultative way by making sense of the MDR and analyzing a company’s needs in terms of quality assurance and regulatory compliance. We can support with an initial gap analysis and risk assessment regarding the MDR.

Contact us to learn more about our Medical Device solutions.

[1] BSI: MDR – Risk and Clinical Requirements

How to Break Down the Data Silos in Life Sciences Companies

How to Break Down the Data Silos

How to Break Down the Data Silos in Life Sciences Companies

In Life Sciences industry, making sense of data becomes increasingly challenging due to its growing volume, introduction of new and complex technologies and increase of stakeholder numbers.

Introduction of the right data reporting and analytics solution at the corporate level allows Life Sciences companies not only to streamline the collaboration within the company by breaking down the silos between the departments but also to improve the decision-making process and optimise technology investments.

On July 8, Arithmos hosted a complimentary webinar on using data reporting and visualisation for breaking down the silos in Life Sciences companies. The webinar was conducted by Silvia Gabanti, Managing Director of Arithmos, and Massimo Businaro, CEO of E-project.

Continue reading to learn how effective data reporting and visualisation can improve the decision making at multiple levels, increase operational efficiency, and break the departmental silos.

Register now to get free access to webinar recording

Data Ecosystems in Life Sciences companies

Rapid technology advancement and new regulations benefit patients and speed up clinical trials. However, they also increase the amount of data generated daily. Real World Data, Internet of Things, ePRO – all these technologies allow to multiple the amount of data that a company deals with on a daily basis.

Data is the new fuel for Life Sciences businesses and source of competitive advantage; it drives innovation and stimulates growth.

Currently, despite the enormous potential of this newly generated data, a lot of pharmaceutical companies are suffering from the data silos issue. It manifests itself in an inability to share data across departments, the presence of multiple data repositories, and the absence of a clear picture of the company’s data.

As data is often stored in multiple tools and locations, data silos severely impact the departments’ ability to collaborate and harm scientific discovery and drug development.

Data silos are caused by multiple reasons:

  • Every department has its own data repository isolated from the rest of the organisation
  • Silos tend to arise naturally in organisations over time because each department or business function has different goals, priorities, responsibilities, processes, and data repositories.
  • Legacy solutions don’t prioritise sharing data to other departments while maintaining data segregation, reduce the flow of information to what is considered strictly necessary

The challenge of data silo is normally recognised but often not addressed in a strategic and planned way, especially by small and medium companies.

Elimination of data silos and the creation of free data flow between the departments is the secret to transforming data into fuel for the company’s growth.

Breaking Down the Silos

If a pharmaceutical company plans to break down the data silos and use data to improve its performance and deliver its solutions to the market in a faster and more efficient manner, it should handle a cultural and structural change.

This can be done in 5 steps:

  1. Identify the cause of the company’s silo problem and the main impacted areas (ex. Pharmacovigilance department)
  2. Get management to buy in
  3. Define a scalable technology strategy that covers structural and cultural aspects of the change
  4. Embrace a corporate advanced reporting and analytics solution to create a bridge between the departments
  5. Invest in a process review and cross-functional training

Step 1: Identify the Cause of Silos and the Main Impacted Areas

Identifying the data management inefficiencies that impact company’s performance is very challenging. This is particularly true in small and medium companies without a Chief Data Officer who advocates the importance of collecting and leveraging data in decision-making.

In this case, the first step is to identify a business area that relies the most on data and rationalise the way it collects and manages information. This department will be the “Front door” of the change.

A good example is pharmacovigilance department. It is often perceived as a pure «cost» for the company, but that is key business unit in terms of regulatory compliance and patient safety.

Pharmacovigilance departments need to continuously improve their processes to reduce department costs. On the other side, they are under great pressure because of strict regulations and necessity to work closely with other departments (like regulatory affairs and clinical research) that have their own, completely different business goals.

Breaking down the data silos between pharmacovigilance and other departments eases the cooperation and allows faster and more robust decisions based on data analysis and visualisation, safeguarding patient’s health more efficiently.

Step 2: Get the Management to Buy In

To break down the data silos by introducing a cross-departmental reporting solution, you need to get the upper management to buy in.

Getting rid of data silos helps each individual department and the entire organisation by:

  • offering them the big picture of the company’s data
  • aligning company’s long-term goals and department objectives
  • reducing the conflict between departments providing clear guidance on corporate priorities and avoiding conflict between personal objectives and corporate growth
  • engage corporate stakeholders like IT which should be more conscious of the business goals and act as opinion leaders

Step 3: Define a Scalable Strategy

Pharmaceutical companies can find it challenging to decide which technology solution to choose or when to eliminate redundant technology.

This challenge can arise from biases in business decision making, which in turn can be caused by the following reasons:

  • advocating previous choices (technologies, consultants, processes)
  • sticking to the company’s previous habits
  • lack of appropriate decision-support tools

To ensure the appropriate allocation of funds and minimal disruption to the company’s work during the period of change, it is critical to consider the change as a program. Optimisation of data management and data reporting should be included in a company’s defined roadmap.

This means that Work Breakdown Structure (WBS) approach should be followed – below you can find an example of WBS for defining activities of the project aim at breaking down the data silos.

How to Break Down the Data Silos in Life Sciences Companies

Step 4: Embrace Advanced Reporting and Analytics Tools

Most pharmaceutical companies manage different systems that contain heterogeneous and disparate data.

This gives rise to two challenges:

  • Optimisation of the access to the information of the specific business systems (QA, PhV, RA databases)
  • Increase of the ability to share data by rationalising and connecting these systems

Introduction of an advanced reporting and analytic solution allows pharmaceutical companies to:

  • Support operational teams with data analysis and reporting and reduce manual elaboration of the data. This is not only time consuming, but also increases the risk of regulatory non-compliance
  • Allow the managers to effectively oversee operational teams. This includes:
    • Identifying challenging and ineffective processes
    • Improving KPIs
    • Obtain data that supports the decision-making process
  • Integrate external data in a smooth manner
  • Merge and evaluate data coming from different business areas to allow to identify in real time inconsistencies, carry out data reconciliation, avoid duplicating information between the systems, speed up the regulatory reporting, and increase data quality.

Step 5: Invest in Process Review and Cross-functional Training

Social change is a key step in breaking down the silos – all stakeholders must be fully committed to the successful result from the very beginning.

It is extremely important to share the company objectives both with management and operational teams. The management can define the strategic and economic advantages, but only the resources involved in the data management can identify the real bottlenecks that this solution can destroy.

The involvement of the teams during the requirements gathering, organisation of cross-functional workshops and training during the entire program development can improve the quality of the new processes and tools. What is more important, it can also break down the cultural silos (together with the data ones) reducing the change resilience and building relationships and corporate culture.

Conclusion

Companies can unlock the full data potential and optimize internal processes, and improve decision making by breaking down the data silos and allowing data to flow freely. Advanced data reporting and visualisation solution is a key step in this process alongside cultural and structural change.

The destruction of the data silos in Life Sciences companies can result in streamlined delivery of drugs to the market and more efficient work of all the departments, benefiting patients and allowing companies to safeguard their health more effectively.

Register now to get free access to webinar recording to learn more about breaking down the silos in Life Sciences companies

Navigating Business in Times of COVID-19

Navigating Business in Times of COVID-19: Interview with Paolo Morelli

Navigating Business in Times of COVID-19: Interview with Paolo Morelli

COVID-19 pandemic forced companies to explore how to survive while ensuring business continuity and protecting the health of their employees. We talked to Paolo Morelli, CEO of PM Holding (Arithmos, CROS NT, seQure), in order to understand what business owners can do to mitigate the impact of the pandemic on their operations, ensure business continuity, and deliver better outcomes in times of crisis.

What were your initial steps when the outbreak started?

We had two major concerns when the outbreak started: safety of our team and business continuity.

In order to deal with these concerns, we have formed a risk management team and carried out an operational and commercial revision. The former was necessary to mitigate the risks for our clients and employees, while the latter allowed us to adjust the way our company interacts with the market.

The operational and commercial revision was a way to quickly reassess the situation from the business point of view. COVID-19 had changed the way our Life Science industry operates, so we needed to evaluate regulatory aspects of the new situation and understand new operational constraints. For example, we had to assess how to conduct GCP audits with travel restrictions or how to do on-site monitoring. All of these questions needed urgent answers if we wanted to continue delivering services to our clients.

How did the transition to the home office go?

Smoothly. In the last 5 years we have created a structure that allowed our teams to move to their home office without any obstacles. All the companies across PM Holding have a very clear organisational structure. Each company has a managing director, line managers, team leads that work closely with their units and have clear work objectives. We currently have in place recurring meetings and revisions to ensure that projects are delivered on time and many of our colleagues already have experience with working from home.

All these factors combined with the reliable IT infrastructure and underpinned by our company values – trust and communication – allowed us to switch to home office working in the blink of an eye.

What is the biggest challenge in managing a business in pandemic times?

I believe that regardless of the industry, the most challenging part is the change management. All of us are more comfortable with doing things in a familiar way and we can sometimes struggle when we need to adopt new paradigms as some are very risk adverse.

However, these are times when we need to be flexible and adopt a smart strategy effectively, like during the COVID-19 pandemic.

Paolo Morelli, CEOAre you happy about the way PM Holding companies are coping with the COVID-19 challenge?

Yes. We have worked hard to build the company culture based on trust, innovation, and communication and I see very clearly now that the results of these efforts and investments.

What have we learnt?

First of all, we now have proof that our processes and internal structure are really effective. In calm periods these processes, rules and procedures may seem unnecessary and too strict, but in times of crisis like COVID-19 pandemic, they turned out to be hugely advantageous. They have allowed us to provide business as usual approach to our clients and ensure smooth transition to the home office for our teams.

Secondly, we have seen the importance of glocal management. Thinking globally and acting locally allows us to have a global strategy and global mindset, and at the same time, consider local events and be as close to our clients and employees as possible.

How can businesses deliver better outcomes in times of a crisis?

There are three tips that I would like to share:

  1. Analyse the company organisation and understand the new risks
  2. Define the risk mitigation plan and do the necessary changes in the company organisation
  3. Identify opportunities that the crisis may give rise to

I also can’t emphasise enough the importance of innovation. Innovation needs to be part of company culture, as this is one of the only ways to find new solutions, take risks and cope with adverse circumstances successfully. I am not talking only about technology here – innovation is a mindset! We need to develop this mindset and bring the younger generation into our businesses, as they are like sponges and take onboard so much information, have open minds and are able to think outside the box.

What do you think will be COVID-19 impact on Life Sciences?

Pre-COVID-19 working from home in our industry was the exception rather than the rule, however, it may be vice versa in the future. I also think there will be much more focus on decentralised clinical trials and further enhancing technology with the buy in both from sponsors and patients.

Continue reading:

Q&A: Kairos – Data Analytics and Reporting Solution

Kairos Q&A Data Analytics and Reporting Tool

Q&A: Kairos – Data Analytics and Reporting Solution

Kairos, Data Analytics and Reporting solution for PharmacovigilanceLast year, Arithmos, a provider of innovative IT solutions with a specific focus on the Life Sciences, announced a partnership agreement with E-project, a consulting and system integration company. Together they have brought to the market Kairos, a unique Data Analytics and Reporting solution for pharmacovigilance.

Kairos gives Life Science companies full visibility of their safety data with real-time reports and dashboards, easily ensures regulatory compliance and creates value from the advanced analysis of their safety data.

We have talked to Silvia Gabanti, Arithmos’ Managing Director, to learn more about this new solution.

How can Kairos improve the pharmacovigilance processes and ease the life of the pharmacovigilance team?

First of all, Kairos allows to produce PSUR/PBRER and other safety reports, automatically, in a simple and error-free manner. It excludes the factor of human error.

Secondly, it allows to optimize the performance of the pharmacovigilance department by creating reports to measure KPIs.

Thirdly, it removes the silos between Pharmacovigilance and Regulatory and Clinical departments supporting the data reconciliation in the most effective way.

Silvia Gabanti Managing Director ArithmosCan other departments use Kairos?

Absolutely, and this makes Kairos unique. It helps to make sense of data regardless of its origin. Now Life Science companies gather enormous amounts of data from different sources, like EDCs and safety systems, and mostly use it to ensure compliance. However, data is much more than this.

Data analysis allows to optimize the processes while reducing the team’s effort and the gap between operational units. This helps to make faster and safer decision and improves the business efficiency. Life Science companies can grow business and act proactively, not only reactively.

Although Kairos was born as a pharmacovigilance reporting and data analytics solution, it can also be used horizontally across the company. Departments from Clinical to Marketing can use it to analyse their data, create reports, and share them easily.

Kairos allows you to break down the silos between the departments.

What makes Kairos a unique solution on the market?

Kairos is the result of a synergy between teams with a decade of experience in business intelligence, data analytics, pharmacovigilance, and Life Sciences in general. It is more than just a piece of technology. Kairos includes the solution itself, consultancy, report building, and fine tuning.

Such a combination makes Kairos a powerful tool that is dispensed with low license and maintenance costs.

How can you describe Kairos’ price model?

It is scalable. You start with the minimum investment with the licenses and configuration. If you would like to add new reports, or extend Kairos to other departments, you can easily do it.

Can Kairos be customised?

Yes, absolutely. The whole system is customisable. The first thing we do is understand our clients business needs and type of reporting required. Then we can customize Kairos based on these expectations.

The next step is to give the clients detailed training. We want them to be completely independent and able to change the format, design or data, themselves, to fit their organisation.

How does Kairos fit the bigger Arithmos goals?

We strive to provide 360-degree support, to our clients, on the road to Digital Transformation. Kairos is another step towards this goal.  It allows Arithmos to give more support to the clients.

We also believe in a proactive approach to safety and regulatory compliance. Harnessing the power of data that Kairos contributes to it.

Last, but not the least, we preach efficiency in operations and processes. Introduction of a tool that can operate horizontally and not vertically, allows to save on IT investments and optimize the departments’ work by breaking down the silos.

Want to learn more about Kairos? Contact Arithmos now to see the demo of the solution.

How to Perform a Successful Data Migration in Life Sciences

How to Perform a Successful Data Migration in Life Sciences

How to Perform a Successful Data Migration in Life Sciences

Data Migration is defined as the process of selecting, preparing, extracting, and transforming data and permanently transferring it from one computer storage system to another.

Data migration has the reputation of being risky and difficult and it’s certainly not an easy process. It is time-consuming with many planning and implementation steps, and there is always risk involved in projects of this magnitude.

Without a sufficient understanding of both source and target, transferring data into a more sophisticated application will amplify the negative impact of any incorrect or irrelevant data, perpetuate any hidden legacy problem and increase exposure to risk. A data migration project can be a challenge because administrators must maintain data integrity, time the project correctly to minimize the impact on the business and keep an eye on costs.

However, following a structured methodology will reduce the pain of managing complex data migration.

On May 14, Arithmos hosted a complimentary webinar on performing a successful data migration in Life Sciences. The webinar was conducted by Alessandro Longoni, Arithmos Senior Project Manager & Data Analyst, and focused on the challenges in data migration and ways of overcoming them.

Continue reading to learn the tips on successful performance of data migration in Life Sciences.

Register now to get free access to webinar recording

Tip 1 – Understanding the data

Before starting the data migration, you have to prepare your data for the migration, carrying out an assessment of what is present in the legacy system, understanding clearly which data needs to be migrated, avoiding duplication and promoting quality and standardization.

We can divide the assessment of legacy system in two macro categories:

  • Assessment of the data meaning
  • Assessment of the data quality

Every piece of data that you move is something that has to be validated, to be cleaned and transformed. In data migration projects, migrating only relevant data ensures efficiency and cost control.

Understanding how the source data will be used in the target system is necessary for defining what to migrate. It is important to look at how people are using existing data elements. Are people using specific fields in a variety of different ways, which need to be considered when mapping out the new system’s fields?

The second macro area is the assessment of the quality of the data. It is very important to define a process to measure data quality early in the project, in order to obtain details of each single data piece, and identify unused fields or obsolete records that may have undergone multiple migration. It is also important to avoid the migration of duplicate records or not relevant records.

The quality analysis typically leads to a data cleaning activity.

Cleaning the source data is a key element of reducing data migration effort. This is usually a client’s responsibility. However, the client can be supported by the provider, who will perform specific data extractions, aggregations and normalizations in order to reduce client’s effort.

Another way to clean data is the adoption of migration scripts for cleaning purposes. It is important to understand that this kind of activity could create validation issues, because we are modifying the source data leaving the perimeter of a pure data migration and creating potential data integrity issues.

Tip 2  – Project Governance

The best for approaching a data migration project is clearly defining roles and responsibilities and avoiding accountability overlapping. This can be done in several steps:

  • Define the owner of the data in the target system
  • Include the business users in decision-making. They understand the history, the structure and the meaning of the source data. Additionally, if business users are engaged in the data migration project, it will be easier for them to interact with migrated cases after GoLive.
  • Rely on data migration experts. Each data migration requires assistance from experts, which can fill the gap between business and IT, where both are key stakeholders but often unable to understand each other.

Based on our experience, what makes a difference is the presence of a business analyst. This is a person that acts as a bridge between the technical staff involved in the technical implementation of the migration, and the businesspeople. The business analyst can explain in a clear way technical requirements, and that can really help the business to define the migration rules based on how the target system will use the migrated data.

Tip 3 – Roll back & Dry Run

A roll back strategy has to be put in place in order to mitigate risks of potential failures. Access to source data have to be done in read only mode. This prevents any kind of data modification and ensures its integrity. Backups have to be performed on the target system in order to restore it in case of failures.

Accurate data migration dry run allows to execute validation and production migrations without incidents or deviations. Procedures and processes have to be tested in order to check the completeness of the records, and to ensure the integrity and authenticity in according with data migration rules and purposes.

Tip 4 – The Importance of the Data Mapping Specification Document

Data Mapping Specifications document is the core of data migration. It ensures a complete field mapping and it is used to collect all mapping rules and exceptions.

This project phase is usually long and tiring for a number of reasons:

  • Volume and amount of data details
  • Technical activity with technical documents
  • Little knowledge of dynamics of target database
  • Compromises that have to be made

The Data Mapping Specifications document specified details all the rules related to the data that is migrated. The following tips can help you to do it in the most efficient way:

  • Clarify what has to be migrated and what shouldn’t be migrated
  • Clean source data – this will reduce the number of fields to migrate
  • Liaise with a business analyst that will translate technical requirements and help to explain how data will work in the target system
  • Rely on data migration expert that have already performed similar data migration in the past
  • Avoid using an Excel sheet for mapping table fields – a more visual document with pictures will ease the conversation with the business users

Tip 5 – Perform comprehensive validation testing

To ensure that the goals of the data migration strategy are achieved, a company needs to develop a solid data migration verification process. The data migration verification strategy needs to include ways to prove that the migration was successfully completed, and data integrity was maintained.

Tools or techniques that can be used for data migration verification include source vs target data integrity checks. A combination of manual and automatic checks can be used to cover all verification needs. The verification can include qualitative and quantitative checks.

In order to carry out data verification, a processing of a migrated data through a workflow must be done. This ensures that migrated data properly interacts with the target system functionalities.

The sampling strategy is defined in the Data migration plan and it should be driven by the risk impact of the data. 100% sampling is not feasible and adequate, therefore standards such as ANSI/AQL is usually used to define the risk-based sampling strategy.

This article is based on a webinar presented by Alessandro Longoni on May 14. Register today to get access to the webinar recording.

About Alessandro Longoni, Senior Project Manager & Business Analyst

Alessandro LongoniAlessandro Longoni is a Senior Business Analyst & Project Manager of Arithmos. He has been working in the field of Life Sciences technology for 13 years and has particular expertise in pharmacovigilance and electronic data management.

Learn more about Alessandro Longoni, his career path and current role in Arithmos in our “Career Insights” section.

Artificial Intelligence in Clinical Research

AI in Clinical Research

Interview with Paolo Morelli: Artificial Intelligence in Clinical Research

On the 20th of May Paolo Morelli, CEO of Arithmos, joined the Scientific Board of Italian ePharma Day 2020 to discuss the growing role of the new technologies in clinical trials. We have taken this opportunity to talk to him about one of the most debated technologies of the last few years – Artificial Intelligence (AI) – to understand how it is currently used in the industry and what future it has.

There are so many discussions around the AI right now. What are biggest doubts related to it?

There are a number of questions that we are asking ourselves now. Here are just a few examples:

  • Will AI change and automate processes, in clinical research, making it more efficient?
  • Will researchers have to change the way they do their work?
  • Will pharma companies have to reorganize their R&D departments?
  • Will the approach to the training of clinical research personnel change?
  • Will patient privacy be protected?

For sure companies performing clinical trials are more and more interested in AI. They want to have data of higher qualityperform recruitment faster and in general, be more efficient. Also, when it comes to the financial side of the projects, the journey has just begun, and a lot is yet to come.

What does the notion of the AI encompass?

I suggest we start with what is not AI. For example, automation is not AI. Automation is the use of machines that follow a precise program defined by Human Intelligence. The artificial intelligence (AI), sometimes called machine intelligence, is opposite to the Human intelligence (HI). We can define AI as any device that collects information and takes actions that maximize its chance of successfully achieving its goals.

Is AI already used in the clinical trials?

Yes, I can outline two of the biggest areas where it is already in use. The first one is the use of AI over eSource/mSource data to support decision making (for ex: feasibility, communication with the patient) and the second one is the use of AI to reduce data entry and the burden of paper documentation (for ex. eConsent, Regulatory document completion, SDV).

How will the AI use be expanded in the future? Where else we can apply the AI in clinical trials?

For sure AI will allow a continuous communication with the patient, even though the human touch cannot be replaced by technology. I also believe that, at some point, SDV will be fully automated thanks to NLP technologies. With the proper training data set AI will be more successful in detecting data issues than Human Intelligence.

But even if patients and the industry are typically open to the innovative solutions, regulators will have a big role in the spread of the innovation, and can support it by setting up a modern regulatory framework.

Once the AI becomes integral part of the clinical trials industry, what will be the potential benefits for the patients?

AI in Clinical Research

The combination of AI with the scientific content will support a better communication with the patient: eConsent will allow a better understanding of the texts compared to the paper version of the Informed Consent. AI will answer questions that the subjects may have about the study, and the patient will be more informed about how his data is treated. I believe that we will see a bigger presence and influence of the patients in clinical research. It will be important, though, to overcome privacy challenges before introducing technology and informing the patients about the use of all their data.

Do you think the AI will replace the human professionals in clinical trials of the future?

The omnipresence of overly abundant data (both clinical data and project data) can be successfully triaged and managed with the assistance of AI. And yet, it is HI (human intelligence) that is needed to make the final decision from the options offered. Think about our air traffic control system, highly automated, advanced technology, and yet there remains the need for HI to make the final decisions regarding all these simultaneous active flights (projects).

What are the aspects that can determine success of the AI in clinical trials?

There are three main aspects that will determine the success of AI in Clinical Trials. Availability of the technology is not one of them.

One critical aspect is the availability of System Integrators as a bridge between technology and researchers. Their goal will be to understand clinical processes and to shape all this new technology around the clinical trial activities.

The second important aspect will be the ethics, morals, and governance around AI and ML. Regulators and Life Science companies will need to setup a framework before going too far down the road of AI.

And the last thing is data. AI needs data. Accurate data. Lots of it. We need to start collecting organised data and train AI before it can successfully achieve the desired result of making the clinical trials faster, more efficient, and ensuring higher quality.

At Arithmos, did you already start exploring such innovative technologies?

Arithmos, just as other PM Holding companies, has Innovation as its core value, it is at the heart of everything that we do. Looking at the impact of innovative technology on the Life Sciences industry, I believe that investing in this field was the right choice.

At Arithmos we are already supporting our clients with the digital technologies that optimise clinical trial processes and increase patient centricity, ensuring that patients get an active role in the clinical trials.

Would you like to learn how Arithmos can support you in innovating your clinical trials? Contact us by clicking here.

Interested in more materials on Digital Transformation?

Webinar Q&A: How to Perform a Successful Data Migration in Life Sciences

Data Migration Webinar Q&A

Webinar Q&A: How to Perform a Successful Data Migration in Life Sciences

In the Life Sciences industry a structured, but lean, approach to data migration is vital for success. Such an approach not only ensures data integrity and consistency, but also minimizes the data migration impact on day-by-day business operations.

On May 14, Arithmos hosted a complimentary webinar on performing a successful data migration in Life Sciences. The webinar described the main risks and challenges of the data migration, ways to overcome them, and presented a case study of data migration in pharmacovigilance.

The webinar was conducted by Alessandro Longoni, a Senior Business Analyst & Project Manager of Arithmos. He has been working in the field of Life Sciences technology for 13 years and has particular expertise in pharmacovigilance and electronic data management.

Continue reading to discover the questions from the Q&A session from the webinar.

Register today to get access to the webinar recording

During a pharmacovigilance migration, should only the last version of the cases be migrated, or should we migrate all the versions?

It depends on the legacy and the target system. Some classical safety systems save each version of each case as an independent case.

If you use an ETL tool, to migrate to a new system, that does not operate according to this logic, we would advise to migrate only the last version of the case in order to avoid the duplicate cases. It is also advisable to migrate the case history (case workflow) if possible.

However, in all the cases, it is technically possible to migrate all the versions of the case if you find the right workaround. For example using the following naming format: <case_number>_<caseversion>.

The best practice in data migration is adopting the E2B XML approach – regardless of the number of case versions, only the last version is included in the legacy system.

What steps can be omitted in semi-automated data migration compared to the automated?

Data migration, regardless of the approach used (ETL, XML import, re-entry), must be included in a validation exercise.

The main steps and deliverables are the same regardless of the approach. However, the documents and activities in semi-automated and automated data migration projects require different accuracy and effort:

  • Data Migration Plan: it should include the strategy of the migration project and the main rules for data migration. This means that it covers the mapping needs for re-entry, and in case of the ETL tool, a data mapping document is highly suggested.

In case of the XML migration the format depends on the number of cases and the complexity (due to the difference between the configurations of target and legacy system). For simple migrations/low number of cases, the data migration plan could be sufficient. For complex migrations/high number of cases, we suggest a dedicated data mapping document

  • Data Mapping Specifications: see above
  • Data Migration Validation: it is a protocol that must be executed in the validation environment to verify the migration process. It is more extended in the case of ETL, but we suggest to use it regardless of data migration approaches.
  • Data Migration Production: it consists of the execution of the protocol, described above, that is conducted to migrate data in production. In case of an ETL approach, this phase will require 2-5 days for technical activities plus 1-2 days for business data verification (production downtime period). In case of an XML or re-entry process, this phase will take up to several months of business work (no production downtime period).
  • Data Migration Report

When you access the source data in read-only mode, is it possible to also access the files attached to the ICSR such as publication (in PDF format) or ACK from Regulatory Authorities (in XML format)? / How can we migrate attachments from one system to another? Should the client send the attachments separately from the XML files if the migration is performed by import/export migration?

Yes, it is possible. Of course, this possibility depends heavily on the structure of the legacy system database. Sometimes, technology constraints don’t allow you to extract some particular information, such as attachments or audit trail.

If the legacy system stores the attachments in the database with proper references to the relevant case, and if the target system supports the case attachment functionality, the best idea would be to perform the integration using the ETL tool.

If I have an E2B XML file as source of a case recorded in an Excel database, can I use it as a source for the migration into the new database in order to avoid doing a migration manually from the old database?

The standard approach to data migration is migrating data from the legacy to the target system and not from the source to the target. Nevertheless, a different approach can be chosen if it can grant a better data quality (like in the question) and efficiency. This approach must be described and justified in the Data Migration Plan. Also, we advise to carry out a final verification to detect every discrepancy between migrated data and data stored in the legacy system.

How can you calculate the number of the records from the source and target systems that need to be validated?

An ETL tool can reduce the need for the data verification, especially because a migration script works on specific fields independently from the number of records to be migrated. For example, if my script moves correctly the laboratory data, the number of tests executed by a specific patient is not relevant. This ensures the quality of the migrated data.

However, a minimum quality check of the fields has to be performed, and it should be based on the recommendations of your CSV manager and the risk assessment.

A numeric check of the records moved, (100%), is always required to ensure that there were no exceptions during the script run.

This article is based on a webinar presented by Alessandro Longoni on 14th May 2020. Register today to get access to the webinar recording.

Page 1 of 3123

Follow us on Twitter

RESOURCES

NEWS