Posts Tagged Digital transformation

Webinar: Digital Transformation in Pharma: Challenges and Enablers

Digital Transformation in Pharma: Challenges and Enablers

Webinar: Digital Transformation in Pharma: Challenges and Enablers

We are excited to invite you to our new webinar titled “Digital Transformation in Pharma: Challenges and Enablers”. In the last 10 years, the international and Italian managerial debate has focused heavily on the issue of corporate digitalisation. For pharma and healthcare professionals, the digital age brings new uncertainties.

Companies face a difficult choice:

  • Evolve with the new era by building a digital organisation
  • Risk becoming less competitive on the market as they fail to embrace the change

In this webinar we will discuss the challenges that pharmaceutical companies face on the road to digital success and how to overcome them.

Click here to learn more about the webinar and register your attendance

Who should attend?

CIOs, Clinical, Pharmacovigilance and R&D professionals looking for harnessing the power of new digital technologies

Date and time:

Wednesday, October the 21st, 2020, 4:00 PM – 5:00 PM CEST

The key learning outcomes will be:

  • What is digital maturity
  • Main challenges in embracing digital transformation in pharma industry
  • Enables of digital transformation

Speaker: Silvia Gabanti, Managing Director of Arithmos

Silvia has an impressive 15-year track record in the industry. She began her career in the CRO environment as pharmacovigilance and clinical research specialist. On her career record, she also has extensive experience with Oracle applications – as analyst she was working with pharmacovigilance and clinical systems like Oracle Clinical. Frequent speaker at the industry events, for the last two years before becoming Managing Director she held the position of Service Delivery Manager.

How to Power Clinical Trials with Internet of Things

How to Power Clinical Trials with IoT

How to Power Clinical Trials with Internet of Things

On July 16, Arithmos’ CEO Paolo Morelli has joined a complimentary webinar organised by Avenga, global IT and digital transformation company. This exclusive webinar was prepared for Pharmatech, Life Sciences, Pharmaceutical, Drug Discovery & Development and Healthcare companies and focused on how Internet of Things (IoT) transforms clinical trials.

In this blog we share some of the webinar highlights.

Continue reading to learn about how IoT is used to combine disparate data, improve clinical trials through higher patient retention and reduce trial spans.

Avenga (A): How does the introduction of IoT changed your perspective on the industry?

Paolo Morelli (PM): Nowadays a big variety of patient data including data coming from IoT devices and what I call process data (i.e. KPIs of clinical trial processes) is available. This data becomes more and more valuable for the industry and increases the demand for specialised service providers that are able to work with such big data volumes and harness their potential.

A: What are the key items one should focus on when integrating IoT in clinical trial?

PM: I would outline four key elements:

  • Study team that includes a digital manager to support the integration of the technology in the design of the study and study tools
  • Patient association to ensure that patients will have smooth experience with the new technology
  • Senior Data Managers and Statisticians to cope with regulatory challenges related to data collection and stat methodology
  • Privacy expert

A: When it comes to IoT, what is the difference between a CRO role and a pharmatech company role?

PM: CROs support pharmaceutical companies in conducting clinical trials with clinical and data services. However, most of the CROs are only starting to explore the world of IoT and learning how to integrate them in the traditional services.

On the other side, pharmatech companies have already a broad understanding of technological framework. Thus, they are able to support both pharmaceutical companies and CROs integrating new IT solutions in clinical trials.

A: How do sponsors see IoT in clinical trials?

PM: Sponsors see IoT as a big opportunity as they understand the value of the data that can be generated and used to support clinical evidence and safety of their products. This is valid not only for pharmaceutical and medical device companies, but also for organisations that conduct non-profit studies.

However, the tackling of the growing importance of IoT is also challenging, as a lot of industry players still do not possess necessary analytical and technical capacities.

A: Have you had to adjust the compliance framework as well as other SOPs for the IoT related projects that Arithmos has conducted so far?

PM: Not too much, as the processes are independent from the type of source data and its quantity.

A: What are the new skills that young professionals should add to their skillset to be relevant in this labour market?

PM: Analytical skills are becoming increasingly important for both data scientists and clinical researchers, as they will need to understand the massive amount of reports available in every clinical trial.

Digital managers will also become indispensable for the correct selection and integration of IoT tools in clinical trials processes.

A: What is the connection between big data and data science and IoT?

PM: Data scientists have an important role in bringing together the huge massive of data that arrives from IoT devices.

A: How do you think Covid-19 impacted use of IoT in clinical trials?

PM: I think there will be much more focus on decentralised clinical trials that involve IoT devices and further enhancing technology with the buying both from sponsors and patients.

Would you like to learn how Arithmos can support you in innovating your clinical trials? Contact us by clicking here.

Interested in more materials on Digital Transformation?

How to Break Down the Data Silos in Life Sciences Companies

How to Break Down the Data Silos

How to Break Down the Data Silos in Life Sciences Companies

In Life Sciences industry, making sense of data becomes increasingly challenging due to its growing volume, introduction of new and complex technologies and increase of stakeholder numbers.

Introduction of the right data reporting and analytics solution at the corporate level allows Life Sciences companies not only to streamline the collaboration within the company by breaking down the silos between the departments but also to improve the decision-making process and optimise technology investments.

On July 8, Arithmos hosted a complimentary webinar on using data reporting and visualisation for breaking down the silos in Life Sciences companies. The webinar was conducted by Silvia Gabanti, Managing Director of Arithmos, and Massimo Businaro, CEO of E-project.

Continue reading to learn how effective data reporting and visualisation can improve the decision making at multiple levels, increase operational efficiency, and break the departmental silos.

Register now to get free access to webinar recording

Data Ecosystems in Life Sciences companies

Rapid technology advancement and new regulations benefit patients and speed up clinical trials. However, they also increase the amount of data generated daily. Real World Data, Internet of Things, ePRO – all these technologies allow to multiple the amount of data that a company deals with on a daily basis.

Data is the new fuel for Life Sciences businesses and source of competitive advantage; it drives innovation and stimulates growth.

Currently, despite the enormous potential of this newly generated data, a lot of pharmaceutical companies are suffering from the data silos issue. It manifests itself in an inability to share data across departments, the presence of multiple data repositories, and the absence of a clear picture of the company’s data.

As data is often stored in multiple tools and locations, data silos severely impact the departments’ ability to collaborate and harm scientific discovery and drug development.

Data silos are caused by multiple reasons:

  • Every department has its own data repository isolated from the rest of the organisation
  • Silos tend to arise naturally in organisations over time because each department or business function has different goals, priorities, responsibilities, processes, and data repositories.
  • Legacy solutions don’t prioritise sharing data to other departments while maintaining data segregation, reduce the flow of information to what is considered strictly necessary

The challenge of data silo is normally recognised but often not addressed in a strategic and planned way, especially by small and medium companies.

Elimination of data silos and the creation of free data flow between the departments is the secret to transforming data into fuel for the company’s growth.

Breaking Down the Silos

If a pharmaceutical company plans to break down the data silos and use data to improve its performance and deliver its solutions to the market in a faster and more efficient manner, it should handle a cultural and structural change.

This can be done in 5 steps:

  1. Identify the cause of the company’s silo problem and the main impacted areas (ex. Pharmacovigilance department)
  2. Get management to buy in
  3. Define a scalable technology strategy that covers structural and cultural aspects of the change
  4. Embrace a corporate advanced reporting and analytics solution to create a bridge between the departments
  5. Invest in a process review and cross-functional training

Step 1: Identify the Cause of Silos and the Main Impacted Areas

Identifying the data management inefficiencies that impact company’s performance is very challenging. This is particularly true in small and medium companies without a Chief Data Officer who advocates the importance of collecting and leveraging data in decision-making.

In this case, the first step is to identify a business area that relies the most on data and rationalise the way it collects and manages information. This department will be the “Front door” of the change.

A good example is pharmacovigilance department. It is often perceived as a pure «cost» for the company, but that is key business unit in terms of regulatory compliance and patient safety.

Pharmacovigilance departments need to continuously improve their processes to reduce department costs. On the other side, they are under great pressure because of strict regulations and necessity to work closely with other departments (like regulatory affairs and clinical research) that have their own, completely different business goals.

Breaking down the data silos between pharmacovigilance and other departments eases the cooperation and allows faster and more robust decisions based on data analysis and visualisation, safeguarding patient’s health more efficiently.

Step 2: Get the Management to Buy In

To break down the data silos by introducing a cross-departmental reporting solution, you need to get the upper management to buy in.

Getting rid of data silos helps each individual department and the entire organisation by:

  • offering them the big picture of the company’s data
  • aligning company’s long-term goals and department objectives
  • reducing the conflict between departments providing clear guidance on corporate priorities and avoiding conflict between personal objectives and corporate growth
  • engage corporate stakeholders like IT which should be more conscious of the business goals and act as opinion leaders

Step 3: Define a Scalable Strategy

Pharmaceutical companies can find it challenging to decide which technology solution to choose or when to eliminate redundant technology.

This challenge can arise from biases in business decision making, which in turn can be caused by the following reasons:

  • advocating previous choices (technologies, consultants, processes)
  • sticking to the company’s previous habits
  • lack of appropriate decision-support tools

To ensure the appropriate allocation of funds and minimal disruption to the company’s work during the period of change, it is critical to consider the change as a program. Optimisation of data management and data reporting should be included in a company’s defined roadmap.

This means that Work Breakdown Structure (WBS) approach should be followed – below you can find an example of WBS for defining activities of the project aim at breaking down the data silos.

How to Break Down the Data Silos in Life Sciences Companies

Step 4: Embrace Advanced Reporting and Analytics Tools

Most pharmaceutical companies manage different systems that contain heterogeneous and disparate data.

This gives rise to two challenges:

  • Optimisation of the access to the information of the specific business systems (QA, PhV, RA databases)
  • Increase of the ability to share data by rationalising and connecting these systems

Introduction of an advanced reporting and analytic solution allows pharmaceutical companies to:

  • Support operational teams with data analysis and reporting and reduce manual elaboration of the data. This is not only time consuming, but also increases the risk of regulatory non-compliance
  • Allow the managers to effectively oversee operational teams. This includes:
    • Identifying challenging and ineffective processes
    • Improving KPIs
    • Obtain data that supports the decision-making process
  • Integrate external data in a smooth manner
  • Merge and evaluate data coming from different business areas to allow to identify in real time inconsistencies, carry out data reconciliation, avoid duplicating information between the systems, speed up the regulatory reporting, and increase data quality.

Step 5: Invest in Process Review and Cross-functional Training

Social change is a key step in breaking down the silos – all stakeholders must be fully committed to the successful result from the very beginning.

It is extremely important to share the company objectives both with management and operational teams. The management can define the strategic and economic advantages, but only the resources involved in the data management can identify the real bottlenecks that this solution can destroy.

The involvement of the teams during the requirements gathering, organisation of cross-functional workshops and training during the entire program development can improve the quality of the new processes and tools. What is more important, it can also break down the cultural silos (together with the data ones) reducing the change resilience and building relationships and corporate culture.

Conclusion

Companies can unlock the full data potential and optimize internal processes, and improve decision making by breaking down the data silos and allowing data to flow freely. Advanced data reporting and visualisation solution is a key step in this process alongside cultural and structural change.

The destruction of the data silos in Life Sciences companies can result in streamlined delivery of drugs to the market and more efficient work of all the departments, benefiting patients and allowing companies to safeguard their health more effectively.

Register now to get free access to webinar recording to learn more about breaking down the silos in Life Sciences companies

Q&A: Kairos – Data Analytics and Reporting Solution

Kairos Q&A Data Analytics and Reporting Tool

Q&A: Kairos – Data Analytics and Reporting Solution

Kairos, Data Analytics and Reporting solution for PharmacovigilanceLast year, Arithmos, a provider of innovative IT solutions with a specific focus on the Life Sciences, announced a partnership agreement with E-project, a consulting and system integration company. Together they have brought to the market Kairos, a unique Data Analytics and Reporting solution for pharmacovigilance.

Kairos gives Life Science companies full visibility of their safety data with real-time reports and dashboards, easily ensures regulatory compliance and creates value from the advanced analysis of their safety data.

We have talked to Silvia Gabanti, Arithmos’ Managing Director, to learn more about this new solution.

How can Kairos improve the pharmacovigilance processes and ease the life of the pharmacovigilance team?

First of all, Kairos allows to produce PSUR/PBRER and other safety reports, automatically, in a simple and error-free manner. It excludes the factor of human error.

Secondly, it allows to optimize the performance of the pharmacovigilance department by creating reports to measure KPIs.

Thirdly, it removes the silos between Pharmacovigilance and Regulatory and Clinical departments supporting the data reconciliation in the most effective way.

Silvia Gabanti Managing Director ArithmosCan other departments use Kairos?

Absolutely, and this makes Kairos unique. It helps to make sense of data regardless of its origin. Now Life Science companies gather enormous amounts of data from different sources, like EDCs and safety systems, and mostly use it to ensure compliance. However, data is much more than this.

Data analysis allows to optimize the processes while reducing the team’s effort and the gap between operational units. This helps to make faster and safer decision and improves the business efficiency. Life Science companies can grow business and act proactively, not only reactively.

Although Kairos was born as a pharmacovigilance reporting and data analytics solution, it can also be used horizontally across the company. Departments from Clinical to Marketing can use it to analyse their data, create reports, and share them easily.

Kairos allows you to break down the silos between the departments.

What makes Kairos a unique solution on the market?

Kairos is the result of a synergy between teams with a decade of experience in business intelligence, data analytics, pharmacovigilance, and Life Sciences in general. It is more than just a piece of technology. Kairos includes the solution itself, consultancy, report building, and fine tuning.

Such a combination makes Kairos a powerful tool that is dispensed with low license and maintenance costs.

How can you describe Kairos’ price model?

It is scalable. You start with the minimum investment with the licenses and configuration. If you would like to add new reports, or extend Kairos to other departments, you can easily do it.

Can Kairos be customised?

Yes, absolutely. The whole system is customisable. The first thing we do is understand our clients business needs and type of reporting required. Then we can customize Kairos based on these expectations.

The next step is to give the clients detailed training. We want them to be completely independent and able to change the format, design or data, themselves, to fit their organisation.

How does Kairos fit the bigger Arithmos goals?

We strive to provide 360-degree support, to our clients, on the road to Digital Transformation. Kairos is another step towards this goal.  It allows Arithmos to give more support to the clients.

We also believe in a proactive approach to safety and regulatory compliance. Harnessing the power of data that Kairos contributes to it.

Last, but not the least, we preach efficiency in operations and processes. Introduction of a tool that can operate horizontally and not vertically, allows to save on IT investments and optimize the departments’ work by breaking down the silos.

Want to learn more about Kairos? Contact Arithmos now to see the demo of the solution.

How to Perform a Successful Data Migration in Life Sciences

How to Perform a Successful Data Migration in Life Sciences

How to Perform a Successful Data Migration in Life Sciences

Data Migration is defined as the process of selecting, preparing, extracting, and transforming data and permanently transferring it from one computer storage system to another.

Data migration has the reputation of being risky and difficult and it’s certainly not an easy process. It is time-consuming with many planning and implementation steps, and there is always risk involved in projects of this magnitude.

Without a sufficient understanding of both source and target, transferring data into a more sophisticated application will amplify the negative impact of any incorrect or irrelevant data, perpetuate any hidden legacy problem and increase exposure to risk. A data migration project can be a challenge because administrators must maintain data integrity, time the project correctly to minimize the impact on the business and keep an eye on costs.

However, following a structured methodology will reduce the pain of managing complex data migration.

On May 14, Arithmos hosted a complimentary webinar on performing a successful data migration in Life Sciences. The webinar was conducted by Alessandro Longoni, Arithmos Senior Project Manager & Data Analyst, and focused on the challenges in data migration and ways of overcoming them.

Continue reading to learn the tips on successful performance of data migration in Life Sciences.

Register now to get free access to webinar recording

Tip 1 – Understanding the data

Before starting the data migration, you have to prepare your data for the migration, carrying out an assessment of what is present in the legacy system, understanding clearly which data needs to be migrated, avoiding duplication and promoting quality and standardization.

We can divide the assessment of legacy system in two macro categories:

  • Assessment of the data meaning
  • Assessment of the data quality

Every piece of data that you move is something that has to be validated, to be cleaned and transformed. In data migration projects, migrating only relevant data ensures efficiency and cost control.

Understanding how the source data will be used in the target system is necessary for defining what to migrate. It is important to look at how people are using existing data elements. Are people using specific fields in a variety of different ways, which need to be considered when mapping out the new system’s fields?

The second macro area is the assessment of the quality of the data. It is very important to define a process to measure data quality early in the project, in order to obtain details of each single data piece, and identify unused fields or obsolete records that may have undergone multiple migration. It is also important to avoid the migration of duplicate records or not relevant records.

The quality analysis typically leads to a data cleaning activity.

Cleaning the source data is a key element of reducing data migration effort. This is usually a client’s responsibility. However, the client can be supported by the provider, who will perform specific data extractions, aggregations and normalizations in order to reduce client’s effort.

Another way to clean data is the adoption of migration scripts for cleaning purposes. It is important to understand that this kind of activity could create validation issues, because we are modifying the source data leaving the perimeter of a pure data migration and creating potential data integrity issues.

Tip 2  – Project Governance

The best for approaching a data migration project is clearly defining roles and responsibilities and avoiding accountability overlapping. This can be done in several steps:

  • Define the owner of the data in the target system
  • Include the business users in decision-making. They understand the history, the structure and the meaning of the source data. Additionally, if business users are engaged in the data migration project, it will be easier for them to interact with migrated cases after GoLive.
  • Rely on data migration experts. Each data migration requires assistance from experts, which can fill the gap between business and IT, where both are key stakeholders but often unable to understand each other.

Based on our experience, what makes a difference is the presence of a business analyst. This is a person that acts as a bridge between the technical staff involved in the technical implementation of the migration, and the businesspeople. The business analyst can explain in a clear way technical requirements, and that can really help the business to define the migration rules based on how the target system will use the migrated data.

Tip 3 – Roll back & Dry Run

A roll back strategy has to be put in place in order to mitigate risks of potential failures. Access to source data have to be done in read only mode. This prevents any kind of data modification and ensures its integrity. Backups have to be performed on the target system in order to restore it in case of failures.

Accurate data migration dry run allows to execute validation and production migrations without incidents or deviations. Procedures and processes have to be tested in order to check the completeness of the records, and to ensure the integrity and authenticity in according with data migration rules and purposes.

Tip 4 – The Importance of the Data Mapping Specification Document

Data Mapping Specifications document is the core of data migration. It ensures a complete field mapping and it is used to collect all mapping rules and exceptions.

This project phase is usually long and tiring for a number of reasons:

  • Volume and amount of data details
  • Technical activity with technical documents
  • Little knowledge of dynamics of target database
  • Compromises that have to be made

The Data Mapping Specifications document specified details all the rules related to the data that is migrated. The following tips can help you to do it in the most efficient way:

  • Clarify what has to be migrated and what shouldn’t be migrated
  • Clean source data – this will reduce the number of fields to migrate
  • Liaise with a business analyst that will translate technical requirements and help to explain how data will work in the target system
  • Rely on data migration expert that have already performed similar data migration in the past
  • Avoid using an Excel sheet for mapping table fields – a more visual document with pictures will ease the conversation with the business users

Tip 5 – Perform comprehensive validation testing

To ensure that the goals of the data migration strategy are achieved, a company needs to develop a solid data migration verification process. The data migration verification strategy needs to include ways to prove that the migration was successfully completed, and data integrity was maintained.

Tools or techniques that can be used for data migration verification include source vs target data integrity checks. A combination of manual and automatic checks can be used to cover all verification needs. The verification can include qualitative and quantitative checks.

In order to carry out data verification, a processing of a migrated data through a workflow must be done. This ensures that migrated data properly interacts with the target system functionalities.

The sampling strategy is defined in the Data migration plan and it should be driven by the risk impact of the data. 100% sampling is not feasible and adequate, therefore standards such as ANSI/AQL is usually used to define the risk-based sampling strategy.

This article is based on a webinar presented by Alessandro Longoni on May 14. Register today to get access to the webinar recording.

About Alessandro Longoni, Senior Project Manager & Business Analyst

Alessandro LongoniAlessandro Longoni is a Senior Business Analyst & Project Manager of Arithmos. He has been working in the field of Life Sciences technology for 13 years and has particular expertise in pharmacovigilance and electronic data management.

Learn more about Alessandro Longoni, his career path and current role in Arithmos in our “Career Insights” section.

Arithmos and E-project Launch Kairos, Data Analytics and Reporting Solution for Pharmacovigilance

Kairos, Data Analytics and Reporting solution for Pharmacovigilance

Arithmos and E-project Launch Kairos, Data Analytics and Reporting Solution for Pharmacovigilance

Verona, Italy, May 26th, 2020 – Arithmos, a provider of innovative IT solutions for Life Sciences, together with its partner E-project, a consulting and system integration company, announced the launch of Kairos, innovative Data Analytics and Reporting Solution for Pharmacovigilance.

Kairos gives Life Sciences companies full visibility of their safety data with real-time reports and dashboards, easily ensures regulatory compliance and creates value from the advanced analysis of their safety data.

Kairos, Data Analytics and Reporting solution for PharmacovigilanceBorn as a platform for the pharmacovigilance department, Kairos can also serve as a horizontal solution that fosters collaboration between departments and gives a company a bigger picture of its data. The innovative solution helps Life Sciences companies make faster and more robust decisions based on the data analysis and visualisation, safeguarding patient’s health more efficiently.

The capabilities of Kairos include:

  • Minimised risks as clear data visualisation makes complex pharmacovigilance data easier to grasp and act on
  • Improved decision making, as data can be easily combined, loaded, and visualised
  • Reduced investments in reporting across the business, as the same tool can be adopted by multiple departments

Features:

  • System agnostic – can be connected to any data source
  • Flexible reporting system – creation and fine tuning of any reports
  • Visual – the solution gives clear visual representation of the data in multiple formats

“We have been providing pharmacovigilance technology for 10 years and we know the most common pains of pharmacovigilance teams. Limited reporting capability is one of them and Kairos solves it by offering a whole range of advanced reports and data visualization dashboards”, said Arithmos Managing Director Silvia Gabanti. “We also believe in lean operations and processes, so we wanted to introduce a solution that can be versatile, and be used across the departments, breaking down the silos between them and allowing more collaboration”.

“In E-project, we are committed to applying our expertise in process engineering and data analysis, to help our customers to harness the power of data. Kairos empowers Life Science organisations to make better decisions daily and become truly data-driven”, said Massimo Businaro, president of E-project.

Contact Arithmos now to see the demo

Arithmos is part of PM Holding, a group of companies operating in the Life Sciences sector. PM Holding provides a complete platform of products and services for end-to-end drug and device development.

E-project is a consulting and system integration company founded in 2001 by professionals from the world of consulting. In its work, E-project addresses issues related to the re-engineering of business processes of customers.

Francesco Danzi
Inside Sales Associate
francesco.danzi@arithmostech.com

Artificial Intelligence in Clinical Research

AI in Clinical Research

Interview with Paolo Morelli: Artificial Intelligence in Clinical Research

On the 20th of May Paolo Morelli, CEO of Arithmos, joined the Scientific Board of Italian ePharma Day 2020 to discuss the growing role of the new technologies in clinical trials. We have taken this opportunity to talk to him about one of the most debated technologies of the last few years – Artificial Intelligence (AI) – to understand how it is currently used in the industry and what future it has.

There are so many discussions around the AI right now. What are biggest doubts related to it?

There are a number of questions that we are asking ourselves now. Here are just a few examples:

  • Will AI change and automate processes, in clinical research, making it more efficient?
  • Will researchers have to change the way they do their work?
  • Will pharma companies have to reorganize their R&D departments?
  • Will the approach to the training of clinical research personnel change?
  • Will patient privacy be protected?

For sure companies performing clinical trials are more and more interested in AI. They want to have data of higher qualityperform recruitment faster and in general, be more efficient. Also, when it comes to the financial side of the projects, the journey has just begun, and a lot is yet to come.

What does the notion of the AI encompass?

I suggest we start with what is not AI. For example, automation is not AI. Automation is the use of machines that follow a precise program defined by Human Intelligence. The artificial intelligence (AI), sometimes called machine intelligence, is opposite to the Human intelligence (HI). We can define AI as any device that collects information and takes actions that maximize its chance of successfully achieving its goals.

Is AI already used in the clinical trials?

Yes, I can outline two of the biggest areas where it is already in use. The first one is the use of AI over eSource/mSource data to support decision making (for ex: feasibility, communication with the patient) and the second one is the use of AI to reduce data entry and the burden of paper documentation (for ex. eConsent, Regulatory document completion, SDV).

How will the AI use be expanded in the future? Where else we can apply the AI in clinical trials?

For sure AI will allow a continuous communication with the patient, even though the human touch cannot be replaced by technology. I also believe that, at some point, SDV will be fully automated thanks to NLP technologies. With the proper training data set AI will be more successful in detecting data issues than Human Intelligence.

But even if patients and the industry are typically open to the innovative solutions, regulators will have a big role in the spread of the innovation, and can support it by setting up a modern regulatory framework.

Once the AI becomes integral part of the clinical trials industry, what will be the potential benefits for the patients?

AI in Clinical Research

The combination of AI with the scientific content will support a better communication with the patient: eConsent will allow a better understanding of the texts compared to the paper version of the Informed Consent. AI will answer questions that the subjects may have about the study, and the patient will be more informed about how his data is treated. I believe that we will see a bigger presence and influence of the patients in clinical research. It will be important, though, to overcome privacy challenges before introducing technology and informing the patients about the use of all their data.

Do you think the AI will replace the human professionals in clinical trials of the future?

The omnipresence of overly abundant data (both clinical data and project data) can be successfully triaged and managed with the assistance of AI. And yet, it is HI (human intelligence) that is needed to make the final decision from the options offered. Think about our air traffic control system, highly automated, advanced technology, and yet there remains the need for HI to make the final decisions regarding all these simultaneous active flights (projects).

What are the aspects that can determine success of the AI in clinical trials?

There are three main aspects that will determine the success of AI in Clinical Trials. Availability of the technology is not one of them.

One critical aspect is the availability of System Integrators as a bridge between technology and researchers. Their goal will be to understand clinical processes and to shape all this new technology around the clinical trial activities.

The second important aspect will be the ethics, morals, and governance around AI and ML. Regulators and Life Science companies will need to setup a framework before going too far down the road of AI.

And the last thing is data. AI needs data. Accurate data. Lots of it. We need to start collecting organised data and train AI before it can successfully achieve the desired result of making the clinical trials faster, more efficient, and ensuring higher quality.

At Arithmos, did you already start exploring such innovative technologies?

Arithmos, just as other PM Holding companies, has Innovation as its core value, it is at the heart of everything that we do. Looking at the impact of innovative technology on the Life Sciences industry, I believe that investing in this field was the right choice.

At Arithmos we are already supporting our clients with the digital technologies that optimise clinical trial processes and increase patient centricity, ensuring that patients get an active role in the clinical trials.

Would you like to learn how Arithmos can support you in innovating your clinical trials? Contact us by clicking here.

Interested in more materials on Digital Transformation?

Career Insights: Interview with Marco Pastore, Business Analyst & Project Manager

Career Insights: Interview with Marco Pastore, Business Analyst & Project Manager

In this new blog from ‘Career Insights’ series we talk to Marco Pastore, Arithmos’ Business Analyst and Project Manager, to find out more on his background, career path, and his interests outside of work.

Marco Pastore Business Analyst & Project Manager

Have you always been passionate about Life Sciences field?

I developed an interest in this industry few years ago. I hold a degree in Computer Engineering, and people with this background mainly work in the banking or insurance industry. I wanted to work in a very dynamic and fast-growing industry, and the pharmaceutical industry seemed like a perfect option. It was also a great opportunity not only to do my job, but also to help patients achieve better and healthier lives.

What has been your journey to your current role at Arithmos?

After graduation I concentrated on cutting-edge technologies. I worked as a System Integration Specialist in one of the leading international technological service providers. I also worked for an industrial automation company where I had the opportunity to explore the world of the Internet of Things and Industry 4.0.

After that I started my journey in the Life Sciences by joining a company that produced pharmaceutical ingredients. There I was appointed as National Coordinator of Telemetry and explored the world of healthcare by working closely with the hospitals. This was a great way to obtain industry specific knowledge of validation processes and regulations for specialized software. This experience was so handy when I joined Arithmos!

My meeting with Arithmos happened when I decided to look for a job that would give me a possibility to mix business and technology approaches. I came in for the interview and loved the Arithmos spirit! What impressed me the most was the expertise of my potential colleagues and the structural approach to the projects. I knew that in this way we could deliver superior services with clear objectives and deadlines.

How did you identify Project Management as the right career for you?

I was looking for a career path that will allow me not only to perform the operative tasks, but also be engaged in the business side of the project. Project Management allows me to get a 360-degree view of the project, both technical and business, so I have a bigger possibility to influence the outcome of the project, offer new solutions and be innovative. I am loving the innovation part, because my technical background allows me to come up with new ideas for our clients.

Why were you assigned a dual role of Business Analyst and Project Manager?

Ideally, the first step in the collaboration with a client, is understanding their needs and circumstances, and based on these, suggesting a solution. This work is done by a Business Analyst. Once the collaboration starts, each client is assigned a Project Manager that makes sure that the project is conducted timely, efficiently, and according to the plan.

In Arithmos we have opted for introducing the joint position of Project Manager & Business Analyst. This ensures that the client does not have to deal with the change of the reference person, and makes the collaboration smoother and more convenient for the client.

What most inspires you about working within this field?

The possibility to work with the clients, as one team, and bring to life complex projects that will improve the way clinical trials are managed. Also, the Life Sciences industry is constantly evolving, so there is a lot of space for growth and innovating!

What are the top skills that a Project Manager / Business Analyst in Life Sciences should develop?

  • Analytical skills – you need to have a structured approach to the tasks and know how to step back, see the whole picture, and analyse the situation.
  • Attention to details – as a Project Manager you need to keep track of multiple tasks and liaise with multiple stakeholders at the same time.
  • Technical knowledge – although having a technical background is not obligatory, it helps in ensuring a high level of quality for the project. You also need to always stay on the forefront of the newest updates in the technology field.

Marco Pastore Business Analyst and Project Manager

What are your personal values?

Reliability – if you have taken on a task, you need to deliver it, because if one piece is missing from a project, it puts the rest of the work at risk. I also believe that things should be done with passion. Loving what you do allows you to always strive for excellence.

What are your main interests outside of work?

I am addicted to technology! I have a whole collection of gadgets. Most of them are related to sports, like smart watches and activity trackers. I have used them a lot during my trainings, as I am very keen to marathon running and triathlons. For example, I have done a half Ironman triathlon and have ran the full Valencia and Padova marathons.

Of course, I must mention that I am also a cat fan and I have 4 lovely kitties at home.

Arithmos careers

We are always looking for talented and motivated professionals ready to join us. Would you like to be part of our successful team? To find out more about our job openings click here.

 

Webinar Q&A: How to Perform a Successful Data Migration in Life Sciences

Data Migration Webinar Q&A

Webinar Q&A: How to Perform a Successful Data Migration in Life Sciences

In the Life Sciences industry a structured, but lean, approach to data migration is vital for success. Such an approach not only ensures data integrity and consistency, but also minimizes the data migration impact on day-by-day business operations.

On May 14, Arithmos hosted a complimentary webinar on performing a successful data migration in Life Sciences. The webinar described the main risks and challenges of the data migration, ways to overcome them, and presented a case study of data migration in pharmacovigilance.

The webinar was conducted by Alessandro Longoni, a Senior Business Analyst & Project Manager of Arithmos. He has been working in the field of Life Sciences technology for 13 years and has particular expertise in pharmacovigilance and electronic data management.

Continue reading to discover the questions from the Q&A session from the webinar.

Register today to get access to the webinar recording

During a pharmacovigilance migration, should only the last version of the cases be migrated, or should we migrate all the versions?

It depends on the legacy and the target system. Some classical safety systems save each version of each case as an independent case.

If you use an ETL tool, to migrate to a new system, that does not operate according to this logic, we would advise to migrate only the last version of the case in order to avoid the duplicate cases. It is also advisable to migrate the case history (case workflow) if possible.

However, in all the cases, it is technically possible to migrate all the versions of the case if you find the right workaround. For example using the following naming format: <case_number>_<caseversion>.

The best practice in data migration is adopting the E2B XML approach – regardless of the number of case versions, only the last version is included in the legacy system.

What steps can be omitted in semi-automated data migration compared to the automated?

Data migration, regardless of the approach used (ETL, XML import, re-entry), must be included in a validation exercise.

The main steps and deliverables are the same regardless of the approach. However, the documents and activities in semi-automated and automated data migration projects require different accuracy and effort:

  • Data Migration Plan: it should include the strategy of the migration project and the main rules for data migration. This means that it covers the mapping needs for re-entry, and in case of the ETL tool, a data mapping document is highly suggested.

In case of the XML migration the format depends on the number of cases and the complexity (due to the difference between the configurations of target and legacy system). For simple migrations/low number of cases, the data migration plan could be sufficient. For complex migrations/high number of cases, we suggest a dedicated data mapping document

  • Data Mapping Specifications: see above
  • Data Migration Validation: it is a protocol that must be executed in the validation environment to verify the migration process. It is more extended in the case of ETL, but we suggest to use it regardless of data migration approaches.
  • Data Migration Production: it consists of the execution of the protocol, described above, that is conducted to migrate data in production. In case of an ETL approach, this phase will require 2-5 days for technical activities plus 1-2 days for business data verification (production downtime period). In case of an XML or re-entry process, this phase will take up to several months of business work (no production downtime period).
  • Data Migration Report

When you access the source data in read-only mode, is it possible to also access the files attached to the ICSR such as publication (in PDF format) or ACK from Regulatory Authorities (in XML format)? / How can we migrate attachments from one system to another? Should the client send the attachments separately from the XML files if the migration is performed by import/export migration?

Yes, it is possible. Of course, this possibility depends heavily on the structure of the legacy system database. Sometimes, technology constraints don’t allow you to extract some particular information, such as attachments or audit trail.

If the legacy system stores the attachments in the database with proper references to the relevant case, and if the target system supports the case attachment functionality, the best idea would be to perform the integration using the ETL tool.

If I have an E2B XML file as source of a case recorded in an Excel database, can I use it as a source for the migration into the new database in order to avoid doing a migration manually from the old database?

The standard approach to data migration is migrating data from the legacy to the target system and not from the source to the target. Nevertheless, a different approach can be chosen if it can grant a better data quality (like in the question) and efficiency. This approach must be described and justified in the Data Migration Plan. Also, we advise to carry out a final verification to detect every discrepancy between migrated data and data stored in the legacy system.

How can you calculate the number of the records from the source and target systems that need to be validated?

An ETL tool can reduce the need for the data verification, especially because a migration script works on specific fields independently from the number of records to be migrated. For example, if my script moves correctly the laboratory data, the number of tests executed by a specific patient is not relevant. This ensures the quality of the migrated data.

However, a minimum quality check of the fields has to be performed, and it should be based on the recommendations of your CSV manager and the risk assessment.

A numeric check of the records moved, (100%), is always required to ensure that there were no exceptions during the script run.

This article is based on a webinar presented by Alessandro Longoni on 14th May 2020. Register today to get access to the webinar recording.

Webinar Q&A: Agatha – Quality and Content Management Solution

Agatha Quality and Content Management Solution

Webinar Q&A: Agatha – Quality and Content Management Solution

Biotechnology, pharmaceutical, and medical device companies develop highly complex products. For them, accuracy, consistency, efficiency, and quality are not goals; they are imperatives because the medicines, therapies, and devices they make can improve the quality of patients’ lives and safeguard their health.

On April 17, Arithmos with its partner Agatha Inc. hosted a complimentary webinar on Agatha – a cloud-based Quality and Content Management tool for Life Sciences and healthcare organizations. Agatha addresses the imperatives for their core business processes like managing clinical trials, optimising quality processes, and organising regulatory submissions. The system is highly configurable, allowing for tailored customizations that fit the company’s workflow.

The webinar was conducted by:

  • Silvia Gabanti, Managing Director of Arithmos. Silvia has an impressive 15-year track record in the industry. She began her career in the CRO environment where she developed experience in applications for pharmacovigilance and clinical trials. On her career record, she also has extensive experience with Oracle applications – as analyst she was working with pharmacovigilance and clinical systems like Oracle Clinical.
  • Guillaume Gerard, Chief Operating Officer of Agatha Inc. For the past 15 years Guillaume has been helping Lifesciences organizations worldwide to deliver cloud-based content management applications. His areas of expertise include the compliance aspects and architecture of such systems, as well as the functional expertise on the clinical document management side (Trial Master File) and Quality Management.

Continue reading to discover the questions from the Q&A session from the webinar.

Register today to get access to the webinar recording

What is Agatha’s policy for data backup?

Agatha handles all aspects of the data center operations, including regular backups of all data.  All customer data are backed up daily, and daily backups are retained for 15 days.

It also provides the ability for customers to export all customer data (files, audit logs, metadata) at any time.

Is Agatha compliant with GDPR?

GDPR is a comprehensive privacy regulatory requirement.  Agatha is fully compliant with the requirements in regard to how it stores data and collects information.  There are some aspects of GDPR which are the responsibility of the business entity, for example naming a Privacy Officer.

Are document lifecycle statuses customizable?

Yes, lifecycle stages and other aspects of the review and approval workflows can be fully configured. That includes changing labels, adding steps, reordering steps and changing workflows from serial to parallel structures.

Are electronic signatures compliant to CFR21 part 11?

Yes, electronic signatures within Agatha are fully compliant with health authority requirements.Beyond signatures, Agatha is completely compliant with the FDA’s CFR21 Part 11. We maintain a CFR 21 Part 11 compliance checklist for every release of our service, and any customer can receive a compliance letter that can be used during audits.

Are activities tracked into an audit trail?

Yes, all activities are captured. This includes operations on documents as well as access and any change of settings. When an audit is performed, everything that has been collected in the audit trail process and stored can be provided directly to the auditor via a login with auditor access.

Is it possible to load data via an api from external applications (eg. eTMF, CTMS, EDC, …) and if yes, what kind of data can be loaded, only documents or also xml, json, xls?

It is possible – there is a full set of APIs available that allows data to be loaded, and external applications to be integrated.

Can documents have an expiration date?

Yes, documents can have an expiration date. This can be done using a “valid until” metadata, and processes can be triggered based on that metadata.

Could Agatha be used during the product development stages?

Absolutely. Many aspects of the Agatha solution are appropriate prior to the clinical trial phase, for example management of SOPs and collection of regulatory documentation.

Can Agatha manage the migration of legacy documents?

Yes, most projects include migration steps. There is an import tool that lets the client map data from a source system to Agatha and complete the import.

Is it possible to enable two-factor authentication (e.g. username & password plus SMS) in Agatha application?

Yes, two-factor authentication is the  standard model for the use of Agatha. Two-factor authentication is enabled  on a per-client basis.

Are there any penetration tests that could be shared with customers?

Yes, as part of Agatha hosting service penetration tests are conducted yearly. Reports and results are made available to clients during audits.

During the webinar it was mentioned that Agatha is cost effective. How does it price compare to the competition?

Agatha provides the best value among similar products because, as a ready-to-use and pre-validated system,  it is less expensive to bring into product and on-board users.   It also has lower subscription prices, because it does not operate on top of another product platform. It also has truly packaged modules, as it was shown during the webinar.  Typically, it is 2 times more affordable than the competition.

If you are interested in a specific quote, contact us for a brief conversation. We will ensure we understand the specific functionality you need and the number of users and provide the best offer.

Page 1 of 212

Follow us on Twitter

RESOURCES

NEWS